Science.gov

Sample records for open radioactive sources

  1. Dynamic radioactive particle source

    DOEpatents

    Moore, Murray E.; Gauss, Adam Benjamin; Justus, Alan Lawrence

    2012-06-26

    A method and apparatus for providing a timed, synchronized dynamic alpha or beta particle source for testing the response of continuous air monitors (CAMs) for airborne alpha or beta emitters is provided. The method includes providing a radioactive source; placing the radioactive source inside the detection volume of a CAM; and introducing an alpha or beta-emitting isotope while the CAM is in a normal functioning mode.

  2. Sources of radioactive ions

    SciTech Connect

    Alonso, J.R.

    1985-05-01

    Beams of unstable nuclei can be formed by direct injection of the radioactive atoms into an ion source, or by using the momentum of the primary production beam as the basis for the secondary beam. The effectiveness of this latter mechanism in secondary beam formation, i.e., the quality of the emerging beam (emittance, intensity, energy spread), depends critically on the nuclear reaction kinematics, and on the magnitude of the incident beam energy. When this beam energy significantly exceeds the energies typical of the nuclear reaction process, many of the qualities of the incident beam can be passed on to the secondary beam. Factors affecting secondary beam quality are discussed, along with techniques for isolating and purifying a specific secondary product. The ongoing radioactive beam program at the Bevalac is used as an example, with applications, present performance and plans for improvements.

  3. PORTABLE SOURCE OF RADIOACTIVITY

    DOEpatents

    Goertz, R.C.; Ferguson, K.R.; Rylander, E.W.; Safranski, L.M.

    1959-06-16

    A portable source for radiogiaphy or radiotherapy is described. It consists of a Tl/sup 170/ or Co/sup 60/ source mounted in a rotatable tungsten alloy plug. The plug rotates within a brass body to positions of safety or exposure. Provision is made for reloading and carrying the device safely. (T.R.H.)

  4. Automatic Searching Radioactive Sources by Airborne Radioactive Survey Using Multicopter

    NASA Astrophysics Data System (ADS)

    Rim, H.; Eun, S. B.; Kim, K.; Park, S.; Jung, H. K.

    2015-12-01

    In order to prepare emergency situation lost a dangerous radioelement source in advance and to search a radioactive source automatically, we develop airborne radioelement survey system by multicopter. This multicopter radioelement survey system consists of a small portable customized BGO (Bismuth Germanate Oxide) detector, video recording part, wireless connecting part to ground pilot, GPS, and several equipments for automatic flight. This system is possible to search flight by preprogramed lines. This radioactive detecting system are tested to find intentional hidden source, The performance of detecting a source is well proved with very low flight altitude in spite of depending on the magnitude of radioelement sources. The advantage of multicopter system, one of UAV (Unmanned Aerial Vehicle), is to avoid the potential of close access to a dangerous radioactive source by using fully automatic searching capability. In this paper, we introduce our multicopter system for detecting radioactive source and synthetic case history for demonstrating this system.

  5. RADIOACTIVE CONCENTRATOR AND RADIATION SOURCE

    DOEpatents

    Hatch, L.P.

    1959-12-29

    A method is presented for forming a permeable ion exchange bed using Montmorillonite clay to absorb and adsorb radioactive ions from liquid radioactive wastes. A paste is formed of clay, water, and a material that fomns with clay a stable aggregate in the presence of water. The mixture is extruded into a volume of water to form clay rods. The rods may then be used to remove radioactive cations from liquid waste solutions. After use, the rods are removed from the solution and heated to a temperature of 750 to 1000 deg C to fix the ratioactive cations in the clay.

  6. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  7. Creating Open Source Conversation

    ERIC Educational Resources Information Center

    Sheehan, Kate

    2009-01-01

    Darien Library, where the author serves as head of knowledge and learning services, launched a new website on September 1, 2008. The website is built with Drupal, an open source content management system (CMS). In this article, the author describes how she and her colleagues overhauled the library's website to provide an open source content…

  8. Low-level radioactive waste from commercial nuclear reactors. Volume 3. Bibliographic abstracts of significant source documents. Part 1. Open-literature abstracts for low-level radioactive waste

    SciTech Connect

    Bowers, M.K.; Rodgers, B.R.; Jolley, R.L.

    1986-05-01

    The overall task of this program was to provide an assessment of currently available technology for treating commercial low-level radioactive waste (LLRW), to initiate development of a methodology for choosing one technology for a given application, and to identify research needed to improve current treatment techniques and decision methodology. The resulting report is issued in four volumes. Volume 3 of this series is a collection of abstracts of most of the reference documents used for this study. Because of the large volume of literature, the abstracts have been printed in two separate parts. Volume 3, part 1 presents abstracts of the open literature relating to LLRW treatment methodologies. Some of these references pertain to treatment processes for hazardous wastes that may also be applicable to LLRW management. All abstracts have been limited to 21 lines (for brevity), but each abstract contains sufficient information to enable the reader to determine the potential usefulness of the source document and to locate each article. The abstracts are arranged alphabetically by author or organization, and indexed by keyword.

  9. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io.

  10. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. PMID:27631126

  11. Radiation Safety of Sealed Radioactive Sources

    SciTech Connect

    Pryor, Kathryn H.

    2015-01-29

    Sealed radioactive sources are used in a wide variety of occupational settings and under differing regulatory/licensing structures. The definition of a sealed radioactive source varies between US regulatory authorities and standard-setting organizations. Potential problems with sealed sources cover a range of risks and impacts. The loss of control of high activity sealed sources can result in very high or even fatal doses to members of the public who come in contact with them. Sources that are not adequately sealed, and that fail, can cause spread of contamination and potential intake of radioactive material. There is also the possibility that sealed sources may be (or threatened to be) used for terrorist purposes and disruptive opportunities. Until fairly recently, generally-licensed sealed sources and devices received little, if any, regulatory oversight, and were often forgotten, lost or unaccounted for. Nonetheless, generally licensed devices can contain fairly significant quantities of radioactive material and there is some potential for exposure if a device is treated in a way that it was never designed. Industrial radiographers use and handle high activity and/or high-dose rate sealed sources in the field with a high degree of independence and minimal regulatory oversight. Failure to follow operational procedures and properly handle radiography sources can and has resulted in serious injuries and death. Industrial radiographers have experienced a disproportionately large fraction of incidents that result in unintended exposure to radiation. Sources do not have to contain significant quantities of radioactive material to cause problems in the event of their failure. A loss of integrity can cause the spread of contamination and potential exposure to workers and members of the public. The NCRP has previously provided recommendations on select aspects of sealed source programs. Future efforts to provide recommendations for sealed source programs are discussed.

  12. Radiation safety of sealed radioactive sources.

    PubMed

    Pryor, Kathryn H

    2015-02-01

    Sealed radioactive sources are used in a wide variety of occupational settings and under differing regulatory/licensing structures. The definition of a sealed radioactive source varies between U.S. regulatory authorities and standard-setting organizations. Potential problems with sealed sources cover a range of risks and impacts. The loss of control of high activity sealed sources can result in very high or even fatal doses to members of the public who come in contact with them. Sources that are not adequately sealed and that fail can cause spread of contamination and potential intake of radioactive material. There is also the possibility that sealed sources may be (or threaten to be) used for terrorist purposes and disruptive opportunities. Until fairly recently, generally licensed sealed sources and devices received little, if any, regulatory oversight and were often forgotten, lost or unaccounted for. Nonetheless, generally licensed devices can contain fairly significant quantities of radioactive material, and there is some potential for exposure if a device is treated in a way for which it was never designed. Industrial radiographers use and handle high activity and/or high dose-rate sealed sources in the field with a high degree of independence and minimal regulatory oversight. Failure to follow operational procedures and properly handle radiography sources can and has resulted in serious injuries and death. Industrial radiographers have experienced a disproportionately large fraction of incidents that have resulted in unintended exposure to radiation. Sources do not have to contain significant quantities of radioactive material to cause problems in the event of their failure. A loss of integrity can cause the spread of contamination and potential exposure to workers and members of the public. The National Council on Radiation Protection and Measurements has previously provided recommendations on select aspects of sealed source programs. Future efforts to

  13. 10 CFR 835.1201 - Sealed radioactive source control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Sealed radioactive source control. 835.1201 Section 835.1201 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1201 Sealed radioactive source control. Sealed radioactive sources shall be used, handled,...

  14. 10 CFR 835.1201 - Sealed radioactive source control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Sealed radioactive source control. 835.1201 Section 835.1201 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1201 Sealed radioactive source control. Sealed radioactive sources shall be used, handled,...

  15. 10 CFR 835.1201 - Sealed radioactive source control.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Sealed radioactive source control. 835.1201 Section 835.1201 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1201 Sealed radioactive source control. Sealed radioactive sources shall be used, handled,...

  16. 10 CFR 835.1201 - Sealed radioactive source control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Sealed radioactive source control. 835.1201 Section 835.1201 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1201 Sealed radioactive source control. Sealed radioactive sources shall be used, handled,...

  17. 10 CFR 835.1201 - Sealed radioactive source control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Sealed radioactive source control. 835.1201 Section 835.1201 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1201 Sealed radioactive source control. Sealed radioactive sources shall be used, handled,...

  18. 10 CFR 835.1202 - Accountable sealed radioactive sources.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Accountable sealed radioactive sources. 835.1202 Section 835.1202 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1202 Accountable sealed radioactive sources. (a) Each accountable sealed radioactive...

  19. 10 CFR 835.1202 - Accountable sealed radioactive sources.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Accountable sealed radioactive sources. 835.1202 Section 835.1202 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1202 Accountable sealed radioactive sources. (a) Each accountable sealed radioactive...

  20. 10 CFR 835.1202 - Accountable sealed radioactive sources.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Accountable sealed radioactive sources. 835.1202 Section 835.1202 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1202 Accountable sealed radioactive sources. (a) Each accountable sealed radioactive...

  1. 10 CFR 835.1202 - Accountable sealed radioactive sources.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Accountable sealed radioactive sources. 835.1202 Section 835.1202 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1202 Accountable sealed radioactive sources. (a) Each accountable sealed radioactive...

  2. 10 CFR 835.1202 - Accountable sealed radioactive sources.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Accountable sealed radioactive sources. 835.1202 Section 835.1202 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Sealed Radioactive Source Control § 835.1202 Accountable sealed radioactive sources. (a) Each accountable sealed radioactive...

  3. A Sensitive Cloud Chamber without Radioactive Sources

    ERIC Educational Resources Information Center

    Zeze, Syoji; Itoh, Akio; Oyama, Ayu; Takahashi, Haruka

    2012-01-01

    We present a sensitive diffusion cloud chamber which does not require any radioactive sources. A major difference from commonly used chambers is the use of a heat sink as its bottom plate. The result of a performance test of the chamber is given. (Contains 8 figures.)

  4. Open-Source Colorimeter

    PubMed Central

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032

  5. Open Source in Education

    ERIC Educational Resources Information Center

    Lakhan, Shaheen E.; Jhunjhunwala, Kavita

    2008-01-01

    Educational institutions have rushed to put their academic resources and services online, beginning the global community onto a common platform and awakening the interest of investors. Despite continuing technical challenges, online education shows great promise. Open source software offers one approach to addressing the technical problems in…

  6. Evaluating Open Source Portals

    ERIC Educational Resources Information Center

    Goh, Dion; Luyt, Brendan; Chua, Alton; Yee, See-Yong; Poh, Kia-Ngoh; Ng, How-Yeu

    2008-01-01

    Portals have become indispensable for organizations of all types trying to establish themselves on the Web. Unfortunately, there have only been a few evaluative studies of portal software and even fewer of open source portal software. This study aims to add to the available literature in this important area by proposing and testing a checklist for…

  7. Open-source colorimeter.

    PubMed

    Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.

  8. Obtaining and Investigating Unconventional Sources of Radioactivity

    NASA Astrophysics Data System (ADS)

    Lapp, David R.

    2010-02-01

    This paper provides examples of naturally radioactive items that are likely to be found in most communities. Additionally, there is information provided on how to acquire many of these items inexpensively. I have found that the presence of these materials in the classroom is not only useful for teaching about nuclear radiation and debunking the "nuclear free" myth, but also for helping students to understand the history of some of the commercial uses of radioactive materials since the early 20th century. Finally, the activity of each source (relative to background radiation) is provided.

  9. Open-Source GIS

    SciTech Connect

    Vatsavai, Raju; Burk, Thomas E; Lime, Steve

    2012-01-01

    The components making up an Open Source GIS are explained in this chapter. A map server (Sect. 30.1) can broadly be defined as a software platform for dynamically generating spatially referenced digital map products. The University of Minnesota MapServer (UMN Map Server) is one such system. Its basic features are visualization, overlay, and query. Section 30.2 names and explains many of the geospatial open source libraries, such as GDAL and OGR. The other libraries are FDO, JTS, GEOS, JCS, MetaCRS, and GPSBabel. The application examples include derived GIS-software and data format conversions. Quantum GIS, its origin and its applications explained in detail in Sect. 30.3. The features include a rich GUI, attribute tables, vector symbols, labeling, editing functions, projections, georeferencing, GPS support, analysis, and Web Map Server functionality. Future developments will address mobile applications, 3-D, and multithreading. The origins of PostgreSQL are outlined and PostGIS discussed in detail in Sect. 30.4. It extends PostgreSQL by implementing the Simple Feature standard. Section 30.5 details the most important open source licenses such as the GPL, the LGPL, the MIT License, and the BSD License, as well as the role of the Creative Commons.

  10. How Is Open Source Special?

    ERIC Educational Resources Information Center

    Kapor, Mitchell

    2005-01-01

    Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…

  11. Ion sources and targets for radioactive beams

    SciTech Connect

    Schiffer, J.P.; Back, B.B.; Ahmad, I.

    1995-08-01

    A high-intensity ISOL-type radioactive beam facility depends critically on the performance of the target/ion source system. We developed a concept for producing high-intensity secondary beams of fission fragments, such as {sup 132}Sn, using a two-part target and ion source combination. The idea involves stopping a 1000-kW beam of 200-MeV deuterons in a target of Be or U to produce a secondary beam of neutrons. Just behind the neutron production target is a second target, typically a porous form of UC, coupled to an ISOL-type ion source. In December 1994, we tested this concept with 200-MeV deuterons at low intensity in an experiment at the NSCL. The yields of characteristic gamma rays were measured and confirmed our predictions.

  12. Particle beam generator using a radioactive source

    DOEpatents

    Underwood, D.G.

    1993-03-30

    The apparatus of the present invention selects from particles emitted by a radioactive source those particles having momentum within a desired range and focuses the selected particles in a beam having at least one narrow cross-dimension, and at the same time attenuates potentially disruptive gamma rays and low energy particles. Two major components of the present invention are an achromatic bending and focusing system, which includes sector magnets and quadrupole, and a quadrupole doublet final focus system. Permanent magnets utilized in the apparatus are constructed of a ceramic (ferrite) material which is inexpensive and easily machined.

  13. Particle beam generator using a radioactive source

    DOEpatents

    Underwood, David G.

    1993-01-01

    The apparatus of the present invention selects from particles emitted by a radioactive source those particles having momentum within a desired range and focuses the selected particles in a beam having at least one narrow cross-dimension, and at the same time attenuates potentially disruptive gamma rays and low energy particles. Two major components of the present invention are an achromatic bending and focusing system, which includes sector magnets and quadrupole, and a quadrupole doublet final focus system. Permanent magnets utilized in the apparatus are constructed of a ceramic (ferrite) material which is inexpensive and easily machined.

  14. Compton scattering with low intensity radioactive sources

    NASA Astrophysics Data System (ADS)

    Quarles, Carroll

    2012-03-01

    Compton scattering experiments with gamma rays typically require a ``hot'' source (˜5mCi of Cs137) to observe the scattering as a function of angle. (See Ortec AN34 Experiment #10 Compton Scattering) Here a way is described to investigate Compton scattering with micro Curie level radioactive sources that are more commonly available in the undergraduate laboratory. A vertical-looking 2 inch coaxial hpGe detector, collimated with a 2 inch thick lead shield, was used. Cylindrical Al targets of various thicknesses were placed over the collimator and several available sources were placed around the target so that the average Compton scattering angle into the collimator was 90 deg. A peak could be observed at the expected energy for 90 deg. Compton scattering by doing 24 hour target-in minus target-out runs. The peak was broadened by the spread in the scattering angle due to the variation in the angle of the incoming gamma ray and the angular acceptance of the collimator. A rough analysis can be done by modeling the angular spread due to the geometry and correcting for the gamma ray absorption from the target center. Various target materials and sources can be used and some variation in average Compton scattering angle can be obtained by adjusting the geometry of the source and target.

  15. Management of Disused Radioactive Sealed Sources in Egypt - 13512

    SciTech Connect

    Mohamed, Y.T.; Hasan, M.A.; Lasheen, Y.F.

    2013-07-01

    The future safe development of nuclear energy and progressive increasing use of sealed sources in medicine, research, industry and other fields in Egypt depends on the safe and secure management of disused radioactive sealed sources. In the past years have determined the necessity to formulate and apply the integrated management program for radioactive sealed sources to assure harmless and ecological rational management of disused sealed sources in Egypt. The waste management system in Egypt comprises operational and regulatory capabilities. Both of these activities are performed under legislations. The Hot Laboratories and Waste Management Center HLWMC, is considered as a centralized radioactive waste management facility in Egypt by law 7/2010. (authors)

  16. 10 CFR Appendix E to Part 835 - Values for Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Values for Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting and Labeling Requirements E Appendix E to Part 835 Energy... Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting and...

  17. 10 CFR Appendix E to Part 835 - Values for Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Values for Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting and Labeling Requirements E Appendix E to Part 835 Energy... Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting and...

  18. 10 CFR Appendix E to Part 835 - Values for Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Values for Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting and Labeling Requirements E Appendix E to Part 835 Energy... Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting and...

  19. An Open Source Simulation System

    NASA Technical Reports Server (NTRS)

    Slack, Thomas

    2005-01-01

    An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.

  20. Obtaining and Investigating Unconventional Sources of Radioactivity

    ERIC Educational Resources Information Center

    Lapp, David R.

    2010-01-01

    This paper provides examples of naturally radioactive items that are likely to be found in most communities. Additionally, there is information provided on how to acquire many of these items inexpensively. I have found that the presence of these materials in the classroom is not only useful for teaching about nuclear radiation and debunking the…

  1. Radioactive target and source development at Argonne National Laboratory

    SciTech Connect

    Greene, J.P.; Ahmad, I.; Thomas, G.E.

    1992-01-01

    An increased demand for low-level radioactive targets has created the need for a laboratory dedicated to the production of these foils. A description is given of the radioactive target produced as well as source development work being performed at the Physics Division target facility of Argonne National Laboratory (ANL). Highlights include equipment used and the techniques employed. In addition, some examples of recent source preparation are given as well as work currently in progress.

  2. Radioactive target and source development at Argonne National Laboratory

    SciTech Connect

    Greene, J.P.; Ahmad, I.; Thomas, G.E.

    1992-10-01

    An increased demand for low-level radioactive targets has created the need for a laboratory dedicated to the production of these foils. A description is given of the radioactive target produced as well as source development work being performed at the Physics Division target facility of Argonne National Laboratory (ANL). Highlights include equipment used and the techniques employed. In addition, some examples of recent source preparation are given as well as work currently in progress.

  3. THE OPEN SOURCING OF EPANET

    EPA Science Inventory

    A proposal was made at the 2009 EWRI Congress in Kansas City, MO to establish an Open Source Project (OSP) for the widely used EPANET pipe network analysis program. This would be an ongoing collaborative effort among a group of geographically dispersed advisors and developers, wo...

  4. Radioactivity as a significant energy source in prebiotic synthesis.

    PubMed

    Garzón, L; Garzón, M L

    2001-01-01

    Radioactivity in the continental crust (due mainly to the isotopes 238U, 235U, 232Th and 40K), as a energy source for chemical evolution in the early Archean (between 3.5 and approximately 4 Ga bp), is reviewed. The most important radioactive source in the continental crust is due to the production and accumulation of radioactive gases within the crust voids (porosity). The study of such mechanism has allowed us to reach a deeper understanding about the nature of the radioactive source and to describe its behavior, particularly with regard to prebiotic chemical evolution. An effective total energy of 3 x 10(18) Ja-1 has been obtained for a depth of 1 km, 4 Ga ago. If a depth of 30 km is taken, the obtained value is almost equal to the UV solar energy radiation (lambda < 150 nm). Within the voids the radioactive source of the continental crust played a relevant role in prebiotic synthesis. In uranium deposits of the same age, the role of radioactivity must have been even more relevant in favoring chemical evolution. PMID:11296523

  5. Radioactive heat sources in the lunar interior.

    NASA Technical Reports Server (NTRS)

    Hays, J. F.

    1972-01-01

    Published models for the moon's thermal history typically imply present day central temperatures far too high to be consistent with the recently proposed lunar temperature profile of Sonett et al. (1971). Furthermore, chemical data on Apollo samples show that the moon is depleted relative to chondrites in volatile elements, and possibly enriched relative to chondrites in refractory elements. Additional thermal models have therefore been investigated in order to set upper limits on lunar radioactivity consistent with the proposed temperature distribution. For an initially cold, uniform moon, devoid of potassium, a maximum uranium content of 23 parts per billion is inferred.

  6. Method for fabricating thin californium-containing radioactive source wires

    DOEpatents

    Gross, Ian G; Pierce, Larry A

    2006-08-22

    A method for reducing the cross-sectional diameter of a radioactive californium-containing cermet wire while simultaneously improving the wire diameter to a more nearly circular cross section. A collet fixture is used to reduce the wire diameter by controlled pressurization pulses while simultaneously improving the wire cross-sectional diameter. The method is especially suitable for use in hot cells for the production of optimized cermet brachytherapy sources that contain large amounts of radioactive californium-252.

  7. Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness

    ERIC Educational Resources Information Center

    Committee for Economic Development, 2006

    2006-01-01

    Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…

  8. AKM in Open Source Communities

    NASA Astrophysics Data System (ADS)

    Stamelos, Ioannis; Kakarontzas, George

    Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.

  9. Radioactivity as a Significant Energy Source in Prebiotic Synthesis

    NASA Astrophysics Data System (ADS)

    Garzón, León; Garzón, M. Luisa

    2001-02-01

    Radioactivity in the continental crust (due mainly to the isotopes ^238U, ^235U, ^232Th and ^40K), as a energy source for chemical evolution in the early Archean (between 3.5 and ~4 Ga bp), is reviewed. The most important radioactive source in the continental crust is due to the production and accumulation of radioactive gases within the crust voids (porosity). The study of such mechanism has allowed us to reach a deeper understanding about the nature of the radioactive source and to describe its behavior, particularly with regard to prebiotic chemical evolution. An effective total energy of 3 × 10^18 ^Ja has been obtained for a depth of 1 km, 4 Ga ago. If a depth of 30 km is taken, the obtained value is almost equal to the UV solar energy radiation (λ<150 nm). Within the voids the radioactive source of the continental crust played a relevant role in prebiotic synthesis. In uranium deposits of the same age, the role of radiactivity must have been even more relevant in favoring chemical evolution.

  10. METHOD OF PREPARING RADIOACTIVE CESIUM SOURCES

    DOEpatents

    Quinby, T.C.

    1963-12-17

    A method of preparing a cesium-containing radiation source with physical and chemical properties suitable for high-level use is presented. Finely divided silica is suspended in a solution containing cesium, normally the fission-product isotope cesium 137. Sodium tetraphenyl boron is then added to quantitatively precipitate the cesium. The cesium-containing precipitate is converted to borosilicate glass by heating to the melting point and cooling. Up to 60 weight percent cesium, with a resulting source activity of up to 21 curies per gram, is incorporated in the glass. (AEC)

  11. Registration for the Hanford Site: Sources of radioactive emissions

    SciTech Connect

    Silvia, M.J.

    1993-04-01

    This Registration Application serves to renew the registration for all Hanford Site sources of radioactive air emissions routinely reported to the State of Washington Department of Health (DOH). The current registration expires on August 15, 1993. The Application is submitted pursuant to the Washington Administrative Code (WAC) Chapter 246--247, and is consistent with guidance provided by DOH for renewal. The Application subdivides the Hanford Site into six major production, processing or research areas. Those six areas are in the 100 Area, 200 East Area, 200 West Area, 300 Area, 400 Area, and 600 Area. Each major group of point sources within the six areas listed above is represented by a Source Registration for Radioactive Air Emissions form. Annual emissions. for the sources are listed in the ``Radionuclide Air Emissions Report for the Hanford Site,`` published annually. It is a requirement that the following Statement of Compliance be provided: ``The radioactive air emissions from the above sources do meet the emissions standards contained in Chapter 173-480-040 WAC, Ambient Air Quality Standards and Emissions Limits for Radionuclides. As the Statement of Compliance pertains to this submittal, the phrase ``above sources`` is to be understood as meaning the combined air emissions from all sources registered by this submittal.

  12. Don't Throw Away Your Radioactive Sources!

    ERIC Educational Resources Information Center

    Tracy, Charles; Cunningham, Elizabeth

    2014-01-01

    This article reports on a plea directed to schools in England that changed status to an "academy" and thus lost their Local Authority Radiation Protection Adviser (RPA) service. These schools have been encouraged to do all that they can to hang on to their sources (radioactive equipment used in classroom experiments to investigate…

  13. Analysis of radioactive metals by spark source mass spectrometry.

    PubMed

    Johnson, A J; Kozy, A; Morris, R N

    1969-04-01

    A spark source mass spectrograph with photographic plate recording has been adapted for the analysis of plutonium and americium metals. Over seventy elements can be determined simultaneously in these metals. A comparison has been made between results obtained by mass spectrography and by conventional methods for impurity elements. The operations involved in handling radioactive materials in the mass spectrograph are also discussed. PMID:18960537

  14. The HYPE Open Source Community

    NASA Astrophysics Data System (ADS)

    Strömbäck, L.; Pers, C.; Isberg, K.; Nyström, K.; Arheimer, B.

    2013-12-01

    The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model. It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. In the model, the landscape is divided into classes according to soil type, vegetation and altitude. The soil representation is stratified and can be divided in up to three layers. Water and substances are routed through the same flow paths and storages (snow, soil, groundwater, streams, rivers, lakes) considering turn-over and transformation on the way towards the sea. HYPE has been successfully used in many hydrological applications at SMHI. For Europe, we currently have three different models; The S-HYPE model for Sweden; The BALT-HYPE model for the Baltic Sea; and the E-HYPE model for the whole Europe. These models simulate hydrological conditions and nutrients for their respective areas and are used for characterization, forecasts, and scenario analyses. Model data can be downloaded from hypeweb.smhi.se. In addition, we provide models for the Arctic region, the Arab (Middle East and Northern Africa) region, India, the Niger River basin, the La Plata Basin. This demonstrates the applicability of the HYPE model for large scale modeling in different regions of the world. An important goal with our work is to make our data and tools available as open data and services. For this aim we created the HYPE Open Source Community (OSC) that makes the source code of HYPE available for anyone interested in further development of HYPE. The HYPE OSC (hype.sourceforge.net) is an open source initiative under the Lesser GNU Public License taken by SMHI to strengthen international collaboration in hydrological modeling and hydrological data production. The hypothesis is that more brains and more testing will result in better models and better code. The code is transparent and can be changed

  15. Dribs II:. a Source of Radioactive Nuclei

    NASA Astrophysics Data System (ADS)

    Szöllős, O.; Kliman, J.; Oganessian, Yu. Ts.; Itkis, M. G.; Dmitriev, S. N.; Mishinsky, G. V.; Zhemenik, V. I.

    2002-12-01

    The facility for neutron-rich nuclei production, as a specific part of the project DRIBS - phase II, based on the use of photofission and the characteristics of microtron MT-25 as a source of bremsstrahlung photons are described. Target selection with regard to maximum yield production of strong neutron-rich fission products and with respect to its safety and availability is discussed. Geant 4 simulation toolkit was used to obtain bremsstrahlung γ-beam characteristics and its angular spred was compared with measurements on MT-25 microtron. Next, calculation of fission-rate and fission density in 238U target at nominal values of MT-25 microtron are presented. Calculations of the yield from photofission based on the Wahl's ZP model for charge distribution of fission fragments are compared with experimental data for independent yield of xenon isotopes measured. As result the independent and cummulative yields from photofission of 238U for mictrotron MT-25, as a driver accelerator for DRIBS II, for up to 890 isotopes and their isomers were obtained. Mean characteristics of this compact facility with some others RIB projects are compared.

  16. The Commercial Open Source Business Model

    NASA Astrophysics Data System (ADS)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  17. The HYPE Open Source Community

    NASA Astrophysics Data System (ADS)

    Strömbäck, Lena; Arheimer, Berit; Pers, Charlotta; Isberg, Kristina

    2013-04-01

    The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model (Lindström et al., 2010). It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. In the model, the landscape is divided into classes according to soil type, vegetation and altitude. The soil representation is stratified and can be divided in up to three layers. Water and substances are routed through the same flow paths and storages (snow, soil, groundwater, streams, rivers, lakes) considering turn-over and transformation on the way towards the sea. In Sweden, the model is used by water authorities to fulfil the Water Framework Directive and the Marine Strategy Framework Directive. It is used for characterization, forecasts, and scenario analyses. Model data can be downloaded for free from three different HYPE applications: Europe (www.smhi.se/e-hype), Baltic Sea basin (www.smhi.se/balt-hype), and Sweden (vattenweb.smhi.se) The HYPE OSC (hype.sourceforge.net) is an open source initiative under the Lesser GNU Public License taken by SMHI to strengthen international collaboration in hydrological modelling and hydrological data production. The hypothesis is that more brains and more testing will result in better models and better code. The code is transparent and can be changed and learnt from. New versions of the main code will be delivered frequently. The main objective of the HYPE OSC is to provide public access to a state-of-the-art operational hydrological model and to encourage hydrologic expertise from different parts of the world to contribute to model improvement. HYPE OSC is open to everyone interested in hydrology, hydrological modelling and code development - e.g. scientists, authorities, and consultancies. The HYPE Open Source Community was initiated in November 2011 by a kick-off and workshop with 50 eager participants

  18. Integrated Management Program Radioactive Sealed Sources in Egypt

    SciTech Connect

    Hasan, A.; Cochran, J. R.; El-Adham, K.; El-Sorougy, R.

    2003-02-26

    The radioactive materials in ''public'' locations are typically contained in small, stainless steel capsules known as sealed radiation sources (RS). These capsules seal in the radioactive materials, but not the radiation, because it is the radiation that is needed for a wide variety of applications at hospitals, medical clinics, manufacturing plants, universities, construction sites, and other facilities in the public sector. Radiation sources are readily available, and worldwide there are hundreds of thousands of RS. The IMPRSS Project is a cooperative development between the Egyptian Atomic Energy Authority (EAEA), Egyptian Ministry of Health (MOH), Sandia National Laboratories (SNL), New Mexico Tech University (NMT), and Agriculture Cooperative Development International (ACDI/VOCA). SNL will coordinate the work scope between the participant organizations.

  19. COMPARISON OF RECURSIVE ESTIMATION TECHNIQUES FOR POSITION TRACKING RADIOACTIVE SOURCES

    SciTech Connect

    K. MUSKE; J. HOWSE

    2000-09-01

    This paper compares the performance of recursive state estimation techniques for tracking the physical location of a radioactive source within a room based on radiation measurements obtained from a series of detectors at fixed locations. Specifically, the extended Kalman filter, algebraic observer, and nonlinear least squares techniques are investigated. The results of this study indicate that recursive least squares estimation significantly outperforms the other techniques due to the severe model nonlinearity.

  20. Cable attachment for a radioactive brachytherapy source capsule

    DOEpatents

    Gross, Ian G; Pierce, Larry A

    2006-07-18

    In cancer brachytherapy treatment, a small californium-252 neutron source capsule is attached to a guide cable using a modified crimping technique. The guide cable has a solid cylindrical end, and the attachment employs circumferential grooves micromachined in the solid cable end. The attachment was designed and tested, and hardware fabricated for use inside a radioactive hot cell. A welding step typically required in other cable attachments is avoided.

  1. Resonant Ionization Laser Ion Source for Radioactive Ion Beams

    SciTech Connect

    Liu, Yuan; Beene, James R; Havener, Charles C; Vane, C Randy; Gottwald, T.; Wendt, K.; Mattolat, C.; Lassen, J.

    2009-01-01

    A resonant ionization laser ion source based on all-solid-state, tunable Ti:Sapphire lasers is being developed for the production of pure radioactive ion beams. It consists of a hot-cavity ion source and three pulsed Ti:Sapphire lasers operating at a 10 kHz pulse repetition rate. Spectroscopic studies are being conducted to develop ionization schemes that lead to ionizing an excited atom through an auto-ionization or a Rydberg state for numerous elements of interest. Three-photon resonant ionization of 12 elements has been recently demonstrated. The overall efficiency of the laser ion source measured for some of these elements ranges from 1 to 40%. The results indicate that Ti:Sapphire lasers could be well suited for laser ion source applications. The time structures of the ions produced by the pulsed lasers are investigated. The information may help to improve the laser ion source performance.

  2. Active radiometric calorimeter for absolute calibration of radioactive sources

    SciTech Connect

    Stump, K.E.; DeWerd, L.A.; Rudman, D.A.; Schima, S.A.

    2005-03-01

    This report describes the design and initial noise floor measurements of a radiometric calorimeter designed to measure therapeutic medical radioactive sources. The instrument demonstrates a noise floor of approximately 2 nW. This low noise floor is achieved by using high temperature superconducting (HTS) transition edge sensor (TES) thermometers in a temperature-control feedback loop. This feedback loop will be used to provide absolute source calibrations based upon the electrical substitution method. Other unique features of the calorimeter are (a) its ability to change sources for calibration without disrupting the vacuum of the instrument, and (b) the ability to measure the emitted power of a source in addition to the total contained source power.

  3. Flowsheets and source terms for radioactive waste projections

    SciTech Connect

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.

  4. 2011 Radioactive Materials Usage Survey for Unmonitored Point Sources

    SciTech Connect

    Sturgeon, Richard W.

    2012-06-27

    This report provides the results of the 2011 Radioactive Materials Usage Survey for Unmonitored Point Sources (RMUS), which was updated by the Environmental Protection (ENV) Division's Environmental Stewardship (ES) at Los Alamos National Laboratory (LANL). ES classifies LANL emission sources into one of four Tiers, based on the potential effective dose equivalent (PEDE) calculated for each point source. Detailed descriptions of these tiers are provided in Section 3. The usage survey is conducted annually; in odd-numbered years the survey addresses all monitored and unmonitored point sources and in even-numbered years it addresses all Tier III and various selected other sources. This graded approach was designed to ensure that the appropriate emphasis is placed on point sources that have higher potential emissions to the environment. For calendar year (CY) 2011, ES has divided the usage survey into two distinct reports, one covering the monitored point sources (to be completed later this year) and this report covering all unmonitored point sources. This usage survey includes the following release points: (1) all unmonitored sources identified in the 2010 usage survey, (2) any new release points identified through the new project review (NPR) process, and (3) other release points as designated by the Rad-NESHAP Team Leader. Data for all unmonitored point sources at LANL is stored in the survey files at ES. LANL uses this survey data to help demonstrate compliance with Clean Air Act radioactive air emissions regulations (40 CFR 61, Subpart H). The remainder of this introduction provides a brief description of the information contained in each section. Section 2 of this report describes the methods that were employed for gathering usage survey data and for calculating usage, emissions, and dose for these point sources. It also references the appropriate ES procedures for further information. Section 3 describes the RMUS and explains how the survey results are

  5. Free for All: Open Source Software

    ERIC Educational Resources Information Center

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  6. Management of spent sealed radioactive sources in Turkey.

    PubMed

    Osmanlioglu, Ahmet Erdal

    2006-09-01

    Spent radioactive sources (SRS) have been generated from industrial applications, research, and medicine in Turkey. In this study, management of SRS (Co, Cs) at Cekmece Waste Processing and Storage Facility (CWPSF) is described. Eleven Cs sources (total 851 GBq) and four Co sources (total 27.75 GBq) that had been used as levels and density gauges were conditioned. Reinforced metal drums (200 L in volume) and cement matrix were used for conditioning of these sources. In this way, greater confinement was achieved for long-term storage. Maximum dose rates at the surface of the conditioned waste package were determined. In addition to information about conditioning stages of the sources, various calculations that have been done for shielding are presented. Surface dose rates of the waste packages were 1.60 mSv h for Cs and 1.63 mSv h for Co. Measurements of the final waste packages were presented to fulfill the requirements (<2 mSv h) of transportation according to regulations for the safe transport of radioactive material. PMID:16891901

  7. Open-source hardware for medical devices

    PubMed Central

    2016-01-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528

  8. The 2016 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science. PMID:27781083

  9. 7 Questions to Ask Open Source Vendors

    ERIC Educational Resources Information Center

    Raths, David

    2012-01-01

    With their budgets under increasing pressure, many campus IT directors are considering open source projects for the first time. On the face of it, the savings can be significant. Commercial emergency-planning software can cost upward of six figures, for example, whereas the open source Kuali Ready might run as little as $15,000 per year when…

  10. Open Source 2010: Reflections on 2007

    ERIC Educational Resources Information Center

    Wheeler, Brad

    2007-01-01

    Colleges and universities and commercial firms have demonstrated great progress in realizing the vision proffered for "Open Source 2007," and 2010 will mark even greater progress. Although much work remains in refining open source for higher education applications, the signals are now clear: the collaborative development of software can provide…

  11. Recursive Estimation for the Tracking of Radioactive Sources

    SciTech Connect

    Howse, J.W.; Muske, K.R.; Ticknor, L.O.

    1999-02-01

    This paper describes a recursive estimation algorithm used for tracking the physical location of radioactive sources in real-time as they are moved around in a facility. The al- gorithm is a nonlinear least squares estimation that mini- mizes the change in, the source location and the deviation between measurements and model predictions simultane- ously. The measurements used to estimate position consist of four count rates reported by four different gamma ray de tectors. There is an uncertainty in the source location due to the variance of the detected count rate. This work repre- sents part of a suite of tools which will partially automate security and safety assessments, allow some assessments to be done remotely, and provide additional sensor modalities with which to make assessments.

  12. Recursive estimation for the tracking of radioactive sources

    SciTech Connect

    Howse, J.W.; Ticknor, L.O.; Muske, K.R.

    1998-12-31

    This paper describes a recursive estimation algorithm used for tracking the physical location of radioactive sources in real-time as they are moved around in a facility. The algorithm is related to a nonlinear least squares estimation that minimizes the change in the source location and the deviation between measurements and model predictions simultaneously. The measurements used to estimate position consist of four count rates reported by four different gamma ray detectors. There is an uncertainty in the source location due to the large variance of the detected count rate. This work represents part of a suite of tools which will partially automate security and safety assessments, allow some assessments to be done remotely, and provide additional sensor modalities with which to make assessments.

  13. A concentrated radioactive beam source for atom cooling and trapping

    SciTech Connect

    Maddi, J.; Dinneen, T.; Ghiorso, A.; Gould, H.

    1996-05-01

    The authors describe a novel oven to obtain concentrated beams of radioactive atoms. The Orthotropic oven works by ionizing atoms on its interior walls and electrostatically concentrating them on a neutralizer. Once neutralized the atoms can escape from the oven and form a narrow beam. Atoms that fail to escape become ionized again and repeat the cycle. The authors demonstrate the operation of this oven using {sup 221}Fr and compare both the theoretical and experimental efficiency of this source with standard effusive and channeled ovens.

  14. End of Life Decisions for Sealed Radioactive Sources.

    PubMed

    Pryor, Kathryn H

    2016-02-01

    Sealed radioactive sources are encountered in a wide variety of settings-from household smoke detectors and instrument check sources through fixed industrial gauges, industrial radiography, and well logging sources, to irradiators and medical teletherapy devices. In general, the higher the level of activity in the sealed source, the stricter the regulatory control that is applied to its use, control, and ultimate disposition. Lower levels of attention and oversight can and do lead to sources ending up in the wrong place--as orphan sources in uncontrolled storage, disposed in a sanitary landfill, melted down in metal recycling operations and incorporated into consumer products, or handled by an unsuspecting member of the public. There is a range of issues that contribute to the problem of improper disposal of sealed sources and, in particular, to disused source disposal. Generally licensed sources and devices are particularly at risk of being disposed incorrectly. Higher activity generally licensed sources, although required to be registered with the (NRC) or an Agreement State, receive limited regulatory oversight and are not tracked on a national scale. Users frequently do not consider the full life-cycle costs when procuring sources or devices and discover that they cannot afford and/or are unwilling to pay the associated costs to package, transport and dispose of their sources properly. The NRC requirements for decommissioning funding plans and financial assurance are not adequate to cover sealed source transport and disposal costs fully. While there are regulatory limits for storage of disused sources, enforcement is limited, and there are only limited financial incentives in a small number of states for owners to dispose of the sources. In some cases, the lack of availability of approved Type B shipping casks presents an additional barrier to sealed source disposal. The report of the Disused Sources Working Group does an excellent job of framing these issues

  15. End of Life Decisions for Sealed Radioactive Sources.

    PubMed

    Pryor, Kathryn H

    2016-02-01

    Sealed radioactive sources are encountered in a wide variety of settings-from household smoke detectors and instrument check sources through fixed industrial gauges, industrial radiography, and well logging sources, to irradiators and medical teletherapy devices. In general, the higher the level of activity in the sealed source, the stricter the regulatory control that is applied to its use, control, and ultimate disposition. Lower levels of attention and oversight can and do lead to sources ending up in the wrong place--as orphan sources in uncontrolled storage, disposed in a sanitary landfill, melted down in metal recycling operations and incorporated into consumer products, or handled by an unsuspecting member of the public. There is a range of issues that contribute to the problem of improper disposal of sealed sources and, in particular, to disused source disposal. Generally licensed sources and devices are particularly at risk of being disposed incorrectly. Higher activity generally licensed sources, although required to be registered with the (NRC) or an Agreement State, receive limited regulatory oversight and are not tracked on a national scale. Users frequently do not consider the full life-cycle costs when procuring sources or devices and discover that they cannot afford and/or are unwilling to pay the associated costs to package, transport and dispose of their sources properly. The NRC requirements for decommissioning funding plans and financial assurance are not adequate to cover sealed source transport and disposal costs fully. While there are regulatory limits for storage of disused sources, enforcement is limited, and there are only limited financial incentives in a small number of states for owners to dispose of the sources. In some cases, the lack of availability of approved Type B shipping casks presents an additional barrier to sealed source disposal. The report of the Disused Sources Working Group does an excellent job of framing these issues

  16. Low-level radioactive waste source terms for the 1992 integrated data base

    SciTech Connect

    Loghry, S L; Kibbey, A H; Godbee, H W; Icenhour, A S; DePaoli, S M

    1995-01-01

    This technical manual presents updated generic source terms (i.e., unitized amounts and radionuclide compositions) which have been developed for use in the Integrated Data Base (IDB) Program of the U.S. Department of Energy (DOE). These source terms were used in the IDB annual report, Integrated Data Base for 1992: Spent Fuel and Radioactive Waste Inventories, Projections, and Characteristics, DOE/RW-0006, Rev. 8, October 1992. They are useful as a basis for projecting future amounts (volume and radioactivity) of low-level radioactive waste (LLW) shipped for disposal at commercial burial grounds or sent for storage at DOE solid-waste sites. Commercial fuel cycle LLW categories include boiling-water reactor, pressurized-water reactor, fuel fabrication, and uranium hexafluoride (UF{sub 6}) conversion. Commercial nonfuel cycle LLW includes institutional/industrial (I/I) waste. The LLW from DOE operations is category as uranium/thorium fission product, induced activity, tritium, alpha, and {open_quotes}other{close_quotes}. Fuel cycle commercial LLW source terms are normalized on the basis of net electrical output [MW(e)-year], except for UF{sub 6} conversion, which is normalized on the basis of heavy metal requirement [metric tons of initial heavy metal ]. The nonfuel cycle commercial LLW source term is normalized on the basis of volume (cubic meters) and radioactivity (curies) for each subclass within the I/I category. The DOE LLW is normalized in a manner similar to that for commercial I/I waste. The revised source terms are based on the best available historical data through 1992.

  17. OpenADR Open Source Toolkit: Developing Open Source Software for the Smart Grid

    SciTech Connect

    McParland, Charles

    2011-02-01

    Demand response (DR) is becoming an increasingly important part of power grid planning and operation. The advent of the Smart Grid, which mandates its use, further motivates selection and development of suitable software protocols to enable DR functionality. The OpenADR protocol has been developed and is being standardized to serve this goal. We believe that the development of a distributable, open source implementation of OpenADR will benefit this effort and motivate critical evaluation of its capabilities, by the wider community, for providing wide-scale DR services

  18. Current Situation for Management of Disused Sealed Radioactive Sources in Japan - 13025

    SciTech Connect

    Kusama, Keiji; Miyamoto, Yoichi

    2013-07-01

    As for the Sealed Radioactive Source currently used in Japan, many of them are imported from overseas. The U.S., Canada, Germany, the Netherlands, Belgium and Czech Republic are the main exporting States. Many of disused sealed radioactive sources are being returned to exporting States. The sealed radioactive sources which cannot be returned to exporting States are appropriately kept in the domestic storage facility. So, there are not main problem on the long term management of disused sealed radioactive sources in Japan. However, there are some difficulties on repatriate. One is reservation of a means of transport. The sea mail which conveys radioactive sources owing to reduction of movement of international cargo is decreasing in number. And there is a denial of shipment. Other one is that the manufacturer has already resigned from the work and cannot return disused sealed radioactive sources, or a manufacturer cannot specify and disused sources cannot be returned. The disused sealed radioactive source which cannot be repatriated is a little in term of radioactivity. As for the establishment of national measure of final disposal facility for disused sealed radioactive sources, in Japan, it is not yet installed with difficulty. Since there are many countries for which installation of a final disposal facility for disused sealed radioactive sources is difficult, the source manufacture country should respond positively to return the source which was manufactured and sold in the past. (authors)

  19. Experiments with radioactive samples at the Advanced Photon Source.

    SciTech Connect

    Veluri, V. R.; Justus, A.; Glagola, B.; Rauchas, A.; Vacca, J.

    2000-11-01

    The Advanced Photon Source (APS) at Argonne National Laboratory is a national synchrotron-radiation light source research facility. The 7 GeV electron Storage Ring is currently delivering intense high brilliance x-ray beams to a total of 34 beamlines with over 120 experiment stations to members of the international scientific community to carry out forefront basic and applied research in several scientific disciplines. Researchers come to the APS either as members of Collaborative Access Teams (CATs) or as Independent Investigators (IIs). Collaborative Access Teams comprise large number of investigators from universities, industry, and research laboratories with common research objectives. These teams are responsible for the design, construction, finding, and operation of beamlines. They are the owners of their experimental enclosures (''hutches'') designed and built to meet their specific research needs. Fig. 1 gives a plan view of the location of the Collaborative Access Teams by Sector and Discipline. In the past two years, over 2000 individual experiments were conducted at the APS facility. Of these, about 60 experiments involved the use of radioactive samples, which is less than 3% of the total. However, there is an increase in demand for experiment stations to accommodate the use of radioactive samples in different physical forms embedded in various matrices with activity levels ranging from trace amounts of naturally occurring radionuclides to MBq (mCi) quantities including transuranics. This paper discusses in some detail the steps in the safety review process for experiments involving radioactive samples and how ALARA philosophy is invoked at each step and implemented.

  20. Ion source test stand for radioactive beams (abstract)

    NASA Astrophysics Data System (ADS)

    Nolen, J. A.; Decrock, P.; Portillo, M.; Mullen, T. P.; Geraci, A. A.; Barlow, T. A.; Greene, J. P.; Gomes, I.; Batson, C. H.; Saremba, S. E.

    1998-02-01

    A test stand for development of ion sources for radioactive beams is currently being commissioned at Argonne. It is located at the Physics Division's Dynamitron accelerator which will be used as a neutron generator with a flux of up to 1011 neutrons per second created by reactions of 4 MeV deuterons on various targets with beam currents of up to 100 μA. The primary targets will be located adjacent to heated secondary targets inside an on-line ion source. With this neutron-generator facility it will be possible to produce radioactive beams of various isotopes, such as 6He, 24Na, and neutron-rich fission fragments. For example, with a secondary target of uranium carbide containing 25 g of natural or depleted uranium the yields of individual isotopes in the target will be about 107/s for isotopes such as 132Sn, 140Xe, and 142Cs, near the peak of the fission distribution. The ion sources to be evaluated will be located within a shielded cave with walls consisting of 30 cm of steel plus 60 cm of concrete to attenuate the prompt neutron radiation by a factor of about 104. Secondary beams of radioactive fission fragments with intensities on the order of 106/s per isotope will be extracted in the 1+ charge state at energies of 20 keV and mass separated with a Danfysik mass separator. Light isotopes, such as 6He and 24Na, can be produced via (n,α) and (n,p) reactions on appropriate target materials. Commissioning began with measurements of fission yields from primary targets of C, Be, BeO, and BN. A surface ionization source which is a variation of the one used in the TRISTAN on-line mass seperator facility at Brookhaven National Laboratory has been installed and tested with stable Rb and Cs beams. The isotope separator was also commissioned with these beams. The development program will include emittance measurements and source optimization, initially with stable beams, and target-delay-time and release-efficiency measurements for various target/secondary-beam systems.

  1. Artificial neural networks optimization method for radioactive source localization

    SciTech Connect

    Wacholder, E.; Elias, E.; Merlis, Y.

    1995-05-01

    An optimization artificial neural networks model is developed for solving the ill-posed inverse transport problem associated with localizing radioactive sources in a medium with known properties and dimensions. The model is based on the recurrent (or feedback) Hopfield network with fixed weights. The source distribution is determined based on the response of a limited number of external detectors of known spatial deployment in conjunction with a radiation transport model. The algorithm is tested and evaluated for a large number of simulated two-dimensional cases. Computations are carried out at different noise levels to account for statistical errors encountered in engineering applications. The sensitivity to noise is found to depend on the number of detectors and on their spatial deployment. A pretest empirical procedure is, therefore, suggested for determining an effective arrangement of detectors for a given problem.

  2. "Open-Sourcing" Personal Learning

    ERIC Educational Resources Information Center

    Fiedler, Sebastian H.D.

    2014-01-01

    This article offers a critical reflection on the contemporary Open Educational Resource (OER) movement, its unquestioned investment in a collective "content fetish" and an educational "problem description" that focuses on issues of scarcity, access, and availability of quality materials. It also argues that OER proponents fail…

  3. Automated management of radioactive sources in Saudi Arabia

    SciTech Connect

    Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.; Tuwaili, W. R.

    2014-09-30

    For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 24×7 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check request status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document.

  4. Automated management of radioactive sources in Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.; Tuwaili, W. R.

    2014-09-01

    For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 24×7 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check request status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document.

  5. Extraction simulations and emittance measurements of a Holifield Radioactive Ion Beam Facility electron beam plasma source for radioactive ion beams

    SciTech Connect

    Mendez, II, Anthony J; Liu, Yuan

    2010-01-01

    The Holifield Radioactive Ion Beam Facility HRIBF at Oak Ridge National Laboratory has a variety of ion sources used to produce radioactive ion beams RIBs. Of these, the workhorse is an electron beam plasma EBP ion source. The recent addition of a second RIB injector, the Injector for Radioactive Ion Species 2 IRIS2, for the HRIBF tandem accelerator prompted new studies of the optics of the beam extraction from the EBP source. The source was modeled using SIMION V8.0, and results will be presented, including comparison of the emittances as predicted by simulation and as measured at the HRIBF offline ion source test facilities. Also presented will be the impact on phase space shape resulting from extraction optics modifications implemented at IRIS2.

  6. Extraction simulations and emittance measurements of a Holifield Radioactive Ion Beam Facility electron beam plasma source for radioactive ion beams

    SciTech Connect

    Mendez, A. J. II; Liu, Y.

    2010-02-15

    The Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory has a variety of ion sources used to produce radioactive ion beams (RIBs). Of these, the workhorse is an electron beam plasma (EBP) ion source. The recent addition of a second RIB injector, the Injector for Radioactive Ion Species 2 (IRIS2), for the HRIBF tandem accelerator prompted new studies of the optics of the beam extraction from the EBP source. The source was modeled using SIMION V8.0, and results will be presented, including comparison of the emittances as predicted by simulation and as measured at the HRIBF offline ion source test facilities. Also presented will be the impact on phase space shape resulting from extraction optics modifications implemented at IRIS2.

  7. Extraction Simulations and Emittance Measurements of a Holifield Radioactive Ion Beam Facility Electron Beam Plasma Source for Radioactive Ion Beams

    SciTech Connect

    Mendez, II, Anthony J; Liu, Yuan

    2010-01-01

    The Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory has a variety of ion sources used to produce radioactive ion beams (RIBs). Of these, the workhorse is an electron beam plasma (EBP) ion source. The recent addition of a second RIB injector, the Injector for Radioactive Ion Species 2 (IRIS2), for the HRIBF tandem accelerator prompted new studies of the optics of the beam extraction from the EBP source. The source was modeled using SIMION V8.0, and results will be presented, including comparison of the emittances as predicted by simulation and as measured at the HRIBF offline ion source test facilities. Also presented will be the impact on phase space shape resulting from extraction optics modifications implemented at IRIS2.

  8. Radioactive source materials in Los Estados Unidos de Venezuela

    USGS Publications Warehouse

    Wyant, Donald G.; Sharp, William N.; Rodriguez, Carlos Ponte

    1953-01-01

    This report summarizes the data available on radioactive source materials in Los Estados Unidos de Venezuela accumulated by geologists of the Direccions Tecnica de Geolgia and antecedent agencies prior to June 1951, and the writers from June to November 1951. The investigation comprised preliminary study, field examination, office studies, and the preparation of this report, in which the areas and localities examined are described in detail, the uranium potentialities of Venezuela are summarized, and recommendations are made. Preliminary study was made to select areas and rock types that were known or reported to be radioactive or that geologic experience suggests would be favorable host for uranium deposits, In the office, a study of gamma-ray well logs was started as one means of amassing general radiometric data and of rapidly scanning many of ye rocks in northern Venezuela; gamma-ray logs from about 140 representative wells were examined and their peaks of gamma intensity evaluated; in addition samples were analyzed radiometrically, and petrographically. Radiometic reconnaissance was made in the field during about 3 months of 1951, or about 12 areas, including over 100 localities in the State of Miranda, Carabobo, Yaracuy, Falcon, Lara, Trujillo, Zulia, Merida, Tachira, Bolivar, and Territory Delta Amacuro. During the course of the investigation, both in the filed and office, information was given about geology of uranium deposits, and in techniques used in prospecting and analysis. All studies and this report are designed to supplement and to strengthen the Direccion Tecnica de Geologias's program of investigation of radioactive source in Venezuela now in progress. The uranium potentialities of Los Estados de Venezuela are excellent for large, low-grade deposits of uraniferous phospahtic shales containing from 0.002 to 0.027 percent uranium; fair, for small or moderate-sized, low-grade placer deposits of thorium, rare-earth, and uranium minerals; poor, for

  9. OpenStudio: An Open Source Integrated Analysis Platform; Preprint

    SciTech Connect

    Guglielmetti, R.; Macumber, D.; Long, N.

    2011-12-01

    High-performance buildings require an integrated design approach for all systems to work together optimally; systems integration needs to be incorporated in the earliest stages of design for efforts to be cost and energy-use effective. Building designers need a full-featured software framework to support rigorous, multidisciplinary building simulation. An open source framework - the OpenStudio Software Development Kit (SDK) - is being developed to address this need. In this paper, we discuss the needs that drive OpenStudio's system architecture and goals, provide a development status report (the SDK is currently in alpha release), and present a brief case study that illustrates its utility and flexibility.

  10. Open source bioimage informatics for cell biology.

    PubMed

    Swedlow, Jason R; Eliceiri, Kevin W

    2009-11-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.

  11. Assessment on security system of radioactive sources used in hospitals of Thailand

    NASA Astrophysics Data System (ADS)

    Jitbanjong, Petchara; Wongsawaeng, Doonyapong

    2016-01-01

    Unsecured radioactive sources have caused deaths and serious injuries in many parts of the world. In Thailand, there are 17 hospitals that use teletherapy with cobalt-60 radioactive sources. They need to be secured in order to prevent unauthorized removal, sabotage and terrorists from using such materials in a radiological weapon. The security system of radioactive sources in Thailand is regulated by the Office of Atoms for Peace in compliance with Global Threat Reduction Initiative (GTRI), U.S. DOE, which has started to be implemented since 2010. This study aims to perform an assessment on the security system of radioactive sources used in hospitals in Thailand and the results can be used as a recommended baseline data for development or improvement of hospitals on the security system of a radioactive source at a national regulatory level and policy level. Results from questionnaires reveal that in 11 out of 17 hospitals (64.70%), there were a few differences in conditions of hospitals using radioactive sources with installation of the security system and those without installation of the security system. Also, personals working with radioactive sources did not clearly understand the nuclear security law. Thus, government organizations should be encouraged to arrange trainings on nuclear security to increase the level of understanding. In the future, it is recommended that the responsible government organization issues a minimum requirement of nuclear security for every medical facility using radioactive sources.

  12. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  13. Demonstration and development of control mechanism for radioactive sources in Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Al-Kheliewi, A. S.

    2012-06-01

    Saudi Arabia have no nuclear industry. Nevertheless, many radioactive sources, for different purposes, have been used in the country. There is upswing in the number of companies that recruit nuclear technology in their daily work. The National Center for Radiation Protection (NCRP) takes the full commitment and responsibility for monitoring and regulating the movement of radioactive sources in the country. NCRP issues the licenses for import, export, and use of radioactive sources. It, also, protects the country from any trespassing radiation through a sizable net of early warning and radiation monitoring stations along the borders of Saudi Arabia. This paper talks about the procedures of licensing, importing, exporting of radioactive sources. It, also, sheds light on types of implementing radioactive sources in different practices encompass medicine, industry, research. The NCRP has established an electronic web site to ease the communication with all users in the country. This site is yet in the experimental stage.

  14. Demonstration and development of control mechanism for radioactive sources in Saudi Arabia

    SciTech Connect

    Al-Kheliewi, A. S.

    2012-06-06

    Saudi Arabia have no nuclear industry. Nevertheless, many radioactive sources, for different purposes, have been used in the country. There is upswing in the number of companies that recruit nuclear technology in their daily work. The National Center for Radiation Protection (NCRP) takes the full commitment and responsibility for monitoring and regulating the movement of radioactive sources in the country. NCRP issues the licenses for import, export, and use of radioactive sources. It, also, protects the country from any trespassing radiation through a sizable net of early warning and radiation monitoring stations along the borders of Saudi Arabia. This paper talks about the procedures of licensing, importing, exporting of radioactive sources. It, also, sheds light on types of implementing radioactive sources in different practices encompass medicine, industry, research. The NCRP has established an electronic web site to ease the communication with all users in the country. This site is yet in the experimental stage.

  15. The SAMI2 Open Source Project

    NASA Astrophysics Data System (ADS)

    Huba, J. D.; Joyce, G.

    2001-05-01

    In the past decade, the Open Source Model for software development has gained popularity and has had numerous major achievements: emacs, Linux, the Gimp, and Python, to name a few. The basic idea is to provide the source code of the model or application, a tutorial on its use, and a feedback mechanism with the community so that the model can be tested, improved, and archived. Given the success of the Open Source Model, we believe it may prove valuable in the development of scientific research codes. With this in mind, we are `Open Sourcing' the low to mid-latitude ionospheric model that has recently been developed at the Naval Research Laboratory: SAMI2 (Sami2 is Another Model of the Ionosphere). The model is comprehensive and uses modern numerical techniques. The structure and design of SAMI2 make it relatively easy to understand and modify: the numerical algorithms are simple and direct, and the code is reasonably well-written. Furthermore, SAMI2 is designed to run on personal computers; prohibitive computational resources are not necessary, thereby making the model accessible and usable by virtually all researchers. For these reasons, SAMI2 is an excellent candidate to explore and test the open source modeling paradigm in space physics research. We will discuss various topics associated with this project. Research supported by the Office of Naval Research.

  16. Statistical methods for the detection and analysis of radioactive sources

    NASA Astrophysics Data System (ADS)

    Klumpp, John

    We consider four topics from areas of radioactive statistical analysis in the present study: Bayesian methods for the analysis of count rate data, analysis of energy data, a model for non-constant background count rate distributions, and a zero-inflated model of the sample count rate. The study begins with a review of Bayesian statistics and techniques for analyzing count rate data. Next, we consider a novel system for incorporating energy information into count rate measurements which searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data in real time to sequentially update a probability distribution for the sample count rate. We then consider a "moving target" model of background radiation in which the instantaneous background count rate is a function of time, rather than being fixed. Unlike the sequential update system, this model assumes a large body of pre-existing data which can be analyzed retrospectively. Finally, we propose a novel Bayesian technique which allows for simultaneous source detection and count rate analysis. This technique is fully compatible with, but independent of, the sequential update system and moving target model.

  17. Web Server Security on Open Source Environments

    NASA Astrophysics Data System (ADS)

    Gkoutzelis, Dimitrios X.; Sardis, Manolis S.

    Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.

  18. Safety and Security of Radioactive Sealed and Disused/Orphan Sources in Ukraine - German Contribution - 13359

    SciTech Connect

    Brasser, Thomas; Hertes, Uwe; Meyer, Thorsten; Uhlenbruck, Hermann; Shevtsov, Alexey

    2013-07-01

    Within the scope of 'Nuclear Security of Radioactive Sources', the German government implemented the modernization of Ukrainian State Production Company's transport and storage facility for radioactive sources (TSF) in Kiev. The overall management of optimizing the physical protection of the storage facility (including the construction of a hot cell for handling the radioactive sources) is currently carried out by the German Federal Foreign Office (AA). AA jointly have assigned Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Germany's leading expert institution in the area of nuclear safety and waste management, to implement the project and to ensure transparency by financial and technical monitoring. Sealed radioactive sources are widely used in industry, medicine and research. Their life cycle starts with the production and finally ends with the interim/long-term storage of the disused sources. In Ukraine, IZOTOP is responsible for all radioactive sources throughout their life cycle. IZOTOP's transport and storage facility (TSF) is the only Ukrainian storage facility for factory-fresh radioactive sources up to an activity of about 1 million Ci (3.7 1016 Bq). The TSF is specially designed for the storage and handling of radioactive sources. Storage began in 1968, and is licensed by the Ukrainian state authorities. Beside the outdated state of TSF's physical protection and the vulnerability of the facility linked with it, the lack of a hot cell for handling and repacking radioactive sources on the site itself represents an additional potential hazard. The project, financed by the German Federal Foreign Office, aims to significantly improve the security of radioactive sources during their storage and handling at the TSF site. Main tasks of the project are a) the modernization of the physical protection of the TSF itself in order to prevent any unauthorized access to radioactive sources as well as b) the construction of a hot cell to reduce the number of

  19. Open Access, Open Source and Digital Libraries: A Current Trend in University Libraries around the World

    ERIC Educational Resources Information Center

    Krishnamurthy, M.

    2008-01-01

    Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…

  20. The Emergence of Open-Source Software in China

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in China, such as…

  1. Communal Resources in Open Source Software Development

    ERIC Educational Resources Information Center

    Spaeth, Sebastian; Haefliger, Stefan; von Krogh, Georg; Renzl, Birgit

    2008-01-01

    Introduction: Virtual communities play an important role in innovation. The paper focuses on the particular form of collective action in virtual communities underlying as Open Source software development projects. Method: Building on resource mobilization theory and private-collective innovation, we propose a theory of collective action in…

  2. Implementing Rakim: Open Source Chat Reference Software

    ERIC Educational Resources Information Center

    Caraway, Shawn; Payne, Susan

    2005-01-01

    This article describes the conception, implementation, and current status of Rakim open source software at Midlands Technical college in Columbia, SC. Midlands Technical College (MTC) is a 2-year school in Columbia, S.C. It has two large campuses and three smaller campuses. Although the library functions as a single unit, there are separate…

  3. Open-source syringe pump library.

    PubMed

    Wijnen, Bas; Hunt, Emily J; Anzalone, Gerald C; Pearce, Joshua M

    2014-01-01

    This article explores a new open-source method for developing and manufacturing high-quality scientific equipment suitable for use in virtually any laboratory. A syringe pump was designed using freely available open-source computer aided design (CAD) software and manufactured using an open-source RepRap 3-D printer and readily available parts. The design, bill of materials and assembly instructions are globally available to anyone wishing to use them. Details are provided covering the use of the CAD software and the RepRap 3-D printer. The use of an open-source Rasberry Pi computer as a wireless control device is also illustrated. Performance of the syringe pump was assessed and the methods used for assessment are detailed. The cost of the entire system, including the controller and web-based control interface, is on the order of 5% or less than one would expect to pay for a commercial syringe pump having similar performance. The design should suit the needs of a given research activity requiring a syringe pump including carefully controlled dosing of reagents, pharmaceuticals, and delivery of viscous 3-D printer media among other applications. PMID:25229451

  4. Of Birkenstocks and Wingtips: Open Source Licenses

    ERIC Educational Resources Information Center

    Gandel, Paul B.; Wheeler, Brad

    2005-01-01

    The notion of collaborating to create open source applications for higher education is rapidly gaining momentum. From course management systems to ERP financial systems, higher education institutions are working together to explore whether they can in fact build a better mousetrap. As Lois Brooks, of Stanford University, recently observed, the…

  5. Open-Source Syringe Pump Library

    PubMed Central

    Wijnen, Bas; Hunt, Emily J.; Anzalone, Gerald C.; Pearce, Joshua M.

    2014-01-01

    This article explores a new open-source method for developing and manufacturing high-quality scientific equipment suitable for use in virtually any laboratory. A syringe pump was designed using freely available open-source computer aided design (CAD) software and manufactured using an open-source RepRap 3-D printer and readily available parts. The design, bill of materials and assembly instructions are globally available to anyone wishing to use them. Details are provided covering the use of the CAD software and the RepRap 3-D printer. The use of an open-source Rasberry Pi computer as a wireless control device is also illustrated. Performance of the syringe pump was assessed and the methods used for assessment are detailed. The cost of the entire system, including the controller and web-based control interface, is on the order of 5% or less than one would expect to pay for a commercial syringe pump having similar performance. The design should suit the needs of a given research activity requiring a syringe pump including carefully controlled dosing of reagents, pharmaceuticals, and delivery of viscous 3-D printer media among other applications. PMID:25229451

  6. Open source OCR framework using mobile devices

    NASA Astrophysics Data System (ADS)

    Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan

    2008-02-01

    Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.

  7. Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on

    PubMed Central

    2011-01-01

    Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community. PMID:21999342

  8. Dynamic Radioactive Source for Evaluating and Demonstrating Time-dependent Performance of Continuous Air Monitors.

    PubMed

    McLean, Thomas D; Moore, Murray E; Justus, Alan L; Hudston, Jonathan A; Barbé, Benoît

    2016-11-01

    Evaluation of continuous air monitors in the presence of a plutonium aerosol is time intensive, expensive, and requires a specialized facility. The Radiation Protection Services Group at Los Alamos National Laboratory has designed a Dynamic Radioactive Source, intended to replace plutonium aerosol challenge testing. The Dynamic Radioactive Source is small enough to be inserted into the sampler filter chamber of a typical continuous air monitor. Time-dependent radioactivity is introduced from electroplated sources for real-time testing of a continuous air monitor where a mechanical wristwatch motor rotates a mask above an alpha-emitting electroplated disk source. The mask is attached to the watch's minute hand, and as it rotates, more of the underlying source is revealed. The measured alpha activity increases with time, simulating the arrival of airborne radioactive particulates at the air sampler inlet. The Dynamic Radioactive Source allows the temporal behavior of puff and chronic release conditions to be mimicked without the need for radioactive aerosols. The new system is configurable to different continuous air monitor designs and provides an in-house testing capability (benchtop compatible). It is a repeatable and reusable system and does not contaminate the tested air monitor. Test benefits include direct user control, realistic (plutonium) aerosol spectra, and iterative development of continuous air monitor alarm algorithms. Data obtained using the Dynamic Radioactive Source has been used to elucidate alarm algorithms and to compare the response time of two commercial continuous air monitors. PMID:27682903

  9. Open Source, Open Standards, and Health Care Information Systems

    PubMed Central

    2011-01-01

    Recognition of the improvements in patient safety, quality of patient care, and efficiency that health care information systems have the potential to bring has led to significant investment. Globally the sale of health care information systems now represents a multibillion dollar industry. As policy makers, health care professionals, and patients, we have a responsibility to maximize the return on this investment. To this end we analyze alternative licensing and software development models, as well as the role of standards. We describe how licensing affects development. We argue for the superiority of open source licensing to promote safer, more effective health care information systems. We claim that open source licensing in health care information systems is essential to rational procurement strategy. PMID:21447469

  10. Computer Forensics Education - the Open Source Approach

    NASA Astrophysics Data System (ADS)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  11. From open source communications to knowledge

    NASA Astrophysics Data System (ADS)

    Preece, Alun; Roberts, Colin; Rogers, David; Webberley, Will; Innes, Martin; Braines, Dave

    2016-05-01

    Rapid processing and exploitation of open source information, including social media sources, in order to shorten decision-making cycles, has emerged as an important issue in intelligence analysis in recent years. Through a series of case studies and natural experiments, focussed primarily upon policing and counter-terrorism scenarios, we have developed an approach to information foraging and framing to inform decision making, drawing upon open source intelligence, in particular Twitter, due to its real-time focus and frequent use as a carrier for links to other media. Our work uses a combination of natural language (NL) and controlled natural language (CNL) processing to support information collection from human sensors, linking and schematising of collected information, and the framing of situational pictures. We illustrate the approach through a series of vignettes, highlighting (1) how relatively lightweight and reusable knowledge models (schemas) can rapidly be developed to add context to collected social media data, (2) how information from open sources can be combined with reports from trusted observers, for corroboration or to identify con icting information; and (3) how the approach supports users operating at or near the tactical edge, to rapidly task information collection and inform decision-making. The approach is supported by bespoke software tools for social media analytics and knowledge management.

  12. Open Source GIS based integrated watershed management

    NASA Astrophysics Data System (ADS)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address

  13. Review of work related to ion sources and targets for radioactive beams at Argonne

    SciTech Connect

    Nolen, J.A.

    1995-12-01

    A group including many ANL Physics Division staff and ATLAS outside users has discussed the possibilities for research with radioactive ion beams and prepared a working paper entitled {open_quotes}Concept for an Advanced Exotic Beam Facility Based on ATLAS.{close_quotes} Several subgroups have been working on issues related to ion sources and targets which could be used in the production and ionization of radionuclides with high power primary beams. Present activities include: (a) setting up an ion source test stand to measure emittances and energy spreads of ISOL-type ion sources, (b) experiments to evaluate methods of containing liquid uranium for production targets, (c) experimental evaluation of geometries for the generation of secondary neutron beams for production of radionuclides, (d) setting up an ISOL-type ion source at a neutron generator facility to measure fission fragment release times and efficiencies, and (e) computer simulations of an electron-beam charge-state amplifier to increase the charge states of 1{sup +} secondary beams to 2,3 or 4{sup +}. The present status of these projects and future plans are reported below.

  14. Comparison of open source visual analytics toolkits.

    SciTech Connect

    Crossno, Patricia Joyce; Harger, John R.

    2010-11-01

    We present the results of the first stage of a two-stage evaluation of open source visual analytics packages. This stage is a broad feature comparison over a range of open source toolkits. Although we had originally intended to restrict ourselves to comparing visual analytics toolkits, we quickly found that very few were available. So we expanded our study to include information visualization, graph analysis, and statistical packages. We examine three aspects of each toolkit: visualization functions, analysis capabilities, and development environments. With respect to development environments, we look at platforms, language bindings, multi-threading/parallelism, user interface frameworks, ease of installation, documentation, and whether the package is still being actively developed.

  15. Open Source Real Time Operating Systems Overview

    SciTech Connect

    Straumann, Till

    2001-12-11

    Modern control systems applications are often built on top of a real time operating system (RTOS) which provides the necessary hardware abstraction as well as scheduling, networking and other services. Several open source RTOS solutions are publicly available, which is very attractive, both from an economic (no licensing fees) as well as from a technical (control over the source code) point of view. This contribution gives an overview of the RTLinux and RTEMS systems (architecture, development environment, API etc.). Both systems feature most popular CPUs, several APIs (including Posix), networking, portability and optional commercial support. Some performance figures are presented, focusing on interrupt latency and context switching delay.

  16. Open Source Approach to Urban Growth Simulation

    NASA Astrophysics Data System (ADS)

    Petrasova, A.; Petras, V.; Van Berkel, D.; Harmon, B. A.; Mitasova, H.; Meentemeyer, R. K.

    2016-06-01

    Spatial patterns of land use change due to urbanization and its impact on the landscape are the subject of ongoing research. Urban growth scenario simulation is a powerful tool for exploring these impacts and empowering planners to make informed decisions. We present FUTURES (FUTure Urban - Regional Environment Simulation) - a patch-based, stochastic, multi-level land change modeling framework as a case showing how what was once a closed and inaccessible model benefited from integration with open source GIS.We will describe our motivation for releasing this project as open source and the advantages of integrating it with GRASS GIS, a free, libre and open source GIS and research platform for the geospatial domain. GRASS GIS provides efficient libraries for FUTURES model development as well as standard GIS tools and graphical user interface for model users. Releasing FUTURES as a GRASS GIS add-on simplifies the distribution of FUTURES across all main operating systems and ensures the maintainability of our project in the future. We will describe FUTURES integration into GRASS GIS and demonstrate its usage on a case study in Asheville, North Carolina. The developed dataset and tutorial for this case study enable researchers to experiment with the model, explore its potential or even modify the model for their applications.

  17. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  18. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  19. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  20. Open source portal to distributed image repositories

    NASA Astrophysics Data System (ADS)

    Tao, Wenchao; Ratib, Osman M.; Kho, Hwa; Hsu, Yung-Chao; Wang, Cun; Lee, Cason; McCoy, J. M.

    2004-04-01

    In large institution PACS, patient data may often reside in multiple separate systems. While most systems tend to be DICOM compliant, none of them offer the flexibility of seamless integration of multiple DICOM sources through a single access point. We developed a generic portal system with a web-based interactive front-end as well as an application programming interface (API) that allows both web users and client applications to query and retrieve image data from multiple DICOM sources. A set of software tools was developed to allow accessing several DICOM archives through a single point of access. An interactive web-based front-end allows user to search image data seamlessly from the different archives and display the results or route the image data to another DICOM compliant destination. An XML-based API allows other software programs to easily benefit from this portal to query and retrieve image data as well. Various techniques are employed to minimize the performance overhead inherent in the DICOM. The system is integrated with a hospital-wide HIPAA-compliant authentication and auditing service that provides centralized management of access to patient medical records. The system is provided under open source free licensing and developed using open-source components (Apache Tomcat for web server, MySQL for database, OJB for object/relational data mapping etc.). The portal paradigm offers a convenient and effective solution for accessing multiple image data sources in a given healthcare enterprise and can easily be extended to multi-institution through appropriate security and encryption mechanisms.

  1. 77 FR 64435 - Branch Technical Position on the Import of Non-U.S. Origin Radioactive Sources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-22

    ... greater than the number of new sources exported. \\3\\ Import and Export of Radioactive Waste, 60 FR 37556... Radioactive Sources AGENCY: Nuclear Regulatory Commission. ACTION: Request for comment. SUMMARY: In 2010, the... definition of ``radioactive waste''. The phrase was added to the final rule in response to a public...

  2. Beyond Open Source: According to Jim Hirsch, Open Technology, Not Open Source, Is the Wave of the Future

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    This article presents an interview with Jim Hirsch, an associate superintendent for technology at Piano Independent School District in Piano, Texas. Hirsch serves as a liaison for the open technologies committee of the Consortium for School Networking. In this interview, he shares his opinion on the significance of open source in K-12.

  3. Open Source Testing Capability for Geospatial Software

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.

    2013-12-01

    Geospatial Software enables scientists to discover, access and process information for better understanding of the Earth. Hundreds, if not thousands, of geospatial software packages exist today. Many of these implement open standards. The OGC Implementation Statistics page [1] reports, for example, more than 450 software products that implement the OGC Web Map Service (WMS) 1.1.1 standard. Even though organizations voluntarily report their products as implementing the WMS standard, not all of these implementations can interoperate with each other. For example, a WMS client may not interact with all these WMS servers in the same functional way. Making the software work with other software, even when implementing the same standard, still remains a challenge, and the main reason is that not all implementations implement the standard correctly. The Open Geospatial Consortium (OGC) Compliance Program provides a testing infrastructure to test for the correct implementation of OGC standards in interfaces and encodings that enable communication between geospatial clients and servers. The OGC testing tool and the tests are all freely available, including the source code and access to the testing facility. The Test, Evaluation, And Measurement (TEAM) Engine is a test harness that executes test suites written using the OGC Compliance Testing Language (CTL) or the TestNG framework. TEAM Engine is available in Sourceforge. OGC hosts an official stable [2] deployment of TEAM Engine with the approved test suites. OGC also hosts a Beta TEAM Engine [3] with the tests in Beta and with new TEAM Engine functionality. Both deployments are freely available to everybody. The OGC testing infrastructure not only enables developers to test OGC standards, but it can be configured to test profiles of OGC standards and community-developed application agreements. These agreements can be any interface and encoding agreement, not only OGC based. The OGC Compliance Program is thus an important

  4. Open Knee: Open Source Modeling and Simulation in Knee Biomechanics.

    PubMed

    Erdemir, Ahmet

    2016-02-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical functions of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor-intensive reproduction of model development steps can be avoided. Interested parties can immediately utilize readily available models for scientific discovery and clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes the detailed anatomical representation of the joint's major tissue structures and their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next-generation knee models is noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age

  5. Open Source Live Distributions for Computer Forensics

    NASA Astrophysics Data System (ADS)

    Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele

    Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.

  6. The Emergence of Open-Source Software in North America

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while less than…

  7. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    ERIC Educational Resources Information Center

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  8. An Analysis of Open Source Security Software Products Downloads

    ERIC Educational Resources Information Center

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  9. The Open Source Teaching Project (OSTP): Research Note.

    ERIC Educational Resources Information Center

    Hirst, Tony

    The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…

  10. Open Source Service Agent (OSSA) in the intelligence community's Open Source Architecture

    NASA Technical Reports Server (NTRS)

    Fiene, Bruce F.

    1994-01-01

    The Community Open Source Program Office (COSPO) has developed an architecture for the intelligence community's new Open Source Information System (OSIS). The architecture is a multi-phased program featuring connectivity, interoperability, and functionality. OSIS is based on a distributed architecture concept. The system is designed to function as a virtual entity. OSIS will be a restricted (non-public), user configured network employing Internet communications. Privacy and authentication will be provided through firewall protection. Connection to OSIS can be made through any server on the Internet or through dial-up modems provided the appropriate firewall authentication system is installed on the client.

  11. Open-source solutions for SPIMage processing.

    PubMed

    Schmied, Christopher; Stamataki, Evangelia; Tomancak, Pavel

    2014-01-01

    Light sheet microscopy is an emerging technique allowing comprehensive visualization of dynamic biological processes, at high spatial and temporal resolution without significant damage to the sample by the imaging process itself. It thus lends itself to time-lapse observation of fluorescently labeled molecular markers over long periods of time in a living specimen. In combination with sample rotation light sheet microscopy and in particular its selective plane illumination microscopy (SPIM) flavor, enables imaging of relatively large specimens, such as embryos of animal model organisms, in their entirety. The benefits of SPIM multiview imaging come to the cost of image data postprocessing necessary to deliver the final output that can be analyzed. Here, we provide a set of practical recipes that walk biologists through the complex processes of SPIM data registration, fusion, deconvolution, and time-lapse registration using publicly available open-source tools. We explain, in plain language, the basic principles behind SPIM image-processing algorithms that should enable users to make informed decisions during parameter tuning of the various processing steps applied to their own datasets. Importantly, the protocols presented here are applicable equally to processing of multiview SPIM data from the commercial Zeiss Lightsheet Z.1 microscope and from the open-access SPIM platforms such as OpenSPIM. PMID:24974045

  12. Spatial rainfall data in open source environment

    NASA Astrophysics Data System (ADS)

    Schuurmans, Hanneke; Maarten Verbree, Jan; Leijnse, Hidde; van Heeringen, Klaas-Jan; Uijlenhoet, Remko; Bierkens, Marc; van de Giesen, Nick; Gooijer, Jan; van den Houten, Gert

    2013-04-01

    Since January 2013 The Netherlands have access to innovative high-quality rainfall data that is used for watermanagers. This product is innovative because of the following reasons. (i) The product is developed in a 'golden triangle' construction - corporation between government, business and research. (ii) Second the rainfall products are developed according to the open-source GPL license. The initiative comes from a group of water boards in the Netherlands that joined their forces to fund the development of a new rainfall product. Not only data from Dutch radar stations (as is currently done by the Dutch meteorological organization KNMI) is used but also data from radars in Germany and Belgium. After a radarcomposite is made, it is adjusted according to data from raingauges (ground truth). This results in 9 different rainfall products that give for each moment the best rainfall data. Specific knowledge is necessary to develop these kind of data. Therefore a pool of experts (KNMI, Deltares and 3 universities) participated in the development. The philosophy of the developers (being corporations) is that products like this should be developed in open source. This way knowledge is shared and the whole community is able to make suggestions for improvement. In our opinion this is the only way to make real progress in product development. Furthermore the financial resources of government organizations are optimized. More info (in Dutch): www.nationaleregenradar.nl

  13. Open Source Hardware for DIY Environmental Sensing

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Hicks, S. D.; Damiano, S. G.; Montgomery, D. S.

    2014-12-01

    The Arduino open source electronics platform has been very popular within the DIY (Do It Yourself) community for several years, and it is now providing environmental science researchers with an inexpensive alternative to commercial data logging and transmission hardware. Here we present the designs for our latest series of custom Arduino-based dataloggers, which include wireless communication options like self-meshing radio networks and cellular phone modules. The main Arduino board uses a custom interface board to connect to various research-grade sensors to take readings of turbidity, dissolved oxygen, water depth and conductivity, soil moisture, solar radiation, and other parameters. Sensors with SDI-12 communications can be directly interfaced to the logger using our open Arduino-SDI-12 software library (https://github.com/StroudCenter/Arduino-SDI-12). Different deployment options are shown, like rugged enclosures to house the loggers and rigs for mounting the sensors in both fresh water and marine environments. After the data has been collected and transmitted by the logger, the data is received by a mySQL-PHP stack running on a web server that can be accessed from anywhere in the world. Once there, the data can be visualized on web pages or served though REST requests and Water One Flow (WOF) services. Since one of the main benefits of using open source hardware is the easy collaboration between users, we are introducing a new web platform for discussion and sharing of ideas and plans for hardware and software designs used with DIY environmental sensors and data loggers.

  14. An Open Source Tool to Test Interoperability

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  15. Predicting induced radioactivity for the accelerator operations at the Taiwan Photon Source.

    PubMed

    Sheu, R J; Jiang, S H

    2010-12-01

    This study investigates the characteristics of induced radioactivity due to the operations of a 3-GeV electron accelerator at the Taiwan Photon Source (TPS). According to the beam loss analysis, the authors set two representative irradiation conditions for the activation analysis. The FLUKA Monte Carlo code has been used to predict the isotope inventories, residual activities, and remanent dose rates as a function of time. The calculation model itself is simple but conservative for the evaluation of induced radioactivity in a light source facility. This study highlights the importance of beam loss scenarios and demonstrates the great advantage of using FLUKA in comparing the predicted radioactivity with corresponding regulatory limits. The calculated results lead to the conclusion that, due to fairly low electron consumption, the radioactivity induced in the accelerator components and surrounding concrete walls of the TPS is rather moderate and manageable, while the possible activation of air and cooling water in the tunnel and their environmental releases are negligible.

  16. Developing an Open Source Option for NASA Software

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Parks, John W. (Technical Monitor)

    2003-01-01

    We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.

  17. Enrico Fermi's Discovery of Neutron-Induced Artificial Radioactivity: Neutrons and Neutron Sources

    NASA Astrophysics Data System (ADS)

    Guerra, Francesco; Leone, Matteo; Robotti, Nadia

    2006-09-01

    We reconstruct and analyze the path leading from James Chadwick’s discovery of the neutron in February 1932 through Frédéric Joliot and Irène Curie’s discovery of artificial radioactivity in January 1934 to Enrico Fermi’s discovery of neutron-induced artificial radioactivity in March 1934. We show, in particular, that Fermi’s innovative construction and use of radon-beryllium neutron sources permitted him to make his discovery.

  18. Calibration of a DSSSD detector with radioactive sources

    SciTech Connect

    Guadilla, V.; Tain, J. L.; Agramunt, J.; Algora, A.; Domingo-Pardo, C.; Rubio, B.

    2013-06-10

    The energy calibration of a DSSSD is carried out with the spectra produced by a {sup 207}Bi conversion electron source, a {sup 137}Cs gamma source and a {sup 239}Pu/{sup 241}Am/{sup 244}Cm triple alpha source, as well as employing a precision pulse generator in the whole dynamic range. Multiplicity and coincidence of signals in different strips for the same event are also studied.

  19. Application of radioactive sources in analytical instruments for planetary exploration.

    PubMed

    Economou, Thanasis E

    2010-01-01

    Radioactive isotopes have been used in analytical instrumentation for planetary exploration since the very beginning of the space age. An Alpha Scattering Instrument (ASI) on board the Surveyor 5, 6 and 7 spacecrafts used the isotope (242)Cm to obtain the chemical composition of the lunar surface material in 1960s. The Alpha Proton X-ray Spectrometers (APXS) used on several mission to Mars (Pathfinder, Mars-96, Mars Exploration Rovers (MER) and on the Mars Science Laboratory (MSL), the next mission to Mars in 2011 and on the Rosetta mission to a comet) are improved derivatives of the original ASI, complimented with an X-ray mode and using the longer lived (244)Cm isotope. (57)Co, (55)Fe and many other radioisotopes have been used in several missions carrying XRF and Mössbauer instruments. In addition, (238)Pu isotope is exclusively being used in most of the space missions for heating and power generation. PMID:19850487

  20. An open source simulator for water management

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Selby, Philip; Mohammed, Khaled; Khadem, Majed; Padula, Silvia; Harou, Julien; Rosenberg, David; Rheinheimer, David

    2015-04-01

    Descriptive modelling of water resource systems requires the representation of different aspects in one model: the physical system including hydrological inputs and engineered infrastructure, and human management, including social, economic and institutional behaviours and constraints. Although most water resource systems share some characteristics such as the ability to represent them as a network of nodes and links, geographical, institutional and other differences mean that invariably each water system functions in a unique way. A diverse group is developing an open source simulation framework which will allow model developers to build generalised water management models that are customised to the institutional, physical and economical components they are seeking to model. The framework will allow the simulation of complex individual and institutional behaviour required for the assessment of real-world resource systems. It supports the spatial and hierarchical structures commonly found in water resource systems. The individual infrastructures can be operated by different actors while policies are defined at a regional level by one or more institutional actors. The framework enables building multi-agent system simulators in which developers can define their own agent types and add their own decision making code. Developers using the framework have two main tasks: (i) Extend the core classes to represent the aspects of their particular system, and (ii) write model structure files. Both are done in Python. For task one, users must either write new decision making code for each class or link to an existing code base to provide functionality to each of these extension classes. The model structure file links these extension classes in a standardised way to the network topology. The framework will be open-source and written in Python and is to be available directly for download through standard installer packages. Many water management model developers are unfamiliar

  1. Ion sources for initial use at the Holifield Radioactive Ion Beam Facility

    SciTech Connect

    Alton, G.D.

    1993-12-31

    The Holifield Radioactive Ion Beam Facility (HRIBF) now under construction at the Oak Ridge National Laboratory will use the 25-MV tandem accelerator for the acceleration of radioactive ion beams to energies appropriate for research in nuclear physics; negative ion beams are, therefore, required for injection into the tandem accelerator. Because charge exchange is an efficient means for converting initially positive ion beams to negative ion beams, both positive and negative ion sources are viable options for use at the facility; the choice of the type of ion source will depend on the overall efficiency for generating the radioactive species of interest. A high-temperature version of the CERN-ISOLDE positive ion source has been selected and a modified version of the source designed and fabricated for initial use at the HRIBF because of its low emittance, relatively high ionization efficiencies and species versatility, and because it has been engineered for remote installation, removal and servicing as required for safe handling in a high-radiation-level ISOL facility. Prototype plasma-sputter negative ion sources and negative surface-ionization sources are also under design consideration for generating negative radioactive ion beams from high-electron-affinity elements. The design features of these sources and expected efficiencies and beam qualities (emittances) will be described in this report.

  2. An Affordable Open-Source Turbidimeter

    PubMed Central

    Kelley, Christopher D.; Krolick, Alexander; Brunner, Logan; Burklund, Alison; Kahn, Daniel; Ball, William P.; Weber-Shirk, Monroe

    2014-01-01

    Turbidity is an internationally recognized criterion for assessing drinking water quality, because the colloidal particles in turbid water may harbor pathogens, chemically reduce oxidizing disinfectants, and hinder attempts to disinfect water with ultraviolet radiation. A turbidimeter is an electronic/optical instrument that assesses turbidity by measuring the scattering of light passing through a water sample containing such colloidal particles. Commercial turbidimeters cost hundreds or thousands of dollars, putting them beyond the reach of low-resource communities around the world. An affordable open-source turbidimeter based on a single light-to-frequency sensor was designed and constructed, and evaluated against a portable commercial turbidimeter. The final product, which builds on extensive published research, is intended to catalyze further developments in affordable water and sanitation monitoring. PMID:24759114

  3. An open source business model for malaria.

    PubMed

    Årdal, Christine; Røttingen, John-Arne

    2015-01-01

    Greater investment is required in developing new drugs and vaccines against malaria in order to eradicate malaria. These precious funds must be carefully managed to achieve the greatest impact. We evaluate existing efforts to discover and develop new drugs and vaccines for malaria to determine how best malaria R&D can benefit from an enhanced open source approach and how such a business model may operate. We assess research articles, patents, clinical trials and conducted a smaller survey among malaria researchers. Our results demonstrate that the public and philanthropic sectors are financing and performing the majority of malaria drug/vaccine discovery and development, but are then restricting access through patents, 'closed' publications and hidden away physical specimens. This makes little sense since it is also the public and philanthropic sector that purchases the drugs and vaccines. We recommend that a more "open source" approach is taken by making the entire value chain more efficient through greater transparency which may lead to more extensive collaborations. This can, for example, be achieved by empowering an existing organization like the Medicines for Malaria Venture (MMV) to act as a clearing house for malaria-related data. The malaria researchers that we surveyed indicated that they would utilize such registry data to increase collaboration. Finally, we question the utility of publicly or philanthropically funded patents for malaria medicines, where little to no profits are available. Malaria R&D benefits from a publicly and philanthropically funded architecture, which starts with academic research institutions, product development partnerships, commercialization assistance through UNITAID and finally procurement through mechanisms like The Global Fund to Fight AIDS, Tuberculosis and Malaria and the U.S.' President's Malaria Initiative. We believe that a fresh look should be taken at the cost/benefit of patents particularly related to new malaria

  4. Safety and security management of disused sealed radioactive sources in Thailand

    NASA Astrophysics Data System (ADS)

    Ya-anant, N.; Nuanjan, P.; Phattanasub, A.; Akharawutchayanon, T.; O-manee, A.; Prasertchiewchan, N.; Benitez-Navarro, J. C.

    2015-05-01

    When sealed radioactive sources are no longer in use, they should be returned back to the country of origin. However, most of them could not be returned to the origin; therefore, disused sealed radioactive sources (DSRS) have to be managed locally to ensure the safety and the security for long term storage before final disposal. The Radioactive Waste Management Center, Thailand Institute of Nuclear Technology, is authorized to operate the treatment, conditioning and storage of DSRS in Thailand. This paper will describe the operational procedures on characterization technique, identifation of unknown sources, volume reduction technique, re-packaging, registration and record keeping of DSRS. The successful results included that the record keeping of DSRS has been developed, and the national inventory of stored DSRS has been made up to date. The results confirmed that the quality control at the DSRS storage facility at Thailand Institute of Nuclear Technology was established and well implemented to ensure safe and secure management.

  5. Development of Approach for Long-Term Management of Disused Sealed Radioactive Sources - 13630

    SciTech Connect

    Kinker, M.; Reber, E.; Mansoux, H.; Bruno, G.

    2013-07-01

    Radioactive sources are used widely throughout the world in a variety of medical, industrial, research and military applications. When such radioactive sources are no longer used and are not intended to be used for the practice for which an authorization was granted, they are designated as 'disused sources'. Whether appropriate controls are in place during the useful life of a source or not, the end of this useful life is often a turning point after which it is more difficult to ensure the safety and security of the source over time. For various reasons, many disused sources cannot be returned to the manufacturer or the supplier for reuse or recycling. When these attempts fail, disused sources should be declared as radioactive waste and should be managed as such, in compliance with relevant international legal instruments and safety standards. However, disposal remains an unresolved issue in many counties, due to in part to limited public acceptance, insufficient funding, and a lack of practical examples of strategies for determining suitable disposal options. As a result, disused sources are often stored indefinitely at the facilities where they were once used. In order to prevent disused sources from becoming orphan sources, each country must develop and implement a comprehensive waste management strategy that includes disposal of disused sources. The International Atomic Energy Agency (IAEA) fosters international cooperation between countries and encourages the development of a harmonized 'cradle to grave' approach to managing sources consistent with international legal instruments, IAEA safety standards, and international good practices. This 'cradle to grave' approach requires the development of a national policy and implementing strategy, an adequate legal and regulatory framework, and adequate resources and infrastructure that cover the entire life cycle, from production and use of radioactive sources to disposal. (authors)

  6. The Open Source Snowpack modelling ecosystem

    NASA Astrophysics Data System (ADS)

    Bavay, Mathias; Fierz, Charles; Egger, Thomas; Lehning, Michael

    2016-04-01

    As a large number of numerical snow models are available, a few stand out as quite mature and widespread. One such model is SNOWPACK, the Open Source model that is developed at the WSL Institute for Snow and Avalanche Research SLF. Over the years, various tools have been developed around SNOWPACK in order to expand its use or to integrate additional features. Today, the model is part of a whole ecosystem that has evolved to both offer seamless integration and high modularity so each tool can easily be used outside the ecosystem. Many of these Open Source tools experience their own, autonomous development and are successfully used in their own right in other models and applications. There is Alpine3D, the spatially distributed version of SNOWPACK, that forces it with terrain-corrected radiation fields and optionally with blowing and drifting snow. This model can be used on parallel systems (either with OpenMP or MPI) and has been used for applications ranging from climate change to reindeer herding. There is the MeteoIO pre-processing library that offers fully integrated data access, data filtering, data correction, data resampling and spatial interpolations. This library is now used by several other models and applications. There is the SnopViz snow profile visualization library and application that supports both measured and simulated snow profiles (relying on the CAAML standard) as well as time series. This JavaScript application can be used standalone without any internet connection or served on the web together with simulation results. There is the OSPER data platform effort with a data management service (build on the Global Sensor Network (GSN) platform) as well as a data documenting system (metadata management as a wiki). There are several distributed hydrological models for mountainous areas in ongoing development that require very little information about the soil structure based on the assumption that in step terrain, the most relevant information is

  7. Conditioning and Repackaging of Spent Radioactive Cs-137 and Co-60 Sealed Sources in Egypt - 13490

    SciTech Connect

    Hasan, M.A.; Selim, Y.T.; El-Zakla, T.

    2013-07-01

    Radioactive Sealed sources (RSSs) are widely use all over the world in medicine, agriculture, industry, research, etc. The accidental misuse and exposure to RSSs has caused significant environmental contamination, serious injuries and many deaths. The high specific activity of the materials in many RSSs means that the spread of as little as microgram quantities can generate significant risk to human health and inhibit the use of buildings and land. Conditioning of such sources is a must to protect humans and environment from the hazard of ionizing radiation and contamination. Conditioning is also increase the security of these sources by decreasing the probability of stolen and/or use in terrorist attacks. According to the law No.7/2010, Egyptian atomic energy authority represented in the hot laboratories and waste management center (centralized waste facility, HLWMC) has the responsibility of collecting, conditioning, storing and management of all types of radioactive waste from all Egyptian territory including spent radioactive sealed sources (SRSSs). This paper explains the conditioning procedures for two of the most common SRSSs, Cs{sup 137} and Co{sup 60} sources which make up more than 90% of the total spent radioactive sealed sources stored in our centralized waste facility as one of the major activities of hot laboratories and waste management center. Conditioning has to meet three main objectives, be acceptable for storage, enable their safe transport, and comply with disposal requirements. (authors)

  8. OpenMx: An Open Source Extended Structural Equation Modeling Framework

    ERIC Educational Resources Information Center

    Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John

    2011-01-01

    OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are introduced--these…

  9. The 2015 Bioinformatics Open Source Conference (BOSC 2015).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Lapp, Hilmar; Chapman, Brad; Davey, Rob; Fields, Christopher; Hokamp, Karsten; Munoz-Torres, Monica

    2016-02-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.

  10. The 2015 Bioinformatics Open Source Conference (BOSC 2015).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Lapp, Hilmar; Chapman, Brad; Davey, Rob; Fields, Christopher; Hokamp, Karsten; Munoz-Torres, Monica

    2016-02-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule. PMID:26914653

  11. Internet as a Source of Misconception: "Radiation and Radioactivity"

    ERIC Educational Resources Information Center

    Acar Sesen, Burcin; Ince, Elif

    2010-01-01

    The purpose of this study is to examine students' usage styles of the Internet for seeking information and to investigate whether information obtained from the Internet is a source of misconceptions. For this reason, a two-stage study was conducted. At the first stage, a questionnaire was developed to get information about students' Internet usage…

  12. An Open Source Business Model for Malaria

    PubMed Central

    Årdal, Christine; Røttingen, John-Arne

    2015-01-01

    Greater investment is required in developing new drugs and vaccines against malaria in order to eradicate malaria. These precious funds must be carefully managed to achieve the greatest impact. We evaluate existing efforts to discover and develop new drugs and vaccines for malaria to determine how best malaria R&D can benefit from an enhanced open source approach and how such a business model may operate. We assess research articles, patents, clinical trials and conducted a smaller survey among malaria researchers. Our results demonstrate that the public and philanthropic sectors are financing and performing the majority of malaria drug/vaccine discovery and development, but are then restricting access through patents, ‘closed’ publications and hidden away physical specimens. This makes little sense since it is also the public and philanthropic sector that purchases the drugs and vaccines. We recommend that a more “open source” approach is taken by making the entire value chain more efficient through greater transparency which may lead to more extensive collaborations. This can, for example, be achieved by empowering an existing organization like the Medicines for Malaria Venture (MMV) to act as a clearing house for malaria-related data. The malaria researchers that we surveyed indicated that they would utilize such registry data to increase collaboration. Finally, we question the utility of publicly or philanthropically funded patents for malaria medicines, where little to no profits are available. Malaria R&D benefits from a publicly and philanthropically funded architecture, which starts with academic research institutions, product development partnerships, commercialization assistance through UNITAID and finally procurement through mechanisms like The Global Fund to Fight AIDS, Tuberculosis and Malaria and the U.S.’ President’s Malaria Initiative. We believe that a fresh look should be taken at the cost/benefit of patents particularly related to new

  13. Fast pulsed operation of a small non-radioactive electron source with continuous emission current control.

    PubMed

    Cochems, P; Kirk, A T; Bunert, E; Runge, M; Goncalves, P; Zimmermann, S

    2015-06-01

    Non-radioactive electron sources are of great interest in any application requiring the emission of electrons at atmospheric pressure, as they offer better control over emission parameters than radioactive electron sources and are not subject to legal restrictions. Recently, we published a simple electron source consisting only of a vacuum housing, a filament, and a single control grid. In this paper, we present improved control electronics that utilize this control grid in order to focus and defocus the electron beam, thus pulsing the electron emission at atmospheric pressure. This allows short emission pulses and excellent stability of the emitted electron current due to continuous control, both during pulsed and continuous operations. As an application example, this electron source is coupled to an ion mobility spectrometer. Here, the pulsed electron source allows experiments on gas phase ion chemistry (e.g., ion generation and recombination kinetics) and can even remove the need for a traditional ion shutter.

  14. Fast pulsed operation of a small non-radioactive electron source with continuous emission current control

    SciTech Connect

    Cochems, P.; Kirk, A. T.; Bunert, E.; Runge, M.; Goncalves, P.; Zimmermann, S.

    2015-06-15

    Non-radioactive electron sources are of great interest in any application requiring the emission of electrons at atmospheric pressure, as they offer better control over emission parameters than radioactive electron sources and are not subject to legal restrictions. Recently, we published a simple electron source consisting only of a vacuum housing, a filament, and a single control grid. In this paper, we present improved control electronics that utilize this control grid in order to focus and defocus the electron beam, thus pulsing the electron emission at atmospheric pressure. This allows short emission pulses and excellent stability of the emitted electron current due to continuous control, both during pulsed and continuous operations. As an application example, this electron source is coupled to an ion mobility spectrometer. Here, the pulsed electron source allows experiments on gas phase ion chemistry (e.g., ion generation and recombination kinetics) and can even remove the need for a traditional ion shutter.

  15. Quantitative radiochemical methods for determination of the sources of natural radioactivity

    USGS Publications Warehouse

    Rosholt, J.N.

    1957-01-01

    Study of the state of equilibrium of any natural radioactive source requires determination of several key nuclides or groups of nuclides to find their contribution to the total amount of radioactivity. Alpha activity measured by scintillation counting is used for determination of protactinium-231, thorium-232, thorium-230, and radium-226. The chemical procedures for the separations of the specific elements are described, as well as the measurement techniques used to determine the abundances of the individual isotopes. To correct for deviations in the ore standards, an independent means of evaluating the efficiencies of the individual separations and measurements is used. The development of these methods of radiochemical analysis facilitates detailed investigation of the major sources of natural radioactivity.

  16. The case for open-source software in drug discovery.

    PubMed

    DeLano, Warren L

    2005-02-01

    Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.

  17. Closed-Loop, Open-Source Electrophysiology

    PubMed Central

    Rolston, John D.; Gross, Robert E.; Potter, Steve M.

    2010-01-01

    Multiple extracellular microelectrodes (multi-electrode arrays, or MEAs) effectively record rapidly varying neural signals, and can also be used for electrical stimulation. Multi-electrode recording can serve as artificial output (efferents) from a neural system, while complex spatially and temporally targeted stimulation can serve as artificial input (afferents) to the neuronal network. Multi-unit or local field potential (LFP) recordings can not only be used to control real world artifacts, such as prostheses, computers or robots, but can also trigger or alter subsequent stimulation. Real-time feedback stimulation may serve to modulate or normalize aberrant neural activity, to induce plasticity, or to serve as artificial sensory input. Despite promising closed-loop applications, commercial electrophysiology systems do not yet take advantage of the bidirectional capabilities of multi-electrodes, especially for use in freely moving animals. We addressed this lack of tools for closing the loop with NeuroRighter, an open-source system including recording hardware, stimulation hardware, and control software with a graphical user interface. The integrated system is capable of multi-electrode recording and simultaneous patterned microstimulation (triggered by recordings) with minimal stimulation artifact. The potential applications of closed-loop systems as research tools and clinical treatments are broad; we provide one example where epileptic activity recorded by a multi-electrode probe is used to trigger targeted stimulation, via that probe, to freely moving rodents. PMID:20859448

  18. An open-source laser electronics suite

    NASA Astrophysics Data System (ADS)

    Pisenti, Neal C.; Reschovsky, Benjamin J.; Barker, Daniel S.; Restelli, Alessandro; Campbell, Gretchen K.

    2016-05-01

    We present an integrated set of open-source electronics for controlling external-cavity diode lasers and other instruments in the laboratory. The complete package includes a low-noise circuit for driving high-voltage piezoelectric actuators, an ultra-stable current controller based on the design of, and a high-performance, multi-channel temperature controller capable of driving thermo-electric coolers or resistive heaters. Each circuit (with the exception of the temperature controller) is designed to fit in a Eurocard rack equipped with a low-noise linear power supply capable of driving up to 5 A at +/- 15 V. A custom backplane allows signals to be shared between modules, and a digital communication bus makes the entire rack addressable by external control software over TCP/IP. The modular architecture makes it easy for additional circuits to be designed and integrated with existing electronics, providing a low-cost, customizable alternative to commercial systems without sacrificing performance.

  19. Closed-loop, open-source electrophysiology.

    PubMed

    Rolston, John D; Gross, Robert E; Potter, Steve M

    2010-01-01

    Multiple extracellular microelectrodes (multi-electrode arrays, or MEAs) effectively record rapidly varying neural signals, and can also be used for electrical stimulation. Multi-electrode recording can serve as artificial output (efferents) from a neural system, while complex spatially and temporally targeted stimulation can serve as artificial input (afferents) to the neuronal network. Multi-unit or local field potential (LFP) recordings can not only be used to control real world artifacts, such as prostheses, computers or robots, but can also trigger or alter subsequent stimulation. Real-time feedback stimulation may serve to modulate or normalize aberrant neural activity, to induce plasticity, or to serve as artificial sensory input. Despite promising closed-loop applications, commercial electrophysiology systems do not yet take advantage of the bidirectional capabilities of multi-electrodes, especially for use in freely moving animals. We addressed this lack of tools for closing the loop with NeuroRighter, an open-source system including recording hardware, stimulation hardware, and control software with a graphical user interface. The integrated system is capable of multi-electrode recording and simultaneous patterned microstimulation (triggered by recordings) with minimal stimulation artifact. The potential applications of closed-loop systems as research tools and clinical treatments are broad; we provide one example where epileptic activity recorded by a multi-electrode probe is used to trigger targeted stimulation, via that probe, to freely moving rodents. PMID:20859448

  20. XNAT Central: Open sourcing imaging research data.

    PubMed

    Herrick, Rick; Horton, William; Olsen, Timothy; McKay, Michael; Archie, Kevin A; Marcus, Daniel S

    2016-01-01

    XNAT Central is a publicly accessible medical imaging data repository based on the XNAT open-source imaging informatics platform. It hosts a wide variety of research imaging data sets. The primary motivation for creating XNAT Central was to provide a central repository to host and provide access to a wide variety of neuroimaging data. In this capacity, XNAT Central hosts a number of data sets from research labs and investigative efforts from around the world, including the OASIS Brains imaging studies, the NUSDAST study of schizophrenia, and more. Over time, XNAT Central has expanded to include imaging data from many different fields of research, including oncology, orthopedics, cardiology, and animal studies, but continues to emphasize neuroimaging data. Through the use of XNAT's DICOM metadata extraction capabilities, XNAT Central provides a searchable repository of imaging data that can be referenced by groups, labs, or individuals working in many different areas of research. The future development of XNAT Central will be geared towards greater ease of use as a reference library of heterogeneous neuroimaging data and associated synthetic data. It will also become a tool for making data available supporting published research and academic articles.

  1. Zherlock: an open source data analysis software.

    PubMed

    Alsberg, B K; Kirkhus, L; Hagen, R; Knudsen, O; Tangstad, T; Anderssen, E

    2003-01-01

    Zherlock is an open source software that provides state-of-the-art data analysis tools to the user in an intuitive and flexible way. It is a front-end to different numerical "engines" to produce a seamless integration of algorithms written in different computer languages. Of particular interest is creating an interface to high-level scientific languages such as Octave (a Matlab clone) and R (an S-PLUS clone) to enable efficient porting of new data analytical methods. Zherlock uses advanced scientific visualization tools in 2-D and 3-D and has been extended to work on virtual reality (VR) systems. Central to Zherlock is a visual programming environment (VPE) which enables diagram based programming. These diagrams consist of nodes and connection lines where each node is an operator or a method and lines describe the flow of data between nodes. A VPE is chosen for Zherlock because it forms an effective way to control the processing pipeline in complex data analyses. The VPE is similar in functionality to other programs such as IRIS Explorer, AVS or LabVIEW. PMID:14758979

  2. Using Radioactive Fallout Cesium (137Cs) to Distinguish Sediment Sources in an Agricultural Watershed

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Radioactive fallout Cesium (Cs-137) has been used for quantifying sources of accumulating sediment in water bodies and to determine the rates and pattern of soil erosion. The objectives of this research are to use Cs-137 as a tracer to determine patterns of soil erosion and deposition of eroding soi...

  3. Commissioning and initial operation of a radioactive beam ion source at ISAC

    NASA Astrophysics Data System (ADS)

    Dombsky, M.; Bishop, D.; Bricault, P.; Dale, D.; Hurst, A.; Jayamanna, K.; Keitel, R.; Olivo, M.; Schmor, P.; Stanford, G.

    2000-02-01

    In November of 1998, the ISAC radioactive beam facility at TRIUMF started delivering on-line isotope separated radioactive beams to experiments. A surface ionization source developed for ISAC has been used to commission the mass separator and beam transport systems and is providing radioactive beams to the first generation of ISAC experiments. The ion source is integral with the radioactive beam production target and is designed to be simple, radiation hard, inexpensive, and easily exchanged by remote-handling techniques. The ion source and its extraction column are suspended at the bottom of ˜2 m of steel shielding incorporated in the target module. The module is suspended in a vacuum tank with primary and secondary vacuum systems. All services for the target/ion source and beam extraction system are ducted through the module shielding. The first sets of beam transport elements and beam diagnostic devices are similarly suspended in vacuum at the bottom of two additional shielded modules. Ion beam characteristics can be routinely monitored during on-line operation by a system of Faraday cups, wire scanners, "harp" monitors, and a novel emittance measurement apparatus that can measure beam emittance in both horizontal and vertical planes. The diagnostics devices are capable of resolving beam signals down to the 10 pA range.

  4. The adequacy of current import and export controls on sealed radioactive sources.

    SciTech Connect

    Longley, Susan W.; Cochran, John Russell; Price, Laura L.; Lipinski, Kendra J.

    2003-10-01

    Millions of sealed radioactive sources (SRSs) are being used for a wide variety of beneficial purposes throughout the world. Security experts are now concerned that these beneficial SRSs could be used in a radiological dispersion device to terrorize and disrupt society. The greatest safety and security threat is from those highly radioactive Category 1 and 2 SRSs. Without adequate controls, it may be relatively easy to legally purchase a Category 1 or 2 SRS on the international market under false pretenses. Additionally, during transfer, SRSs are particularly susceptible to theft since the sources are in a shielded and mobile configuration, transportation routes are predictable, and shipments may not be adequately guarded. To determine if government controls on SRS are adequate, this study was commissioned to review the current SRS import and export controls of six countries. Canada, the Russian Federation, and South Africa were selected as the exporting countries, and Egypt, the Philippines, and the United States were selected as importing countries. A detailed review of the controls in each country is presented. The authors found that Canada and Russia are major exporters, and are exporting highly radioactive SRSs without first determining if the recipient is authorized by the receiving country to own and use the SRSs. Available evidence was used to estimate that on average there are tens to possibly hundreds of intercountry transfers of highly radioactive SRSs each day. Based on these and other findings, this reports recommends stronger controls on the export and import of highly radioactive SRSs.

  5. A simple method to prolong the service life of radioactive sources for external radiotherapy.

    PubMed

    Xu, Yingjie; Tian, Yuan; Dai, Jianrong

    2014-01-01

    A radioactive source is usually replaced and disposed after being used for a certain amount of time (usually a half-life). In this study, a simple method is proposed to prolong its service life. Instead of replacing the used source with a new source of full activity, a new source of less activity is added in the source holder in front of the used one, so that the total activity of two sources is equal to the initial activity of the used source or even higher. Similarly, more sources can be added to the previous ones. Attenuation of front source(s) to the back source(s) was evaluated with exponential attenuation equation, and variation of source-focus distance (SFD) with inverse square law for Leksell 4C Gamma Knife, which served as an example of external radiotherapy units. When the number of front sources increased from 1 to 3, the relative air kerma decreased from 36.5% to 5.0%. Both the attenuation effect and SFD variation contributed to the decrease in air kerma, with the former being the major factor. If the height of the source can be decreased in some way, such as increasing the specific activity of sources, the sources can be used more efficiently. The method prolongs the service life of sources by several factors, and reduces the expense of source exchange and reclamation. PMID:25207406

  6. An ion source module for the Beijing Radioactive Ion-beam Facility

    SciTech Connect

    Cui, B. Huang, Q.; Tang, B.; Ma, R.; Chen, L.; Ma, Y.

    2014-02-15

    An ion source module is developed for Beijing Radioactive Ion-beam Facility. The ion source module is designed to meet the requirements of remote handling. The connection and disconnection of the electricity, cooling and vacuum between the module and peripheral units can be executed without on-site manual work. The primary test of the target ion source has been carried out and a Li{sup +} beam has been extracted. Details of the ion source module and its primary test results are described.

  7. Radioactive source localization inside pipes using a long-range alpha detector

    NASA Astrophysics Data System (ADS)

    Wu, Xue-Mei; Tuo, Xian-Guo; Li, Zhe; Liu, Ming-Zhe; Zhang, Jin-Zhao; Dong, Xiang-Long; Li, Ping-Chuan

    2013-08-01

    Long-range alpha detectors (LRADs) are attracting much attention in the decommissioning of nuclear facilities because of some problems in obtaining source positions on an interior surface during pipe decommissioning. By utilizing the characteristic that LRAD detects alphas by collecting air-driving ions, this article applies a method to localize the radioactive source by ions' fluid property. By obtaining the ion travel time and the airspeed distribution in the pipe, the source position can be determined. Thus this method overcomes the ion's lack of periodic characteristics. Experimental results indicate that this method can approximately localize the source inside the pipe. The calculation results are in good agreement with the experimental results.

  8. 78 FR 53020 - Branch Technical Position on the Import of Non-U.S. Origin Radioactive Sources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-28

    ... BTP on the Import of Non-U.S. Origin Radioactive Sources,'' 77 FR 2924 (January 20, 2012), and... Non-U.S. Origin Radioactive Sources,'' ] 77 FR 64435 (October 22, 2012), and received eight comment..., 2010, the NRC published a final rule in the Federal Register (75 FR 44072) that amended...

  9. ENKI - An Open Source environmental modelling platfom

    NASA Astrophysics Data System (ADS)

    Kolberg, S.; Bruland, O.

    2012-04-01

    The ENKI software framework for implementing spatio-temporal models is now released under the LGPL license. Originally developed for evaluation and comparison of distributed hydrological model compositions, ENKI can be used for simulating any time-evolving process over a spatial domain. The core approach is to connect a set of user specified subroutines into a complete simulation model, and provide all administrative services needed to calibrate and run that model. This includes functionality for geographical region setup, all file I/O, calibration and uncertainty estimation etc. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines and various model compositions in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational water resource management. ENKI uses a plug-in structure to invoke separately compiled subroutines, separately built as dynamic-link libraries (dlls). The source code of an ENKI routine is highly compact, with a narrow framework-routine interface allowing the main program to recognise the number, types, and names of the routine's variables. The framework then exposes these variables to the user within the proper context, ensuring that distributed maps coincide spatially, time series exist for input variables, states are initialised, GIS data sets exist for static map data, manually or automatically calibrated values for parameters etc. By using function calls and memory data structures to invoke routines and facilitate information flow, ENKI provides good performance. For a typical distributed hydrological model setup in a spatial domain of 25000 grid cells, 3-4 time steps simulated per second should be expected. Future adaptation to parallel processing may further increase this speed. New modifications to ENKI include a full separation of API and user interface

  10. Open Source Initiative Powers Real-Time Data Streams

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Under an SBIR contract with Dryden Flight Research Center, Creare Inc. developed a data collection tool called the Ring Buffered Network Bus. The technology has now been released under an open source license and is hosted by the Open Source DataTurbine Initiative. DataTurbine allows anyone to stream live data from sensors, labs, cameras, ocean buoys, cell phones, and more.

  11. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    ERIC Educational Resources Information Center

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies. This book…

  12. Open Source Library Management Systems: A Multidimensional Evaluation

    ERIC Educational Resources Information Center

    Balnaves, Edmund

    2008-01-01

    Open source library management systems have improved steadily in the last five years. They now present a credible option for small to medium libraries and library networks. An approach to their evaluation is proposed that takes account of three additional dimensions that only open source can offer: the developer and support community, the source…

  13. Integrating an Automatic Judge into an Open Source LMS

    ERIC Educational Resources Information Center

    Georgouli, Katerina; Guerreiro, Pedro

    2011-01-01

    This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…

  14. Open-Source Unionism: New Workers, New Strategies

    ERIC Educational Resources Information Center

    Schmid, Julie M.

    2004-01-01

    In "Open-Source Unionism: Beyond Exclusive Collective Bargaining," published in fall 2002 in the journal Working USA, labor scholars Richard B. Freeman and Joel Rogers use the term "open-source unionism" to describe a form of unionization that uses Web technology to organize in hard-to-unionize workplaces. Rather than depend on the traditional…

  15. Open Source Communities in Technical Writing: Local Exigence, Global Extensibility

    ERIC Educational Resources Information Center

    Conner, Trey; Gresham, Morgan; McCracken, Jill

    2011-01-01

    By offering open-source software (OSS)-based networks as an affordable technology alternative, we partnered with a nonprofit community organization. In this article, we narrate the client-based experiences of this partnership, highlighting the ways in which OSS and open-source culture (OSC) transformed our students' and our own expectations of…

  16. Migrations of the Mind: The Emergence of Open Source Education

    ERIC Educational Resources Information Center

    Glassman, Michael; Bartholomew, Mitchell; Jones, Travis

    2011-01-01

    The authors describe an Open Source approach to education. They define Open Source Education (OSE) as a teaching and learning framework where the use and presentation of information is non-hierarchical, malleable, and subject to the needs and contributions of students as they become "co-owners" of the course. The course transforms itself into an…

  17. Open Source as Appropriate Technology for Global Education

    ERIC Educational Resources Information Center

    Carmichael, Patrick; Honour, Leslie

    2002-01-01

    Economic arguments for the adoption of "open source" software in business have been widely discussed. In this paper we draw on personal experience in the UK, South Africa and Southeast Asia to forward compelling reasons why open source software should be considered as an appropriate and affordable alternative to the currently prevailing dependency…

  18. Getting Open Source Software into Schools: Strategies and Challenges

    ERIC Educational Resources Information Center

    Hepburn, Gary; Buley, Jan

    2006-01-01

    In this article Gary Hepburn and Jan Buley outline different approaches to implementing open source software (OSS) in schools; they also address the challenges that open source advocates should anticipate as they try to convince educational leaders to adopt OSS. With regard to OSS implementation, they note that schools have a flexible range of…

  19. German Support Program for Retrieval and Safe Storage of Disused Radioactive Sealed Sources in Ukraine - 13194

    SciTech Connect

    Pretzsch, Gunter; Salewski, Peter; Sogalla, Martin

    2013-07-01

    The German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (BMU) on behalf of the Government of the Federal Republic of Germany supports the State Nuclear Regulatory Inspectorate of Ukraine (SNRIU) in enhancement of nuclear safety and radiation protection and strengthening of the physical protection. One of the main objectives of the agreement concluded by these parties in 2008 was the retrieval and safe interim storage of disused orphan high radioactive sealed sources in Ukraine. At present, the Ukrainian National Registry does not account all high active radiation sources but only for about 70 - 80 %. GRS in charge of BMU to execute the program since 2008 concluded subcontracts with the waste management and interim storage facilities RADON at different regions in Ukraine as well with the waste management and interim storage facility IZOTOP at Kiev. Below selected examples of removal of high active Co-60 and Cs-137 sources from irradiation facilities at research institutes are described. By end of 2012 removal and safe interim storage of 12.000 disused radioactive sealed sources with a total activity of more than 5,7.10{sup 14} Bq was achieved within the frame of this program. The German support program will be continued up to the end of 2013 with the aim to remove and safely store almost all disused radioactive sealed sources in Ukraine. (authors)

  20. Methods and apparatus for safely handling radioactive sources in measuring-while-drilling tools

    SciTech Connect

    Wraight, P.D.

    1989-07-04

    This patent describes a method for removing a chemical radioactive source from a MWD tool which is coupled in a drill string supported by a drilling rig while a borehole is drilled and includes logging means for measuring formation characteristics in response to irradiation of the adjacent formations by the radioactive source during the drilling operation. The steps of the method are: halting the drilling operation and then removing the drill string from the borehole for moving the MWD tool to a work station at the surface where the source is at a safe working distance from the drilling rig and will be accessible by way of one end of the MWD tool; positioning a radiation shield at a location adjacent to the one end of the MWD tool where the shield is ready for receiving the source as it is moved away from the other end of the MWD tool and then moving the source away from the other end of the MWD tool for enclosing the source within the shield; and once the source is enclosed within the shield; removing the shield together with the enclosed source from the MWD tool for transferring the enclosed source to another work station.

  1. Learning from hackers: open-source clinical trials.

    PubMed

    Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico

    2012-05-01

    Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.

  2. Efficient Open Source Lidar for Desktop Users

    NASA Astrophysics Data System (ADS)

    Flanagan, Jacob P.

    Lidar --- Light Detection and Ranging --- is a remote sensing technology that utilizes a device similar to a rangefinder to determine a distance to a target. A laser pulse is shot at an object and the time it takes for the pulse to return in measured. The distance to the object is easily calculated using the speed property of light. For lidar, this laser is moved (primarily in a rotational movement usually accompanied by a translational movement) and records the distances to objects several thousands of times per second. From this, a 3 dimensional structure can be procured in the form of a point cloud. A point cloud is a collection of 3 dimensional points with at least an x, a y and a z attribute. These 3 attributes represent the position of a single point in 3 dimensional space. Other attributes can be associated with the points that include properties such as the intensity of the return pulse, the color of the target or even the time the point was recorded. Another very useful, post processed attribute is point classification where a point is associated with the type of object the point represents (i.e. ground.). Lidar has gained popularity and advancements in the technology has made its collection easier and cheaper creating larger and denser datasets. The need to handle this data in a more efficiently manner has become a necessity; The processing, visualizing or even simply loading lidar can be computationally intensive due to its very large size. Standard remote sensing and geographical information systems (GIS) software (ENVI, ArcGIS, etc.) was not originally built for optimized point cloud processing and its implementation is an afterthought and therefore inefficient. Newer, more optimized software for point cloud processing (QTModeler, TopoDOT, etc.) usually lack more advanced processing tools, requires higher end computers and are very costly. Existing open source lidar approaches the loading and processing of lidar in an iterative fashion that requires

  3. The 2015 Bioinformatics Open Source Conference (BOSC 2015)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J. A.; Lapp, Hilmar

    2016-01-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included “Data Science;” “Standards and Interoperability;” “Open Science and Reproducibility;” “Translational Bioinformatics;” “Visualization;” and “Bioinformatics Open Source Project Updates”. In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled “Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community,” that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule. PMID:26914653

  4. Study on effect of geometrical configuration of radioactive source material to the radiation intensity of betavoltaic nuclear battery

    NASA Astrophysics Data System (ADS)

    Badrianto, Muldani Dwi; Riupassa, Robi D.; Basar, Khairul

    2015-09-01

    Nuclear batteries have strategic applications and very high economic potential. One Important problem in application of nuclear betavoltaic battery is its low efficiency. Current efficiency of betavoltaic nuclear battery reaches only arround 2%. One aspect that can influence the efficiency of betavoltaic nuclear battery is the geometrical configuration of radioactive source. In this study we discuss the effect of geometrical configuration of radioactive source material to the radiation intensity in betavoltaic nuclear battery system. received by the detector. By obtaining the optimum configurations, the optimum usage of radioactive materials can be determined. Various geometrical configurations of radioactive source material are simulated. It is obtained that usage of radioactive source will be optimum for circular configuration.

  5. Study on effect of geometrical configuration of radioactive source material to the radiation intensity of betavoltaic nuclear battery

    SciTech Connect

    Badrianto, Muldani Dwi; Riupassa, Robi D.; Basar, Khairul

    2015-09-30

    Nuclear batteries have strategic applications and very high economic potential. One Important problem in application of nuclear betavoltaic battery is its low efficiency. Current efficiency of betavoltaic nuclear battery reaches only arround 2%. One aspect that can influence the efficiency of betavoltaic nuclear battery is the geometrical configuration of radioactive source. In this study we discuss the effect of geometrical configuration of radioactive source material to the radiation intensity in betavoltaic nuclear battery system. received by the detector. By obtaining the optimum configurations, the optimum usage of radioactive materials can be determined. Various geometrical configurations of radioactive source material are simulated. It is obtained that usage of radioactive source will be optimum for circular configuration.

  6. Management of Disused Radioactive Sealed Sources in the Slovak Republic - 12100

    SciTech Connect

    Salzer, Peter

    2012-07-01

    After splitting-up the Czechoslovak Federation in 1993, the system of management of institutional radioactive waste, where disused sources represent its significant part, had had to build from beginning, since all corresponding activities had remained in the Czech part of the Federation. The paper presents the development of legislative and institutional framework of the disused radioactive sealed source management, development of the national inventory and development of management practices. According the Governmental decision (1994), the management of disused sealed sources and institutional radioactive waste at whole was based on maximal utilization of facilities inside nuclear facilities, particularly in the NPP A1 (shut down in the past, currently under decommissioning). This approach has been recently changing by Governmental decision (2009) to construct 'non-nuclear facility' - central storage for remained disused sealed sources collected from the places of use, where they were stored in some cases for tens of years. The approaches to siting and construction of this storage facility will be presented, as well as the current approaches to solution of the disused radioactive sources final disposal. Environmental impact assessment process in regard to the given facility/activity is slowly drawing to a close. The final statement of the Ministry of Environment can be expected in January or February 2012, probably recommending option 1 as preferred [6]. According to the Slovak legislation, the final statement has a status of recommendation for ongoing processes leading to the siting license. Very recently, in December 2012, Government of the Slovak republic decided to postpone putting the facility into operation by the end of June, 2014. (author)

  7. A singly charged ion source for radioactive 11C ion acceleration

    NASA Astrophysics Data System (ADS)

    Katagiri, K.; Noda, A.; Nagatsu, K.; Nakao, M.; Hojo, S.; Muramatsu, M.; Suzuki, K.; Wakui, T.; Noda, K.

    2016-02-01

    A new singly charged ion source using electron impact ionization has been developed to realize an isotope separation on-line system for simultaneous positron emission tomography imaging and heavy-ion cancer therapy using radioactive 11C ion beams. Low-energy electron beams are used in the electron impact ion source to produce singly charged ions. Ionization efficiency was calculated in order to decide the geometric parameters of the ion source and to determine the required electron emission current for obtaining high ionization efficiency. Based on these considerations, the singly charged ion source was designed and fabricated. In testing, the fabricated ion source was found to have favorable performance as a singly charged ion source.

  8. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REQUIREMENTS FOR WELL LOGGING Equipment § 39.43 Inspection, maintenance, and opening of a source or source holder. (a) Each licensee shall visually check source holders, logging tools, and source handling tools... holders, logging tools, injection tools, source handling tools, storage containers, transport...

  9. Quantitative radiochemical method for determination of major sources of natural radioactivity in ores and minerals

    USGS Publications Warehouse

    Rosholt, J.N.

    1954-01-01

    When an ore sample contains radioactivity other than that attributable to the uranium series in equilibrium, a quantitative analysis of the other emitters must be made in order to determine the source of this activity. Thorium-232, radon-222, and lead-210 have been determined by isolation and subsequent activity analysis of some of their short-lived daughter products. The sulfides of bismuth and polonium are precipitated out of solutions of thorium or uranium ores, and the ??-particle activity of polonium-214, polonium-212, and polonium-210 is determined by scintillation-counting techniques. Polonium-214 activity is used to determine radon-222, polonium-212 activity for thorium-232, and polonium-210 for lead-210. The development of these methods of radiochemical analysis will facilitate the rapid determination of some of the major sources of natural radioactivity.

  10. Experimental evidence that potassium is a substantial radioactive heat source in planetary cores.

    PubMed

    Murthy, V Rama; van Westrenen, Wim; Fei, Yingwei

    2003-05-01

    The hypothesis that (40)K may be a significant radioactive heat source in the Earth's core was proposed on theoretical grounds over three decades ago, but experiments have provided only ambiguous and contradictory evidence for the solubility of potassium in iron-rich alloys. The existence of such radioactive heat in the core would have important implications for our understanding of the thermal evolution of the Earth and global processes such as the generation of the geomagnetic field, the core-mantle boundary heat flux and the time of formation of the inner core. Here we provide experimental evidence to show that the ambiguous results obtained from earlier experiments are probably due to previously unrecognized experimental and analytical difficulties. The high-pressure, high-temperature data presented here show conclusively that potassium enters iron sulphide melts in a strongly temperature-dependent fashion and that (40)K can serve as a substantial heat source in the cores of the Earth and Mars.

  11. A compact ultra-clean system for deploying radioactive sources inside the KamLAND detector

    NASA Astrophysics Data System (ADS)

    Banks, T. I.; Freedman, S. J.; Wallig, J.; Ybarrolaza, N.; Gando, A.; Gando, Y.; Ikeda, H.; Inoue, K.; Kishimoto, Y.; Koga, M.; Mitsui, T.; Nakamura, K.; Shimizu, I.; Shirai, J.; Suzuki, A.; Takemoto, Y.; Tamae, K.; Ueshima, K.; Watanabe, H.; Xu, B. D.; Yoshida, H.; Yoshida, S.; Kozlov, A.; Grant, C.; Keefer, G.; Piepke, A.; Bloxham, T.; Fujikawa, B. K.; Han, K.; Ichimura, K.; Murayama, H.; O`Donnell, T.; Steiner, H. M.; Winslow, L. A.; Dwyer, D. A.; McKeown, R. D.; Zhang, C.; Berger, B. E.; Lane, C. E.; Maricic, J.; Miletic, T.; Batygov, M.; Learned, J. G.; Matsuno, S.; Sakai, M.; Horton-Smith, G. A.; Downum, K. E.; Gratta, G.; Efremenko, Y.; Perevozchikov, O.; Karwowski, H. J.; Markoff, D. M.; Tornow, W.; Heeger, K. M.; Detwiler, J. A.; Enomoto, S.; Decowski, M. P.

    2015-01-01

    We describe a compact, ultra-clean device used to deploy radioactive sources along the vertical axis of the KamLAND liquid-scintillator neutrino detector for purposes of calibration. The device worked by paying out and reeling in precise lengths of a hanging, small-gauge wire rope (cable); an assortment of interchangeable radioactive sources could be attached to a weight at the end of the cable. All components exposed to the radiopure liquid scintillator were made of chemically compatible UHV-cleaned materials, primarily stainless steel, in order to avoid contaminating or degrading the scintillator. To prevent radon intrusion, the apparatus was enclosed in a hermetically sealed housing inside a glove box, and both volumes were regularly flushed with purified nitrogen gas. An infrared camera attached to the side of the housing permitted real-time visual monitoring of the cable's motion, and the system was controlled via a graphical user interface.

  12. A compact ultra-clean system for deploying radioactive sources inside the KamLAND detector

    SciTech Connect

    Banks, T. I.; Freedman, S. J.; Wallig, J.; Ybarrolaza, N.; Gando, A.; Gando, Y.; Ikeda, H.; Inoue, K.; Kishimoto, Y.; Koga, M.; Mitsui, T.; Nakamura, K.; Shimizu, I.; Shirai, J.; Suzuki, A.; Takemoto, Y.; Tamae, K.; Ueshima, K.; Watanabe, H.; Xu, B. D.; Yoshida, H.; Yoshida, S.; Kozlov, A.; Grant, C.; Keefer, G.; Piepke, A.; Bloxham, T.; Fujikawa, B. K.; Han, K.; Ichimura, K.; Murayama, H.; O׳Donnell, T.; Steiner, H. M.; Winslow, L. A.; Dwyer, D. A.; McKeown, R. D.; Zhang, C.; Berger, B. E.; Lane, C. E.; Maricic, J.; Miletic, T.; Batygov, M.; Learned, J. G.; Matsuno, S.; Sakai, M.; Horton-Smith, G. A.; Downum, K. E.; Gratta, G.; Efremenko, Y.; Perevozchikov, O.; Karwowski, H. J.; Markoff, D. M.; Tornow, W.; Heeger, K. M.; Detwiler, J. A.; Enomoto, S.; Decowski, M. P.

    2014-10-14

    We describe a compact, ultra-clean device used to deploy radioactive sources along the vertical axis of the KamLAND liquid-scintillator neutrino detector for purposes of calibration. The device worked by paying out and reeling in precise lengths of a hanging, small-gauge wire rope (cable); an assortment of interchangeable radioactive sources could be attached to a weight at the end of the cable. All components exposed to the radiopure liquid scintillator were made of chemically compatible UHV-cleaned materials, primarily stainless steel, in order to avoid contaminating or degrading the scintillator. To prevent radon intrusion, the apparatus was enclosed in a hermetically sealed housing inside a glove box, and both volumes were regularly flushed with purified nitrogen gas. Finally, an infrared camera attached to the side of the housing permitted real-time visual monitoring of the cable’s motion, and the system was controlled via a graphical user interface.

  13. Guidelines for the implementation of an open source information system

    SciTech Connect

    Doak, J.; Howell, J.A.

    1995-08-01

    This work was initially performed for the International Atomic Energy Agency (IAEA) to help with the Open Source Task of the 93 + 2 Initiative; however, the information should be of interest to anyone working with open sources. The authors cover all aspects of an open source information system (OSIS) including, for example, identifying relevant sources, understanding copyright issues, and making information available to analysts. They foresee this document as a reference point that implementors of a system could augment for their particular needs. The primary organization of this document focuses on specific aspects, or components, of an OSIS; they describe each component and often make specific recommendations for its implementation. This document also contains a section discussing the process of collecting open source data and a section containing miscellaneous information. The appendix contains a listing of various providers, producers, and databases that the authors have come across in their research.

  14. A Low-Tech, Low-Budget Storage Solution for High Level Radioactive Sources

    SciTech Connect

    Brett Carlsen; Ted Reed; Todd Johnson; John Weathersby; Joe Alexander; Dave Griffith; Douglas Hamelin

    2014-07-01

    The need for safe, secure, and economical storage of radioactive material becomes increasingly important as beneficial uses of radioactive material expand (increases inventory), as political instability rises (increases threat), and as final disposal and treatment facilities are delayed (increases inventory and storage duration). Several vendor-produced storage casks are available for this purpose but are often costly — due to the required design, analyses, and licensing costs. Thus the relatively high costs of currently accepted storage solutions may inhibit substantial improvements in safety and security that might otherwise be achieved. This is particularly true in areas of the world where the economic and/or the regulatory infrastructure may not provide the means and/or the justification for such an expense. This paper considers a relatively low-cost, low-technology radioactive material storage solution. The basic concept consists of a simple shielded storage container that can be fabricated locally using a steel pipe and a corrugated steel culvert as forms enclosing a concrete annulus. Benefits of such a system include 1) a low-tech solution that utilizes materials and skills available virtually anywhere in the world, 2) a readily scalable design that easily adapts to specific needs such as the geometry and radioactivity of the source term material), 3) flexible placement allows for free-standing above-ground or in-ground (i.e., below grade or bermed) installation, 4) the ability for future relocation without direct handling of sources, and 5) a long operational lifetime . ‘Le mieux est l’ennemi du bien’ (translated: The best is the enemy of good) applies to the management of radioactive materials – particularly where the economic and/or regulatory justification for additional investment is lacking. Development of a low-cost alternative that considerably enhances safety and security may lead to a greater overall risk reduction than insisting on

  15. Identification of Low-level Point Radioactive Sources using a sensor network

    SciTech Connect

    Chin, J. C.; Rao, Nageswara S.; Yao, David K. Y.; Shankar, Mallikarjun; Yang, Yong; Hou, J. C.; Srivathsan, Sri; Iyengar, S. Sitharama

    2010-09-01

    Identification of a low-level point radioactive source amidst background radiation is achieved by a network of radiation sensors using a two-step approach. Based on measurements from three or more sensors, a geometric difference triangulation method or an N-sensor localization method is used to estimate the location and strength of the source. Then a sequential probability ratio test based on current measurements and estimated parameters is employed to finally decide: (1) the presence of a source with the estimated parameters, or (2) the absence of the source, or (3) the insufficiency of measurements to make a decision. This method achieves specified levels of false alarm and missed detection probabilities, while ensuring a close-to-minimal number of measurements for reaching a decision. This method minimizes the ghost-source problem of current estimation methods, and achieves a lower false alarm rate compared with current detection methods. This method is tested and demonstrated using: (1) simulations, and (2) a test-bed that utilizes the scaling properties of point radioactive sources to emulate high intensity ones that cannot be easily and safely handled in laboratory experiments.

  16. Open Source, Meet "User-Generated Science"

    ERIC Educational Resources Information Center

    Huwe, Terence K.

    2009-01-01

    This article discusses Research Blogging, a community-run nonprofit organization that is promoting a suite of blogging software to scholars. Research Blogging itself does two things. First, it extends an invitation to a community, and it is open to anyone. Second, it requires its users to follow guidelines. The combination of rigorous guidelines…

  17. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  18. Radon Adsorbed in Activated Charcoal--A Simple and Safe Radiation Source for Teaching Practical Radioactivity in Schools and Colleges

    ERIC Educational Resources Information Center

    Al-Azmi, Darwish; Mustapha, Amidu O.; Karunakara, N.

    2012-01-01

    Simple procedures for teaching practical radioactivity are presented in a way that attracts students' attention and does not make them apprehensive about their safety. The radiation source is derived from the natural environment. It is based on the radioactivity of radon, a ubiquitous inert gas, and the adsorptive property of activated charcoal.…

  19. Selection of targets and ion sources for RIB generation at the Holifield Radioactive Ion Beam Facility

    SciTech Connect

    Alton, G.D.

    1995-12-31

    In this report, the authors describe the performance characteristics for a selected number of target ion sources that will be employed for initial use at the Holifield Radioactive Ion Beam Facility (HRIBF) as well as prototype ion sources that show promise for future use for RIB applications. A brief review of present efforts to select target materials and to design composite target matrix/heat-sink systems that simultaneously incorporate the short diffusion lengths, high permeabilities, and controllable temperatures required to effect fast and efficient diffusion release of the short-lived species is also given.

  20. Radiation exposure modeling for apartment living spaces with multiple radioactive sources.

    PubMed

    Hwang, J S; Chan, C C; Wang, J D; Chang, W P

    1998-03-01

    Since late 1992, over 100 building complexes in Taiwan, including both public and private schools, and 1,000 apartments have been identified as emitting elevated levels of gamma-radiation. These high levels of gamma-radiation have been traced to construction steel contaminated with 60Co. Accurate reconstruction of the radiation exposure dosage among residents is complicated by the discovery of multiple radioactive sources within the living spaces and by the lack of comprehensive information about resident life-style and occupancy patterns within these contaminated spaces. The objective of this study was to evaluate the sensitivity of current dose reconstruction approach employed in an epidemiological study for the health effects of these occupants. We apply a statistical method of local smoothing in dose rate estimation and examine factors that are closely associated with radiation exposure from multiple radioactive sources in the apartment. Two examples are used, a simulated measurement in a hypothetical room with three radioactive sources and a real apartment in Ming-Shan Villa, one of the contaminated buildings. The simulated and estimated means are compared along 5-10 selected points of measurement: by local smoothing approach, with the furniture-adjusted space, and with the occupancy time-weighted mean. We found that the local smoothing approach came much closer to theoretical values. The local smoothing approach may serve as a refined method of radiation dose distribution modeling in exposure estimation. Before environmental exposure assessment, "highly occupied zones" (HOZs) in the contaminated spaces must be identified. Estimates of the time spent in these HOZs are essential to obtain accurate dosage values. These results will facilitate a more accurate dose reconstruction in the assessment of residential exposure in apartments with elevated levels of radioactivity.

  1. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  2. Open Source Solutions for Libraries: ABCD vs Koha

    ERIC Educational Resources Information Center

    Macan, Bojan; Fernandez, Gladys Vanesa; Stojanovski, Jadranka

    2013-01-01

    Purpose: The purpose of this study is to present an overview of the two open source (OS) integrated library systems (ILS)--Koha and ABCD (ISIS family), to compare their "next-generation library catalog" functionalities, and to give comparison of other important features available through ILS modules. Design/methodology/approach: Two open source…

  3. Modular Open-Source Software for Item Factor Analysis

    ERIC Educational Resources Information Center

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  4. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  5. Open-source 3D-printable optics equipment.

    PubMed

    Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.

  6. Using R to implement spatial analysis in open source environment

    NASA Astrophysics Data System (ADS)

    Shao, Yixi; Chen, Dong; Zhao, Bo

    2007-06-01

    R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.

  7. Open Source Software Licenses for Livermore National Laboratory

    SciTech Connect

    Busby, L.

    2000-08-10

    This paper attempts to develop supporting material in an effort to provide new options for licensing Laboratory-created software. Where employees and the Lab wish to release software codes as so-called ''Open Source'', they need, at a minimum, new licensing language for their released products. Several open source software licenses are reviewed to understand their common elements, and develop recommendations regarding new language.

  8. Open-Source 3D-Printable Optics Equipment

    PubMed Central

    Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104

  9. Open-source 3D-printable optics equipment.

    PubMed

    Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104

  10. Open source electronic health records and chronic disease management

    PubMed Central

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    Objective To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). Methods and Materials The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Results Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. Discussion The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. Conclusions The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC. PMID:23813566

  11. [The radioecological problems of Eurasia and the sources of radioactive environmental contamination in the former USSR].

    PubMed

    Polikarpov, G G; Aarkrog, A

    1993-01-01

    There is three major sites of radioactive environmental contamination in the former USSR: the Chelyabinsk region in the Urals, Chernobyl NPP in Ukraine and Novaya Zemlya in the Arctic Ocean. The first mentioned is the most important with regard to local (potential) contamination, the last one dominates the global contamination. A number of sites and sources are less well known with regard to environmental contamination. This is thus the case for the plutonium production factories at Tomsk and Dodonovo. More information on nuclear reactors in lost or dumped submarines is also needed. From a global point of view reliable assessment of the radioactive run-off from land and deposits of nuclear waste in the Arctic Ocean are in particular pertinent. PMID:8469738

  12. Radionuclide sources and radioactive decay figures pertinent to the Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Heeb, C.M.

    1991-03-01

    The origin and radioactive decay schemes of radionuclides currently expected to be the major contributors to potential radiation doses that populations might have received as a result of nuclear operations at the Hanford Site since 1944 are identified and illustrated in this report. The reactions considered include actinide neutron capture and decay sequences, fission product decays, and neutron activation reactions. It is important to note that the radioactive half-life of a given nuclide does not, by itself, fully determine the significance of a given radionuclide as a potential source term. This report does not address environmental transport mechanisms, behavior in the environment, or radiological dose impact of any of the radionuclides shown. 1 ref., 10 figs.

  13. Development of Laser Light Sources for Trapping Radioactive Francium Atoms Toward Tests of Fundamental Symmetries

    NASA Astrophysics Data System (ADS)

    Harada, Ken-ichi; Ezure, Saki; Hayamizu, Tomohiro; Kato, Ko; Kawamura, Hirokazu; Inoue, Takeshi; Arikawa, Hiroshi; Ishikawa, Taisuke; Aoki, Takahiro; Uchiyama, Aiko; Itoh, Masatoshi; Ando, Shun; Aoki, Takatoshi; Hatakeyama, Atsushi; Hatanaka, Kichiji; Imai, Kenichi; Murakami, Tetsuya; Shimizu, Yasuhiro; Sato, Tomoya; Wakasa, Tomotsugu; Yoshida, Hidetomo P.; Sakemi, Yasuhiro

    We have developed laser light sources and a magneto-optical trap system for cooling and trapping radioactive francium (Fr) atoms. Because Fr is the heaviest alkali element, a Fr atom exhibits high sensitivity to symmetry violation effects such as atomic parity nonconservation (APNC) and the electron electric dipole moment (eEDM). A laser cooling and trapping technique reduces the systematic errors due to the Doppler effect and the motion-induced magnetic field effect caused by the velocity of atoms. Thus, optically cooled and trapped Fr atoms are among a few promising candidates considered for APNC and eEDM measurements. Frequency stabilization of laser light is required for any stable measurement involving trapped radioactive atoms, including Fr. Since the hyperfine splitting in iodine molecules (127I2) is close to the resonance frequency of the Fr D2 line, we performed frequency modulation spectroscopy of hyperfine structures of I2.

  14. Radon adsorbed in activated charcoal—a simple and safe radiation source for teaching practical radioactivity in schools and colleges

    NASA Astrophysics Data System (ADS)

    Al-Azmi, Darwish; Mustapha, Amidu O.; Karunakara, N.

    2012-07-01

    Simple procedures for teaching practical radioactivity are presented in a way that attracts students' attention and does not make them apprehensive about their safety. The radiation source is derived from the natural environment. It is based on the radioactivity of radon, a ubiquitous inert gas, and the adsorptive property of activated charcoal. Radon gas from ambient air in the laboratory was adsorbed into about 70 g of activated charcoal inside metallic canisters. Gamma radiation was subsequently emitted from the canisters, following the radioactive decay of radon and its progenies. The intensities of the emitted gamma-rays were measured at suitable intervals using a NaI gamma-ray detector. The counts obtained were analysed and used to demonstrate the radioactive decay law and determine the half-life of radon. In addition to learning the basic properties of radioactivity the students also get practical experience about the existence of natural sources of radiation in the environment.

  15. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent

  16. Management of Spent and Disused Sealed Radioactive Sources in the Czech Republic - 12124

    SciTech Connect

    Podlaha, J.

    2012-07-01

    The Czech Republic is a country with a well-developed peaceful utilization of nuclear energy and ionizing radiation. Sealed Radioactive Sources (further also SRS) are broadly used in many areas in the Czech Republic, e.g. in research, industry, medicine, education, agriculture, etc. Legislation in the field of ionizing radiation source utilization has been fully harmonized with European Community legislation. SRS utilization demands a proper system which must ensure the safe use of SRS, including the management of disused (spent) and orphaned SRS. In the Czech Republic, a comprehensive system of SRS management has been established that is comparable with systems in other developed countries. The system covers both legal and institutional aspects. The Central Register of Ionizing Radiation Sources is an important part of the system. It is a tracking system that covers all activities related to SRS, from their production or import to the end of their use (recycling or disposal). Many spent SRS are recycled and can be used for other purposes after inspection, repacking or reprocessing. When the disused SRS are not intended for further use, they are managed as radioactive waste (RAW). The system of SRS management also ensures the suitable resolution of situations connected with improper SRS handling (in the case of orphaned sources, accidents, etc.). (author)

  17. Open Source and ROI: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    A switch to free open source software can minimize cost and allow funding to be diverted to equipment and other programs. For instance, the OpenOffice suite is an alternative to expensive basic application programs offered by major vendors. Many such programs on the market offer features seldom used in education but for which educators must pay.…

  18. Your Personal Analysis Toolkit - An Open Source Solution

    NASA Astrophysics Data System (ADS)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  19. The Imagery Exchange (TIE): Open Source Imagery Management System

    NASA Astrophysics Data System (ADS)

    Alarcon, C.; Huang, T.; Thompson, C. K.; Roberts, J. T.; Hall, J. R.; Cechini, M.; Schmaltz, J. E.; McGann, J. M.; Boller, R. A.; Murphy, K. J.; Bingham, A. W.

    2013-12-01

    The NASA's Global Imagery Browse Service (GIBS) is the Earth Observation System (EOS) imagery solution for delivering global, full-resolution satellite imagery in a highly responsive manner. GIBS consists of two major subsystems, OnEarth and The Imagery Exchange (TIE). TIE is the GIBS horizontally scaled imagery workflow manager component, an Open Archival Information System (OAIS) responsible for orchestrating the acquisition, preparation, generation, and archiving of imagery to be served by OnEarth. TIE is an extension of the Data Management and Archive System (DMAS), a high performance data management system developed at the Jet Propulsion Laboratory by leveraging open source tools and frameworks, which includes Groovy/Grails, Restlet, Apache ZooKeeper, Apache Solr, and other open source solutions. This presentation focuses on the application of Open Source technologies in developing a horizontally scaled data system like DMAS and TIE. As part of our commitment in contributing back to the open source community, TIE is in the process of being open sourced. This presentation will also cover our current effort in getting TIE in to the hands of the community from which we benefited from.

  20. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  1. Advanced concept proof-of-principle demonstration: Switchable radioactive neutron source

    SciTech Connect

    Rhodes, E.A.; Bowers, D.L.; Boyar, R.E.; Dickerman, C.E.

    1995-10-01

    An advanced concept proof-of-principle demonstration was successfully performed to show the feasibility of a practical switchable radioactive neutron source (SRNS) that can be switched on and off like an accelerator, but without requiring accelerator equipment such as high voltage supply, control unit, etc. This source concept would provide a highly portable neutron source for field radiation measurement applications. Such a source would require minimal, if any, shielding when not in use. The SRNS, previously patented by Argonne staff, provides a means of constructing the alpha-emitting and light-element components of a radioactive neutron source, in such a fashion that these two components can brought together to turn the source ``on`` and then be separated to turn the source ``off``. An SRNS could be used for such field applications as active neutron interrogation of objects to detect fissile materials or to measure their concentration; and to excite gamma-ray emission for detection of specific elements that indicate toxic chemicals, drugs, explosives, etc. The demonstration was performed using Pu-238 as the alpha emitter and Be as the light element, in an air-atmosphere glovebox having no atmosphere purification capability. A stable, thin film of Pu-238 oxide was deposited on a stainless steel planchet. The ``on`` output of the demonstration Pu-238 film was measured to be 2.5 {times} 10{sup 6} neutrons/sec-gram of Pu-238. The measured ``off`` neutron rate was satisfactory, only about 5% of the ``on`` output, after two weeks of exposure to the glovebox atmosphere. After several weeks additional exposure, the ``off`` rate had increased to about 15%. This work demonstrates the feasibility of constructing practical, highly portable SRNS units with very low gamma-ray dose in the ``off`` position.

  2. Tools for Inspecting and Sampling Waste in Underground Radioactive Storage Tanks with Small Access Riser Openings

    SciTech Connect

    Nance, T.A.

    1998-12-17

    Underground storage tanks with 2 inches to 3 inches diameter access ports at the Department of Energy's Savannah River Site have been used to store radioactive solvents and sludge. In order to close these tanks, the contents of the tanks need to first be quantified in terms of volume and chemical and radioactive characteristics. To provide information on the volume of waste contained within the tanks, a small remote inspection system was needed. This inspection system was designed to provide lighting and provide pan and tilt capabilities in an inexpensive package with zoom abilities and color video. This system also needed to be utilized inside of a plastic tent built over the access port to contain any contamination exiting from the port. This system had to be build to travel into the small port opening, through the riser pipe, into the tank evacuated space, and out of the riser pipe and access port with no possibility of being caught and blocking the access riser. Long thin plates were found in many access riser pipes that blocked the inspection system from penetrating into the tank interiors. Retrieval tools to clear the plates from the tanks using developed sampling devices while providing safe containment for the samples. This paper will discuss the inspection systems, tools for clearing access pipes, and solvent sampling tools developed to evaluate the tank contents of the underground solvent storage tanks.

  3. Ion source developments for the production of radioactive isotope beams at TRIUMF

    SciTech Connect

    Ames, F. Bricault, P.; Heggen, H.; Kunz, P.; Lassen, J.; Mjøs, A.; Raeder, S.; Teigelhöfer, A.

    2014-02-15

    At the ISAC facility at TRIUMF radioactive ions are produced by bombarding solid targets with up to 100 μA of 500 MeV protons. The reaction products have to diffuse out of the hot target into an ion source. Normally, singly charged ions are extracted. They can be transported either directly to experiments or via an ECR charge state breeder to a post accelerator. Several different types of ion sources have to be used in order to deliver a large variety of rare isotope beams. At ISAC those are surface ion sources, forced electron beam arc discharge (FEBIAD) ion sources and resonant laser ionization sources. Recent development activities concentrated on increasing the selectivity for the ionization to suppress isobaric contamination in the beam. Therefore, a surface ion rejecting resonant laser ionization source (SIRLIS) has been developed to suppress ions from surface ionization. For the FEBIAD ion source a cold transfer line has been introduced to prevent less volatile components from reaching the ion source.

  4. A Framework for the Systematic Collection of Open Source Intelligence

    SciTech Connect

    Pouchard, Line Catherine; Trien, Joseph P; Dobson, Jonathan D

    2009-01-01

    Following legislative directions, the Intelligence Community has been mandated to make greater use of Open Source Intelligence (OSINT). Efforts are underway to increase the use of OSINT but there are many obstacles. One of these obstacles is the lack of tools helping to manage the volume of available data and ascertain its credibility. We propose a unique system for selecting, collecting and storing Open Source data from the Web and the Open Source Center. Some data management tasks are automated, document source is retained, and metadata containing geographical coordinates are added to the documents. Analysts are thus empowered to search, view, store, and analyze Web data within a single tool. We present ORCAT I and ORCAT II, two implementations of the system.

  5. Open Source Radiation Hardened by Design Technology

    NASA Technical Reports Server (NTRS)

    Shuler, Robert

    2016-01-01

    The proposed technology allows use of the latest microcircuit technology with lowest power and fastest speed, with minimal delay and engineering costs, through new Radiation Hardened by Design (RHBD) techniques that do not require extensive process characterization, technique evaluation and re-design at each Moore's Law generation. The separation of critical node groups is explicitly parameterized so it can be increased as microcircuit technologies shrink. The technology will be open access to radiation tolerant circuit vendors. INNOVATION: This technology would enhance computation intensive applications such as autonomy, robotics, advanced sensor and tracking processes, as well as low power applications such as wireless sensor networks. OUTCOME / RESULTS: 1) Simulation analysis indicates feasibility. 2)Compact voting latch 65 nanometer test chip designed and submitted for fabrication -7/2016. INFUSION FOR SPACE / EARTH: This technology may be used in any digital integrated circuit in which a high level of resistance to Single Event Upsets is desired, and has the greatest benefit outside low earth orbit where cosmic rays are numerous.

  6. A Neutron Source Facility for Neutron Cross-Section Measurements on Radioactive Targets at RIA

    SciTech Connect

    Ahle, L E; Bernstein, L; Rusnak, B; Berio, R

    2003-05-20

    The stockpile stewardship program is interested in neutron cross-section measurements on nuclei that are a few nucleons away from stability. Since neutron targets do not exist, radioactive targets are the only way to directly perform these measurements. This requires a facility that can provide high production rates for these short-lived nuclei as well as a source of neutrons. The Rare Isotope Accelerator (RIA) promises theses high production rates. Thus, adding a co-located neutron source facility to the RIA project baseline would allow these neutron cross-section measurements to be made. A conceptual design for such a neutron source has been developed, which would use two accelerators, a Dynamitron and a linac, to create the neutrons through a variety of reactions (d-d, d-t, deuteron break-up, p-Li). This range of reactions is needed in order to provide the desired energy range from 10's of keV to 20 MeV. The facility would also have hot cells to perform chemistry on the radioactive material both before and after neutron irradiation. The present status of this design and direction of future work will be discussed.

  7. Calibration of a time-resolved hard-x-ray detector using radioactive sources

    NASA Astrophysics Data System (ADS)

    Stoeckl, C.; Theobald, W.; Regan, S. P.; Romanofsky, M. H.

    2016-11-01

    A four-channel, time-resolved, hard x-ray detector (HXRD) has been operating at the Laboratory for Laser Energetics for more than a decade. The slope temperature of the hot-electron population in direct-drive inertial confinement fusion experiments is inferred by recording the hard x-ray radiation generated in the interaction of the electrons with the target. Measuring the energy deposited by hot electrons requires an absolute calibration of the hard x-ray detector. A novel method to obtain an absolute calibration of the HXRD using single photons from radioactive sources was developed, which uses a thermoelectrically cooled, low-noise, charge-sensitive amplifier.

  8. Open Source Drug Discovery in Practice: A Case Study

    PubMed Central

    Årdal, Christine; Røttingen, John-Arne

    2012-01-01

    Background Open source drug discovery offers potential for developing new and inexpensive drugs to combat diseases that disproportionally affect the poor. The concept borrows two principle aspects from open source computing (i.e., collaboration and open access) and applies them to pharmaceutical innovation. By opening a project to external contributors, its research capacity may increase significantly. To date there are only a handful of open source R&D projects focusing on neglected diseases. We wanted to learn from these first movers, their successes and failures, in order to generate a better understanding of how a much-discussed theoretical concept works in practice and may be implemented. Methodology/Principal Findings A descriptive case study was performed, evaluating two specific R&D projects focused on neglected diseases. CSIR Team India Consortium's Open Source Drug Discovery project (CSIR OSDD) and The Synaptic Leap's Schistosomiasis project (TSLS). Data were gathered from four sources: interviews of participating members (n = 14), a survey of potential members (n = 61), an analysis of the websites and a literature review. Both cases have made significant achievements; however, they have done so in very different ways. CSIR OSDD encourages international collaboration, but its process facilitates contributions from mostly Indian researchers and students. Its processes are formal with each task being reviewed by a mentor (almost always offline) before a result is made public. TSLS, on the other hand, has attracted contributors internationally, albeit significantly fewer than CSIR OSDD. Both have obtained funding used to pay for access to facilities, physical resources and, at times, labor costs. TSLS releases its results into the public domain, whereas CSIR OSDD asserts ownership over its results. Conclusions/Significance Technically TSLS is an open source project, whereas CSIR OSDD is a crowdsourced project. However, both have enabled high quality

  9. Technology collaboration by means of an open source government

    NASA Astrophysics Data System (ADS)

    Berardi, Steven M.

    2009-05-01

    The idea of open source software originally began in the early 1980s, but it never gained widespread support until recently, largely due to the explosive growth of the Internet. Only the Internet has made this kind of concept possible, bringing together millions of software developers from around the world to pool their knowledge. The tremendous success of open source software has prompted many corporations to adopt the culture of open source and thus share information they previously held secret. The government, and specifically the Department of Defense (DoD), could also benefit from adopting an open source culture. In acquiring satellite systems, the DoD often builds walls between program offices, but installing doors between programs can promote collaboration and information sharing. This paper addresses the challenges and consequences of adopting an open source culture to facilitate technology collaboration for DoD space acquisitions. DISCLAIMER: The views presented here are the views of the author, and do not represent the views of the United States Government, United States Air Force, or the Missile Defense Agency.

  10. Comparison of open-source linear programming solvers.

    SciTech Connect

    Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.; Jones, Katherine A.; Martin, Nathaniel; Detry, Richard Joseph

    2013-10-01

    When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modular In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.

  11. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  12. A compact ultra-clean system for deploying radioactive sources inside the KamLAND detector

    DOE PAGESBeta

    Banks, T. I.; Freedman, S. J.; Wallig, J.; Ybarrolaza, N.; Gando, A.; Gando, Y.; Ikeda, H.; Inoue, K.; Kishimoto, Y.; Koga, M.; et al

    2014-10-14

    We describe a compact, ultra-clean device used to deploy radioactive sources along the vertical axis of the KamLAND liquid-scintillator neutrino detector for purposes of calibration. The device worked by paying out and reeling in precise lengths of a hanging, small-gauge wire rope (cable); an assortment of interchangeable radioactive sources could be attached to a weight at the end of the cable. All components exposed to the radiopure liquid scintillator were made of chemically compatible UHV-cleaned materials, primarily stainless steel, in order to avoid contaminating or degrading the scintillator. To prevent radon intrusion, the apparatus was enclosed in a hermetically sealedmore » housing inside a glove box, and both volumes were regularly flushed with purified nitrogen gas. Finally, an infrared camera attached to the side of the housing permitted real-time visual monitoring of the cable’s motion, and the system was controlled via a graphical user interface.« less

  13. Bioclipse: an open source workbench for chemo- and bioinformatics

    PubMed Central

    Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl ES

    2007-01-01

    Background There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no sucessful attempts have been made to integrate chemo- and bioinformatics into a single framework. Results Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Conclusion Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at . PMID:17316423

  14. Open source tools for ATR development and performance evaluation

    NASA Astrophysics Data System (ADS)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  15. 10 CFR Appendix E to Part 835 - Values for Establishing Sealed Radioactive Source Accountability and Radioactive Material Posting...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Accountability and Radioactive Material Posting and Labeling Requirements E Appendix E to Part 835 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Pt. 835, App. E Appendix E to Part 835—Values for... Requirements The data presented in appendix E are to be used for identifying accountable sealed...

  16. The patient as a radioactive source: an intercomparison of survey meters for measurements in nuclear medicine.

    PubMed

    Uhrhan, K; Drzezga, A; Sudbrock, F

    2014-11-01

    In this work, the radiation exposure in nuclear medicine is evaluated by measuring dose rates in the proximity of patients and those in close contact to sources like capsules and syringes. A huge number of different survey meters (SMs) are offered commercially. This topic has recently gained interest since dosemeters and active personal dosemeters (APD) for the new dose quantities (ambient and directional dose equivalent) have become available. One main concern is the practical use of SMs and APD in daily clinical routines. Therefore, the radiation field of four common radiopharmaceuticals containing (18)F, (90)Y, (99m)Tc and (131)I in radioactive sources or after application to the patient was determined. Measurements were carried out with different SMs and for several distances. Dose rates decline significantly with the distance to the patient, and with some restrictions, APD can be used as SMs.

  17. SIMPLIFIED PRACTICAL TEST METHOD FOR PORTABLE DOSE METERS USING SEVERAL SEALED RADIOACTIVE SOURCES.

    PubMed

    Mikamoto, Takahiro; Yamada, Takahiro; Kurosawa, Tadahiro

    2016-09-01

    Sealed radioactive sources which have small activity were employed for the determination of response and tests for non-linearity and energy dependence of detector responses. Close source-to-detector geometry (at 0.3 m or less) was employed to practical tests for portable dose meters to accumulate statistically sufficient ionizing currents. Difference between response in the present experimentally studied field and in the reference field complied with ISO 4037 due to non-uniformity of radiation fluence at close geometry was corrected by use of Monte Carlo simulation. As a consequence, corrected results were consistent with the results obtained in the ISO 4037 reference field within their uncertainties. PMID:27521204

  18. CARTOGAM - a portable gamma camera for remote localisation of radioactive sources in nuclear facilities

    NASA Astrophysics Data System (ADS)

    Gal, O.; Izac, C.; Jean, F.; Lainé, F.; Lévêque, C.; Nguyen, A.

    2001-03-01

    We have developed a compact gamma-imaging system, CARTOGAM, for remote localisation of radioactive sources in nuclear facilities. This system is under industrial development and commercialisation by the firm EURISYS Mesures. The most specific characteristics of CARTOGAM lie in its size (8 cm in diameter) and mass (15 kg for the detection head, including the shield), which make it portable by a person. As an example, CARTOGAM detects a 660 keV source producing a 0.4 μGy/h dose rate at the camera location in 10 min. The angular resolution at that energy ranges from 1° to 3°, depending on the field of view (30° or 50°) and scintillator thickness (2 or 4 mm). We present here a review of the specifications of the camera and show a few images illustrating its performance.

  19. Freeing Crop Genetics through the Open Source Seed Initiative

    PubMed Central

    Luby, Claire H.; Goldman, Irwin L.

    2016-01-01

    For millennia, seeds have been freely available to use for farming and plant breeding without restriction. Within the past century, however, intellectual property rights (IPRs) have threatened this tradition. In response, a movement has emerged to counter the trend toward increasing consolidation of control and ownership of plant germplasm. One effort, the Open Source Seed Initiative (OSSI, www.osseeds.org), aims to ensure access to crop genetic resources by embracing an open source mechanism that fosters exchange and innovation among farmers, plant breeders, and seed companies. Plant breeders across many sectors have taken the OSSI Pledge to create a protected commons of plant germplasm for future generations. PMID:27093567

  20. Human genome and open source: balancing ethics and business.

    PubMed

    Marturano, Antonio

    2011-01-01

    The Human Genome Project has been completed thanks to a massive use of computer techniques, as well as the adoption of the open-source business and research model by the scientists involved. This model won over the proprietary model and allowed a quick propagation and feedback of research results among peers. In this paper, the author will analyse some ethical and legal issues emerging by the use of such computer model in the Human Genome property rights. The author will argue that the Open Source is the best business model, as it is able to balance business and human rights perspectives.

  1. Open Source Next Generation Visualization Software for Interplanetary Missions

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  2. Open source and DIY hardware for DNA nanotechnology labs

    PubMed Central

    Damase, Tulsi R.; Stephens, Daniel; Spencer, Adam; Allen, Peter B.

    2015-01-01

    A set of instruments and specialized equipment is necessary to equip a laboratory to work with DNA. Reducing the barrier to entry for DNA manipulation should enable and encourage new labs to enter the field. We present three examples of open source/DIY technology with significantly reduced costs relative to commercial equipment. This includes a gel scanner, a horizontal PAGE gel mold, and a homogenizer for generating DNA-coated particles. The overall cost savings obtained by using open source/DIY equipment was between 50 and 90%. PMID:26457320

  3. Development of a surface ionization source for the production of radioactive alkali ion beams in SPIRAL

    NASA Astrophysics Data System (ADS)

    Eléon, C.; Jardin, P.; Gaubert, G.; Saint-Laurent, M.-G.; Alcántara-Núñez, J.; Alvès Condé, R.; Barué, C.; Boilley, D.; Cornell, J.; Delahaye, P.; Dubois, M.; Jacquot, B.; Leherissier, P.; Leroy, R.; Lhersonneau, G.; Marie-Jeanne, M.; Maunoury, L.; Pacquet, J. Y.; Pellemoine, F.; Pierret, C.; Thomas, J. C.; Villari, A. C. C.

    2008-10-01

    In the framework of the production of radioactive alkali ion beams by the isotope separation on-line (ISOL) method in SPIRAL I, a surface ionization source has been developed at GANIL to produce singly-charged ions of Li, Na and K. This new source has been designed to work in the hostile environment whilst having a long lifetime. This new system of production has two ohmic heating components: the first for the target oven and the second for the ionizer. The latter, being in carbon, offers high reliability and competitive ionization efficiency. This surface ionization source has been tested on-line using a 48Ca primary beam at 60.3 A MeV with an intensity of 0.14 pA. The ionization efficiencies obtained for Li, Na and K are significantly better than the theoretical values of the ionization probability per contact. The enhanced efficiency, due to the polarization of the ionizer, is shown to be very important also for short-lived isotopes. In the future, this source will be associated with the multicharged electron-cyclotron-resonance (ECR) ion source NANOGAN III for production of multicharged alkali ions in SPIRAL. The preliminary tests of the set up are also presented in this contribution.

  4. NASA's Open Source Software for Serving and Viewing Global Imagery

    NASA Astrophysics Data System (ADS)

    Roberts, J. T.; Alarcon, C.; Boller, R. A.; Cechini, M. F.; Gunnoe, T.; Hall, J. R.; Huang, T.; Ilavajhala, S.; King, J.; McGann, M.; Murphy, K. J.; Plesea, L.; Schmaltz, J. E.; Thompson, C. K.

    2014-12-01

    The NASA Global Imagery Browse Services (GIBS), which provide open access to an enormous archive of historical and near real time imagery from NASA supported satellite instruments, has also released most of its software to the general public as open source. The software packages, originally developed at the Jet Propulsion Laboratory and Goddard Space Flight Center, currently include: 1) the Meta Raster Format (MRF) GDAL driver—GDAL support for a specialized file format used by GIBS to store imagery within a georeferenced tile pyramid for exceptionally fast access; 2) OnEarth—a high performance Apache module used to serve tiles from MRF files via common web service protocols; 3) Worldview—a web mapping client to interactively browse global, full-resolution satellite imagery and download underlying data. Examples that show developers how to use GIBS with various mapping libraries and programs are also available. This stack of tools is intended to provide an out-of-the-box solution for serving any georeferenced imagery.Scientists as well as the general public can use the open source software for their own applications such as developing visualization interfaces for improved scientific understanding and decision support, hosting a repository of browse images to help find and discover satellite data, or accessing large datasets of geo-located imagery in an efficient manner. Open source users may also contribute back to NASA and the wider Earth Science community by taking an active role in evaluating and developing the software.This presentation will discuss the experiences of developing the software in an open source environment and useful lessons learned. To access the open source software repositories, please visit: https://github.com/nasa-gibs/

  5. Is Open Source the ERP Cure-All?

    ERIC Educational Resources Information Center

    Panettieri, Joseph C.

    2008-01-01

    Conventional and hosted applications thrive, but open source ERP (enterprise resource planning) is coming on strong. In many ways, the evolution of the ERP market is littered with ironies. When Oracle began buying up customer relationship management (CRM) and ERP companies, some universities worried that they would be left with fewer choices and…

  6. Faculty/Student Surveys Using Open Source Software

    ERIC Educational Resources Information Center

    Kaceli, Sali

    2004-01-01

    This session will highlight an easy survey package which lets non-technical users create surveys, administer surveys, gather results, and view statistics. This is an open source application all managed online via a web browser. By using phpESP, the faculty is given the freedom of creating various surveys at their convenience and link them to their…

  7. Bioconductor: an open source framework for bioinformatics and computational biology.

    PubMed

    Reimers, Mark; Carey, Vincent J

    2006-01-01

    This chapter describes the Bioconductor project and details of its open source facilities for analysis of microarray and other high-throughput biological experiments. Particular attention is paid to concepts of container and workflow design, connections of biological metadata to statistical analysis products, support for statistical quality assessment, and calibration of inference uncertainty measures when tens of thousands of simultaneous statistical tests are performed.

  8. Analysing the Social Networks constituted by Open Source communities

    NASA Astrophysics Data System (ADS)

    Concas, Giulio; Lisci, Manuela; Pinna, Sandro; Porruvecchio, Guido; Uras, Selene

    2008-11-01

    In this study on Open Source communities, we examined some important aspects about communication among participants to developers mailing lists. We utilized Social Network Analysis approach to analyse this particular kind of communities, in order to better understand interaction among members, individuate communicational flows and discover whether there is a subsequential community coordination and control.

  9. Open Source Software: Fully Featured vs. "The Devil You Know"

    ERIC Educational Resources Information Center

    Hotrum, Michael; Ludwig, Brian; Baggaley, Jon

    2005-01-01

    The "ILIAS" learning management system (LMS) was evaluated, following its favourable rating in an independent evaluation study of open source software (OSS) products. The current review found "ILIAS" to have numerous features of value to distance education (DE) students and teachers, as well as problems for consideration in the system's ongoing…

  10. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  11. Higher Education Sub-Cultures and Open Source Adoption

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2011-01-01

    Successful adoption of new teaching and learning technologies in higher education requires the consensus of two sub-cultures, namely the technologist sub-culture and the academic sub-culture. This paper examines trends in adoption of open source software (OSS) for teaching and learning by comparing the results of a 2009 survey of 285 Chief…

  12. Chinese Localisation of Evergreen: An Open Source Integrated Library System

    ERIC Educational Resources Information Center

    Zou, Qing; Liu, Guoying

    2009-01-01

    Purpose: The purpose of this paper is to investigate various issues related to Chinese language localisation in Evergreen, an open source integrated library system (ILS). Design/methodology/approach: A Simplified Chinese version of Evergreen was implemented and tested and various issues such as encoding, indexing, searching, and sorting…

  13. Digital Preservation in Open-Source Digital Library Software

    ERIC Educational Resources Information Center

    Madalli, Devika P.; Barve, Sunita; Amin, Saiful

    2012-01-01

    Digital archives and digital library projects are being initiated all over the world for materials of different formats and domains. To organize, store, and retrieve digital content, many libraries as well as archiving centers are using either proprietary or open-source software. While it is accepted that print media can survive for centuries with…

  14. Color science demonstration kit from open source hardware and software

    NASA Astrophysics Data System (ADS)

    Zollers, Michael W.

    2014-09-01

    Color science is perhaps the most universally tangible discipline within the optical sciences for people of all ages. Excepting a small and relatively well-understood minority, we can see that the world around us consists of a multitude of colors; yet, describing the "what", "why", and "how" of these colors is not an easy task, especially without some sort of equally colorful visual aids. While static displays (e.g., poster boards, etc.) serve their purpose, there is a growing trend, aided by the recent permeation of small interactive devices into our society, for interactive and immersive learning. However, for the uninitiated, designing software and hardware for this purpose may not be within the purview of all optical scientists and engineers. Enter open source. Open source "anything" are those tools and designs -- hardware or software -- that are available and free to use, often without any restrictive licensing. Open source software may be familiar to some, but the open source hardware movement is relatively new. These are electronic circuit board designs that are provided for free and can be implemented in physical hardware by anyone. This movement has led to the availability of some relatively inexpensive, but quite capable, computing power for the creation of small devices. This paper will showcase the design and implementation of the software and hardware that was used to create an interactive demonstration kit for color. Its purpose is to introduce and demonstrate the concepts of color spectra, additive color, color rendering, and metamers.

  15. The Case for Open Source Software in Digital Forensics

    NASA Astrophysics Data System (ADS)

    Zanero, Stefano; Huebner, Ewa

    In this introductory chapter we discuss the importance of the use of open source software (OSS), and in particular of free software (FLOSS) in computer forensics investigations including the identification, capture, preservation and analysis of digital evidence; we also discuss the importance of OSS in computer forensics

  16. Open Source Projects in Software Engineering Education: A Mapping Study

    ERIC Educational Resources Information Center

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  17. Critical Analysis on Open Source LMSs Using FCA

    ERIC Educational Resources Information Center

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  18. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network

  19. Analysis of source term modeling for low-level radioactive waste performance assessments

    SciTech Connect

    Icenhour, A.S.

    1995-03-01

    Site-specific radiological performance assessments are required for the disposal of low-level radioactive waste (LLW) at both commercial and US Department of Energy facilities. This work explores source term modeling of LLW disposal facilities by using two state-of-the-art computer codes, SOURCEI and SOURCE2. An overview of the performance assessment methodology is presented, and the basic processes modeled in the SOURCE1 and SOURCE2 codes are described. Comparisons are made between the two advective models for a variety of radionuclides, transport parameters, and waste-disposal technologies. These comparisons show that, in general, the zero-order model predicts undecayed cumulative fractions leached that are slightly greater than or equal to those of the first-order model. For long-lived radionuclides, results from the two models eventually reach the same value. By contrast, for short-lived radionuclides, the zero-order model predicts a slightly higher undecayed cumulative fraction leached than does the first-order model. A new methodology, based on sensitivity and uncertainty analyses, is developed for predicting intruder scenarios. This method is demonstrated for {sup 137}Cs in a tumulus-type disposal facility. The sensitivity and uncertainty analyses incorporate input-parameter uncertainty into the evaluation of a potential time of intrusion and the remaining radionuclide inventory. Finally, conclusions from this study are presented, and recommendations for continuing work are made.

  20. Implementation of an imaging spectrometer for localization and identification of radioactive sources

    NASA Astrophysics Data System (ADS)

    Lemaire, H.; Khalil, R. Abou; Amgarou, K.; Angélique, J.-C.; Bonnet, F.; De Toro, D.; Carrel, F.; Giarmana, O.; Gmar, M.; Menaa, N.; Menesguen, Y.; Normand, S.; Patoz, A.; Schoepff, V.; Talent, P.; Timi, T.

    2014-11-01

    Spatial localization of radioactive sources is currently a main issue interesting nuclear industry as well as homeland security applications and can be achieved using gamma cameras. For several years, CEA LIST has been designing a new system, called GAMPIX, with improved sensitivity, portability and ease of use. The main remaining limitation of this system is the lack of spectrometric information, preventing the identification of radioactive materials. This article describes the development of an imaging spectrometer based on the GAMPIX technology. Experimental tests have been carried out according to both spectrometric methods enabled by the pixelated Timepix chip used in the GAMPIX gamma camera. The first method is based on the size of the impacts produced by a gamma-ray energy deposition in the detection matrix. The second one uses the Time over Threshold (ToT) mode of the Timepix chip and deals with time spent by pulses generated by charge preamplifiers over a user-specified threshold. Both energy resolution and sensitivity studies demonstrated the superiority of the ToT approach which will consequently be further explored. Energy calibration, tests of different pixel sizes for the Timepix chip and use of the Medipix3 chip are future milestones to improve performances of the newly implemented imaging spectrometer.

  1. Radioactive sealed sources: Reasonable accountability, exemption, and licensing activity thresholds -- A technical basis

    SciTech Connect

    Lee, D.W.; Shingleton, K.L.

    1996-07-01

    Perhaps owing to their small size and portability, some radiation accidents/incidents have involved radioactive sealed sources (RSSs). As a result, programs for the control and accountability of RSSs have come to be recommended and emplaced that essentially require RSSs to be controlled in a manner different from bulk, unsealed radioactive material. Crucially determining the total number of RSSs for which manpower-intensive radiation protection surveillance is provided is the individual RSS activity above which such surveillance is required and below which such effort is not considered cost effective. Individual RSS activity thresholds are typically determined through scenarios which impart a chosen internal or external limiting dose to Reference Man under specified exposure conditions. The resultant RSS threshold activity levels have meaning commensurate with the assumed scenario exposure parameters, i.e., if they are realistic and technically based. A review of how the Department of Energy (DOE), the International Atomic Energy Agency (IAEA), and the Nuclear Regulatory Commission (NRC) have determined their respective accountability, exemption, and licensing threshold activity values is provided. Finally, a fully explained method using references readily available to practicing health physicists is developed using realistic, technically-based calculation parameters by which RSS threshold activities may be locally generated.

  2. Radiation Field of Packages Carrying Spent Co-60 Radioactive Sources - 12437

    SciTech Connect

    Marzo, Giuseppe A.; Giorgiantoni, Giorgio; Sepielli, Massimo

    2012-07-01

    Among the diverse radioactive sources commonly exploited in medical and industrial applications, Co- 60 is increasingly used as strong gamma emitter. Over time, source manufacturers favored Co-60 as opposed to other gamma emitters because its relatively short half-life (5.27 year) that minimizes issues related to the management of disused sources. Disused Co-60 sources can retain a significant amount of radioactivity (from hundreds of MBq to several GBq) that still poses safety concerns on their handling and transportation. In this context a detailed knowledge of their radiation field would provide the necessary information for taking actions in preventing unnecessary doses to the workers and the population by optimizing transportation procedures and handling operations. We modeled the geometry and the materials constituting a transportation packaging of a spent Co-60 source which had an original maximum activity of a few GBq and was enclosed in a small lead irradiator. Then we applied a Monte Carlo transport code (MCNP5) for tracking down the gamma photons emitted by the source, including the secondary photons resulting by the interaction of the source photons with the surrounding materials. This allowed for the evaluation of the radiation field inside and outside the packaging, and the corresponding equivalent dose useful for checking the compliance with the regulations and the health risk of possible radiation exposure. We found that a typical 60-liters drum carrying a spent Co-60 source, enclosed in its original irradiator, with a residual activity of 300 MBq could already overcome an equivalent dose of 0.2 mSv/h on the drum external surface, which is the maximum equivalent dose at any point of the surface for this packaging as prescribed by local regulations. This condition is even more apparent when the source is slightly displaced with respect to the rotation axis of the drum, an easily occurring condition for sources not properly packaged, generating non

  3. OpenDA: Open Source Generic Data Assimilation Environment and its Application in Geophysical Process Models

    NASA Astrophysics Data System (ADS)

    Weerts, A.; van Velzen, N.; Verlaan, M.; Sumihar, J.; Hummel, S.; El Serafy, G.; Dhondia, J.; Gerritsen, H.; Vermeer-Ooms, S.; Loots, E.; Markus, A.; Kockx, A.

    2011-12-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their use in operational forecasting and real-time control in the fields of groundwater, surface water and soil systems. In meteorological and atmospheric sciences, steady improvements in numerical weather forecasting and climate prediction over the last couple of decades have been enabled to a large degree by the development of community-based models and data assimilation systems. The hydrologic community should learn from the experiences of the meteorological and atmospheric communities by accelerating the transition of hydrologic DA research into operations and developing community-supported, open-source modeling and forecasting systems and data assimilation tools. In 2010, a community based open source initiative named OpenDA was started. The openDA initiative bears similarities with the well-known openMI initiative. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modeling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model describing a process (atmospheric processes, 3D circulation, 2D water flow, rainfall-runoff, unsaturated flow, groundwater flow, etc.). Presently, openDA features filtering techniques and calibration techniques. The presentation will give an overview of openDA and the results of some of its practical applications.

  4. Simulated Performance of Algorithms for the Localization of Radioactive Sources from a Position Sensitive Radiation Detecting System (COCAE)

    SciTech Connect

    Karafasoulis, K.; Zachariadou, K.; Seferlis, S.; Kaissas, I.; Potiriadis, C.; Lambropoulos, C.; Loukas, D.

    2011-12-13

    Simulation studies are presented regarding the performance of algorithms that localize point-like radioactive sources detected by a position sensitive portable radiation instrument (COCAE). The source direction is estimated by using the List Mode Maximum Likelihood Expectation Maximization (LM-ML-EM) imaging algorithm. Furthermore, the source-to-detector distance is evaluated by three different algorithms based on the photo-peak count information of each detecting layer, the quality of the reconstructed source image, and the triangulation method. These algorithms have been tested on a large number of simulated photons over a wide energy range (from 200 keV to 2 MeV) emitted by point-like radioactive sources located at different orientations and source-to-detector distances.

  5. Transforming High School Classrooms with Free/Open Source Software: "It's Time for an Open Source Software Revolution"

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2008-01-01

    Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…

  6. Open Source Clinical NLP - More than Any Single System.

    PubMed

    Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang

    2014-01-01

    The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.

  7. How Open Source Can Still Save the World

    NASA Astrophysics Data System (ADS)

    Behlendorf, Brian

    Many of the worlds’ major problems - economic distress, natural disaster responses, broken health care systems, education crises, and more - are not fundamentally information technology issues. However, in every case mentioned and more, there exist opportunities for Open Source software to uniquely change the way we can address these problems. At times this is about addressing a need for which no sufficient commercial market exists. For others, it is in the way Open Source licenses free the recipient from obligations to the creators, creating a relationship of mutual empowerment rather than one of dependency. For yet others, it is in the way the open collaborative processes that form around Open Source software provide a neutral ground for otherwise competitive parties to find a greatest common set of mutual needs to address together rather than in parallel. Several examples of such software exist today and are gaining traction. Governments, NGOs, and businesses are beginning to recognize the potential and are organizing to meet it. How far can this be taken?

  8. Effect of geometrical configuration of radioactive sources on radiation intensity in beta-voltaic nuclear battery system: A preliminary result

    NASA Astrophysics Data System (ADS)

    Basar, Khairul; Riupassa, Robi D.; Bachtiar, Reza; Badrianto, Muldani D.

    2014-09-01

    It is known that one main problem in the application of beta-voltaic nuclear battery system is its low efficiency. The efficiency of the beta-voltaic nuclear battery system mainly depends on three aspects: source of radioactive radiation, interface between materials in the system and process of converting electron-hole pair to electric current in the semiconductor material. In this work, we show the effect of geometrical configuration of radioactive sources on radiation intensity of beta-voltaic nuclear battery system.

  9. Aerostat-Lofted Instrument Platform and Sampling Method for Determination of Emissions from Open Area Sources

    EPA Science Inventory

    Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...

  10. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    PubMed

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease. PMID:27575624

  11. openBIS ELN-LIMS: an open-source database for academic laboratories

    PubMed Central

    Barillari, Caterina; Ottoz, Diana S. M.; Fuentes-Serna, Juan Mariano; Ramakrishnan, Chandrasekhar; Rinn, Bernd; Rudolf, Fabian

    2016-01-01

    Summary: The open-source platform openBIS (open Biology Information System) offers an Electronic Laboratory Notebook and a Laboratory Information Management System (ELN-LIMS) solution suitable for the academic life science laboratories. openBIS ELN-LIMS allows researchers to efficiently document their work, to describe materials and methods and to collect raw and analyzed data. The system comes with a user-friendly web interface where data can be added, edited, browsed and searched. Availability and implementation: The openBIS software, a user guide and a demo instance are available at https://openbis-eln-lims.ethz.ch. The demo instance contains some data from our laboratory as an example to demonstrate the possibilities of the ELN-LIMS (Ottoz et al., 2014). For rapid local testing, a VirtualBox image of the ELN-LIMS is also available. Contact: brinn@ethz.ch or fabian.rudolf@bsse.ethz.ch PMID:26508761

  12. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    PubMed

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  13. Concentrating Radioactivity

    ERIC Educational Resources Information Center

    Herrmann, Richard A.

    1974-01-01

    By concentrating radioactivity contained on luminous dials, a teacher can make a high reading source for classroom experiments on radiation. The preparation of the source and its uses are described. (DT)

  14. Using RSS feeds to track open source radiology informatics projects

    NASA Astrophysics Data System (ADS)

    Nagy, Paul; Daly, Mark; Warnock, Michael; Siddiqui, Khan; Siegel, Eliot

    2005-04-01

    There are over 40 open source projects in the field of radiology informatics. Because these are organized and written by volunteers, the development speed varies greatly from one project to the next. To keep track of updates, users must constantly check in on each project's Web page. Many projects remain dormant for years, and ad hoc checking becomes both an inefficient and unreliable means of determining when new versions are available. The result is that most end users track only a few projects and are unaware when others that may be more germane to their interests leapfrog in development. RSS feeds provide a machine readable XML format to track software project updates. Currently only 8 of the 40 projects provide RSS feeds for automatic propagation of news updates. We have a built a news aggregation engine around open source projects in radiology informatics.

  15. OPERA: Open-source Pipeline for Espadons Reduction and Analysis

    NASA Astrophysics Data System (ADS)

    Teeple, Douglas

    2014-11-01

    OPERA (Open-source Pipeline for Espadons Reduction and Analysis) is an open-source collaborative software reduction pipeline for ESPaDOnS data. ESPaDOnS is a bench-mounted high-resolution echelle spectrograph and spectro-polarimeter designed to obtain a complete optical spectrum (from 370 to 1,050 nm) in a single exposure with a mode-dependent resolving power between 68,000 and 81,000. OPERA is fully automated, calibrates on two-dimensional images and reduces data to produce one-dimensional intensity and polarimetric spectra. Spectra are extracted using an optimal extraction algorithm. Though designed for CFHT ESPaDOnS data, the pipeline is extensible to other echelle spectrographs.

  16. Long distance education for croatian nurses with open source software.

    PubMed

    Radenovic, Aleksandar; Kalauz, Sonja

    2006-01-01

    Croatian Nursing Informatics Association (CNIA) has been established as result of continuing work on promoting nursing informatics in Croatia. Main goals of CNIA are promoting nursing informatics and education of nurses about nursing informatics and using information technology in nursing process. CNIA in start of work is developed three courses from nursing informatics all designed with support of long distance education with open source software. Courses are: A - 'From Data to Wisdom', B - 'Introduction to Nursing Informatics' and C - 'Nursing Informatics I'. Courses A and B are obligatory for C course. Technology used to implement these online courses is based on the open source Learning Management System (LMS), Claroline, free online collaborative learning platform. Courses are divided in two modules/days. First module/day participants have classical approach to education and second day with E-learning from home. These courses represent first courses from nursing informatics' and first long distance education for nurses also. PMID:17102315

  17. Modular Open-Source Software for Item Factor Analysis

    PubMed Central

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven

    2015-01-01

    This paper introduces an Item Factor Analysis (IFA) module for OpenMx, a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation and manipulation of models. Modular organization of the source code facilitates the easy addition of item models, item parameter estimation algorithms, optimizers, test scoring algorithms, and fit diagnostics all within an integrated framework. Three short example scripts are presented for fitting item parameters, latent distribution parameters, and a multiple group model. The availability of both IFA and structural equation modeling in the same software is a step toward the unification of these two methodologies. PMID:27065479

  18. Open source data analysis and visualization software for optical engineering

    NASA Astrophysics Data System (ADS)

    Smith, Greg A.; Lewis, Benjamin J.; Palmer, Michael; Kim, Dae Wook; Loeff, Adrian R.; Burge, James H.

    2012-10-01

    SAGUARO is open-source software developed to simplify data assimilation, analysis, and visualization by providing a single framework for disparate data sources from raw hardware measurements to optical simulation output. Developed with a user-friendly graphical interface in the MATLABTM environment, SAGUARO is intended to be easy for the enduser in search of useful optical information as well as the developer wanting to add new modules and functionalities. We present here the flexibility of the SAGUARO software and discuss how it can be applied to the wider optical engineering community.

  19. Open Source Software For Patient Data Management In Critical Care.

    PubMed

    Massaut, Jacques; Charretk, Nicolas; Gayraud, Olivia; Van Den Bergh, Rafael; Charles, Adelin; Edema, Nathalie

    2015-01-01

    We have previously developed a Patient Data Management System for Intensive Care based on Open Source Software. The aim of this work was to adapt this software to use in Emergency Departments in low resource environments. The new software includes facilities for utilization of the South African Triage Scale and prediction of mortality based on independent predictive factors derived from data from the Tabarre Emergency Trauma Center in Port au Prince, Haiti.

  20. GISCube, an Open Source Web-based GIS Application

    NASA Astrophysics Data System (ADS)

    Boustani, M.; Mattmann, C. A.; Ramirez, P.

    2014-12-01

    There are many Earth science projects and data systems being developed at the Jet Propulsion Laboratory, California Institute of Technology (JPL) that require the use of Geographic Information Systems (GIS). Three in particular are: (1) the JPL Airborne Snow Observatory (ASO) that measures the amount of water being generated from snow melt in mountains; (2) the Regional Climate Model Evaluation System (RCMES) that compares climate model outputs with remote sensing datasets in the context of model evaluation and the Intergovernmental Panel on Climate Change and for the U.S. National Climate Assessment and; (3) the JPL Snow Server that produces a snow and ice climatology for the Western US and Alaska, for the U.S. National Climate Assessment. Each of these three examples and all other earth science projects are strongly in need of having GIS and geoprocessing capabilities to process, visualize, manage and store GeoSpatial data. Beside some open source GIS libraries and some software like ArcGIS there are comparatively few open source, web-based and easy to use application that are capable of doing GIS processing and visualization. To address this, we present GISCube, an open source web-based GIS application that can store, visualize and process GIS and GeoSpatial data. GISCube is powered by Geothon, an open source python GIS cookbook. Geothon has a variety of Geoprocessing tools such data conversion, processing, spatial analysis and data management tools. GISCube has the capability of supporting a variety of well known GIS data formats in both vector and raster formats, and the system is being expanded to support NASA's and scientific data formats such as netCDF and HDF files. In this talk, we demonstrate how Earth science and other projects can benefit by using GISCube and Geothon, its current goals and our future work in the area.

  1. Crux: rapid open source protein tandem mass spectrometry analysis.

    PubMed

    McIlwain, Sean; Tamura, Kaipo; Kertesz-Farkas, Attila; Grant, Charles E; Diament, Benjamin; Frewen, Barbara; Howbert, J Jeffry; Hoopmann, Michael R; Käll, Lukas; Eng, Jimmy K; MacCoss, Michael J; Noble, William Stafford

    2014-10-01

    Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit ( http://cruxtoolkit.sourceforge.net ) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276

  2. Small-angle Compton Scattering to Determine the Depth of a Radioactive Source in Matter

    SciTech Connect

    Oberer, R. B.; Gunn, C. A.; Chiang, L. G.; Valiga, R. E.; Cantrell, J. A.

    2011-04-01

    A gamma-ray peak in a spectrum is often accompanied by a discontinuity in the Compton continuum at the peak. The Compton continuum results from Compton scattering in the detector. The discontinuity at a peak results from small-angle Compton scattering by the gamma rays in matter situated directly between the gamma-ray source and the detector. The magnitude of this discontinuity with respect to the gamma-ray peak is therefore an indicator of the amount of material or shielding between the gamma-ray source and the detector. This small-angle scattering was used to determine the depth of highly-enriched uranium (HEU) solution standards in a concrete floor mockup. The empirical results of the use of this small-angle scattering discontinuity in a concrete floor experiment will be described. A Monte Carlo calculation of the experiment will also be described. In addition, the depth determined from small-angle scattering was used in conjunction with differential attenuation to more accurately measure the uranium content of the mockup. Following these empirical results, the theory of small-angle scattering will be discussed. The magnitude of the discontinuity compared to the peak count rate is directly related to the depth of the gamma-ray source in matter. This relation can be described by relatively simple mathematical expressions. This is the first instance that we are aware of in which the small-angle Compton scattering has been used to determine the depth of a radioactive source. Furthermore this is the first development of the theoretical expressions for the magnitude of the small-angle scattering discontinuity.

  3. Harvesting, Integrating and Distributing Large Open Geospatial Datasets Using Free and Open-Source Software

    NASA Astrophysics Data System (ADS)

    Oliveira, Ricardo; Moreno, Rafael

    2016-06-01

    Federal, State and Local government agencies in the USA are investing heavily on the dissemination of Open Data sets produced by each of them. The main driver behind this thrust is to increase agencies' transparency and accountability, as well as to improve citizens' awareness. However, not all Open Data sets are easy to access and integrate with other Open Data sets available even from the same agency. The City and County of Denver Open Data Portal distributes several types of geospatial datasets, one of them is the city parcels information containing 224,256 records. Although this data layer contains many pieces of information it is incomplete for some custom purposes. Open-Source Software were used to first collect data from diverse City of Denver Open Data sets, then upload them to a repository in the Cloud where they were processed using a PostgreSQL installation on the Cloud and Python scripts. Our method was able to extract non-spatial information from a `not-ready-to-download' source that could then be combined with the initial data set to enhance its potential use.

  4. Going open source: some lessons learned from the development of OpenRecLink.

    PubMed

    Camargo Jr, Kenneth Rochel de; Coeli, Claudia Medina

    2015-02-01

    Record linkage is the process of identifying and merging records across different databases belonging to the same entity. The health sector is one of the pioneering areas of record linkage techniques applications. In 1998 we began the development of a software package, called RecLink that implemented probabilistic record linkage techniques. In this article we report the development of a new, open-source version of that program, now named OpenRecLink. The aim of this article is to present the main characteristics of the new version and some of the lessons learned during its development. The new version is a total rewrite of the program, based on three goals: (1) to migrate to a free and open source software (FOSS) platform; (2) to implement a multiplatform version; (3) to implement the support for internationalization. We describe the tools that we adopted, the process of development and some of the problems encountered. PMID:25760160

  5. Opening Up to Open Source: Looking at How Moodle Was Adopted in Higher Education

    ERIC Educational Resources Information Center

    Costello, Eamon

    2013-01-01

    The virtual learning environment (VLE) has grown to become a piece of complex infrastructure that is now deemed critical to higher educational provision. This paper looks at Moodle and its adoption in higher education. Moodle's origins, as an open source VLE, are investigated and its growth examined in the context of how higher educational…

  6. A Dozen Years after Open Source's 1998 Birth, It's Time for "OpenTechComm"

    ERIC Educational Resources Information Center

    Still, Brian

    2010-01-01

    2008 marked the 10-year Anniversary of the Open Source movement, which has had a substantial impact on not only software production and adoption, but also on the sharing and distribution of information. Technical communication as a discipline has taken some advantage of the movement or its derivative software, but this article argues not as much…

  7. Open source software engineering for geoscientific modeling applications

    NASA Astrophysics Data System (ADS)

    Bilke, L.; Rink, K.; Fischer, T.; Kolditz, O.

    2012-12-01

    OpenGeoSys (OGS) is a scientific open source project for numerical simulation of thermo-hydro-mechanical-chemical (THMC) processes in porous and fractured media. The OGS software development community is distributed all over the world and people with different backgrounds are contributing code to a complex software system. The following points have to be addressed for successful software development: - Platform independent code - A unified build system - A version control system - A collaborative project web site - Continuous builds and testing - Providing binaries and documentation for end users OGS should run on a PC as well as on a computing cluster regardless of the operating system. Therefore the code should not include any platform specific feature or library. Instead open source and platform independent libraries like Qt for the graphical user interface or VTK for visualization algorithms are used. A source code management and version control system is a definite requirement for distributed software development. For this purpose Git is used, which enables developers to work on separate versions (branches) of the software and to merge those versions at some point to the official one. The version control system is integrated into an information and collaboration website based on a wiki system. The wiki is used for collecting information such as tutorials, application examples and case studies. Discussions take place in the OGS mailing list. To improve code stability and to verify code correctness a continuous build and testing system, based on the Jenkins Continuous Integration Server, has been established. This server is connected to the version control system and does the following on every code change: - Compiles (builds) the code on every supported platform (Linux, Windows, MacOS) - Runs a comprehensive test suite of over 120 benchmarks and verifies the results Runs software development related metrics on the code (like compiler warnings, code complexity

  8. Nowcasting influenza outbreaks using open-source media report.

    SciTech Connect

    Ray, Jaideep; Brownstein, John S.

    2013-02-01

    We construct and verify a statistical method to nowcast influenza activity from a time-series of the frequency of reports concerning influenza related topics. Such reports are published electronically by both public health organizations as well as newspapers/media sources, and thus can be harvested easily via web crawlers. Since media reports are timely, whereas reports from public health organization are delayed by at least two weeks, using timely, open-source data to compensate for the lag in %E2%80%9Cofficial%E2%80%9D reports can be useful. We use morbidity data from networks of sentinel physicians (both the Center of Disease Control's ILINet and France's Sentinelles network) as the gold standard of influenza-like illness (ILI) activity. The time-series of media reports is obtained from HealthMap (http://healthmap.org). We find that the time-series of media reports shows some correlation ( 0.5) with ILI activity; further, this can be leveraged into an autoregressive moving average model with exogenous inputs (ARMAX model) to nowcast ILI activity. We find that the ARMAX models have more predictive skill compared to autoregressive (AR) models fitted to ILI data i.e., it is possible to exploit the information content in the open-source data. We also find that when the open-source data are non-informative, the ARMAX models reproduce the performance of AR models. The statistical models are tested on data from the 2009 swine-flu outbreak as well as the mild 2011-2012 influenza season in the U.S.A.

  9. Global threat reduction initiative efforts to address transportation challenges associated with the recovery of disused radioactive sealed sources - 10460

    SciTech Connect

    Whitworth, Julie; Abeyta, Cristy L; Griffin, Justin M; Matzke, James L; Pearson, Michael W; Cuthbertson, Abigail; Rawl, Richard; Singley, Paul

    2010-01-01

    Proper disposition of disused radioactive sources is essential for their safe and secure management and necessary to preclude their use in malicious activities. Without affordable, timely transportation options, disused sealed sources remain in storage at hundreds of sites throughout the country and around the world. While secure storage is a temporary measure, the longer sources remain disused or unwanted the chances increase that they will become unsecured or abandoned. The Global Threat Reduction Initiative's Off-Site Source Recovery Project (GTRIlOSRP), recovers thousands of disused and unwanted sealed sources annually as part of GTRl's larger mission to reduce and protect high risk nuclear and radiological materials located at civilian sites worldwide. Faced with decreasing availability of certified transportation containers to support movement of disused and unwanted neutron- and beta/gamma-emitting radioactive sealed sources, GTRIlOSRP has initiated actions to ensure the continued success of the project in timely recovery and management of sealed radioactive sources. Efforts described in this paper to enhance transportation capabilities include: {sm_bullet} Addition of authorized content to existing and planned Type B containers to support the movement of non-special form and other Type B-quantity sealed sources; {sm_bullet} Procurement of vendor services for the design, development, testing and certification of a new Type B container to support transportation of irradiators, teletherapy heads or sources removed from these devices using remote handling capabilities such as the IAEA portable hot cell facility; {sm_bullet} Expansion of shielded Type A container inventory for transportation of gamma-emitting sources in activity ranges requiring use of shielding for conformity with transportation requirements; {sm_bullet} Approval of the S300 Type A fissile container for transport of Pu-239 sealed sources internationally; {sm_bullet} Technology transfer of field

  10. Source term evaluation model for high-level radioactive waste repository with decay chain build-up.

    PubMed

    Chopra, Manish; Sunny, Faby; Oza, R B

    2016-09-18

    A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories. PMID:27337157

  11. Source term evaluation model for high-level radioactive waste repository with decay chain build-up.

    PubMed

    Chopra, Manish; Sunny, Faby; Oza, R B

    2016-09-18

    A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories.

  12. Building an Open Source Framework for Integrated Catchment Modeling

    NASA Astrophysics Data System (ADS)

    Jagers, B.; Meijers, E.; Villars, M.

    2015-12-01

    In order to develop effective strategies and associated policies for environmental management, we need to understand the dynamics of the natural system as a whole and the human role therein. This understanding is gained by comparing our mental model of the world with observations from the field. However, to properly understand the system we should look at dynamics of water, sediments, water quality, and ecology throughout the whole system from catchment to coast both at the surface and in the subsurface. Numerical models are indispensable in helping us understand the interactions of the overall system, but we need to be able to update and adjust them to improve our understanding and test our hypotheses. To support researchers around the world with this challenging task we started a few years ago with the development of a new open source modeling environment DeltaShell that integrates distributed hydrological models with 1D, 2D, and 3D hydraulic models including generic components for the tracking of sediment, water quality, and ecological quantities throughout the hydrological cycle composed of the aforementioned components. The open source approach combined with a modular approach based on open standards, which allow for easy adjustment and expansion as demands and knowledge grow, provides an ideal starting point for addressing challenging integrated environmental questions.

  13. OpenDrift - an open source framework for ocean trajectory modeling

    NASA Astrophysics Data System (ADS)

    Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn

    2016-04-01

    We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.

  14. RADIOACTIVE WASTE MANAGEMENT IN THE USSR: A REVIEW OF UNCLASSIFIED SOURCES, 1963-1990

    SciTech Connect

    Bradley, D. J.; Schneider, K. J.

    1990-03-01

    year capacity as the first of several modules, was about 30% completed by July 1989. The completion of this plant was subsequently "indefinitely postponed." The initial reprocessing scheme at the Kyshtym site used sodium uranyl acetate precipitation from fuel dissolved in nitric acid solutions. The basic method~ ology now appears to be based on the conventional PUREX process. Dry reprocessing on a pilot or laboratory scale has been under way in Dimitrovgrad since 1984, and a larger unit is now being built, according to the French CEA. Perhaps significantly, much research is being done on partitioning high-level waste into element fractions. The Soviets appear to have the technology to remove radioactive noble gases released during reprocessing operations; however, there are no indications of its implementation. Millions of curies of liquid low- and intermediate-level wastes have been disposed of by well injection into underground areas where they were supposedly contained by watertight rock strata. Some gaseous wastes were also disposed of by well injection. This practice is not referred to in recent literature and thus may not be widely used today. Rather, it appears that these waste streams are now first treated to reduce volume, and then solidified using bitumen or concrete. These solidified liquid wastes from Soviet nuclear power reactor operations, along with solid wastes, are disposed of in shallow-land burial sites located at most large power reactor stations. In addition, 35 shallow-land burial sites have been alluded to by the Soviets for disposal of industrial, medical, and research low-level wastes as well as ionization sources. Research on tritium-bearing and other gaseous wastes is mentioned, as well as a waste minimization program aimed at reducing the volume of waste streams by 30%. The Soviets have announced that their high-level waste management plan is to 1) store liquid wastes for 3-5 years; 2) incorporate the waste into glass (at a final glass

  15. A new chapter in environmental sensing: The Open-Source Published Environmental Sensing (OPENS) laboratory

    NASA Astrophysics Data System (ADS)

    Selker, J. S.; Roques, C.; Higgins, C. W.; Good, S. P.; Hut, R.; Selker, A.

    2015-12-01

    The confluence of 3-Dimensional printing, low-cost solid-state-sensors, low-cost, low-power digital controllers (e.g., Arduinos); and open-source publishing (e.g., Github) is poised to transform environmental sensing. The Open-Source Published Environmental Sensing (OPENS) laboratory has launched and is available for all to use. OPENS combines cutting edge technologies and makes them available to the global environmental sensing community. OPENS includes a Maker lab space where people may collaborate in person or virtually via on-line forum for the publication and discussion of environmental sensing technology (Corvallis, Oregon, USA, please feel free to request a free reservation for space and equipment use). The physical lab houses a test-bed for sensors, as well as a complete classical machine shop, 3-D printers, electronics development benches, and workstations for code development. OPENS will provide a web-based formal publishing framework wherein global students and scientists can peer-review publish (with DOI) novel and evolutionary advancements in environmental sensor systems. This curated and peer-reviewed digital collection will include complete sets of "printable" parts and operating computer code for sensing systems. The physical lab will include all of the machines required to produce these sensing systems. These tools can be addressed in person or virtually, creating a truly global venue for advancement in monitoring earth's environment and agricultural systems. In this talk we will present an example of the process of design and publication the design and data from the OPENS-Permeameter. The publication includes 3-D printing code, Arduino (or other control/logging platform) operational code; sample data sets, and a full discussion of the design set in the scientific context of previous related devices. Editors for the peer-review process are currently sought - contact John.Selker@Oregonstate.edu or Clement.Roques@Oregonstate.edu.

  16. Source team evaluation for radioactive low-level waste disposal performance assessment

    SciTech Connect

    Cowgill, M.G.; Sullivan, T.M.

    1993-01-01

    Information compiled on the low-level radioactive waste disposed at the three currently operating commercial disposal sites during the period 1987--1989 have been reviewed and processed in order to determine the total activity distribution in terms of waste stream, waste classification and waste form. The review identified deficiencies in the information currently being recorded on shipping manifests and the development of a uniform manifest is recommended (the NRC is currently developing a rule to establish a uniform manifest). The data from waste disposed during 1989 at one of the sites (Richland, WA) were more detailed than the data available during other years and at other sites, and thus were amenable to a more in-depth treatment. This included determination of the distribution of activity for each radionuclide by waste form, and thus enabled these data to be evaluated in terms of the specific needs for improved modeling of releases from waste packages. From the results, preliminary lists have been prepared of the isotopes which might be the most significant from the aspect of the development of a source term model.

  17. The Effect of Gamma-ray Detector Energy Resolution on the Ability to Identify Radioactive Sources

    SciTech Connect

    Nelson, K E; Gosnell, T B; Knapp, D A

    2009-03-05

    This report describes the results of an initial study on radiation detector spectral resolution, along with the underlying methodology used. The study was done as part of an ongoing effort in Detection Modeling and Operational Analysis (DMOA) for the DNDO System Architecture Directorate. The study objective was to assess the impact of energy resolution on radionuclide identification capability, measured by the ability to reliably discriminate between spectra associated with 'threats' (defined as fissile materials) and radioactive 'non-threats' that might be present in the normal stream of commerce. Although numerous factors must be considered in deciding which detector technology is appropriate for a specific application, spectral resolution is a critical one for homeland security applications in which a broad range of non-threat sources are present and very low false-alarm rates are required. In this study, we have proposed a metric for quantifying discrimination capability, and have shown how this metric depends on resolution. In future work we will consider other important factors, such as efficiency and volume, and the relative frequency of spectra known to be discrimination challenges in practical applications.

  18. Children's Ideas about Radioactivity and Radiation: sources, modes of travel, uses and dangers.

    ERIC Educational Resources Information Center

    Boyes, Edward; Stanisstreet, Martin

    1994-01-01

    The understanding concerning radioactivity and radiation of pupils ages 11-16 was studied using a closed-form questionnaire with a large cohort of children and interviews with subsets of this group. A majority of children demonstrated confusion about the environmental impacts of radioactivity and radiation. (LZ)

  19. Conceptual Architecture of Building Energy Management Open Source Software (BEMOSS)

    SciTech Connect

    Khamphanchai, Warodom; Saha, Avijit; Rathinavel, Kruthika; Kuzlu, Murat; Pipattanasomporn, Manisa; Rahman, Saifur; Akyol, Bora A.; Haack, Jereme N.

    2014-12-01

    The objective of this paper is to present a conceptual architecture of a Building Energy Management Open Source Software (BEMOSS) platform. The proposed BEMOSS platform is expected to improve sensing and control of equipment in small- and medium-sized buildings, reduce energy consumption and help implement demand response (DR). It aims to offer: scalability, robustness, plug and play, open protocol, interoperability, cost-effectiveness, as well as local and remote monitoring. In this paper, four essential layers of BEMOSS software architecture -- namely User Interface, Application and Data Management, Operating System and Framework, and Connectivity layers -- are presented. A laboratory test bed to demonstrate the functionality of BEMOSS located at the Advanced Research Institute of Virginia Tech is also briefly described.

  20. The Pixhawk Open-Source Computer Vision Framework for Mavs

    NASA Astrophysics Data System (ADS)

    Meier, L.; Tanskanen, P.; Fraundorfer, F.; Pollefeys, M.

    2011-09-01

    Unmanned aerial vehicles (UAV) and micro air vehicles (MAV) are already intensively used in geodetic applications. State of the art autonomous systems are however geared towards the application area in safe and obstacle-free altitudes greater than 30 meters. Applications at lower altitudes still require a human pilot. A new application field will be the reconstruction of structures and buildings, including the facades and roofs, with semi-autonomous MAVs. Ongoing research in the MAV robotics field is focusing on enabling this system class to operate at lower altitudes in proximity to nearby obstacles and humans. PIXHAWK is an open source and open hardware toolkit for this purpose. The quadrotor design is optimized for onboard computer vision and can connect up to four cameras to its onboard computer. The validity of the system design is shown with a fully autonomous capture flight along a building.

  1. Automated MCNP photon source generation for arbitrary configurations of radioactive materials and first-principles calculations of photon detector responses

    SciTech Connect

    Estes, G.P.; Schrandt, R.G.; Kriese, J.T.

    1988-03-01

    A patch to the Los Alamos Monte Carlo code MCNP has been developed that automates the generation of source descriptions for photons from arbitrary mixtures and configurations of radioactive isotopes. Photon branching ratios for decay processes are obtained from national and international data bases and accesed directly from computer files. Code user input is generally confined to readily available information such as density, isotopic weight fractions, atomic numbers, etc. of isotopes and material compositions. The availbility of this capability in conjunction with the ''generalized source'' capability of MCNP Version 3A makes possible the rapid and accurate description of photon sources from complex mixtures and configurations of radioactive materials, resulting in imporved radiation transport predictive capabilities. This capability is combined with a first - principles calculation of photon spectrometer response - functions for NaI, BGO, and HPGe for E..gamma.. )approxreverse arrowlt) 1 MeV. 25 refs., 1 fig., 4 tabs.

  2. Spatial Information Processing: Standards-Based Open Source Visualization Technology

    NASA Astrophysics Data System (ADS)

    Hogan, P.

    2009-12-01

    . Spatial information intelligence is a global issue that will increasingly affect our ability to survive as a species. Collectively we must better appreciate the complex relationships that make life on Earth possible. Providing spatial information in its native context can accelerate our ability to process that information. To maximize this ability to process information, three basic elements are required: data delivery (server technology), data access (client technology), and data processing (information intelligence). NASA World Wind provides open source client and server technologies based on open standards. The possibilities for data processing and data sharing are enhanced by this inclusive infrastructure for geographic information. It is interesting that this open source and open standards approach, unfettered by proprietary constraints, simultaneously provides for entirely proprietary use of this same technology. 1. WHY WORLD WIND? NASA World Wind began as a single program with specific functionality, to deliver NASA content. But as the possibilities for virtual globe technology became more apparent, we found that while enabling a new class of information technology, we were also getting in the way. Researchers, developers and even users expressed their desire for World Wind functionality in ways that would service their specific needs. They want it in their web pages. They want to add their own features. They want to manage their own data. They told us that only with this kind of flexibility, could their objectives and the potential for this technology be truly realized. World Wind client technology is a set of development tools, a software development kit (SDK) that allows a software engineer to create applications requiring geographic visualization technology. 2. MODULAR COMPONENTRY Accelerated evolution of a technology requires that the essential elements of that technology be modular components such that each can advance independent of the other

  3. Clarity: an open-source manager for laboratory automation.

    PubMed

    Delaney, Nigel F; Rojas Echenique, José I; Marx, Christopher J

    2013-04-01

    Software to manage automated laboratories, when interfaced with hardware instruments, gives users a way to specify experimental protocols and schedule activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity, a laboratory automation manager that is hardware agnostic, portable, extensible, and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity, demonstrate an example of its implementation for the automated analysis of bacterial growth, and describe how the program can be extended to manage new hardware. Clarity is mature, well documented, actively developed, written in C# for the Common Language Infrastructure, and is free and open-source software. These advantages set Clarity apart from currently available laboratory automation programs. The source code and documentation for Clarity is available at http://code.google.com/p/osla/.

  4. Effects of source rocks, soil features and climate on natural gamma radioactivity in the Crati valley (Calabria, Southern Italy).

    PubMed

    Guagliardi, Ilaria; Rovella, Natalia; Apollaro, Carmine; Bloise, Andrea; De Rosa, Rosanna; Scarciglia, Fabio; Buttafuoco, Gabriele

    2016-05-01

    The study, which represents an innovative scientific strategy to approach the study of natural radioactivity in terms of spatial and temporal variability, was aimed to characterize the background levels of natural radionuclides in soil and rock in the urban and peri-urban soil of a southern Italy area; to quantify their variations due to radionuclide bearing minerals and soil properties, taking into account nature and extent of seasonality influence. Its main novelty is taking into account the effect of climate in controlling natural gamma radioactivity as well as analysing soil radioactivity in terms of soil properties and pedogenetic processes. In different bedrocks and soils, activities of natural radionuclides ((238)U, (232)Th (4) K) and total radioactivity were measured at 181 locations by means of scintillation γ-ray spectrometry. In addition, selected rocks samples were collected and analysed, using a Scanning Electron Microscope (SEM) equipped with an Energy Dispersive Spectrometer (EDS) and an X-Ray Powder Diffraction (XRPD), to assess the main sources of radionuclides. The natural-gamma background is intimately related to differing petrologic features of crystalline source rocks and to peculiar pedogenetic features and processes. The radioactivity survey was conducted during two different seasons with marked changes in the main climatic characteristics, namely dry summer and moist winter, to evaluate possible effects of seasonal climatic variations and soil properties on radioactivity measurements. Seasonal variations of radionuclides activities show their peak values in summer. The activities of (238)U, (232)Th and (4) K exhibit a positive correlation with the air temperature and are negatively correlated with precipitations.

  5. Handling of Highly Radioactive Radiation Sources in a Hot Cell Using a Mechanically Driven Cell Crane - 13452

    SciTech Connect

    Klute, Stefan; Huber, Wolfgang-Bruno

    2013-07-01

    In 2010, Siempelkamp Nukleartechnik GmbH was awarded the contract for design and erection of a Hot Cell for handling and storage of highly radioactive radiation sources. This Hot Cell is part of a new hot cell laboratory, constructed for the NHZ (Neues Handhabungszentrum = New Handling Center) of the Nuclear Engineering Seibersdorf GmbH (NES). All incurring radioactive materials from Austria are collected in the NHZ, where they are safely conditioned and stored temporarily until their final storage. The main tasks of the NES include, apart from the collection, conditioning and storage of radioactive waste, also the reprocessing and the decontamination of facilities and laboratories originating from 45 years of research and development at the Seibersdorf site as well as the operation of the Hot Cell Laboratory [1]. The new Hot Cell Laboratory inside the NHZ consists of the following room areas: - One hot cell, placed in the center, for remote controlled, radiation protected handling of radioactive materials, including an integrated floor storage for the long-term temporary storage of highly radioactive radiation sources; - An anteroom for the loading and unloading of the hot cell; - One control room for the remote controlling of the hot cell equipment; - One floor storage, placed laterally to the hot cell, for burial, interim storage and removal of fissionable radioactive material in leak-proof packed units in 100 l drums. The specific design activity of the hot cell of 1.85 Pbq relating to 1-Me-Radiator including the integrated floor storage influences realization and design of the components used in the cell significantly. (authors)

  6. Effects of source rocks, soil features and climate on natural gamma radioactivity in the Crati valley (Calabria, Southern Italy).

    PubMed

    Guagliardi, Ilaria; Rovella, Natalia; Apollaro, Carmine; Bloise, Andrea; De Rosa, Rosanna; Scarciglia, Fabio; Buttafuoco, Gabriele

    2016-05-01

    The study, which represents an innovative scientific strategy to approach the study of natural radioactivity in terms of spatial and temporal variability, was aimed to characterize the background levels of natural radionuclides in soil and rock in the urban and peri-urban soil of a southern Italy area; to quantify their variations due to radionuclide bearing minerals and soil properties, taking into account nature and extent of seasonality influence. Its main novelty is taking into account the effect of climate in controlling natural gamma radioactivity as well as analysing soil radioactivity in terms of soil properties and pedogenetic processes. In different bedrocks and soils, activities of natural radionuclides ((238)U, (232)Th (4) K) and total radioactivity were measured at 181 locations by means of scintillation γ-ray spectrometry. In addition, selected rocks samples were collected and analysed, using a Scanning Electron Microscope (SEM) equipped with an Energy Dispersive Spectrometer (EDS) and an X-Ray Powder Diffraction (XRPD), to assess the main sources of radionuclides. The natural-gamma background is intimately related to differing petrologic features of crystalline source rocks and to peculiar pedogenetic features and processes. The radioactivity survey was conducted during two different seasons with marked changes in the main climatic characteristics, namely dry summer and moist winter, to evaluate possible effects of seasonal climatic variations and soil properties on radioactivity measurements. Seasonal variations of radionuclides activities show their peak values in summer. The activities of (238)U, (232)Th and (4) K exhibit a positive correlation with the air temperature and are negatively correlated with precipitations. PMID:26891362

  7. A Stigmergy Approach for Open Source Software Developer Community Simulation

    SciTech Connect

    Cui, Xiaohui; Beaver, Justin M; Potok, Thomas E; Pullum, Laura L; Treadwell, Jim N

    2009-01-01

    The stigmergy collaboration approach provides a hypothesized explanation about how online groups work together. In this research, we presented a stigmergy approach for building an agent based open source software (OSS) developer community collaboration simulation. We used group of actors who collaborate on OSS projects as our frame of reference and investigated how the choices actors make in contribution their work on the projects determinate the global status of the whole OSS projects. In our simulation, the forum posts and project codes served as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing developer agent behaviors selection probability.

  8. Leveraging Open Source Technologies to Build Scientific Data Systems

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Mattmann, C. A.; Hart, A.; Hughes, J. S.; Hardman, S.; Law, E.; Kelly, S.

    2011-12-01

    Scientific discovery is largely a collaborative endeavor. From the design and execution of earth and planetary science missions to the evaluation of biomarkers that can identify a particular predisposition to cancer, the scientific community increasingly depends on multi-institutional collaboration as a key enabler of the discovery process. Science data systems, in turn, play a key role in enabling the data-driven collaborative investigations,. The utility of these systems can be greatly enhanced by applying many of the same principles governing scientific collaboration to the software development and deployment process. Rather than being built in isolation, scientific data systems must be developed using a collaborative model, both to ensure they can be run in multi-center deployments, and that they will support the full range of varying and evolving needs of the scientific community they target. Open source plays a vital role in enabling this process. By its very nature, open source allows software development to turn software projects into multi-institutional and international data systems by developing the communities around software product lines. At the Jet Propulsion Laboratory (JPL) we have been involved in developing a core software framework called "OODT" to support development of cross-disciplinary science data systems following an open source implementation. This framework has been applied to various areas in earth science including mission science data system development, climate research and data analysis as well as to planetary, lunar, astrophysics and biomedical research. In 2011, OODT became the first top-level project at the Apache Software Foundation (ASF) to be incubated at a NASA center. The experience in incubating and developing the OODT framework has been invaluable in shifting the development of science data systems towards a collaborative model. Rather than developing each system independently, there is substantial collaboration that is

  9. An open-source java platform for automated reaction mapping.

    PubMed

    Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M

    2010-09-27

    This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.

  10. Open-source, Rapid Reporting of Dementia Evaluations.

    PubMed

    Graves, Rasinio S; Mahnken, Jonathan D; Swerdlow, Russell H; Burns, Jeffrey M; Price, Cathy; Amstein, Brad; Hunt, Suzanne L; Brown, Lexi; Adagarla, Bhargav; Vidoni, Eric D

    2015-01-01

    The National Institutes of Health Alzheimer's Disease Center consortium requires member institutions to build and maintain a longitudinally characterized cohort with a uniform standard data set. Increasingly, centers are employing electronic data capture to acquire data at annual evaluations. In this paper, the University of Kansas Alzheimer's Disease Center reports on an open-source system of electronic data collection and reporting to improve efficiency. This Center capitalizes on the speed, flexibility and accessibility of the system to enhance the evaluation process while rapidly transferring data to the National Alzheimer's Coordinating Center. This framework holds promise for other consortia that regularly use and manage large, standardized datasets. PMID:26779306

  11. Implementing Open Source Platform for Education Quality Enhancement in Primary Education: Indonesia Experience

    ERIC Educational Resources Information Center

    Kisworo, Marsudi Wahyu

    2016-01-01

    Information and Communication Technology (ICT)-supported learning using free and open source platform draws little attention as open source initiatives were focused in secondary or tertiary educations. This study investigates possibilities of ICT-supported learning using open source platform for primary educations. The data of this study is taken…

  12. Poster — Thur Eve — 41: Considerations for Patients with Permanently Implant Radioactive Sources Requiring Unrelated Surgery

    SciTech Connect

    Basran, P. S; Beckham, WA; Baxter, P

    2014-08-15

    Permanent implant of sealed radioactive sources is an effective technique for treating cancer. Typically, the radioactive sources are implanted in and near the disease, depositing dose locally over several months. There may be instances where these patients must undergo unrelated surgical procedures when the radioactive material remains active enough to pose risks. This work explores these risks, discusses strategies to mitigate those risks, and describes a case study for a permanent I-125 prostate brachytherapy implant patient who developed colo-rectal cancer and required surgery 6 months after brachytherapy. The first consideration is identifying the risk from unwarranted radiation to the patient and staff before, during, and after the surgical procedure. The second is identifying the risk the surgical procedure may have on the efficacy of the brachytherapy implant. Finally, there are considerations for controlling for radioactive substances from a regulatory perspective. After these risks are defined, strategies to mitigate those risks are considered. These strategies may include applying the concepts of ALARA, the use of protective equipment and developing a best practice strategy with the operating room team. We summarize this experience with some guidelines: If the surgical procedure is near (ex: 5 cm) of the implant; and, the surgical intervention may dislodge radioisotopes enough to compromise treatment or introduces radiation safety risks; and, the radioisotope has not sufficiently decayed to background levels; and, the surgery cannot be postponed, then a detailed analysis of risk is advised.

  13. Understanding How the "Open" of Open Source Software (OSS) Will Improve Global Health Security.

    PubMed

    Hahn, Erin; Blazes, David; Lewis, Sheri

    2016-01-01

    Improving global health security will require bold action in all corners of the world, particularly in developing settings, where poverty often contributes to an increase in emerging infectious diseases. In order to mitigate the impact of emerging pandemic threats, enhanced disease surveillance is needed to improve early detection and rapid response to outbreaks. However, the technology to facilitate this surveillance is often unattainable because of high costs, software and hardware maintenance needs, limited technical competence among public health officials, and internet connectivity challenges experienced in the field. One potential solution is to leverage open source software, a concept that is unfortunately often misunderstood. This article describes the principles and characteristics of open source software and how it may be applied to solve global health security challenges. PMID:26889576

  14. Integrating HCI Specialists into Open Source Software Development Projects

    NASA Astrophysics Data System (ADS)

    Hedberg, Henrik; Iivari, Netta

    Typical open source software (OSS) development projects are organized around technically talented developers, whose communication is based on technical aspects and source code. Decision-making power is gained through proven competence and activity in the project, and non-technical end-user opinions are too many times neglected. In addition, also human-computer interaction (HCI) specialists have encountered difficulties in trying to participate in OSS projects, because there seems to be no clear authority and responsibility for them. In this paper, based on HCI and OSS literature, we introduce an extended OSS development project organization model that adds a new level of communication and roles for attending human aspects of software. The proposed model makes the existence of HCI specialists visible in the projects, and promotes interaction between developers and the HCI specialists in the course of a project.

  15. Inexpensive Open-Source Data Logging in the Field

    NASA Astrophysics Data System (ADS)

    Wickert, A. D.

    2013-12-01

    I present a general-purpose open-source field-capable data logger, which provides a mechanism to develop dense networks of inexpensive environmental sensors. This data logger was developed as a low-power variant of the Arduino open-source development system, and is named the ALog ("Arduino Logger") BottleLogger (it is slim enough to fit inside a Nalgene water bottle) version 1.0. It features an integrated high-precision real-time clock, SD card slot for high-volume data storage, and integrated power switching. The ALog can interface with sensors via six analog/digital pins, two digital pins, and one digital interrupt pin that can read event-based inputs, such as those from a tipping-bucket rain gauge. We have successfully tested the ALog BottleLogger with ultrasonic rangefinders (for water stage and snow accumulation and melt), temperature sensors, tipping-bucket rain gauges, soil moisture and water potential sensors, resistance-based tools to measure frost heave, and cameras that it triggers based on events. The source code for the ALog, including functions to interface with a a range of commercially-available sensors, is provided as an Arduino C++ library with example implementations. All schematics, circuit board layouts, and source code files are open-source and freely available under GNU GPL v3.0 and Creative Commons Attribution-ShareAlike 3.0 Unported licenses. Through this work, we hope to foster a community-driven movement to collect field environmental data on a budget that permits citizen-scientists and researchers from low-income countries to collect the same high-quality data as researchers in wealthy countries. These data can provide information about global change to managers, governments, scientists, and interested citizens worldwide. Watertight box with ALog BottleLogger data logger on the left and battery pack with 3 D cells on the right. Data can be collected for 3-5 years on one set of batteries.

  16. A Basic Positron Emission Tomography System Constructed to Locate a Radioactive Source in a Bi-dimensional Space.

    PubMed

    Montaño-Zetina, Luis Manuel; Villalobos-Mora, Omar

    2016-01-01

    A simple Positron Emission Tomography (PET) prototype has been constructed to fully characterize its basic working principles. The PET prototype was created by coupling plastic scintillator crystals to photomultipliers or PMT's which are placed at opposing positions to detect two gamma rays emitted from a radioactive source, of which is placed in the geometric center of the PET set-up. The prototype consists of four detectors placed geometrically in a 20 cm diameter circle, and a radioactive source in the center. By moving the radioactive source centimeters from the center the system one is able to detect the displacement by measuring the time of flight difference between any two PMT's and, with this information, the system can calculate the virtual position in a graphical interface. In this way, the prototype reproduces the main principles of a PET system. It is capable to determine the real position of the source with intervals of 4 cm in 2 lines of detection taking less than 2 min. PMID:26863081

  17. openPSTD: The open source pseudospectral time-domain method for acoustic propagation

    NASA Astrophysics Data System (ADS)

    Hornikx, Maarten; Krijnen, Thomas; van Harten, Louis

    2016-06-01

    An open source implementation of the Fourier pseudospectral time-domain (PSTD) method for computing the propagation of sound is presented, which is geared towards applications in the built environment. Being a wave-based method, PSTD captures phenomena like diffraction, but maintains efficiency in processing time and memory usage as it allows to spatially sample close to the Nyquist criterion, thus keeping both the required spatial and temporal resolution coarse. In the implementation it has been opted to model the physical geometry as a composition of rectangular two-dimensional subdomains, hence initially restricting the implementation to orthogonal and two-dimensional situations. The strategy of using subdomains divides the problem domain into local subsets, which enables the simulation software to be built according to Object-Oriented Programming best practices and allows room for further computational parallelization. The software is built using the open source components, Blender, Numpy and Python, and has been published under an open source license itself as well. For accelerating the software, an option has been included to accelerate the calculations by a partial implementation of the code on the Graphical Processing Unit (GPU), which increases the throughput by up to fifteen times. The details of the implementation are reported, as well as the accuracy of the code.

  18. OpenOrb: Open-source asteroid orbit computation software including statistical ranging

    NASA Astrophysics Data System (ADS)

    Granvik, M.; Virtanen, J.; Oszkiewicz, D.; Muinonen, K.

    2009-01-01

    We are making an open-source asteroid orbit computation software package called OpenOrb publicly available. OpenOrb is built on a well-established Bayesian inversion theory, which means that it is to a large part complementary to orbit-computation packages currently available. In particular, OpenOrb is the first package that contains tools for rigorously estimating the uncertainties resulting from the inverse problem of computing orbital elements using scarce astrometry. In addition to the well-known least-squares method, OpenOrb also contains both Monte-Carlo (MC) and Markov-Chain MC (MCMC; Oszkiewicz et al. [2009]) versions of the statistical ranging method. Ranging allows the user to obtain sampled, non-Gaussian orbital-element probability-density functions and is therefore optimized for cases where the amount of astrometry is scarce or spans a relatively short time interval. Ranging-based methods have successfully been applied to a variety of different problems such as rigorous ephemeris prediction, orbital element distribution studies for transneptunian objects, the computation of invariant collision probabilities between near-Earth objects and the Earth, detection of linkages between astrometric asteroid observations within an apparition as well as between apparitions, and in the rigorous analysis of the impact of orbital arc length and/or astrometric uncertainty on the uncertainty of the resulting orbits. Tools for making ephemeris predictions and for classifying objects based on their orbits are also available in OpenOrb. As an example, we use OpenOrb in the search for candidate retrograde and/or high-inclination objects similar to 2008 KV42 in the known population of transneptunian objects that have an observational time span shorter than 30 days.

  19. Fast, accurate, robust and Open Source Brain Extraction Tool (OSBET)

    NASA Astrophysics Data System (ADS)

    Namias, R.; Donnelly Kehoe, P.; D'Amato, J. P.; Nagel, J.

    2015-12-01

    The removal of non-brain regions in neuroimaging is a critical task to perform a favorable preprocessing. The skull-stripping depends on different factors including the noise level in the image, the anatomy of the subject being scanned and the acquisition sequence. For these and other reasons, an ideal brain extraction method should be fast, accurate, user friendly, open-source and knowledge based (to allow for the interaction with the algorithm in case the expected outcome is not being obtained), producing stable results and making it possible to automate the process for large datasets. There are already a large number of validated tools to perform this task but none of them meets the desired characteristics. In this paper we introduced an open source brain extraction tool (OSBET), composed of four steps using simple well-known operations such as: optimal thresholding, binary morphology, labeling and geometrical analysis that aims to assemble all the desired features. We present an experiment comparing OSBET with other six state-of-the-art techniques against a publicly available dataset consisting of 40 T1-weighted 3D scans and their corresponding manually segmented images. OSBET gave both: a short duration with an excellent accuracy, getting the best Dice Coefficient metric. Further validation should be performed, for instance, in unhealthy population, to generalize its usage for clinical purposes.

  20. Building integrated business environments: analysing open-source ESB

    NASA Astrophysics Data System (ADS)

    Martínez-Carreras, M. A.; García Jimenez, F. J.; Gómez Skarmeta, A. F.

    2015-05-01

    Integration and interoperability are two concepts that have gained significant prominence in the business field, providing tools which enable enterprise application integration (EAI). In this sense, enterprise service bus (ESB) has played a crucial role as the underpinning technology for creating integrated environments in which companies may connect all their legacy-applications. However, the potential of these technologies remains unknown and some important features are not used to develop suitable business environments. The aim of this paper is to describe and detail the elements for building the next generation of integrated business environments (IBE) and to analyse the features of ESBs as the core of this infrastructure. For this purpose, we evaluate how well-known open-source ESB products fulfil these needs. Moreover, we introduce a scenario in which the collaborative system 'Alfresco' is integrated in the business infrastructure. Finally, we provide a comparison of the different open-source ESBs available for IBE requirements. According to this study, Fuse ESB provides the best results, considering features such as support for a wide variety of standards and specifications, documentation and implementation, security, advanced business trends, ease of integration and performance.

  1. Instrumentino: An Open-Source Software for Scientific Instruments.

    PubMed

    Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C

    2015-01-01

    Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project. PMID:26668933

  2. Open source projects in software engineering education: a mapping study

    NASA Astrophysics Data System (ADS)

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study aims to summarize the literature on how OSP have been used to facilitate students' learning of SE. Method: A systematic mapping study was undertaken by identifying, filtering and classifying primary studies using a predefined strategy. Results: 72 papers were selected and classified. The main results were: (a) most studies focused on comprehensive SE courses, although some dealt with specific areas; (b) the most prevalent approach was the traditional project method; (c) studies' general goals were: learning SE concepts and principles by using OSP, learning open source software or both; (d) most studies tried out ideas in regular courses within the curriculum; (e) in general, students had to work with predefined projects; (f) there was a balance between approaches where instructors had either inside control or no control on the activities performed by students; (g) when learning was assessed, software artefacts, reports and presentations were the main instruments used by teachers, while surveys were widely used for students' self-assessment; (h) most studies were published in the last seven years. Conclusions: The resulting map gives an overview of the existing initiatives in this context and shows gaps where further research can be pursued.

  3. Open Source Quartz Crystal Microbalance with dissipation monitoring

    NASA Astrophysics Data System (ADS)

    Mista, C.; Zalazar, M.; Peñalva, A.; Martina, M.; Reta, J. M.

    2016-04-01

    The dissipation factor and subsequently the characterization of the viscoelasticity of deposition films have become crucial for the study of biomolecular adsorption. Most of the commercial quartz crystal microbalance (QCM) systems offer this feature, but it has not been incorporated in open source systems. This article describes the design, construction, and simulation of an open source QCM module for measuring dissipation factor. The module includes two blocks: switch and envelope detector. The switch rapidly disrupts the excitation of the crystal, and connects the output to the envelope detector which demodulates the amplitude of the signal. Damped sinusoidal exponential signals with different time constant were used for simulating viscosity interfaces. The incorporation of few elements facilitated a double-faced PCB design with reduced dimensions. The results from simulation show that the system has a good performance in the range of the biomolecular processes; greater relative error are observed for time constant lower than 1 us. In conclusion, a dissipation module has been developed for calculate dissipation factor using QCM, which is compact and shows great performance for use in biomolecular adsorption.

  4. Open Source Software Reuse in the Airborne Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Khudikyan, S. E.; Hart, A. F.; Hardman, S.; Freeborn, D.; Davoodi, F.; Resneck, G.; Mattmann, C. A.; Crichton, D. J.

    2012-12-01

    Earth science airborne missions play an important role in helping humans understand our climate. A challenge for airborne campaigns in contrast to larger NASA missions is that their relatively modest budgets do not permit the ground-up development of data management tools. These smaller missions generally consist of scientists whose primary focus is on the algorithmic and scientific aspects of the mission, which often leaves data management software and systems to be addressed as an afterthought. The Airborne Cloud Computing Environment (ACCE), developed by the Jet Propulsion Laboratory (JPL) to support Earth Science Airborne Program, is a reusable, multi-mission data system environment for NASA airborne missions. ACCE provides missions with a cloud-enabled platform for managing their data. The platform consists of a comprehensive set of robust data management capabilities that cover everything from data ingestion and archiving, to algorithmic processing, and to data delivery. Missions interact with this system programmatically as well as via browser-based user interfaces. The core components of ACCE are largely based on Apache Object Oriented Data Technology (OODT), an open source information integration framework at the Apache Software Foundation (ASF). Apache OODT is designed around a component-based architecture that allows for selective combination of components to create highly configurable data management systems. The diverse and growing community that currently contributes to Apache OODT fosters on-going growth and maturation of the software. ACCE's key objective is to reduce cost and risks associated with developing data management systems for airborne missions. Software reuse plays a prominent role in mitigating these problems. By providing a reusable platform based on open source software, ACCE enables airborne missions to allocate more resources to their scientific goals, thereby opening the doors to increased scientific discovery.

  5. Comparative Analysis Study of Open Source GIS in Malaysia

    NASA Astrophysics Data System (ADS)

    Rasid, Muhammad Zamir Abdul; Kamis, Naddia; Khuizham Abd Halim, Mohd

    2014-06-01

    Open source origin might appear like a major prospective change which is qualified to deliver in various industries and also competing means in developing countries. The leading purpose of this research study is to basically discover the degree of adopting Open Source Software (OSS) that is connected with Geographic Information System (GIS) application within Malaysia. It was derived based on inadequate awareness with regards to the origin ideas or even on account of techie deficiencies in the open origin instruments. This particular research has been carried out based on two significant stages; the first stage involved a survey questionnaire: to evaluate the awareness and acceptance level based on the comparison feedback regarding OSS and commercial GIS. This particular survey was conducted among three groups of candidates: government servant, university students and lecturers, as well as individual. The approaches of measuring awareness in this research were based on a comprehending signal plus a notion signal for each survey questions. These kinds of signs had been designed throughout the analysis in order to supply a measurable and also a descriptive signal to produce the final result. The second stage involved an interview session with a major organization that carries out available origin internet GIS; the Federal Department of Town and Country Planning Peninsular Malaysia (JPBD). The impact of this preliminary study was to understand the particular viewpoint of different groups of people on the available origin, and also their insufficient awareness with regards to origin ideas as well as likelihood may be significant root of adopting level connected with available origin options.

  6. Radioactive waste management in the USSR: A review of unclassified sources. Volume 2

    SciTech Connect

    Bradley, D.J.

    1991-03-01

    The Soviet Union does not currently have an overall radioactive waste management program or national laws that define objectives, procedures, and standards, although such a law is being developed, according to the Soviets. Occupational health and safety does not appear to receive major attention as it does in Western nations. In addition, construction practices that would be considered marginal in Western facilities show up in Soviet nuclear power and waste management operations. The issues involved with radioactive waste management and environmental restoration are being investigated at several large Soviet institutes; however, there is little apparent interdisciplinary integration between them, or interaction with the USSR Academy of Sciences. It is expected that a consensus on technical solutions will be achieved, but it may be slow in coming, especially for final disposal of high-level radioactive wastes and environmental restoration of contaminated areas. Meanwhile, many treatment, solidification, and disposal options for radioactive waste management are being investigated by the Soviets.

  7. Radioactive waste management in the USSR: A review of unclassified sources

    SciTech Connect

    Bradley, D.J.

    1991-03-01

    The Soviet Union does not currently have an overall radioactive waste management program or national laws that define objectives, procedures, and standards, although such a law is being developed, according to the Soviets. Occupational health and safety does not appear to receive major attention as it does in Western nations. In addition, construction practices that would be considered marginal in Western facilities show up in Soviet nuclear power and waste management operations. The issues involved with radioactive waste management and environmental restoration are being investigated at several large Soviet institutes; however, there is little apparent interdisciplinary integration between them, or interaction with the USSR Academy of Sciences. It is expected that a consensus on technical solutions will be achieved, but it may be slow in coming, especially for final disposal of high-level radioactive wastes and environmental restoration of contaminated areas. Meanwhile, many treatment, solidification, and disposal options for radioactive waste management are being investigated by the Soviets.

  8. Choosing Open Source ERP Systems: What Reasons Are There For Doing So?

    NASA Astrophysics Data System (ADS)

    Johansson, Björn; Sudzina, Frantisek

    Enterprise resource planning (ERP) systems attract a high attention and open source software does it as well. The question is then if, and if so, when do open source ERP systems take off. The paper describes the status of open source ERP systems. Based on literature review of ERP system selection criteria based on Web of Science articles, it discusses reported reasons for choosing open source or proprietary ERP systems. Last but not least, the article presents some conclusions that could act as input for future research. The paper aims at building up a foundation for the basic question: What are the reasons for an organization to adopt open source ERP systems.

  9. Combining Open-Source Packages for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    Schmidt, Albrecht; Grieger, Björn; Völk, Stefan

    2015-04-01

    The science planning of the ESA Rosetta mission has presented challenges which were addressed with combining various open-source software packages, such as the SPICE toolkit, the Python language and the Web graphics library three.js. The challenge was to compute certain parameters from a pool of trajectories and (possible) attitudes to describe the behaviour of the spacecraft. To be able to do this declaratively and efficiently, a C library was implemented that allows to interface the SPICE toolkit for geometrical computations from the Python language and process as much data as possible during one subroutine call. To minimise the lines of code one has to write special care was taken to ensure that the bindings were idiomatic and thus integrate well into the Python language and ecosystem. When done well, this very much simplifies the structure of the code and facilitates the testing for correctness by automatic test suites and visual inspections. For rapid visualisation and confirmation of correctness of results, the geometries were visualised with the three.js library, a popular Javascript library for displaying three-dimensional graphics in a Web browser. Programmatically, this was achieved by generating data files from SPICE sources that were included into templated HTML and displayed by a browser, thus made easily accessible to interested parties at large. As feedback came and new ideas were to be explored, the authors benefited greatly from the design of the Python-to-SPICE library which allowed the expression of algorithms to be concise and easier to communicate. In summary, by combining several well-established open-source tools, we were able to put together a flexible computation and visualisation environment that helped communicate and build confidence in planning ideas.

  10. GRASS GIS: The first Open Source Temporal GIS

    NASA Astrophysics Data System (ADS)

    Gebbert, Sören; Leppelt, Thomas

    2015-04-01

    GRASS GIS is a full featured, general purpose Open Source geographic information system (GIS) with raster, 3D raster and vector processing support[1]. Recently, time was introduced as a new dimension that transformed GRASS GIS into the first Open Source temporal GIS with comprehensive spatio-temporal analysis, processing and visualization capabilities[2]. New spatio-temporal data types were introduced in GRASS GIS version 7, to manage raster, 3D raster and vector time series. These new data types are called space time datasets. They are designed to efficiently handle hundreds of thousands of time stamped raster, 3D raster and vector map layers of any size. Time stamps can be defined as time intervals or time instances in Gregorian calendar time or relative time. Space time datasets are simplifying the processing and analysis of large time series in GRASS GIS, since these new data types are used as input and output parameter in temporal modules. The handling of space time datasets is therefore equal to the handling of raster, 3D raster and vector map layers in GRASS GIS. A new dedicated Python library, the GRASS GIS Temporal Framework, was designed to implement the spatio-temporal data types and their management. The framework provides the functionality to efficiently handle hundreds of thousands of time stamped map layers and their spatio-temporal topological relations. The framework supports reasoning based on the temporal granularity of space time datasets as well as their temporal topology. It was designed in conjunction with the PyGRASS [3] library to support parallel processing of large datasets, that has a long tradition in GRASS GIS [4,5]. We will present a subset of more than 40 temporal modules that were implemented based on the GRASS GIS Temporal Framework, PyGRASS and the GRASS GIS Python scripting library. These modules provide a comprehensive temporal GIS tool set. The functionality range from space time dataset and time stamped map layer management

  11. Mössbauer and Raman spectroscopy characterization of concretes used in the conditioning of spent radioactive sources

    NASA Astrophysics Data System (ADS)

    Monroy-Guzman, F.; González-Neri, M.; González-Díaz, R. C.; Ortíz-Arcivar, G.; Corona-Pérez, I. J.; Nava, N.; Cabral-Prieto, A.; Escobar-Alarcón, L.

    2015-06-01

    Spent radioactive sources are considered a type of radioactive waste which must be stored properly. These sources are usually conditioned in concrete that functions as shield and physical barrier to prevent the potential migration of radionuclides, and must have suitable properties: mechanical, thermal or irradiation resistance. Concretes used in the conditioning of spent radioactive source in Mexico were tested, preparing concrete test specimens with Portland cement CPC 30RS EXTRA CEMEX and aggregates, and subjected to compression strength, γ-ray-irradiation and thermal resistance assays and subsequently analyzed by Mössbauer and Raman Spectroscopies as well as by Scanning Electron Microscopy, in order to correlate the radiation and temperature effects on the compressive strengths, the oxidation states of iron and the structural features of the concrete. Iron was found in the concrete in Fe 2+ and Fe 3+ in the tetrahedral (T) and two octahedral positions (O1, O2). Radiolysis of water causes the dehydratation (200-600 kGy) and rehydratation (1000-10000 kGy) of calcium silicate hydrates (C-S-H) and ferric hydrate phases in concretes and structural distortion around the iron sites in concretes. The compressive strength of concretes are not significantly affected by γ-radiation or heat.

  12. The Case for Open Source: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    Open source has continued to evolve and in the past three years the development of a graphical user interface has made it increasingly accessible and viable for end users without special training. Open source relies to a great extent on the free software movement. In this context, the term free refers not to cost, but to the freedom users have to…

  13. An Extensible Open-Source Compiler Infrastructure for Testing

    SciTech Connect

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  14. Improving Data Catalogs with Free and Open Source Software

    NASA Astrophysics Data System (ADS)

    Schweitzer, R.; Hankin, S.; O'Brien, K.

    2013-12-01

    The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are

  15. Flood hazard mapping using open source hydrological tools

    NASA Astrophysics Data System (ADS)

    Tollenaar, Daniel; Wensveen, Lex; Winsemius, Hessel; Schellekens, Jaap

    2014-05-01

    Commonly, flood hazard maps are produced by building detailed hydrological and hydraulic models. These models are forced and parameterized by locally available, high resolution and preferably high quality data. The models use a high spatio-temporal resolution, resulting in large computational effort. Also, many hydraulic packages that solve 1D (canal) and 2D (overland) shallow water equations, are not freeware nor open source. In this contribution, we evaluate whether simplified open source data and models can be used for a rapid flood hazard assessment and to highlight areas where more detail may be required. The validity of this approach is tested by using four combinations of open-source tools: (1) a global hydrological model (PCR-GLOBWB, Van Beek and Bierkens, 2009) with a static inundation routine (GLOFRIS, Winsemius et al. 2013); (2) a global hydrological model with a dynamic inundation model (Subgrid, Stelling, 2012); (3) a local hydrological model (WFLOW) with a static inundation routine; (4) and a local hydrological model with a dynamic inundation model. The applicability of tools is assessed on (1) accuracy to reproduce the phenomenon, (2) time for model setup and (3) computational time. The performance is tested in a case study in the Rio Mamoré, one of the tributaries of the Amazone River (230,000 km2). References: Stelling, G.S.: Quadtree flood simulations with sub-grid digital elevation models, Proceedings of the ICE - Water Management, Volume 165, Issue 10, 01 November 2012 , pages 567 -580 Winsemius, H. C., Van Beek, L. P. H., Jongman, B., Ward, P. J., and Bouwman, A.: A framework for global river flood risk assessments, Hydrol. Earth Syst. Sci. Discuss., 9, 9611-9659, doi:10.5194/hessd-9-9611-2012, 2012 Van Beek, L. P. H. and Bierkens, M. F. P.: The global hydrological model PCR-GLOBWB: conceptualization, parameterization and verification, Dept. of Physical Geography, Utrecht University, Utrecht, available at: http

  16. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood

  17. User`s Manual for the SOURCE1 and SOURCE2 Computer Codes: Models for Evaluating Low-Level Radioactive Waste Disposal Facility Source Terms (Version 2.0)

    SciTech Connect

    Icenhour, A.S.; Tharp, M.L.

    1996-08-01

    The SOURCE1 and SOURCE2 computer codes calculate source terms (i.e. radionuclide release rates) for performance assessments of low-level radioactive waste (LLW) disposal facilities. SOURCE1 is used to simulate radionuclide releases from tumulus-type facilities. SOURCE2 is used to simulate releases from silo-, well-, well-in-silo-, and trench-type disposal facilities. The SOURCE codes (a) simulate the degradation of engineered barriers and (b) provide an estimate of the source term for LLW disposal facilities. This manual summarizes the major changes that have been effected since the codes were originally developed.

  18. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  19. Open Source GIS Connectors to NASA GES DISC Satellite Data

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Pham, Long; Yang, Wenli

    2014-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) houses a suite of high spatiotemporal resolution GIS data including satellite-derived and modeled precipitation, air quality, and land surface parameter data. The data are valuable to various GIS research and applications at regional, continental, and global scales. On the other hand, many GIS users, especially those from the ArcGIS community, have difficulties in obtaining, importing, and using our data due to factors such as the variety of data products, the complexity of satellite remote sensing data, and the data encoding formats. We introduce a simple open source ArcGIS data connector that significantly simplifies the access and use of GES DISC data in ArcGIS.

  20. Open-source products for a lighting experiment device.

    PubMed

    Gildea, Kevin M; Milburn, Nelda

    2014-12-01

    The capabilities of open-source software and microcontrollers were used to construct a device for controlled lighting experiments. The device was designed to ascertain whether individuals with certain color vision deficiencies were able to discriminate between the red and white lights in fielded systems on the basis of luminous intensity. The device provided the ability to control the timing and duration of light-emitting diode (LED) and incandescent light stimulus presentations, to present the experimental sequence and verbal instructions automatically, to adjust LED and incandescent luminous intensity, and to display LED and incandescent lights with various spectral emissions. The lighting device could easily be adapted for experiments involving flashing or timed presentations of colored lights, or the components could be expanded to study areas such as threshold light perception and visual alerting systems.

  1. Introducing djatoka: a reuse friendly, open source JPEG image server

    SciTech Connect

    Chute, Ryan M; Van De Sompel, Herbert

    2008-01-01

    The ISO-standardized JPEG 2000 image format has started to attract significant attention. Support for the format is emerging in major consumer applications, and the cultural heritage community seriously considers it a viable format for digital preservation. So far, only commercial image servers with JPEG 2000 support have been available. They come with significant license fees and typically provide the customers with limited extensibility capabilities. Here, we introduce djatoka, an open source JPEG 2000 image server with an attractive basic feature set, and extensibility under control of the community of implementers. We describe djatoka, and point at demonstrations that feature digitized images of marvelous historical manuscripts from the collections of the British Library and the University of Ghent. We also caIl upon the community to engage in further development of djatoka.

  2. IP address management : augmenting Sandia's capabilities through open source tools.

    SciTech Connect

    Nayar, R. Daniel

    2005-08-01

    Internet Protocol (IP) address management is an increasingly growing concern at Sandia National Laboratories (SNL) and the networking community as a whole. The current state of the available IP addresses indicates that they are nearly exhausted. Currently SNL doesn't have the justification to obtain more IP address space from Internet Assigned Numbers Authority (IANA). There must exist a local entity to manage and allocate IP assignments efficiently. Ongoing efforts at Sandia have been in the form of a multifunctional database application notably known as Network Information System (NWIS). NWIS is a database responsible for a multitude of network administrative services including IP address management. This study will explore the feasibility of augmenting NWIS's IP management capabilities utilizing open source tools. Modifications of existing capabilities to better allocate available IP address space are studied.

  3. Open-Source Software in Computational Research: A Case Study

    DOE PAGESBeta

    Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; Gel, Aytekin; Pannala, Sreekanth

    2008-01-01

    A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized inmore » the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.« less

  4. Performance testing open source products for the TMT event service

    NASA Astrophysics Data System (ADS)

    Gillies, K.; Bhate, Yogesh

    2014-07-01

    The software system for TMT is a distributed system with many components on many computers. Each component integrates with the overall system using a set of software services. The Event Service is a publish-subscribe message system that allows the distribution of demands and other events. The performance requirements for the Event Service are demanding with a goal of over 60 thousand events/second. This service is critical to the success of the TMT software architecture; therefore, a project was started to survey the open source and commercial market for viable software products. A trade study led to the selection of five products for thorough testing using a specially constructed computer/network configuration and test suite. The best performing product was chosen as the basis of a prototype Event Service implementation. This paper describes the process and performance tests conducted by Persistent Systems that led to the selection of the product for the prototype Event Service.

  5. An open source mobile platform for psychophysiological self tracking.

    PubMed

    Gaggioli, Andrea; Cipresso, Pietro; Serino, Silvia; Pioggia, Giovanni; Tartarisco, Gennaro; Baldus, Giovanni; Corda, Daniele; Riva, Giuseppe

    2012-01-01

    Self tracking is a recent trend in e-health that refers to the collection, elaboration and visualization of personal health data through ubiquitous computing tools such as mobile devices and wearable sensors. Here, we describe the design of a mobile self-tracking platform that has been specifically designed for clinical and research applications in the field of mental health. The smartphone-based application allows collecting a) self-reported feelings and activities from pre-programmed questionnaires; b) electrocardiographic (ECG) data from a wireless sensor platform worn by the user; c) movement activity information obtained from a tri-axis accelerometer embedded in the wearable platform. Physiological signals are further processed by the application and stored on the smartphone's memory. The mobile data collection platform is free and released under an open source licence to allow wider adoption by the research community (download at: http://sourceforge.net/projects/psychlog/).

  6. Development of parallel DEM for the open source code MFIX

    SciTech Connect

    Gopalakrishnan, Pradeep; Tafti, Danesh

    2013-02-01

    The paper presents the development of a parallel Discrete Element Method (DEM) solver for the open source code, Multiphase Flow with Interphase eXchange (MFIX) based on the domain decomposition method. The performance of the code was evaluated by simulating a bubbling fluidized bed with 2.5 million particles. The DEM solver shows strong scalability up to 256 processors with an efficiency of 81%. Further, to analyze weak scaling, the static height of the fluidized bed was increased to hold 5 and 10 million particles. The results show that global communication cost increases with problem size while the computational cost remains constant. Further, the effects of static bed height on the bubble hydrodynamics and mixing characteristics are analyzed.

  7. Management of Astronomical Software Projects with Open Source Tools

    NASA Astrophysics Data System (ADS)

    Briegel, F.; Bertram, T.; Berwein, J.; Kittmann, F.

    2010-12-01

    In this paper we will offer an innovative approach to managing the software development process with free open source tools, for building and automated testing, a system to automate the compile/test cycle on a variety of platforms to validate code changes, using virtualization to compile in parallel on various operating system platforms, version control and change management, enhanced wiki and issue tracking system for online documentation and reporting and groupware tools as they are: blog, discussion and calendar. Initially starting with the Linc-Nirvana instrument a new project and configuration management tool for developing astronomical software was looked for. After evaluation of various systems of this kind, we are satisfied with the selection we are using now. Following the lead of Linc-Nirvana most of the other software projects at the MPIA are using it now.

  8. Dentocase - open-source education management system in dentistry.

    PubMed

    Peroz, I; Seidel, O; Böning, K; Bösel, C; Schütte, U

    2004-04-01

    Since 2001, an interdisciplinary project on multimedia education in medicine has been sponsored by the Federal Ministry of Education and Research at the Charité. One part of the project is on dentistry. In the light of the results of a survey of dental students, an Internet-based education management system was created using open-source back-end systems. It supports four didactic levels for editing documentation of patient treatments. Each level corresponds to the learning abilities of the students. The patient documentation is organized to simulate the working methods of a physician or dentist. The system was tested for the first time by students in the summer semester of 2003 and has been used since the winter semester of 2003 as part of the curriculum. PMID:15516095

  9. Open Source Dataturbine (OSDT) Android Sensorpod in Environmental Observing Systems

    NASA Astrophysics Data System (ADS)

    Fountain, T. R.; Shin, P.; Tilak, S.; Trinh, T.; Smith, J.; Kram, S.

    2014-12-01

    The OSDT Android SensorPod is a custom-designed mobile computing platform for assembling wireless sensor networks for environmental monitoring applications. Funded by an award from the Gordon and Betty Moore Foundation, the OSDT SensorPod represents a significant technological advance in the application of mobile and cloud computing technologies to near-real-time applications in environmental science, natural resources management, and disaster response and recovery. It provides a modular architecture based on open standards and open-source software that allows system developers to align their projects with industry best practices and technology trends, while avoiding commercial vendor lock-in to expensive proprietary software and hardware systems. The integration of mobile and cloud-computing infrastructure represents a disruptive technology in the field of environmental science, since basic assumptions about technology requirements are now open to revision, e.g., the roles of special purpose data loggers and dedicated site infrastructure. The OSDT Android SensorPod was designed with these considerations in mind, and the resulting system exhibits the following characteristics - it is flexible, efficient and robust. The system was developed and tested in the three science applications: 1) a fresh water limnology deployment in Wisconsin, 2) a near coastal marine science deployment at the UCSD Scripps Pier, and 3) a terrestrial ecological deployment in the mountains of Taiwan. As part of a public education and outreach effort, a Facebook page with daily ocean pH measurements from the UCSD Scripps pier was developed. Wireless sensor networks and the virtualization of data and network services is the future of environmental science infrastructure. The OSDT Android SensorPod was designed and developed to harness these new technology developments for environmental monitoring applications.

  10. JSim, an open-source modeling system for data analysis.

    PubMed

    Butterworth, Erik; Jardine, Bartholomew E; Raymond, Gary M; Neal, Maxwell L; Bassingthwaighte, James B

    2013-01-01

    JSim is a simulation system for developing models, designing experiments, and evaluating hypotheses on physiological and pharmacological systems through the testing of model solutions against data. It is designed for interactive, iterative manipulation of the model code, handling of multiple data sets and parameter sets, and for making comparisons among different models running simultaneously or separately. Interactive use is supported by a large collection of graphical user interfaces for model writing and compilation diagnostics, defining input functions, model runs, selection of algorithms solving ordinary and partial differential equations, run-time multidimensional graphics, parameter optimization (8 methods), sensitivity analysis, and Monte Carlo simulation for defining confidence ranges. JSim uses Mathematical Modeling Language (MML) a declarative syntax specifying algebraic and differential equations. Imperative constructs written in other languages (MATLAB, FORTRAN, C++, etc.) are accessed through procedure calls. MML syntax is simple, basically defining the parameters and variables, then writing the equations in a straightforward, easily read and understood mathematical form. This makes JSim good for teaching modeling as well as for model analysis for research.   For high throughput applications, JSim can be run as a batch job.  JSim can automatically translate models from the repositories for Systems Biology Markup Language (SBML) and CellML models. Stochastic modeling is supported. MML supports assigning physical units to constants and variables and automates checking dimensional balance as the first step in verification testing. Automatic unit scaling follows, e.g. seconds to minutes, if needed. The JSim Project File sets a standard for reproducible modeling analysis: it includes in one file everything for analyzing a set of experiments: the data, the models, the data fitting, and evaluation of parameter confidence ranges. JSim is open source; it

  11. JSim, an open-source modeling system for data analysis.

    PubMed

    Butterworth, Erik; Jardine, Bartholomew E; Raymond, Gary M; Neal, Maxwell L; Bassingthwaighte, James B

    2013-01-01

    JSim is a simulation system for developing models, designing experiments, and evaluating hypotheses on physiological and pharmacological systems through the testing of model solutions against data. It is designed for interactive, iterative manipulation of the model code, handling of multiple data sets and parameter sets, and for making comparisons among different models running simultaneously or separately. Interactive use is supported by a large collection of graphical user interfaces for model writing and compilation diagnostics, defining input functions, model runs, selection of algorithms solving ordinary and partial differential equations, run-time multidimensional graphics, parameter optimization (8 methods), sensitivity analysis, and Monte Carlo simulation for defining confidence ranges. JSim uses Mathematical Modeling Language (MML) a declarative syntax specifying algebraic and differential equations. Imperative constructs written in other languages (MATLAB, FORTRAN, C++, etc.) are accessed through procedure calls. MML syntax is simple, basically defining the parameters and variables, then writing the equations in a straightforward, easily read and understood mathematical form. This makes JSim good for teaching modeling as well as for model analysis for research.   For high throughput applications, JSim can be run as a batch job.  JSim can automatically translate models from the repositories for Systems Biology Markup Language (SBML) and CellML models. Stochastic modeling is supported. MML supports assigning physical units to constants and variables and automates checking dimensional balance as the first step in verification testing. Automatic unit scaling follows, e.g. seconds to minutes, if needed. The JSim Project File sets a standard for reproducible modeling analysis: it includes in one file everything for analyzing a set of experiments: the data, the models, the data fitting, and evaluation of parameter confidence ranges. JSim is open source; it

  12. JSim, an open-source modeling system for data analysis

    PubMed Central

    Bassingthwaighte, James B.

    2013-01-01

    JSim is a simulation system for developing models, designing experiments, and evaluating hypotheses on physiological and pharmacological systems through the testing of model solutions against data. It is designed for interactive, iterative manipulation of the model code, handling of multiple data sets and parameter sets, and for making comparisons among different models running simultaneously or separately. Interactive use is supported by a large collection of graphical user interfaces for model writing and compilation diagnostics, defining input functions, model runs, selection of algorithms solving ordinary and partial differential equations, run-time multidimensional graphics, parameter optimization (8 methods), sensitivity analysis, and Monte Carlo simulation for defining confidence ranges. JSim uses Mathematical Modeling Language (MML) a declarative syntax specifying algebraic and differential equations. Imperative constructs written in other languages (MATLAB, FORTRAN, C++, etc.) are accessed through procedure calls. MML syntax is simple, basically defining the parameters and variables, then writing the equations in a straightforward, easily read and understood mathematical form. This makes JSim good for teaching modeling as well as for model analysis for research.   For high throughput applications, JSim can be run as a batch job.  JSim can automatically translate models from the repositories for Systems Biology Markup Language (SBML) and CellML models. Stochastic modeling is supported. MML supports assigning physical units to constants and variables and automates checking dimensional balance as the first step in verification testing. Automatic unit scaling follows, e.g. seconds to minutes, if needed. The JSim Project File sets a standard for reproducible modeling analysis: it includes in one file everything for analyzing a set of experiments: the data, the models, the data fitting, and evaluation of parameter confidence ranges. JSim is open source; it

  13. The Future of ECHO: Evaluating Open Source Possibilities

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Gilman, J.; Baynes, K.; Mitchell, A. E.

    2012-12-01

    NASA's Earth Observing System ClearingHOuse (ECHO) is a format agnostic metadata repository supporting over 3000 collections and 100M science granules. ECHO exposes FTP and RESTful Data Ingest APIs in addition to both SOAP and RESTful search and order capabilities. Built on top of ECHO is a human facing search and order web application named Reverb. ECHO processes hundreds of orders, tens of thousands of searches, and 1-2M ingest actions each week. As ECHO's holdings, metadata format support, and visibility have increased, the ECHO team has received requests by non-NASA entities for copies of ECHO that can be run locally against their data holdings. ESDIS and the ECHO Team have begun investigations into various deployment and Open Sourcing models that can balance the real constraints faced by the ECHO project with the benefits of providing ECHO capabilities to a broader set of users and providers. This talk will discuss several release and Open Source models being investigated by the ECHO team along with the impacts those models are expected to have on the project. We discuss: - Addressing complex deployment or setup issues for potential users - Models of vetting code contributions - Balancing external (public) user requests versus our primary partners - Preparing project code for public release, including navigating licensing issues related to leveraged libraries - Dealing with non-free project dependencies such as commercial databases - Dealing with sensitive aspects of project code such as database passwords, authentication approaches, security through obscurity, etc. - Ongoing support for the released code including increased testing demands, bug fixes, security fixes, and new features.

  14. When Free Isn't Free: The Realities of Running Open Source in School

    ERIC Educational Resources Information Center

    Derringer, Pam

    2009-01-01

    Despite the last few years' growth in awareness of open-source software in schools and the potential savings it represents, its widespread adoption is still hampered. Randy Orwin, technology director of the Bainbridge Island School District in Washington State and a strong open-source advocate, cautions that installing an open-source…

  15. HOS-ocean: Open-source solver for nonlinear waves in open ocean based on High-Order Spectral method

    NASA Astrophysics Data System (ADS)

    Ducrozet, Guillaume; Bonnefoy, Félicien; Le Touzé, David; Ferrant, Pierre

    2016-06-01

    HOS-ocean is an efficient High-Order Spectral code developed to solve the deterministic propagation of nonlinear wavefields in open ocean. HOS-ocean is released as open-source, developed and distributed under the terms of GNU General Public License (GPLv3). Along with the source code, a documentation under wiki format is available which makes easy the compilation and execution of the source files. The code has been shown to be accurate and efficient.

  16. A low-cost, open-source, wireless electrophysiology system.

    PubMed

    Ghomashchi, A; Zheng, Z; Majaj, N; Trumpis, M; Kiorpes, L; Viventi, J

    2014-01-01

    Many experiments in neuroscience require or would benefit tremendously from a wireless neural recording system. However, commercially available wireless systems are expensive, have moderate to high noise and are often not customizable. Academic wireless systems present impressive capabilities, but are not available for other labs to use. To overcome these limitations, we have developed an ultra-low noise 8 channel wireless electrophysiological data acquisition system using standard, commercially available components. The system is capable of recording many types of neurological signals, including EEG, ECoG, LFP and unit activity. With a diameter of just 25 mm and height of 9 mm, including a CR2032 Lithium coin cell battery, it is designed to fit into a small recording chamber while minimizing the overall implant height (Fig. 1 and 3). Using widely available parts we were able to keep the material cost of our system under $100 dollars. The complete design, including schematic, PCB layout, bill of materials and source code, will be released through an open source license, allowing other labs to modify the design to fit their needs. We have also developed a driver to acquire data using the BCI2000 software system. Feedback from the community will allow us to improve the design and create a more useful neuroscience research tool. PMID:25570656

  17. A Small Radioactive Source in a Large Media - Localization by Multi-detector Measurement. The Case of a Lung Counter

    SciTech Connect

    Alfassi, Z. B.; Pelled, O.; German, U.

    2008-08-14

    Considerable errors in the determination of radioactive contamination in lungs can be induced if there is no homogeneous distribution, as assumed for the calibration. Modern lung counter systems use several detectors, and the count rate ratios of the detectors can be used for localization of the radioactive contamination, enabling the use of correction algorithms. This greatly reduces the errors in the determination of the activity. Further reduction of the errors can be obtained by simultaneous analysis of several gamma lines (if several energies are emitted by the radioisotope), and by optimizing the number and location of the detectors. This presentation deals with the case of a point source of natural uranium in human lungs.

  18. VSEARCH: a versatile open source tool for metagenomics

    PubMed Central

    Flouri, Tomáš; Nichols, Ben; Quince, Christopher; Mahé, Frédéric

    2016-01-01

    Background VSEARCH is an open source and free of charge multithreaded 64-bit tool for processing and preparing metagenomics, genomics and population genomics nucleotide sequence data. It is designed as an alternative to the widely used USEARCH tool (Edgar, 2010) for which the source code is not publicly available, algorithm details are only rudimentarily described, and only a memory-confined 32-bit version is freely available for academic use. Methods When searching nucleotide sequences, VSEARCH uses a fast heuristic based on words shared by the query and target sequences in order to quickly identify similar sequences, a similar strategy is probably used in USEARCH. VSEARCH then performs optimal global sequence alignment of the query against potential target sequences, using full dynamic programming instead of the seed-and-extend heuristic used by USEARCH. Pairwise alignments are computed in parallel using vectorisation and multiple threads. Results VSEARCH includes most commands for analysing nucleotide sequences available in USEARCH version 7 and several of those available in USEARCH version 8, including searching (exact or based on global alignment), clustering by similarity (using length pre-sorting, abundance pre-sorting or a user-defined order), chimera detection (reference-based or de novo), dereplication (full length or prefix), pairwise alignment, reverse complementation, sorting, and subsampling. VSEARCH also includes commands for FASTQ file processing, i.e., format detection, filtering, read quality statistics, and merging of paired reads. Furthermore, VSEARCH extends functionality with several new commands and improvements, including shuffling, rereplication, masking of low-complexity sequences with the well-known DUST algorithm, a choice among different similarity definitions, and FASTQ file format conversion. VSEARCH is here shown to be more accurate than USEARCH when performing searching, clustering, chimera detection and subsampling, while on a par

  19. Streamtube Fate and Transport Modeling of the Source Term for the Old Radioactive Waste

    SciTech Connect

    Brewer, K.

    2000-11-16

    The modeling described in this report is an extension of previous fate and transport modeling for the Old Radioactive Waste Burial Ground Corrective Measures Study/Feasibility Study. The purpose of this and the previous modeling is to provide quantitative input to the screening of remedial alternatives for the CMS/FS for this site.

  20. Radioactivity in HgCdTe devices: potential source of "snowballs"

    NASA Astrophysics Data System (ADS)

    McCullough, P.

    2009-12-01

    We hypothesize that the "snowballs" observed in HgCdTe infrared detectors are caused by natural radioactivity in the devices themselves. As characterized by Hilbert (2009) in the WFC3 flight IR array (FPA165), "snowballs" are transient events that instantaneously saturate a few pixels and deposit a few hundred thousand electrons over a ~5-pixel (~100-um) diameter region. In 2008, prior to flight of detector FPA165, Hilbert (2009) detected 21 snowballs during thermal vaccum test three (TV3) and inferred a rate of ~1100 ± 200 snowballs per year per cm2 of the HgCdTe detector. Alpha particles emitted from either (or both) naturally radioactive thorium and/or uranium, at ~1 ppm concentrations within the device, can explain the observed characteristics of the "snowballs." If thorium is present, up to four distinctly observable snowballs should appear at the same location on the pixel array over the course of many years. While the indium in the bump bonds is almost entirely the radioactive isotope In-115, and 12% of the cadmium is naturally radioactive Cd-113, both of those emit only betas, which are too penetrating and not energetic enough to match the observed characteristics of "snowballs." Also, the Cd-113 emission rate is much less than that of the observed snowballs.

  1. OpenStereo: Open Source, Cross-Platform Software for Structural Geology Analysis

    NASA Astrophysics Data System (ADS)

    Grohmann, C. H.; Campanha, G. A.

    2010-12-01

    Free and open source software (FOSS) are increasingly seen as synonyms of innovation and progress. Freedom to run, copy, distribute, study, change and improve the software (through access to the source code) assure a high level of positive feedback between users and developers, which results in stable, secure and constantly updated systems. Several software packages for structural geology analysis are available to the user, with commercial licenses or that can be downloaded at no cost from the Internet. Some provide basic tools of stereographic projections such as plotting poles, great circles, density contouring, eigenvector analysis, data rotation etc, while others perform more specific tasks, such as paleostress or geotechnical/rock stability analysis. This variety also means a wide range of data formating for input, Graphical User Interface (GUI) design and graphic export format. The majority of packages is built for MS-Windows and even though there are packages for the UNIX-based MacOS, there aren't native packages for *nix (UNIX, Linux, BSD etc) Operating Systems (OS), forcing the users to run these programs with emulators or virtual machines. Those limitations lead us to develop OpenStereo, an open source, cross-platform software for stereographic projections and structural geology. The software is written in Python, a high-level, cross-platform programming language and the GUI is designed with wxPython, which provide a consistent look regardless the OS. Numeric operations (like matrix and linear algebra) are performed with the Numpy module and all graphic capabilities are provided by the Matplolib library, including on-screen plotting and graphic exporting to common desktop formats (emf, eps, ps, pdf, png, svg). Data input is done with simple ASCII text files, with values of dip direction and dip/plunge separated by spaces, tabs or commas. The user can open multiple file at the same time (or the same file more than once), and overlay different elements of

  2. Using "Moodle[TM]" (Open Source Software) with Grades 3-6

    ERIC Educational Resources Information Center

    Menges, Beth

    2009-01-01

    "Moodle[TM]," an acronym for Modular Object-Oriented Dynamic Learning Environment, is an open source software package that teachers can use to produce Internet-based courses and Web sites. "Open source" basically means that it is copyrighted, but it can be used at no cost as long as the user agrees to provide the source to others. "Moodle[TM]" can…

  3. Open Knee: Open Source Modeling & Simulation to Enable Scientific Discovery and Clinical Care in Knee Biomechanics

    PubMed Central

    Erdemir, Ahmet

    2016-01-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical function of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor intensive reproduction of model development steps can be avoided. The interested parties can immediately utilize readily available models for scientific discovery and for clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes detailed anatomical representation of the joint's major tissue structures, their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next generation knee models are noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age

  4. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    PubMed

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  5. Open-Source Selective Laser Sintering (OpenSLS) of Nylon and Biocompatible Polycaprolactone

    PubMed Central

    Paulsen, Samantha J.; Hwang, Daniel H.; Ta, Anderson H.; Yalacki, David R.; Schmidt, Tim; Miller, Jordan S.

    2016-01-01

    Selective Laser Sintering (SLS) is an additive manufacturing process that uses a laser to fuse powdered starting materials into solid 3D structures. Despite the potential for fabrication of complex, high-resolution structures with SLS using diverse starting materials (including biomaterials), prohibitive costs of commercial SLS systems have hindered the wide adoption of this technology in the scientific community. Here, we developed a low-cost, open-source SLS system (OpenSLS) and demonstrated its capacity to fabricate structures in nylon with sub-millimeter features and overhanging regions. Subsequently, we demonstrated fabrication of polycaprolactone (PCL) into macroporous structures such as a diamond lattice. Widespread interest in using PCL for bone tissue engineering suggests that PCL lattices are relevant model scaffold geometries for engineering bone. SLS of materials with large powder grain size (~500 μm) leads to part surfaces with high roughness, so we further introduced a simple vapor-smoothing technique to reduce the surface roughness of sintered PCL structures which further improves their elastic modulus and yield stress. Vapor-smoothed PCL can also be used for sacrificial templating of perfusable fluidic networks within orthogonal materials such as poly(dimethylsiloxane) silicone. Finally, we demonstrated that human mesenchymal stem cells were able to adhere, survive, and differentiate down an osteogenic lineage on sintered and smoothed PCL surfaces, suggesting that OpenSLS has the potential to produce PCL scaffolds useful for cell studies. OpenSLS provides the scientific community with an accessible platform for the study of laser sintering and the fabrication of complex geometries in diverse materials. PMID:26841023

  6. Free and Open Source Software for land degradation vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Imbrenda, Vito; Calamita, Giuseppe; Coluzzi, Rosa; D'Emilio, Mariagrazia; Lanfredi, Maria Teresa; Perrone, Angela; Ragosta, Maria; Simoniello, Tiziana

    2013-04-01

    Nowadays the role of FOSS software in scientific research is becoming increasingly important. Besides the important issues of reduced costs for licences, legality and security there are many other reasons that make FOSS software attractive. Firstly, making the code opened is a warranty of quality permitting to thousands of developers around the world to check the code and fix bugs rather than rely on vendors claims. FOSS communities are usually enthusiastic about helping other users for solving problems and expand or customize software (flexibility). Most important for this study, the interoperability allows to combine the user-friendly QGIS with the powerful GRASS-GIS and the richness of statistical methods of R in order to process remote sensing data and to perform geo-statistical analysis in one only environment. This study is focused on the land degradation (i.e. the reduction in the capacity of the land to provide ecosystem goods and services and assure its functions) and in particular on the estimation of the vulnerability levels in order to suggest appropriate policy actions to reduce/halt land degradation impacts, using the above mentioned software. The area investigated is the Basilicata Region (Southern Italy) where large natural areas are mixed with anthropized areas. To identify different levels of vulnerability we adopted the Environmentally Sensitive Areas (ESAs) model, based on the combination of indicators related to soil, climate, vegetation and anthropic stress. Such indicators were estimated by using the following data-sources: - Basilicata Region Geoportal to assess soil vulnerability; - DESERTNET2 project to evaluate potential vegetation vulnerability and climate vulnerability; - NDVI-MODIS satellite time series (2000-2010) with 250m resolution, available as 16-day composite from the NASA LP DAAC to characterize the dynamic component of vegetation; - Agricultural Census data 2010, Corine Land Cover 2006 and morphological information to assess

  7. Implementation of an OAIS Repository Using Free, Open Source Software

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Gessler, P. E.; Seamon, E.

    2015-12-01

    The Northwest Knowledge Network (NKN) is a regional data repository located at the University of Idaho that focuses on the collection, curation, and distribution of research data. To support our home institution and others in the region, we offer services to researchers at all stages of the data lifecycle—from grant application and data management planning to data distribution and archive. In this role, we recognize the need to work closely with other data management efforts at partner institutions and agencies, as well as with larger aggregation efforts such as our state geospatial data clearinghouses, data.gov, DataONE, and others. In the past, one of our challenges with monolithic, prepackaged data management solutions is that customization can be difficult to implement and maintain, especially as new versions of the software are released that are incompatible with our local codebase. Our solution is to break the monolith up into its constituent parts, which offers us several advantages. First, any customizations that we make are likely to fall into areas that can be accessed through Application Program Interfaces (API) that are likely to remain stable over time, so our code stays compatible. Second, as components become obsolete or insufficient to meet new demands that arise, we can replace the individual components with minimal effect on the rest of the infrastructure, causing less disruption to operations. Other advantages include increased system reliability, staggered rollout of new features, enhanced compatibility with legacy systems, reduced dependence on a single software company as a point of failure, and the separation of development into manageable tasks. In this presentation, we describe our application of the Service Oriented Architecture (SOA) design paradigm to assemble a data repository that conforms to the Open Archival Information System (OAIS) Reference Model primarily using a collection of free and open-source software. We detail the design

  8. An Open Source modular platform for hydrological model implementation

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur; Bruland, Oddbjørn

    2010-05-01

    An implementation framework for setup and evaluation of spatio-temporal models is developed, forming a highly modularized distributed model system. The ENKI framework allows building space-time models for hydrological or other environmental purposes, from a suite of separately compiled subroutine modules. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational hydropower forecasting or other water resource management. Written in C++, ENKI uses a plug-in structure to build a complete model from separately compiled subroutine implementations. These modules contain very little code apart from the core process simulation, and are compiled as dynamic-link libraries (dll). A narrow interface allows the main executable to recognise the number and type of the different variables in each routine. The framework then exposes these variables to the user within the proper context, ensuring that time series exist for input variables, initialisation for states, GIS data sets for static map data, manually or automatically calibrated values for parameters etc. ENKI is designed to meet three different levels of involvement in model construction: • Model application: Running and evaluating a given model. Regional calibration against arbitrary data using a rich suite of objective functions, including likelihood and Bayesian estimation. Uncertainty analysis directed towards input or parameter uncertainty. o Need not: Know the model's composition of subroutines, or the internal variables in the model, or the creation of method modules. • Model analysis: Link together different process methods, including parallel setup of alternative methods for solving the same task. Investigate the effect of different spatial discretization schemes. o Need not

  9. HELIOS: A new open-source radiative transfer code

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin

    2015-12-01

    I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net

  10. Superconducting open-gradient magnetic separation for the pretreatment of radioactive or mixed waste vitrification feeds. 1997 annual progress report

    SciTech Connect

    Doctor, R.; Nunez, L.; Cicero-Herman, C.A.; Ritter, J.A.; Landsberger, S.

    1997-01-01

    'Vitrification has been selected as a final waste form technology in the US for long-term storage of high-level radioactive wastes (HLW). However, a foreseeable problem during vitrification in some waste feed streams lies in the presence of elements (e.g., transition metals) in the HLW that may cause instabilities in the final glass product. The formation of spinel compounds, such as Fe{sub 3}O{sub 4} and FeCrO{sub 4}, results in glass phase separation and reduces vitrifier lifetime, and durability of the final waste form. A superconducting open gradient magnetic separation (OGMS) system maybe suitable for the removal of the deleterious transition elements (e.g. Fe, Co, and Ni) and other elements (lanthanides) from vitrification feed streams due to their ferromagnetic or paramagnetic nature. The OGMS systems are designed to deflect and collect paramagnetic minerals as they interact with a magnetic field gradient. This system has the potential to reduce the volume of HLW for vitrification and ensure a stable product. In order to design efficient OGMS and High gradient magnetic separation (HGMS) processes, a fundamental understanding of the physical and chemical properties of the waste feed streams is required. Using HLW simulant and radioactive fly ash and sludge samples from the Savannah River Technology Center, Rocky Flats site, and the Hanford reservation, several techniques were used to characterize and predict the separation capability for a superconducting OGMS system.'

  11. Source holder collimator for encapsulating radioactive material and collimating the emanations from the material

    DOEpatents

    Laurer, G.R.

    1974-01-22

    This invention provides a transportable device capable of detecting normal levels of a trace element, such as lead in a doughnutshaped blood sample by x-ray fluorescence with a minimum of sample preparation in a relatively short analyzing time. In one embodiment, the blood is molded into a doughnut-shaped sample around an annular array of low-energy radioactive material that is at the center of the doughnut-shaped sample but encapsulated in a collimator, the latter shielding a detector that is close to the sample and facing the same so that the detector receives secondary emissions from the sample while the collimator collimates ths primary emissions from the radioactive material to direct these emissions toward the sample around 360 deg and away from the detector. (Official Gazette)

  12. In situ gamma spectrometry measurements and Monte Carlo computations for the detection of radioactive sources in scrap metal.

    PubMed

    Clouvas, A; Xanthos, S; Takoudis, G; Potiriadis, C; Silva, J

    2005-02-01

    A very limited number of field experiments have been performed to assess the relative radiation detection sensitivities of commercially available equipment used to detect radioactive sources in recycled metal scrap. Such experiments require the cooperation and commitment of considerable resources on the part of vendors of the radiation detection systems and the cooperation of a steel mill or scrap processing facility. The results will unavoidably be specific to the equipment tested at the time, the characteristics of the scrap metal involved in the tests, and to the specific configurations of the scrap containers. Given these limitations, the use of computer simulation for this purpose would be a desirable alternative. With this in mind, this study sought to determine whether Monte Carlo simulation of photon flux energy distributions resulting from a radiation source in metal scrap would be realistic. In the present work, experimental and simulated photon flux energy distributions in the outer part of a truck due to the presence of embedded radioactive sources in the scrap metal load are compared. The experimental photon fluxes are deduced by in situ gamma spectrometry measurements with portable Ge detector and the calculated ones by Monte Carlo simulations with the MCNP code. The good agreement between simulated and measured photon flux energy distributions indicate that the results obtained by the Monte Carlo simulations are realistic.

  13. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  14. Evaluation of Open-Source Hard Real Time Software Packages

    NASA Technical Reports Server (NTRS)

    Mattei, Nicholas S.

    2004-01-01

    replacing this somewhat costly implementation is the focus of one of the SA group s current research projects. The explosion of open source software in the last ten years has led to the development of a multitude of software solutions which were once only produced by major corporations. The benefits of these open projects include faster release and bug patching cycles as well as inexpensive if not free software solutions. The main packages for hard real time solutions under Linux are Real Time Application Interface (RTAI) and two varieties of Real Time Linux (RTL), RTLFree and RTLPro. During my time here at NASA I have been testing various hard real time solutions operating as layers on the Linux Operating System. All testing is being run on an Intel SBC 2590 which is a common embedded hardware platform. The test plan was provided to me by the Software Assurance group at the start of my internship and my job has been to test the systems by developing and executing the test cases on the hardware. These tests are constructed so that the Software Assurance group can get hard test data for a comparison between the open source and proprietary implementations of hard real time solutions.

  15. MetaTrans: an open-source pipeline for metatranscriptomics

    PubMed Central

    Martinez, Xavier; Pozuelo, Marta; Pascal, Victoria; Campos, David; Gut, Ivo; Gut, Marta; Azpiroz, Fernando; Guarner, Francisco; Manichanh, Chaysavanh

    2016-01-01

    To date, meta-omic approaches use high-throughput sequencing technologies, which produce a huge amount of data, thus challenging modern computers. Here we present MetaTrans, an efficient open-source pipeline to analyze the structure and functions of active microbial communities using the power of multi-threading computers. The pipeline is designed to perform two types of RNA-Seq analyses: taxonomic and gene expression. It performs quality-control assessment, rRNA removal, maps reads against functional databases and also handles differential gene expression analysis. Its efficacy was validated by analyzing data from synthetic mock communities, data from a previous study and data generated from twelve human fecal samples. Compared to an existing web application server, MetaTrans shows more efficiency in terms of runtime (around 2 hours per million of transcripts) and presents adapted tools to compare gene expression levels. It has been tested with a human gut microbiome database but also proposes an option to use a general database in order to analyze other ecosystems. For the installation and use of the pipeline, we provide a detailed guide at the following website (www.metatrans.org). PMID:27211518

  16. Dinosaur: A Refined Open-Source Peptide MS Feature Detector.

    PubMed

    Teleman, Johan; Chawade, Aakash; Sandin, Marianne; Levander, Fredrik; Malmström, Johan

    2016-07-01

    In bottom-up mass spectrometry (MS)-based proteomics, peptide isotopic and chromatographic traces (features) are frequently used for label-free quantification in data-dependent acquisition MS but can also be used for the improved identification of chimeric spectra or sample complexity characterization. Feature detection is difficult because of the high complexity of MS proteomics data from biological samples, which frequently causes features to intermingle. In addition, existing feature detection algorithms commonly suffer from compatibility issues, long computation times, or poor performance on high-resolution data. Because of these limitations, we developed a new tool, Dinosaur, with increased speed and versatility. Dinosaur has the functionality to sample algorithm computations through quality-control plots, which we call a plot trail. From the evaluation of this plot trail, we introduce several algorithmic improvements to further improve the robustness and performance of Dinosaur, with the detection of features for 98% of MS/MS identifications in a benchmark data set, and no other algorithm tested in this study passed 96% feature detection. We finally used Dinosaur to reimplement a published workflow for peptide identification in chimeric spectra, increasing chimeric identification from 26% to 32% over the standard workflow. Dinosaur is operating-system-independent and is freely available as open source on https://github.com/fickludd/dinosaur . PMID:27224449

  17. Open-Source Telemedicine Platform for Wireless Medical Video Communication

    PubMed Central

    Panayides, A.; Eleftheriou, I.; Pantziaris, M.

    2013-01-01

    An m-health system for real-time wireless communication of medical video based on open-source software is presented. The objective is to deliver a low-cost telemedicine platform which will allow for reliable remote diagnosis m-health applications such as emergency incidents, mass population screening, and medical education purposes. The performance of the proposed system is demonstrated using five atherosclerotic plaque ultrasound videos. The videos are encoded at the clinically acquired resolution, in addition to lower, QCIF, and CIF resolutions, at different bitrates, and four different encoding structures. Commercially available wireless local area network (WLAN) and 3.5G high-speed packet access (HSPA) wireless channels are used to validate the developed platform. Objective video quality assessment is based on PSNR ratings, following calibration using the variable frame delay (VFD) algorithm that removes temporal mismatch between original and received videos. Clinical evaluation is based on atherosclerotic plaque ultrasound video assessment protocol. Experimental results show that adequate diagnostic quality wireless medical video communications are realized using the designed telemedicine platform. HSPA cellular networks provide for ultrasound video transmission at the acquired resolution, while VFD algorithm utilization bridges objective and subjective ratings. PMID:23573082

  18. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  19. Open source tools for standardized privacy protection of medical images

    NASA Astrophysics Data System (ADS)

    Lien, Chung-Yueh; Onken, Michael; Eichelberg, Marco; Kao, Tsair; Hein, Andreas

    2011-03-01

    In addition to the primary care context, medical images are often useful for research projects and community healthcare networks, so-called "secondary use". Patient privacy becomes an issue in such scenarios since the disclosure of personal health information (PHI) has to be prevented in a sharing environment. In general, most PHIs should be completely removed from the images according to the respective privacy regulations, but some basic and alleviated data is usually required for accurate image interpretation. Our objective is to utilize and enhance these specifications in order to provide reliable software implementations for de- and re-identification of medical images suitable for online and offline delivery. DICOM (Digital Imaging and Communications in Medicine) images are de-identified by replacing PHI-specific information with values still being reasonable for imaging diagnosis and patient indexing. In this paper, this approach is evaluated based on a prototype implementation built on top of the open source framework DCMTK (DICOM Toolkit) utilizing standardized de- and re-identification mechanisms. A set of tools has been developed for DICOM de-identification that meets privacy requirements of an offline and online sharing environment and fully relies on standard-based methods.

  20. Open-source telemedicine platform for wireless medical video communication.

    PubMed

    Panayides, A; Eleftheriou, I; Pantziaris, M

    2013-01-01

    An m-health system for real-time wireless communication of medical video based on open-source software is presented. The objective is to deliver a low-cost telemedicine platform which will allow for reliable remote diagnosis m-health applications such as emergency incidents, mass population screening, and medical education purposes. The performance of the proposed system is demonstrated using five atherosclerotic plaque ultrasound videos. The videos are encoded at the clinically acquired resolution, in addition to lower, QCIF, and CIF resolutions, at different bitrates, and four different encoding structures. Commercially available wireless local area network (WLAN) and 3.5G high-speed packet access (HSPA) wireless channels are used to validate the developed platform. Objective video quality assessment is based on PSNR ratings, following calibration using the variable frame delay (VFD) algorithm that removes temporal mismatch between original and received videos. Clinical evaluation is based on atherosclerotic plaque ultrasound video assessment protocol. Experimental results show that adequate diagnostic quality wireless medical video communications are realized using the designed telemedicine platform. HSPA cellular networks provide for ultrasound video transmission at the acquired resolution, while VFD algorithm utilization bridges objective and subjective ratings. PMID:23573082

  1. Dinosaur: A Refined Open-Source Peptide MS Feature Detector.

    PubMed

    Teleman, Johan; Chawade, Aakash; Sandin, Marianne; Levander, Fredrik; Malmström, Johan

    2016-07-01

    In bottom-up mass spectrometry (MS)-based proteomics, peptide isotopic and chromatographic traces (features) are frequently used for label-free quantification in data-dependent acquisition MS but can also be used for the improved identification of chimeric spectra or sample complexity characterization. Feature detection is difficult because of the high complexity of MS proteomics data from biological samples, which frequently causes features to intermingle. In addition, existing feature detection algorithms commonly suffer from compatibility issues, long computation times, or poor performance on high-resolution data. Because of these limitations, we developed a new tool, Dinosaur, with increased speed and versatility. Dinosaur has the functionality to sample algorithm computations through quality-control plots, which we call a plot trail. From the evaluation of this plot trail, we introduce several algorithmic improvements to further improve the robustness and performance of Dinosaur, with the detection of features for 98% of MS/MS identifications in a benchmark data set, and no other algorithm tested in this study passed 96% feature detection. We finally used Dinosaur to reimplement a published workflow for peptide identification in chimeric spectra, increasing chimeric identification from 26% to 32% over the standard workflow. Dinosaur is operating-system-independent and is freely available as open source on https://github.com/fickludd/dinosaur .

  2. Open-Source Software for Modeling of Nanoelectronic Devices

    NASA Technical Reports Server (NTRS)

    Oyafuso, Fabiano; Hua, Hook; Tisdale, Edwin; Hart, Don

    2004-01-01

    The Nanoelectronic Modeling 3-D (NEMO 3-D) computer program has been upgraded to open-source status through elimination of license-restricted components. The present version functions equivalently to the version reported in "Software for Numerical Modeling of Nanoelectronic Devices" (NPO-30520), NASA Tech Briefs, Vol. 27, No. 11 (November 2003), page 37. To recapitulate: NEMO 3-D performs numerical modeling of the electronic transport and structural properties of a semiconductor device that has overall dimensions of the order of tens of nanometers. The underlying mathematical model represents the quantum-mechanical behavior of the device resolved to the atomistic level of granularity. NEMO 3-D solves the applicable quantum matrix equation on a Beowulf-class cluster computer by use of a parallel-processing matrix vector multiplication algorithm coupled to a Lanczos and/or Rayleigh-Ritz algorithm that solves for eigenvalues. A prior upgrade of NEMO 3-D incorporated a capability for a strain treatment, parameterized for bulk material properties of GaAs and InAs, for two tight-binding submodels. NEMO 3-D has been demonstrated in atomistic analyses of effects of disorder in alloys and, in particular, in bulk In(x)Ga(1-x)As and in In(0.6)Ga(0.4)As quantum dots.

  3. An Open Source Software Tool for Hydrologic Climate Change Assessment

    NASA Astrophysics Data System (ADS)

    Park, Dong Kwan; Shin, Mun-Ju; Kim, Young-Oh

    2015-04-01

    With the Intergovernmental Panel on Climate Change (IPCC) publishing Climate Change Assessment Reports containing updated forecasts and scenarios regularly, it is necessary to also periodically perform hydrologic assessments studies on these scenarios. The practical users including scientists and government people need to use handy tools that operate from climate input data of historical observations and climate change scenarios to rainfall-runoff simulation and assessment periodically. We propose HydroCAT (Hydrologic Climate change Assessment Tool), which is a flexible software tool designed to simplify and streamline hydrologic climate change assessment studies with the incorporation of: taking climate input values from general circulation models using the latest climate change scenarios; simulation of downscaled values using statistical downscaling methods; calibration and simulation of well-know multiple lumped conceptual hydrologic models; assessment of results using statistical methods. This package is designed in an open source, R-based, software package that includes an operating framework to support wide data frameworks, variety of hydrologic models, and climate change scenarios. The use of the software is demonstrated in a case study of the Geum River basin in Republic of Korea.

  4. An Open Source Embedding Code for the Condensed Phase

    NASA Astrophysics Data System (ADS)

    Genova, Alessandro; Ceresoli, Davide; Krishtal, Alisa; Andreussi, Oliviero; Distasio, Robert; Pavanello, Michele

    Work from our group as well as others has shown that for many systems such as molecular aggregates, liquids, and complex layered materials, subsystem Density-Functional Theory (DFT) is capable of immensely reducing the computational cost while providing a better and more intuitive insight into the underlying physics. We developed a massively parallel implementation of Subsystem DFT for the condensed phase into the open-source Quantum ESPRESSO software package. In this talk, we will discuss how we: (1) implemented such a flexible parallel framework aiming at the optimal load balancing; (2) simplified the solution of the electronic structure problem by allowing a fragment specific sampling of the first Brillouin Zone; (3) achieve enormous speedups by solving the electronic structure of each fragment in a unit cell smaller than the supersystem simulation cell, effectively introducing a fragment specific basis set, with no deterioration of the fully periodic simulation. As of March 14, 2016, the code has been released and is available to the public.

  5. Nektar++: An open-source spectral/ hp element framework

    NASA Astrophysics Data System (ADS)

    Cantwell, C. D.; Moxey, D.; Comerford, A.; Bolis, A.; Rocco, G.; Mengaldo, G.; De Grazia, D.; Yakovlev, S.; Lombard, J.-E.; Ekelschot, D.; Jordi, B.; Xu, H.; Mohamied, Y.; Eskilsson, C.; Nelson, B.; Vos, P.; Biotto, C.; Kirby, R. M.; Sherwin, S. J.

    2015-07-01

    Nektar++ is an open-source software framework designed to support the development of high-performance scalable solvers for partial differential equations using the spectral/ hp element method. High-order methods are gaining prominence in several engineering and biomedical applications due to their improved accuracy over low-order techniques at reduced computational cost for a given number of degrees of freedom. However, their proliferation is often limited by their complexity, which makes these methods challenging to implement and use. Nektar++ is an initiative to overcome this limitation by encapsulating the mathematical complexities of the underlying method within an efficient C++ framework, making the techniques more accessible to the broader scientific and industrial communities. The software supports a variety of discretisation techniques and implementation strategies, supporting methods research as well as application-focused computation, and the multi-layered structure of the framework allows the user to embrace as much or as little of the complexity as they need. The libraries capture the mathematical constructs of spectral/ hp element methods, while the associated collection of pre-written PDE solvers provides out-of-the-box application-level functionality and a template for users who wish to develop solutions for addressing questions in their own scientific domains.

  6. Open-Source Photometric System for Enzymatic Nitrate Quantification.

    PubMed

    Wittbrodt, B T; Squires, D A; Walbeck, J; Campbell, E; Campbell, W H; Pearce, J M

    2015-01-01

    Nitrate, the most oxidized form of nitrogen, is regulated to protect people and animals from harmful levels as there is a large over abundance due to anthropogenic factors. Widespread field testing for nitrate could begin to address the nitrate pollution problem, however, the Cadmium Reduction Method, the leading certified method to detect and quantify nitrate, demands the use of a toxic heavy metal. An alternative, the recently proposed Environmental Protection Agency Nitrate Reductase Nitrate-Nitrogen Analysis Method, eliminates this problem but requires an expensive proprietary spectrophotometer. The development of an inexpensive portable, handheld photometer will greatly expedite field nitrate analysis to combat pollution. To accomplish this goal, a methodology for the design, development, and technical validation of an improved open-source water testing platform capable of performing Nitrate Reductase Nitrate-Nitrogen Analysis Method. This approach is evaluated for its potential to i) eliminate the need for toxic chemicals in water testing for nitrate and nitrite, ii) reduce the cost of equipment to perform this method for measurement for water quality, and iii) make the method easier to carryout in the field. The device is able to perform as well as commercial proprietary systems for less than 15% of the cost for materials. This allows for greater access to the technology and the new, safer nitrate testing technique. PMID:26244342

  7. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  8. MetaTrans: an open-source pipeline for metatranscriptomics.

    PubMed

    Martinez, Xavier; Pozuelo, Marta; Pascal, Victoria; Campos, David; Gut, Ivo; Gut, Marta; Azpiroz, Fernando; Guarner, Francisco; Manichanh, Chaysavanh

    2016-05-23

    To date, meta-omic approaches use high-throughput sequencing technologies, which produce a huge amount of data, thus challenging modern computers. Here we present MetaTrans, an efficient open-source pipeline to analyze the structure and functions of active microbial communities using the power of multi-threading computers. The pipeline is designed to perform two types of RNA-Seq analyses: taxonomic and gene expression. It performs quality-control assessment, rRNA removal, maps reads against functional databases and also handles differential gene expression analysis. Its efficacy was validated by analyzing data from synthetic mock communities, data from a previous study and data generated from twelve human fecal samples. Compared to an existing web application server, MetaTrans shows more efficiency in terms of runtime (around 2 hours per million of transcripts) and presents adapted tools to compare gene expression levels. It has been tested with a human gut microbiome database but also proposes an option to use a general database in order to analyze other ecosystems. For the installation and use of the pipeline, we provide a detailed guide at the following website (www.metatrans.org).

  9. Application of Open Source Technologies for Oceanographic Data Analysis

    NASA Astrophysics Data System (ADS)

    Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.

    2015-12-01

    NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes

  10. Assesing Ecohydrological Impacts of Forest Disturbance using Open Source Software

    NASA Astrophysics Data System (ADS)

    Lovette, J. P.; Chang, T.; Treglia, M.; Gan, T.; Duncan, J.

    2014-12-01

    In the past 30 years, land management protocols, climate change, and land use have radically changed the frequency and magnitudes of disturbance regimes. Landscape scale disturbances can change a forest structure, resulting in impacts on adjacent watersheds that may affect water amount/quality for human and natural resource use. Our project quantifies hydrologic changes from of a suite of disturbance events resulting in vegetation cover shifts at watersheds across the continental United States. These disturbance events include: wildfire, insect/disease, deforestation(logging), hurricanes, ice storms, and human land use. Our major question is: Can the effects of disturbance on ecohydrology be generalized across regions, time scales, and spatial scales? Using a workflow of open source tools, and utilizing publicly available data, this work could be extended and leveraged by other researchers. Spatial data on disturbance include the MODIS Global Disturbance Index (NTSG), Landsat 7 Global Forest Change (Hansen dataset), and the Degree of Human Modification (Theobald dataset). Ecohydrologic response data includes USGS NWIS, USFS-LTER climDB/hydroDB, and the CUAHSI HIS.

  11. Open-Source Photometric System for Enzymatic Nitrate Quantification.

    PubMed

    Wittbrodt, B T; Squires, D A; Walbeck, J; Campbell, E; Campbell, W H; Pearce, J M

    2015-01-01

    Nitrate, the most oxidized form of nitrogen, is regulated to protect people and animals from harmful levels as there is a large over abundance due to anthropogenic factors. Widespread field testing for nitrate could begin to address the nitrate pollution problem, however, the Cadmium Reduction Method, the leading certified method to detect and quantify nitrate, demands the use of a toxic heavy metal. An alternative, the recently proposed Environmental Protection Agency Nitrate Reductase Nitrate-Nitrogen Analysis Method, eliminates this problem but requires an expensive proprietary spectrophotometer. The development of an inexpensive portable, handheld photometer will greatly expedite field nitrate analysis to combat pollution. To accomplish this goal, a methodology for the design, development, and technical validation of an improved open-source water testing platform capable of performing Nitrate Reductase Nitrate-Nitrogen Analysis Method. This approach is evaluated for its potential to i) eliminate the need for toxic chemicals in water testing for nitrate and nitrite, ii) reduce the cost of equipment to perform this method for measurement for water quality, and iii) make the method easier to carryout in the field. The device is able to perform as well as commercial proprietary systems for less than 15% of the cost for materials. This allows for greater access to the technology and the new, safer nitrate testing technique.

  12. Dinosaur: A Refined Open-Source Peptide MS Feature Detector

    PubMed Central

    2016-01-01

    In bottom-up mass spectrometry (MS)-based proteomics, peptide isotopic and chromatographic traces (features) are frequently used for label-free quantification in data-dependent acquisition MS but can also be used for the improved identification of chimeric spectra or sample complexity characterization. Feature detection is difficult because of the high complexity of MS proteomics data from biological samples, which frequently causes features to intermingle. In addition, existing feature detection algorithms commonly suffer from compatibility issues, long computation times, or poor performance on high-resolution data. Because of these limitations, we developed a new tool, Dinosaur, with increased speed and versatility. Dinosaur has the functionality to sample algorithm computations through quality-control plots, which we call a plot trail. From the evaluation of this plot trail, we introduce several algorithmic improvements to further improve the robustness and performance of Dinosaur, with the detection of features for 98% of MS/MS identifications in a benchmark data set, and no other algorithm tested in this study passed 96% feature detection. We finally used Dinosaur to reimplement a published workflow for peptide identification in chimeric spectra, increasing chimeric identification from 26% to 32% over the standard workflow. Dinosaur is operating-system-independent and is freely available as open source on https://github.com/fickludd/dinosaur. PMID:27224449

  13. What makes computational open source software libraries successful?

    NASA Astrophysics Data System (ADS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  14. Hydrognomon - open source software for the analysis of hydrological data

    NASA Astrophysics Data System (ADS)

    Kozanis, Stefanos; Christofides, Antonios; Mamassis, Nikos; Efstratiadis, Andreas; Koutsoyiannis, Demetris

    2010-05-01

    Hydrognomon is a software tool for the processing of hydrological data. It is an open source application running on standard Microsoft Windows platforms, and it is part of the openmeteo.org framework. Data are imported through standard text files, spreadsheets or by typing. Standard hydrological data processing techniques include time step aggregation and regularization, interpolation, regression analysis and infilling of missing values, consistency tests, data filtering, graphical and tabular visualisation of time series, etc. It supports several time steps, from the finest minute scales up to decades; specific cases of irregular time steps and offsets are also supported. The program also includes common hydrological applications, such as evapotranspiration modelling, stage-discharge analysis, homogeneity tests, areal integration of point data series, processing of hydrometric data, as well as lumped hydrological modelling with automatic calibration facilities. Here the emphasis is given on the statistical module of Hydrognomon, which provides tools for data exploration, fitting of distribution functions, statistical prediction, Monte-Carlo simulation, determination of confidence limits, analysis of extremes, and construction of ombrian (intensity-duration-frequency) curves. Hydrognomon is available for download from http://www.hydrognomon.org/.

  15. BRISC-an open source pulmonary nodule image retrieval framework.

    PubMed

    Lam, Michael O; Disney, Tim; Raicu, Daniela S; Furst, Jacob; Channin, David S

    2007-11-01

    We have created a content-based image retrieval framework for computed tomography images of pulmonary nodules. When presented with a nodule image, the system retrieves images of similar nodules from a collection prepared by the Lung Image Database Consortium (LIDC). The system (1) extracts images of individual nodules from the LIDC collection based on LIDC expert annotations, (2) stores the extracted data in a flat XML database, (3) calculates a set of quantitative descriptors for each nodule that provide a high-level characterization of its texture, and (4) uses various measures to determine the similarity of two nodules and perform queries on a selected query nodule. Using our framework, we compared three feature extraction methods: Haralick co-occurrence, Gabor filters, and Markov random fields. Gabor and Markov descriptors perform better at retrieving similar nodules than do Haralick co-occurrence techniques, with best retrieval precisions in excess of 88%. Because the software we have developed and the reference images are both open source and publicly available they may be incorporated into both commercial and academic imaging workstations and extended by others in their research.

  16. Open-Source Photometric System for Enzymatic Nitrate Quantification

    PubMed Central

    Wittbrodt, B. T.; Squires, D. A.; Walbeck, J.; Campbell, E.; Campbell, W. H.; Pearce, J. M.

    2015-01-01

    Nitrate, the most oxidized form of nitrogen, is regulated to protect people and animals from harmful levels as there is a large over abundance due to anthropogenic factors. Widespread field testing for nitrate could begin to address the nitrate pollution problem, however, the Cadmium Reduction Method, the leading certified method to detect and quantify nitrate, demands the use of a toxic heavy metal. An alternative, the recently proposed Environmental Protection Agency Nitrate Reductase Nitrate-Nitrogen Analysis Method, eliminates this problem but requires an expensive proprietary spectrophotometer. The development of an inexpensive portable, handheld photometer will greatly expedite field nitrate analysis to combat pollution. To accomplish this goal, a methodology for the design, development, and technical validation of an improved open-source water testing platform capable of performing Nitrate Reductase Nitrate-Nitrogen Analysis Method. This approach is evaluated for its potential to i) eliminate the need for toxic chemicals in water testing for nitrate and nitrite, ii) reduce the cost of equipment to perform this method for measurement for water quality, and iii) make the method easier to carryout in the field. The device is able to perform as well as commercial proprietary systems for less than 15% of the cost for materials. This allows for greater access to the technology and the new, safer nitrate testing technique. PMID:26244342

  17. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.

  18. Special population planner 4 : an open source release.

    SciTech Connect

    Kuiper, J.; Metz, W.; Tanzman, E.

    2008-01-01

    Emergencies like Hurricane Katrina and the recent California wildfires underscore the critical need to meet the complex challenge of planning for individuals with special needs and for institutionalized special populations. People with special needs and special populations often have difficulty responding to emergencies or taking protective actions, and emergency responders may be unaware of their existence and situations during a crisis. Special Population Planner (SPP) is an ArcGIS-based emergency planning system released as an open source product. SPP provides for easy production of maps, reports, and analyses to develop and revise emergency response plans. It includes tools to manage a voluntary registry of data for people with special needs, integrated links to plans and documents, tools for response planning and analysis, preformatted reports and maps, and data on locations of special populations, facility and resource characteristics, and contacts. The system can be readily adapted for new settings without programming and is broadly applicable. Full documentation and a demonstration database are included in the release.

  19. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  20. La Cura, An Open Source Cure for Cancer.

    PubMed

    Iaconesi, Salvatore

    2013-09-01

    When I was diagnosed with brain cancer, I opened up my medical records to the Web, asking for help. The response was incredible: a global source of crowd-generated information about how to cure my own cancer. This work explores the many issues involved in handling such a peculiar form of information (including privacy, preserving the complexity of the human being, the reliability of responses, and more) and the outcomes of the overall process. "[Critique] does not aim to make possible metaphysics which becomes, in the end, science; its aim is to look as more as possible beyond and beside at the infinite work of freedom." -M. Foucault, "What is Illuminism" "Maybe today the most important objective is not to understand what we are, but to refuse it. We must imagine and build what we could be, to drop that political double bind which is constituted by the simultaneous individualization and totalitarianization of the structures of modern powers. The conclusion might be that the political, ethical, social and philosophical issue today is not to liberate individuals from the State and its institutions, but to free ourselves from the State and from the individualization which is bound to the State. We must promote new ways for subjectivity through the refusal of that kind of individuality which has been imposed to us for so many centuries." -M. Foucault, "Why Study Power: The Question of the Subject" (1).

  1. A global, open-source database of flood protection standards

    NASA Astrophysics Data System (ADS)

    Scussolini, Paolo; Aerts, Jeroen; Jongman, Brenden; Bouwer, Laurens; Winsemius, Hessel; de Moel, Hans; Ward, Philip

    2016-04-01

    Accurate flood risk estimation is pivotal in that it enables risk-informed policies in disaster risk reduction, as emphasized in the recent Sendai framework for Disaster Risk Reduction. To improve our understanding of flood risk, models are now capable to provide actionable risk information on the (sub)global scale. Still the accuracy of their results is greatly limited by the lack of information on standards of protection to flood that are actually in place; and researchers thus take large assumptions on the extent of protection. With our work we propose a first global, open-source database of FLOod PROtection Standards, FLOPROS, covering a range of spatial scales. FLOPROS is structured in three layers of information, and merges them into one consistent database: 1) the Design layer contains empirical information about the standard of protection presently in place; 2) the Policy layer contains intended protection standards from normative documents; 3) the Model layer uses a validated numerical approach to calculate protection standards for areas not covered in the other layers. The FLOPROS database can be used for more accurate risk assessment exercises across scales. As the database should be continually updated to reflect new interventions, we invite researchers and practitioners to contribute information. Further, we look for partners within the risk community to participate in additional strategies to implement the amount and accuracy of information contained in this first version of FLOPROS.

  2. Tessera: Open source software for accelerated data science

    SciTech Connect

    Sego, Landon H.; Hafen, Ryan P.; Director, Hannah M.; LaMothe, Ryan R.

    2014-06-30

    Extracting useful, actionable information from data can be a formidable challenge for the safeguards, nonproliferation, and arms control verification communities. Data scientists are often on the “front-lines” of making sense of complex and large datasets. They require flexible tools that make it easy to rapidly reformat large datasets, interactively explore and visualize data, develop statistical algorithms, and validate their approaches—and they need to perform these activities with minimal lines of code. Existing commercial software solutions often lack extensibility and the flexibility required to address the nuances of the demanding and dynamic environments where data scientists work. To address this need, Pacific Northwest National Laboratory developed Tessera, an open source software suite designed to enable data scientists to interactively perform their craft at the terabyte scale. Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving insight from data. We illustrate the use of Tessera with an example analysis of computer network data.

  3. Wndchrm – an open source utility for biological image analysis

    PubMed Central

    Shamir, Lior; Orlov, Nikita; Eckley, D Mark; Macura, Tomasz; Johnston, Josiah; Goldberg, Ilya G

    2008-01-01

    Background Biological imaging is an emerging field, covering a wide range of applications in biological and clinical research. However, while machinery for automated experimenting and data acquisition has been developing rapidly in the past years, automated image analysis often introduces a bottleneck in high content screening. Methods Wndchrm is an open source utility for biological image analysis. The software works by first extracting image content descriptors from the raw image, image transforms, and compound image transforms. Then, the most informative features are selected, and the feature vector of each image is used for classification and similarity measurement. Results Wndchrm has been tested using several publicly available biological datasets, and provided results which are favorably comparable to the performance of task-specific algorithms developed for these datasets. The simple user interface allows researchers who are not knowledgeable in computer vision methods and have no background in computer programming to apply image analysis to their data. Conclusion We suggest that wndchrm can be effectively used for a wide range of biological image analysis tasks. Using wndchrm can allow scientists to perform automated biological image analysis while avoiding the costly challenge of implementing computer vision and pattern recognition algorithms. PMID:18611266

  4. Gadgetron: an open source framework for medical image reconstruction.

    PubMed

    Hansen, Michael Schacht; Sørensen, Thomas Sangild

    2013-06-01

    This work presents a new open source framework for medical image reconstruction called the "Gadgetron." The framework implements a flexible system for creating streaming data processing pipelines where data pass through a series of modules or "Gadgets" from raw data to reconstructed images. The data processing pipeline is configured dynamically at run-time based on an extensible markup language configuration description. The framework promotes reuse and sharing of reconstruction modules and new Gadgets can be added to the Gadgetron framework through a plugin-like architecture without recompiling the basic framework infrastructure. Gadgets are typically implemented in C/C++, but the framework includes wrapper Gadgets that allow the user to implement new modules in the Python scripting language for rapid prototyping. In addition to the streaming framework infrastructure, the Gadgetron comes with a set of dedicated toolboxes in shared libraries for medical image reconstruction. This includes generic toolboxes for data-parallel (e.g., GPU-based) execution of compute-intensive components. The basic framework architecture is independent of medical imaging modality, but this article focuses on its application to Cartesian and non-Cartesian parallel magnetic resonance imaging.

  5. Open Source and Open Standard based decision support system: the example of lake Verbano floods management.

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Antonovic, Milan; Pozzoni, Maurizio; Graf, Andrea

    2015-04-01

    The Locarno area (Switzerland, Canton Ticino) is exposed to lacual floods with a return period of about 7-8 years. The risk is of particular concern because the area is located in a floodplain that registered in the last decades a great increase in settlement and values of the real estates. Moreover small differences in lake altitude may produce a significant increase in flooded area due to the very low average slope of the terrain. While fatalities are not generally registered, several important economic costs are associated, e.g.: damages to real estates, interruption of activities, evacuation and relocation and environmental damages. While important events were registered in 1978, 1993, 2000, 2002 and 2014 the local stakeholder invested time and money in the set-up of an up-to-date decision support system that allows for the reduction of risks. Thanks to impressive technological advances the visionary concept of the Digital Earth (Gore 1992, 1998) is being realizing: geospatial coverages and monitoring systems data are increasingly available on the Web, and more importantly, in a standard format. As a result, today is possible to develop innovative decision support systems (Molinari et al. 2013) which mesh-up several information sources and offers special features for risk scenarios evaluation. In agreement with the exposed view, the authors have recently developed a new Web system whose design is based on the Service Oriented Architecture pattern. Open source software (e.g.: Geoserver, PostGIS, OpenLayers) has been used throughout the whole system and geospatial Open Standards (e.g.: SOS, WMS, WFS) are the pillars it rely on. SITGAP 2.0, implemented in collaboration with the Civil protection of Locarno e Vallemaggia, combines a number of data sources such as the Federal Register of Buildings and Dwellings, the Cantonal Register of residents, the Cadastral Surveying, the Cantonal Hydro-meteorological monitoring observations, the Meteoswiss weather forecasts, and

  6. Methods and Data Used to Investigate Polonium-210 as a Source of Excess Gross-Alpha Radioactivity in Ground Water, Churchill County, Nevada

    USGS Publications Warehouse

    Seiler, Ralph L.

    2007-01-01

    Ground water is the major source of drinking water in the Carson River Basin, California and Nevada. Previous studies have shown that uranium and gross-alpha radioactivities in ground water can be greater than U.S. Environmental Protection Agency Maximum Contaminant Levels, particularly in the Carson Desert, Churchill County, Nevada. Studies also have shown that the primary source of the gross-alpha radioactivity and alpha-emitting radionuclides in ground water is the dissolution of uranium-rich granitic rocks and basin-fill sediments that have their origins in the Sierra Nevada. However, ground water sampled from some wells in the Carson Desert had gross-alpha radioactivities greater than could be accounted for by the decay of dissolved uranium. The occurrence of polonium-210 (Po-210) was hypothesized to explain the higher than expected gross-alpha radioactivities. This report documents and describes the study design, field and analytical methods, and data used to determine whether Po-210 is the source of excess gross-alpha radioactivity in ground water underlying the Carson Desert in and around Fallon, Nevada. Specifically, this report presents: 1) gross alpha and uranium radioactivities for 100 wells sampled from June to September 2001; and 2) pH, dissolved oxygen, specific conductance, and Po-210 radioactivity for 25 wells sampled in April and June 2007. Results of quality-control samples for the 2007 dataset are also presented.

  7. EHDViz: clinical dashboard development using open-source technologies

    PubMed Central

    Badgeley, Marcus A; Shameer, Khader; Glicksberg, Benjamin S; Tomlinson, Max S; Levin, Matthew A; McCormick, Patrick J; Kasarskis, Andrew; Reich, David L; Dudley, Joel T

    2016-01-01

    -driven precision medicine. As an open-source visualisation framework capable of integrating health assessment, EHDViz aims to be a valuable toolkit for rapid design, development and implementation of scalable clinical data visualisation dashboards. PMID:27013597

  8. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  9. The open source, object- and process oriented simulation system OpenGeoSys - concepts, development, community

    NASA Astrophysics Data System (ADS)

    Bauer, S.; Li, D.; Beyer, C.; Wang, W.; Bilke, L.; Graupner, B.

    2011-12-01

    Many geoscientific problems, such as underground waste disposal, nuclear waste disposal, CO2 sequestration, geothermal energy, etc., require for prediction of ongoing processes as well as risk and safety assessment a numerical simulation system. The governing processes are thermal heat transfer (T), hydraulic flow in multi-phase systems (H), mechanical deformation (M) and geochemical reactions (C), which interact in a complex way (THMC). The development of suitable simulation systems requires a large amount of effort for code development, verification and applications. OpenGeoSys (OGS) is an open source scientific initiative for the simulation of these THMC processes in porous media. A flexible numerical framework based on the Finite Element Method is provided and applied to the governing process equations. Due to the object- and process-oriented character of the code, functionality enhancement and code coupling with external simulators can be performed reasonably effectively. This structure also allows for a distributed development, with developers at different locations contributing to the common code. The code is platform independent, accessible via internet for development and application, and checked by an automated benchmarking system regularly.

  10. Collaboration using open standards and open source software (examples of DIAS/CEOS Water Portal)

    NASA Astrophysics Data System (ADS)

    Miura, S.; Sekioka, S.; Kuroiwa, K.; Kudo, Y.

    2015-12-01

    The DIAS/CEOS Water Portal is a part of the DIAS (Data Integration and Analysis System, http://www.editoria.u-tokyo.ac.jp/projects/dias/?locale=en_US) systems for data distribution for users including, but not limited to, scientists, decision makers and officers like river administrators. One of the functions of this portal is to enable one-stop search and access variable water related data archived multiple data centers located all over the world. This portal itself does not store data. Instead, according to requests made by users on the web page, it retrieves data from distributed data centers on-the-fly and lets them download and see rendered images/plots. Our system mainly relies on the open source software GI-cat (http://essi-lab.eu/do/view/GIcat) and open standards such as OGC-CSW, Opensearch and OPeNDAP protocol to enable the above functions. Details on how it works will be introduced during the presentation. Although some data centers have unique meta data format and/or data search protocols, our portal's brokering function enables users to search across various data centers at one time. And this portal is also connected to other data brokering systems, including GEOSS DAB (Discovery and Access Broker). As a result, users can search over thousands of datasets, millions of files at one time. Users can access the DIAS/CEOS Water Portal system at http://waterportal.ceos.org/.

  11. Towards the Implementation of an openEHR-based Open Source EHR Platform (a vision paper).

    PubMed

    Pazos Gutiérrez, Pablo

    2015-01-01

    Healthcare Information Systems are a big business. Currently there is an explosion of EHR/EMR products available on the market, and the best tools are really expensive. Many developing countries and healthcare providers cannot access such tools, and for those who can, there is not a clear strategy for the evolution, scaling, and cost of these electronic health products. The lack of standard-based implementations conduct to the creation of isolated information silos that cannot be exploited (i.e. shared between providers to promote a holistic view of each patient's medical history). This paper exposes the main elements behind a Standard-based Open Source EHR Platform that is future-proof and allows to evolve and scale with minimal cost. The proposed EHR Architecture is based on openEHR specifications, adding elements emerged from research and development experiences, leading to a design that can be implemented in any modern technology. Different implementations will be interoperable by design. This Platform will leverage contexts of scarce resources, reusing clinical knowledge, a common set of software components and services.

  12. Induced radioactivity of a GSO scintillator by secondary fragments in carbon ion therapy and its effects on in-beam OpenPET imaging.

    PubMed

    Hirano, Yoshiyuki; Nitta, Munetaka; Nishikido, Fumihiko; Yoshida, Eiji; Inadama, Naoko; Yamaya, Taiga

    2016-07-01

    The accumulation of induced radioactivity within in-beam PET scanner scintillators is of concern for its long-term clinical usage in particle therapy. To estimate the effects on OpenPET which we are developing for in-beam PET based on GSOZ (Zi doped Gd2SiO5), we measured the induced radioactivity of GSO activated by secondary fragments in a water phantom irradiation by a (12)C beam with an energy of 290 MeV u(-1). Radioisotopes of Na, Ce, Eu, Gd, Nd, Pm and Tb including positron emitters were observed in the gamma ray spectra of the activated GSO with a high purity Ge detector and their absolute radioactivities were calculated. We used the Monte Carlo simulation platform, Geant4 in which the observed radioactivity was assigned to the scintillators of a precisely reproduced OpenPET and the single and coincidence rates immediately after one treatment and after one-year usage were estimated for the most severe conditions. Comparing the highest coincidence rate originating from the activated scintillators (background) and the expected coincidence rate from an imaging object (signal), we determined the expected signal-to-noise ratio to be more than 7 within 3 min and more than 10 within 1 min from the scan start time. We concluded the effects of scintillator activation and their accumulation on the OpenPET imaging were small and clinical long-term usage of the OpenPET was feasible.

  13. Induced radioactivity of a GSO scintillator by secondary fragments in carbon ion therapy and its effects on in-beam OpenPET imaging.

    PubMed

    Hirano, Yoshiyuki; Nitta, Munetaka; Nishikido, Fumihiko; Yoshida, Eiji; Inadama, Naoko; Yamaya, Taiga

    2016-07-01

    The accumulation of induced radioactivity within in-beam PET scanner scintillators is of concern for its long-term clinical usage in particle therapy. To estimate the effects on OpenPET which we are developing for in-beam PET based on GSOZ (Zi doped Gd2SiO5), we measured the induced radioactivity of GSO activated by secondary fragments in a water phantom irradiation by a (12)C beam with an energy of 290 MeV u(-1). Radioisotopes of Na, Ce, Eu, Gd, Nd, Pm and Tb including positron emitters were observed in the gamma ray spectra of the activated GSO with a high purity Ge detector and their absolute radioactivities were calculated. We used the Monte Carlo simulation platform, Geant4 in which the observed radioactivity was assigned to the scintillators of a precisely reproduced OpenPET and the single and coincidence rates immediately after one treatment and after one-year usage were estimated for the most severe conditions. Comparing the highest coincidence rate originating from the activated scintillators (background) and the expected coincidence rate from an imaging object (signal), we determined the expected signal-to-noise ratio to be more than 7 within 3 min and more than 10 within 1 min from the scan start time. We concluded the effects of scintillator activation and their accumulation on the OpenPET imaging were small and clinical long-term usage of the OpenPET was feasible. PMID:27280308

  14. Induced radioactivity of a GSO scintillator by secondary fragments in carbon ion therapy and its effects on in-beam OpenPET imaging

    NASA Astrophysics Data System (ADS)

    Hirano, Yoshiyuki; Nitta, Munetaka; Nishikido, Fumihiko; Yoshida, Eiji; Inadama, Naoko; Yamaya, Taiga

    2016-07-01

    The accumulation of induced radioactivity within in-beam PET scanner scintillators is of concern for its long-term clinical usage in particle therapy. To estimate the effects on OpenPET which we are developing for in-beam PET based on GSOZ (Zi doped Gd2SiO5), we measured the induced radioactivity of GSO activated by secondary fragments in a water phantom irradiation by a 12C beam with an energy of 290 MeV u-1. Radioisotopes of Na, Ce, Eu, Gd, Nd, Pm and Tb including positron emitters were observed in the gamma ray spectra of the activated GSO with a high purity Ge detector and their absolute radioactivities were calculated. We used the Monte Carlo simulation platform, Geant4 in which the observed radioactivity was assigned to the scintillators of a precisely reproduced OpenPET and the single and coincidence rates immediately after one treatment and after one-year usage were estimated for the most severe conditions. Comparing the highest coincidence rate originating from the activated scintillators (background) and the expected coincidence rate from an imaging object (signal), we determined the expected signal-to-noise ratio to be more than 7 within 3 min and more than 10 within 1 min from the scan start time. We concluded the effects of scintillator activation and their accumulation on the OpenPET imaging were small and clinical long-term usage of the OpenPET was feasible.

  15. Induced radioactivity of a GSO scintillator by secondary fragments in carbon ion therapy and its effects on in-beam OpenPET imaging

    NASA Astrophysics Data System (ADS)

    Hirano, Yoshiyuki; Nitta, Munetaka; Nishikido, Fumihiko; Yoshida, Eiji; Inadama, Naoko; Yamaya, Taiga

    2016-07-01

    The accumulation of induced radioactivity within in-beam PET scanner scintillators is of concern for its long-term clinical usage in particle therapy. To estimate the effects on OpenPET which we are developing for in-beam PET based on GSOZ (Zi doped Gd2SiO5), we measured the induced radioactivity of GSO activated by secondary fragments in a water phantom irradiation by a 12C beam with an energy of 290 MeV u‑1. Radioisotopes of Na, Ce, Eu, Gd, Nd, Pm and Tb including positron emitters were observed in the gamma ray spectra of the activated GSO with a high purity Ge detector and their absolute radioactivities were calculated. We used the Monte Carlo simulation platform, Geant4 in which the observed radioactivity was assigned to the scintillators of a precisely reproduced OpenPET and the single and coincidence rates immediately after one treatment and after one-year usage were estimated for the most severe conditions. Comparing the highest coincidence rate originating from the activated scintillators (background) and the expected coincidence rate from an imaging object (signal), we determined the expected signal-to-noise ratio to be more than 7 within 3 min and more than 10 within 1 min from the scan start time. We concluded the effects of scintillator activation and their accumulation on the OpenPET imaging were small and clinical long-term usage of the OpenPET was feasible.

  16. A flexible open-source toolkit for lava flow simulations

    NASA Astrophysics Data System (ADS)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  17. OpenCFU, a new free and open-source software to count cell colonies and other circular objects.

    PubMed

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net.

  18. High-resolution medical imaging system for 3D imaging of radioactive sources with 1-mm FWHM spatial resolution

    NASA Astrophysics Data System (ADS)

    Smither, Robert K.

    2003-06-01

    This paper describes a modification of a new imaging system developed at Argonne National Laboratory that has the potential of achieving a spatial resolution of 1 mm FWHM. The imaging system uses a crystal diffraction lens to focus gamma rays from the radioactive source. The medical imaging application of this system would be to detect small amounts of radioactivity in the human body that would be associated with cancer. The best spatial resolution obtained with the present lens at the time of the presentation made at the Medical Imaging Symposium 2001, was 6.7 mm FWHM for a 1-mm-diameter source. Since then it has been possible to improve the spacial resolution of the lens system to 3 mm FWHM. Experiments with the original lens system have led to a new design for a lens system that could have a spacial resolution of 1 mm FWHM. This is accomplished by: one, reducing the radial dimension of the crystals, and two, by replacing the small individual crystals with bent strips of single-crystalline material. Experiments are under way to test this approach.

  19. An open-source chemical kinetics network: VULCAN

    NASA Astrophysics Data System (ADS)

    Tsai, Shang-Min; Lyons, James; Heng, Kevin

    2015-12-01

    I will present VULCAN, an open-source 1D chemical kinetics code suited for the temperature and pressure range relevant to observable exoplanet atmospheres. The chemical network is based on a set of reduced rate coefficients for C-H-O systems. Most of the rate coefficients are based on the NIST online database, and validated by comparing withthermodynamic equilibrium codes (TEA, STANJAN). The difference between the experimental rates and those from the thermodynamical data is carefully examined and discussed. For the numerical method, a simple, quick, semi-implicit Euler integrator is adopted to solve the stiff chemical reactions, within an operator-splitting scheme for computational efficiency.Several test runs of VULCAN are shown in a hierarchical way: pure H, H+O, H+O+C, including controlled experiments performed with a simple analytical temperature-pressure profiles, so that different parameters, such as the stellar irradiation, atmospheric opacities and albedo can be individually explored to understand how these properties affect the temperaturestructure and hence the chemical abundances. I will also revisit the "transport-induced-quenching” effects, and discuss the limitation of this approximation and its impact on observations. Finally, I will discuss the effects of C/O ratio and compare with published work in the literature.VULCAN is written in Python and is part of the publicly-available set of community tools we call the Exoclimes Simulation Platform (ESP; www.exoclime.org). I am a Ph.D student of Kevin Heng at the University of Bern, Switzerland.

  20. Acquire: an open-source comprehensive cancer biobanking system

    PubMed Central

    Dowst, Heidi; Pew, Benjamin; Watkins, Chris; McOwiti, Apollo; Barney, Jonathan; Qu, Shijing; Becnel, Lauren B.

    2015-01-01

    Motivation: The probability of effective treatment of cancer with a targeted therapeutic can be improved for patients with defined genotypes containing actionable mutations. To this end, many human cancer biobanks are integrating more tightly with genomic sequencing facilities and with those creating and maintaining patient-derived xenografts (PDX) and cell lines to provide renewable resources for translational research. Results: To support the complex data management needs and workflows of several such biobanks, we developed Acquire. It is a robust, secure, web-based, database-backed open-source system that supports all major needs of a modern cancer biobank. Its modules allow for i) up-to-the-minute ‘scoreboard’ and graphical reporting of collections; ii) end user roles and permissions; iii) specimen inventory through caTissue Suite; iv) shipping forms for distribution of specimens to pathology, genomic analysis and PDX/cell line creation facilities; v) robust ad hoc querying; vi) molecular and cellular quality control metrics to track specimens’ progress and quality; vii) public researcher request; viii) resource allocation committee distribution request review and oversight and ix) linkage to available derivatives of specimen. Availability and Implementation: Acquire implements standard controlled vocabularies, ontologies and objects from the NCI, CDISC and others. Here we describe the functionality of the system, its technological stack and the processes it supports. A test version Acquire is available at https://tcrbacquire-stg.research.bcm.edu; software is available in https://github.com/BCM-DLDCC/Acquire; and UML models, data and workflow diagrams, behavioral specifications and other documents are available at https://github.com/BCM-DLDCC/Acquire/tree/master/supplementaryMaterials. Contact: becnel@bcm.edu PMID:25573920

  1. Looking toward the Future: A Case Study of Open Source Software in the Humanities

    ERIC Educational Resources Information Center

    Quamen, Harvey

    2006-01-01

    In this article Harvey Quamen examines how the philosophy of open source software might be of particular benefit to humanities scholars in the near future--particularly for academic journals with limited financial resources. To this end he provides a case study in which he describes his use of open source technology (MySQL database software and…

  2. You Can Deliver the Goods Better, Faster Cheaper with Open Source Databases

    ERIC Educational Resources Information Center

    Banerjee, Kyle

    2005-01-01

    Most people don't realize how easy it is to improve library services with open source databases. Despite the fact that they are distributed free of charge, open source databases are powerful and stable enough for critical applications. With a little knowledge of HTML and some basic technical skills, you can quickly create databases to register…

  3. 76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ... Software Public Meeting AGENCY: Defense Acquisition Regulations System, Department of Defense (DoD). ACTION... regarding the use of open source software in DoD contracts. DATES: Public Meeting: January 12, 2012, from 10... for the discussions in the meeting. Please cite ``Public Meeting, DFARS--Open Source Software'' in...

  4. Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.

    ERIC Educational Resources Information Center

    Newby, Gregory B.; Greenberg, Jane; Jones, Paul

    2003-01-01

    Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)

  5. An Evaluation of Open Source Learning Management Systems According to Learners Tools

    ERIC Educational Resources Information Center

    Uzunboylu, Huseyin; Ozdamli, Fezile; Ozcinar, Zehra

    2006-01-01

    Learning Management System (LMS) is the main element of internet-based education. In parallel to this, studies in this area are increasing. The aim of this research is to evaluate the current existing Open Source Learning Management Systems in the market. For this, seventy-two Open Source Learning Management Systems have been subjected to a…

  6. The Implications of Incumbent Intellectual Property Strategies for Open Source Software Success and Commercialization

    ERIC Educational Resources Information Center

    Wen, Wen

    2012-01-01

    While open source software (OSS) emphasizes open access to the source code and avoids the use of formal appropriability mechanisms, there has been little understanding of how the existence and exercise of formal intellectual property rights (IPR) such as patents influence the direction of OSS innovation. This dissertation seeks to bridge this gap…

  7. Build, Buy, Open Source, or Web 2.0?: Making an Informed Decision for Your Library

    ERIC Educational Resources Information Center

    Fagan, Jody Condit; Keach, Jennifer A.

    2010-01-01

    When improving a web presence, today's libraries have a choice: using a free Web 2.0 application, opting for open source, buying a product, or building a web application. This article discusses how to make an informed decision for one's library. The authors stress that deciding whether to use a free Web 2.0 application, to choose open source, to…

  8. Perceptions of Open Source versus Commercial Software: Is Higher Education Still on the Fence?

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2007-01-01

    This exploratory study investigated the perceptions of technology and academic decision-makers about open source benefits and risks versus commercial software applications. The study also explored reactions to a concept for outsourcing campus-wide deployment and maintenance of open source. Data collected from telephone interviews were analyzed,…

  9. Government Technology Acquisition Policy: The Case of Proprietary versus Open Source Software

    ERIC Educational Resources Information Center

    Hemphill, Thomas A.

    2005-01-01

    This article begins by explaining the concepts of proprietary and open source software technology, which are now competing in the marketplace. A review of recent individual and cooperative technology development and public policy advocacy efforts, by both proponents of open source software and advocates of proprietary software, subsequently…

  10. Open-Source Learning Management Systems: A Predictive Model for Higher Education

    ERIC Educational Resources Information Center

    van Rooij, S. Williams

    2012-01-01

    The present study investigated the role of pedagogical, technical, and institutional profile factors in an institution of higher education's decision to select an open-source learning management system (LMS). Drawing on the results of previous research that measured patterns of deployment of open-source software (OSS) in US higher education and…

  11. An Evaluation of Open Source Learning Management Systems According to Administration Tools and Curriculum Design

    ERIC Educational Resources Information Center

    Ozdamli, Fezile

    2007-01-01

    Distance education is becoming more important in the universities and schools. The aim of this research is to evaluate the current existing Open Source Learning Management Systems according to Administration tool and Curriculum Design. For this, seventy two Open Source Learning Management Systems have been subjected to a general evaluation. After…

  12. Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic Publishing

    ERIC Educational Resources Information Center

    Samuels, Ruth Gallegos; Griffy, Henry

    2012-01-01

    This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…

  13. Implementing an Open Source Learning Management System: A Critical Analysis of Change Strategies

    ERIC Educational Resources Information Center

    Uys, Philip M.

    2010-01-01

    This paper analyses the change and innovation strategies that Charles Sturt University (CSU) used from 2007 to 2009 during the implementation and mainstreaming of an open source learning management system (LMS), Sakai, named locally as "CSU Interact". CSU was in January 2008 the first Australian University to implement an open source learning…

  14. Preparation and characterization of cesium-137 aluminosilicate pellets for radioactive source applications

    SciTech Connect

    Schultz, F.J.; Tompkins, J.A.; Haff, K.W.; Case, F.N.

    1981-07-01

    Twenty-seven fully loaded /sup 137/Cs aluminosilicate pellets were fabricated in a hot cell by the vacuum hot pressing of a cesium carbonate/montmorillonite clay mixture at 1500/sup 0/C and 570 psig. Four pellets were selected for characterization studies which included calorimetric measurements, metallography, scanning electron microscope and electron backscattering (SEM-BSE), electron microprobe, x-ray diffraction, and cesium ion leachability measurements. Each test pellet contained 437 to 450 curies of /sup 137/Cs as determined by calorimetric measurements. Metallographic examinations revealed a two-phase system: a primary, granular, gray matrix phase containing large and small pores and small pore agglomerations, and a secondary fused phase interspersed throughout the gray matrix. SEM-BSE analyses showed that cesium and silicon were uniformly distributed throughout both phases of the pellet. This indicated that the cesium-silicon-clay reaction went to completion. Aluminum homogeneity was unconfirmed due to the high background noise associated with the inherent radioactivity of the test specimens. X-ray diffraction analyses of both radioactive and non-radioactive aluminosilicate pellets confirmed the crystal lattice structure to be pollucite. Cesium ion quasistatic leachability measurements determined the leach rates of fully loaded /sup 137/Cs sectioned pollucite pellets to date to be 4.61 to 34.4 x 10/sup -10/ kg m/sup -2/s/sup -1/, while static leach tests performed on unsectioned fully loaded pellets showed the leach rates of the cesium ion to date to be 2.25 to 3.41 x 10/sup -12/ kg m/sup -2/s/sup -1/. The cesium ion diffusion coefficients through the pollucite pellet were calculated using Fick's first and second laws of diffusion. The diffusion coefficients calculated for three tracer level /sup 137/Cs aluminosilicate pellets were 1.29 x 10/sup -16/m/sup 2/s/sup -1/, 6.88 x 10/sup -17/m/sup 2/s/sup -1/, and 1.35 x 10/sup -17/m/sup 2/s/sup -1/, respectively.

  15. OpenSpace: An Open-Source Framework for Data Visualization and Contextualization

    NASA Astrophysics Data System (ADS)

    Bock, A.; Pembroke, A. D.; Mays, M. L.; Emmart, C. B.; Ynnerman, A.

    2015-12-01

    We present an open-source software development effort called OpenSpace that is tailored for the dissemination of space-related data visualization. In the current stages of the project, we have focussed on the public dissemination of space missions (Rosetta and New Horizons) as well as the support of space weather forecasting. The presented work will focus on the latter of these foci and elaborate on the efforts that have gone into developing a system that allows the user to assess the accuracy and validity of ENLIL ensemble simulations. It becomes possible to compare the results of ENLIL CME simulations with STEREO and SOHO images using an optical flow algorithm. This allows the user to compare velocities in the volumetric rendering of ENLIL data with the movement of CMEs through the field-of-views of various instruments onboard the space craft. By allowing the user access to these comparisons, new information about the time evolution of CMEs through the interplanetary medium is possible. Additionally, contextualizing this information in three-dimensional rendering scene, allows the analyst and the public to disseminate this data. This dissemination is further improved by the ability to connect multiple instances of the software and, thus, reach a broader audience. In a second step, we plan to combine the two foci of the project to enable the visualization of the SWAP instrument onboard New Horizons in context with a far-reaching ENLIL simulation, thus providing additional information about the solar wind dynamics of the outer solar system. The initial work regarding this plan will be presented.

  16. Application of polar orbiter products in weather forecasting using open source tools and open standards

    NASA Astrophysics Data System (ADS)

    Plieger, Maarten; de Vreede, Ernst

    2015-04-01

    EUMETSAT disseminates data for a number of polar satellites. At KNMI these data are not fully used for operational weather forecasting mainly because of the irregular coverage and lack of tools for handling these different types of data and products. For weather forecasting there is a lot of interest in the application of products from these polar orbiters. One of the key aspects is the high-resolution of these products, which can complement the information provided by numerical weather forecasts. Another advantage over geostationary satellites is the high coverage at higher latitudes and lack of parallax. Products like the VIIRS day-night band offer many possibilities for this application. This presentation will describe a project that aims to make available a number of products from polar satellites to the forecasting operation. The goal of the project is to enable easy and timely access to polar orbiter products and enable combined presentations of satellite imagery with model data. The system will be able to generate RGB composites (“false colour images”) for operational use. The system will be built using open source components and open standards. Pytroll components are used for data handling, reprojection and derived product generation. For interactive presentation of imagery the browser based ADAGUC WMS viewer component is used. Image generation is done by ADAGUC server components, which provide OGC WMS services. Polar satellite products are stored as true color RGBA data in the NetCDF file format, the satellite swaths are stored as regular grids with their own custom geographical projection. The ADAGUC WMS system is able to reproject, render and combine these data in a webbrowser interactively. Results and lessons learned will be presented at the conference.

  17. Estimation of the radioactive source dispersion from Fukushima nuclear power plant accident.

    PubMed

    Schöppner, Michael; Plastino, Wolfango; Povinec, Pavel; Nikkinen, Mika; Ruggieri, Federico; Bella, Francesco

    2013-11-01

    Following the Fukushima nuclear power plant accident detections of (133)Xe have been made in various locations. Using results of these remote measurements the Fukushima (133)Xe source term has been reconstructed and compared with previously reconstructed (137)Cs and (131)I source terms. The reconstruction is accomplished by applying atmospheric transport modeling and an adapted least square error method. The obtained results are in agreement with previous estimations of the Fukushima radionuclide source, and also serve as a proof of principle for source term reconstruction based on atmospheric transport modeling.

  18. Chromium Renderserver: Scalable and Open Source Remote RenderingInfrastructure

    SciTech Connect

    Paul, Brian; Ahern, Sean; Bethel, E. Wes; Brugger, Eric; Cook,Rich; Daniel, Jamison; Lewis, Ken; Owen, Jens; Southard, Dale

    2007-12-01

    Chromium Renderserver (CRRS) is software infrastructure thatprovides the ability for one or more users to run and view image outputfrom unmodified, interactive OpenGL and X11 applications on a remote,parallel computational platform equipped with graphics hardwareaccelerators via industry-standard Layer 7 network protocolsand clientviewers. The new contributions of this work include a solution to theproblem of synchronizing X11 and OpenGL command streams, remote deliveryof parallel hardware-accelerated rendering, and a performance analysis ofseveral different optimizations that are generally applicable to avariety of rendering architectures. CRRSis fully operational, Open Sourcesoftware.

  19. OpenMebius: An Open Source Software for Isotopically Nonstationary 13C-Based Metabolic Flux Analysis

    PubMed Central

    Furusawa, Chikara

    2014-01-01

    The in vivo measurement of metabolic flux by 13C-based metabolic flux analysis (13C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a 13C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas 13C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary 13C metabolic flux analysis (INST-13C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-13C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-13C-MFA. Confidence intervals determined by INST-13C-MFA were less than those determined by conventional methods, indicating the potential of INST-13C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-13C-MFA. PMID:25006579

  20. OpenMebius: an open source software for isotopically nonstationary 13C-based metabolic flux analysis.

    PubMed

    Kajihata, Shuichi; Furusawa, Chikara; Matsuda, Fumio; Shimizu, Hiroshi

    2014-01-01

    The in vivo measurement of metabolic flux by (13)C-based metabolic flux analysis ((13)C-MFA) provides valuable information regarding cell physiology. Bioinformatics tools have been developed to estimate metabolic flux distributions from the results of tracer isotopic labeling experiments using a (13)C-labeled carbon source. Metabolic flux is determined by nonlinear fitting of a metabolic model to the isotopic labeling enrichment of intracellular metabolites measured by mass spectrometry. Whereas (13)C-MFA is conventionally performed under isotopically constant conditions, isotopically nonstationary (13)C metabolic flux analysis (INST-(13)C-MFA) has recently been developed for flux analysis of cells with photosynthetic activity and cells at a quasi-steady metabolic state (e.g., primary cells or microorganisms under stationary phase). Here, the development of a novel open source software for INST-(13)C-MFA on the Windows platform is reported. OpenMebius (Open source software for Metabolic flux analysis) provides the function of autogenerating metabolic models for simulating isotopic labeling enrichment from a user-defined configuration worksheet. Analysis using simulated data demonstrated the applicability of OpenMebius for INST-(13)C-MFA. Confidence intervals determined by INST-(13)C-MFA were less than those determined by conventional methods, indicating the potential of INST-(13)C-MFA for precise metabolic flux analysis. OpenMebius is the open source software for the general application of INST-(13)C-MFA.

  1. Ancient Secrets of Open-Source Geoscience Software Management (Invited)

    NASA Astrophysics Data System (ADS)

    Zender, C. S.

    2009-12-01

    Geoscience research often involves complex models and data analysis designed to test increasingly sophisticated theories. Re-use and improvement of existing models and tools can be more efficient than their re-invention, and this re-use can accelerate knowledge generation and discovery. Open Source Software (OSS) is designed and intended to be re-used, extended, and improved. Hence Earth and Space Science Models (ESSMs) intended for community use are commonly distributed with OSS or OSS-like licenses. Why is it that, despite their permissive licenses, only a relatively small fraction of ESSMs receive community adoption, improvement, and extension? One reason is that developing community geoscience software remains a difficult and perilous exercise for the practicing researcher. This presentation will intercompare the rationale and results of different software management approaches taken in my dozen years as a developer and maintainer of, and participant in, four distinct ESSMs with 10 to 10,000 users. The primary lesson learned is that geoscience research is similar to the wider OSS universe in that most participants are motivated by the desire for greater professional recognition and attribution best summarized as "mindshare". ESSM adoption often hinges on whether the tension between users and developers for mindshare manifests as cooperation or competition. ESSM project management, therefore, should promote (but not require) recognition of all contributors. More practical model management practices include mailing lists, highly visible documentation, consistent APIs, regression tests, and periodic releases to improve features and fix bugs and builds. However, most ESSMs originate as working incarnations of short-term (~three year) research projects and, as such, lack permanent institutional support. Adhering to best software practices to transition these ESSMs from personal to community models often requires sacrificing research time. Recently, funding agencies

  2. THOR: an open-source exo-GCM

    NASA Astrophysics Data System (ADS)

    Grosheintz, Luc; Mendonça, João; Käppeli, Roger; Lukas Grimm, Simon; Mishra, Siddhartha; Heng, Kevin

    2015-12-01

    implicit GCM. By ESS3, I hope to present results for the advection equation.THOR is part of the Exoclimes Simulation Platform (ESP), a set of open-source community codes for simulating and understanding the atmospheres of exoplanets. The ESP also includes tools for radiative transfer and retrieval (HELIOS), an opacity calculator (HELIOS-K), and a chemical kinetics solver (VULCAN). We expect to publicly release an initial version of THOR in 2016 on www.exoclime.org.

  3. Open Source assimilation tool for distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Richard, Julien; Giangola-Murzyn, Agathe; Tchiguirinskaia, Ioulia; Schertzer, Daniel

    2013-04-01

    An advanced GIS data assimilation interface is a requisite to obtain a distributed hydrological model that is both transportable from catchment to catchment and is easily adaptable to data resolution. This tool is achieved for the cartographic data as well as the linked information data. In the case of the Multi-Hydro-Version2 model (A. Giangola-Murzyn et al. 2012), several types of information are distributed on a regular grid. The grid cell size has to be chosen by the user and each cell has to be filled up with information. In order to be the most realistic as possible, the Multi-Hydro model takes into account several data. For that, the assimilation tool (MH-AssimTool) has to be able to import all these different information. The needed flexibility of the studied area and grid size requires that the GIS interface must be easy to take in hand and also practical. The solution of a main window for the geographical visualisation and hierarchical menus coupled with checkboxes was chosen. For example, the geographical information, like the topography or the land use can be visualized in the main window. For the other data, like the soil conductivity, the geology or the initial moisture, the information is demanded through several pop-up windows. Once the needed information imported, MH-AssimTool prepares automatically the data. For the topography data conversion, if the resolution is too small, an interpolation is done during the processing. As a result, all the converted data is in a good resolution for the modelling. As Multi-Hydro, MH-AssimTool is open source. It's coded in Visual Basic language coupled with a GIS library. The interface is built in such a way then it can be used by a non specialist. We will illustrate the efficiency of the tool with some case studies of peri-urban catchments of widely different sizes and characteristics. We will also explain some parts of the coding of the interface.

  4. Open science: Investigating precipitation cycles in dynamically downscaled data using openly available radar data and open source software

    NASA Astrophysics Data System (ADS)

    Collis, Scott; helmus, Jonathan; Kotamarthi, Rao; Wang, Jiali; Feng, Yan; Ghate, Virendra

    2016-04-01

    In order to assess infrastructure resilience to climate change in urban centers, climate model output is needed at spatial resolutions required for urban planning. This is most commonly achieved using either empirical or dynamic downscaling at present. The utility of these downscaling methods for assessments depends on having estimates of biases in the models estimate climate variables and their extremes, surface temperature and precipitation as an example, developed using historical data sets. Since precipitation is a multi-scale stochastic process direct comparison with observations is challenging and even modern data sets work at scales too coarse to capture extreme events. Gauge data requires a direct hit by a storm to see the highest rain rates, often leading to an underestimation in the 1-100 year rainfall. This is exacerbated by phenomena such as training that can cause very high gradients in accumulation. This presentation details a long-term (multi-year) study of precipitation derived from open data from the NOAA Next-Generation Radar (NEXRAD) network. Two locations are studied; Portland, Maine, location for a pilot study conducted by the US Department of Homeland Security's on regional resilience to climate change and the Southern Great Plains of Oklahoma, home to the Department of Energy's ARM program. Both are located within 40km of a NEXRAD radar allowing retrievals of rainfall rates on the order of one kilometer using the Python-ARM Radar Toolkit (Py-ART). Both the diurnal and season cycle of precipitation is studied and compared to WRF dynamically downscaled precipitation rates. This project makes heavy use of open source community tools such as project Jupyter and the Scientific Python ecosystem to manage and process 10's of TB of data on midrange cluster infrastructure. Both the meteorological aspects and the data infrastructure and architecture will be discussed.

  5. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    NASA Astrophysics Data System (ADS)

    Zhang, S. Y.; Shen, G. H.; Sun, Y.; Zhou, D. Z.; Zhang, X. X.; Li, J. W.; Huang, C.; Zhang, X. G.; Dong, Y. J.; Zhang, W. J.; Zhang, B. Q.; Shi, C. Y.

    2016-05-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference 90Sr/90Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  6. Source term identification of environmental radioactive Pu/U particles by their characterization with non-destructive spectrochemical analytical techniques

    NASA Astrophysics Data System (ADS)

    Eriksson, M.; Osán, J.; Jernström, J.; Wegrzynek, D.; Simon, R.; Chinea-Cano, E.; Markowicz, A.; Bamford, S.; Tamborini, G.; Török, S.; Falkenberg, G.; Alsecz, A.; Dahlgaard, H.; Wobrauschek, P.; Streli, C.; Zoeger, N.; Betti, M.

    2005-04-01

    Six radioactive particles stemming from Thule area (NW-Greenland) were investigated by gamma-ray and L X-ray spectrometry based on radioactive disintegration, scanning electron microscopy coupled with energy-dispersive and wavelength-dispersive X-ray spectrometer, synchrotron radiation based techniques as microscopic X-ray fluorescence, microscopic X-ray absorption near-edge structure (μ-XANES) as well as combined X-ray absorption and fluorescence microtomography. Additionally, one particle from Mururoa atoll was examined by microtomography. From the results obtained, it was found out that the U and Pu were mixed in the particles. The U/Pu intensity ratios in the Thule particles varied between 0.05 and 0.36. The results from the microtomography showed that U/Pu ratio was not homogeneously distributed. The 241Am/ 238 + 239 + 240 Pu activity ratios varied between 0.13 and 0.17, indicating that the particles originate from different source terms. The oxidation states of U and Pu as determined by μ-XANES showed that U(IV) is the preponderant species and for Pu, two types of particles could be evidenced. One set had about 90% Pu(IV) while in the other the ratio Pu(IV)/Pu(VI) was about one third.

  7. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    PubMed

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages.

  8. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    PubMed

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. PMID:25483886

  9. What's mine is yours-open source as a new paradigm for sustainable healthcare education.

    PubMed

    Ellaway, Rachel; Martin, Ross D

    2008-01-01

    Free and open access to information, and increasingly digital content and tools, is one of the defining characteristics of the Internet and as such it presents a challenge to traditional models of development and provision of educational materials and activities. Open source is a particular way of giving access to materials and processes in that the source material is available alongside the finished artifact, thereby allowing subsequent adaptation and redevelopment by anyone wishing to undertake the work. Open source is now being developed as a concept that can be applied in settings outside software development (Kelty 2005), and it is increasingly being linked to moral and ethical agendas about the nature of society itself (Lessig 2005). The open source movement also raises issues regarding authority challenging the role of the expert voice. The imperative of open source and associated economic and social factors all point to an opportunity-rich area for both reflection and development. This paper explores the open source phenomena and it will consider ways in which open source principles and ideas can benefit and extend the provision of a wide range of healthcare education services and activities. PMID:18464143

  10. Radioactivity determination of sealed pure beta-sources by surface dose measurements and Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Choi, Chang Heon; Jung, Seongmoon; Choi, Kanghyuk; Son, Kwang-Jae; Lee, Jun Sig; Ye, Sung-Joon

    2016-04-01

    This study aims to determine the activity of a sealed pure beta-source by measuring the surface dose rate using an extrapolation chamber. A conversion factor (cGy s-1 Bq-1), which was defined as the ratio of surface dose rate to activity, can be calculated by Monte Carlo simulations of the extrapolation chamber measurement. To validate this hypothesis the certified activities of two standard pure beta-sources of Sr/Y-90 and Si/P-32 were compared with those determined by this method. In addition, a sealed test source of Sr/Y-90 was manufactured by the HANARO reactor group of KAERI (Korea Atomic Energy Research Institute) and used to further validate this method. The measured surface dose rates of the Sr/Y-90 and Si/P-32 standard sources were 4.615×10-5 cGy s-1 and 2.259×10-5 cGy s-1, respectively. The calculated conversion factors of the two sources were 1.213×10-8 cGy s-1 Bq-1 and 1.071×10-8 cGy s-1 Bq-1, respectively. Therefore, the activity of the standard Sr/Y-90 source was determined to be 3.995 kBq, which was 2.0% less than the certified value (4.077 kBq). For Si/P-32 the determined activity was 2.102 kBq, which was 6.6% larger than the certified activity (1.971 kBq). The activity of the Sr/Y-90 test source was determined to be 4.166 kBq, while the apparent activity reported by KAERI was 5.803 kBq. This large difference might be due to evaporation and diffusion of the source liquid during preparation and uncertainty in the amount of weighed aliquot of source liquid. The overall uncertainty involved in this method was determined to be 7.3%. We demonstrated that the activity of a sealed pure beta-source could be conveniently determined by complementary combination of measuring the surface dose rate and Monte Carlo simulations.

  11. Investigation of aerial dispersion of radioactive dust from an open-pit uranium mine by passive vinyl collectors.

    PubMed

    Pettersson, H B; Koperski, J

    1991-05-01

    Detailed investigations of the aerial dispersion of radioactive dust from the biggest open-pit U mining and milling operation in Australia were carried out. Spatial distributions of the long-lived radionuclides of 238U series and their origin, i.e., mining and milling operations vs. natural background radiation, have been studied. Horizontal flux, dry deposition, and ground resuspension of the radionuclides were investigated along a 50-km transect in the direction of the prevailing monsoonal winds in the region. The study was performed by means of unconventional "sticky vinyl" passive dust collectors, occasionally supported by high-volume air filter samplers. The data from the flux measurements show an inverse square to inverse cubic dependence, and the dry deposition exhibits an inverse square dependence, of radionuclide load vs. distance. The pit has been the predominant contributor of long-lived U series radionuclides to the environment within the radius of several kilometers from the operations. An aerial dispersion computer code (LUCIFER), based on a Gaussian plume model, was developed for the project. Experimental data were used as the code input data. Good agreement between the measured data and the normalized computed results was obtained.

  12. Investigation of aerial dispersion of radioactive dust from an open-pit uranium mine by passive vinyl collectors

    SciTech Connect

    Pettersson, H.B.; Koperski, J. )

    1991-05-01

    Detailed investigations of the aerial dispersion of radioactive dust from the biggest open-pit U mining and milling operation in Australia were carried out. Spatial distributions of the long-lived radionuclides of {sup 238}U series and their origin, i.e., mining and milling operations vs. natural background radiation, have been studied. Horizontal flux, dry deposition, and ground resuspension of the radionuclides were investigated along a 50-km transect in the direction of the prevailing monsoonal winds in the region. The study was performed by means of unconventional 'sticky vinyl' passive dust collectors, occasionally supported by high-volume air filter samplers. The data from the flux measurements show an inverse square to inverse cubic dependence, and the dry deposition exhibits an inverse square dependence, of radionuclide load vs. distance. The pit has been the predominant contributor of long-lived U series radionuclides to the environment within the radius of several kilometers from the operations. An aerial dispersion computer code (LUCIFER), based on a Gaussian plume model, was developed for the project. Experimental data were used as the code input data. Good agreement between the measured data and the normalized computed results was obtained.

  13. An Open-Source Approach for Catchment's Physiographic Characterization

    NASA Astrophysics Data System (ADS)

    Di Leo, M.; Di Stefano, M.

    2013-12-01

    A water catchment's hydrologic response is intimately linked to its morphological shape, which is a signature on the landscape of the particular climate conditions that generated the hydrographic basin over time. Furthermore, geomorphologic structures influence hydrologic regimes and land cover (vegetation). For these reasons, a basin's characterization is a fundamental element in hydrological studies. Physiographic descriptors have been extracted manually for long time, but currently Geographic Information System (GIS) tools ease such task by offering a powerful instrument for hydrologists to save time and improve accuracy of result. Here we present a program combining the flexibility of the Python programming language with the reliability of GRASS GIS, which automatically performing the catchment's physiographic characterization. GRASS (Geographic Resource Analysis Support System) is a Free and Open Source GIS, that today can look back on 30 years of successful development in geospatial data management and analysis, image processing, graphics and maps production, spatial modeling and visualization. The recent development of new hydrologic tools, coupled with the tremendous boost in the existing flow routing algorithms, reduced the computational time and made GRASS a complete toolset for hydrological analysis even for large datasets. The tool presented here is a module called r.basin, based on GRASS' traditional nomenclature, where the "r" stands for "raster", and it is available for GRASS version 6.x and more recently for GRASS 7. As input it uses a Digital Elevation Model and the coordinates of the outlet, and, powered by the recently developed r.stream.* hydrological tools, it performs the flow calculation, delimits the basin's boundaries and extracts the drainage network, returning the flow direction and accumulation, the distance to outlet and the hill slopes length maps. Based on those maps, it calculates hydrologically meaningful shape factors and

  14. KNIME for Open-Source Bioimage Analysis: A Tutorial.

    PubMed

    Dietz, Christian; Berthold, Michael R

    2016-01-01

    The open analytics platform KNIME is a modular environment that enables easy visual assembly and interactive execution of workflows. KNIME is already widely used in various areas of research, for instance in cheminformatics or classical data analysis. In this tutorial the KNIME Image Processing Extension is introduced, which adds the capabilities to process and analyse huge amounts of images. In combination with other KNIME extensions, KNIME Image Processing opens up new possibilities for inter-domain analysis of image data in an understandable and reproducible way. PMID:27207367

  15. KNIME for Open-Source Bioimage Analysis: A Tutorial.

    PubMed

    Dietz, Christian; Berthold, Michael R

    2016-01-01

    The open analytics platform KNIME is a modular environment that enables easy visual assembly and interactive execution of workflows. KNIME is already widely used in various areas of research, for instance in cheminformatics or classical data analysis. In this tutorial the KNIME Image Processing Extension is introduced, which adds the capabilities to process and analyse huge amounts of images. In combination with other KNIME extensions, KNIME Image Processing opens up new possibilities for inter-domain analysis of image data in an understandable and reproducible way.

  16. Sketching Up New Geographies: Open Sourcing and Curriculum Development

    ERIC Educational Resources Information Center

    Boyd, William; Ellis, David

    2013-01-01

    The functionality of web 2.0 technologies has caused academics to rethink their development of teaching and learning methods and approaches. The editable, open access nature of web 2.0 encourages the innovative collaboration of ideas, the creation of equitable visual and tactile learning environments, and opportunity for academics to develop…

  17. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  18. Innovative methodology for intercomparison of radionuclide calibrators using short half-life in situ prepared radioactive sources

    SciTech Connect

    Oliveira, P. A.; Santos, J. A. M.

    2014-07-15

    Purpose: An original radionuclide calibrator method for activity determination is presented. The method could be used for intercomparison surveys for short half-life radioactive sources used in Nuclear Medicine, such as{sup 99m}Tc or most positron emission tomography radiopharmaceuticals. Methods: By evaluation of the resulting net optical density (netOD) using a standardized scanning method of irradiated Gafchromic XRQA2 film, a comparison of the netOD measurement with a previously determined calibration curve can be made and the difference between the tested radionuclide calibrator and a radionuclide calibrator used as reference device can be calculated. To estimate the total expected measurement uncertainties, a careful analysis of the methodology, for the case of{sup 99m}Tc, was performed: reproducibility determination, scanning conditions, and possible fadeout effects. Since every factor of the activity measurement procedure can influence the final result, the method also evaluates correct syringe positioning inside the radionuclide calibrator. Results: As an alternative to using a calibrated source sent to the surveyed site, which requires a relatively long half-life of the nuclide, or sending a portable calibrated radionuclide calibrator, the proposed method uses a source preparedin situ. An indirect activity determination is achieved by the irradiation of a radiochromic film using {sup 99m}Tc under strictly controlled conditions, and cumulated activity calculation from the initial activity and total irradiation time. The irradiated Gafchromic film and the irradiator, without the source, can then be sent to a National Metrology Institute for evaluation of the results. Conclusions: The methodology described in this paper showed to have a good potential for accurate (3%) radionuclide calibrators intercomparison studies for{sup 99m}Tc between Nuclear Medicine centers without source transfer and can easily be adapted to other short half-life radionuclides.

  19. RADIOACTIVE BATTERY

    DOEpatents

    Birden, J.H.; Jordan, K.C.

    1959-11-17

    A radioactive battery which includes a capsule containing the active material and a thermopile associated therewith is presented. The capsule is both a shield to stop the radiations and thereby make the battery safe to use, and an energy conventer. The intense radioactive decay taking place inside is converted to useful heat at the capsule surface. The heat is conducted to the hot thermojunctions of a thermopile. The cold junctions of the thermopile are thermally insulated from the heat source, so that a temperature difference occurs between the hot and cold junctions, causing an electrical current of a constant magnitude to flow.

  20. Sticks AND Carrots: Encouraging Open Science at its source

    PubMed Central

    Leonelli, Sabina; Spichtinger, Daniel; Prainsack, Barbara

    2015-01-01

    The Open Science (OS) movement has been seen as an important facilitator for public participation in science. This has been underpinned by the assumption that widespread and free access to research outputs leads to (i) better and more efficient science, (ii) economic growth, in particular for small and medium-sized enterprises wishing to capitalise on research findings and (iii) increased transparency of knowledge production and its outcomes. The latter in particular could function as a catalyst for public participation and engagement. Whether OS is likely to help realise these benefits, however, will depend on the emergence of systemic incentives for scientists to utilise OS in a meaningful manner. While some areas, the environmental sciences have a long tradition of open ethos, citizen inclusion and global collaborations, such activities need to be more systematically supported and promoted by funders and learned societies in order to improve scientific research and public participation. PMID:26435842

  1. Open-Source 3-D Platform for Low-Cost Scientific Instrument Ecosystem.

    PubMed

    Zhang, C; Wijnen, B; Pearce, J M

    2016-08-01

    The combination of open-source software and hardware provides technically feasible methods to create low-cost, highly customized scientific research equipment. Open-source 3-D printers have proven useful for fabricating scientific tools. Here the capabilities of an open-source 3-D printer are expanded to become a highly flexible scientific platform. An automated low-cost 3-D motion control platform is presented that has the capacity to perform scientific applications, including (1) 3-D printing of scientific hardware; (2) laboratory auto-stirring, measuring, and probing; (3) automated fluid handling; and (4) shaking and mixing. The open-source 3-D platform not only facilities routine research while radically reducing the cost, but also inspires the creation of a diverse array of custom instruments that can be shared and replicated digitally throughout the world to drive down the cost of research and education further. PMID:26763293

  2. Human terrain exploitation suite: applying visual analytics to open source information.

    NASA Astrophysics Data System (ADS)

    Hanratty, Timothy; Richardson, John; Mittrick, Mark; Dumer, John; Heilman, Eric; Roy, Heather; Kase, Sue

    2014-05-01

    This paper presents the concept development and demonstration of the Human Terrain Exploitation Suite (HTES) under development at the U.S. Army Research Laboratory's Tactical Information Fusion Branch. The HTES is an amalgamation of four complementary visual analytic capabilities that target the exploitation of open source information. Open source information, specifically news feeds, blogs and other social media, provide a unique opportunity to collect and examine salient topics and trends. Analysis of open source information provides valuable insights into determining opinions, values, cultural nuances and other socio-political aspects within a military area of interest. The early results of the HTES field study indicate that the tools greatly increased the analysts' ability to exploit open source information, but improvement through greater cross-tool integration and correlation of their results is necessary for further advances.

  3. Open-Source 3-D Platform for Low-Cost Scientific Instrument Ecosystem.

    PubMed

    Zhang, C; Wijnen, B; Pearce, J M

    2016-08-01

    The combination of open-source software and hardware provides technically feasible methods to create low-cost, highly customized scientific research equipment. Open-source 3-D printers have proven useful for fabricating scientific tools. Here the capabilities of an open-source 3-D printer are expanded to become a highly flexible scientific platform. An automated low-cost 3-D motion control platform is presented that has the capacity to perform scientific applications, including (1) 3-D printing of scientific hardware; (2) laboratory auto-stirring, measuring, and probing; (3) automated fluid handling; and (4) shaking and mixing. The open-source 3-D platform not only facilities routine research while radically reducing the cost, but also inspires the creation of a diverse array of custom instruments that can be shared and replicated digitally throughout the world to drive down the cost of research and education further.

  4. Ontology-Based Data Integration of Open Source Electronic Medical Record and Data Capture Systems

    ERIC Educational Resources Information Center

    Guidry, Alicia F.

    2013-01-01

    In low-resource settings, the prioritization of clinical care funding is often determined by immediate health priorities. As a result, investment directed towards the development of standards for clinical data representation and exchange are rare and accordingly, data management systems are often redundant. Open-source systems such as OpenMRS and…

  5. Development and evaluation of a lightweight sensor system for emission sampling from open area sources

    EPA Science Inventory

    A new sensor system for mobile and aerial emission sampling was developed for open area sources, such as open burning. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, and black carbon, samplers for particulate matter with ...

  6. Limitations of Phased Array Beamforming in Open Rotor Noise Source Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Csaba; Envia, Edmane; Podboy, Gary G.

    2013-01-01

    Phased array beamforming results of the F31/A31 historical baseline counter-rotating open rotor blade set were investigated for measurement data taken on the NASA Counter-Rotating Open Rotor Propulsion Rig in the 9- by 15-Foot Low-Speed Wind Tunnel of NASA Glenn Research Center as well as data produced using the LINPROP open rotor tone noise code. The planar microphone array was positioned broadside and parallel to the axis of the open rotor, roughly 2.3 rotor diameters away. The results provide insight as to why the apparent noise sources of the blade passing frequency tones and interaction tones appear at their nominal Mach radii instead of at the actual noise sources, even if those locations are not on the blades. Contour maps corresponding to the sound fields produced by the radiating sound waves, taken from the simulations, are used to illustrate how the interaction patterns of circumferential spinning modes of rotating coherent noise sources interact with the phased array, often giving misleading results, as the apparent sources do not always show where the actual noise sources are located. This suggests that a more sophisticated source model would be required to accurately locate the sources of each tone. The results of this study also have implications with regard to the shielding of open rotor sources by airframe empennages.

  7. Edaq530: A Transparent, Open-End and Open-Source Measurement Solution in Natural Science Education

    ERIC Educational Resources Information Center

    Kopasz, Katalin; Makra, Peter; Gingl, Zoltan

    2011-01-01

    We present Edaq530, a low-cost, compact and easy-to-use digital measurement solution consisting of a thumb-sized USB-to-sensor interface and measurement software. The solution is fully open-source, our aim being to provide a viable alternative to professional solutions. Our main focus in designing Edaq530 has been versatility and transparency. In…

  8. Source term estimation and the isotopic ratio of radioactive material released from the WIPP repository in New Mexico, USA.

    PubMed

    Thakur, P

    2016-01-01

    After almost 15 years of operations, the Waste Isolation Pilot Plant (WIPP) had one of its waste drums breach underground as a result of a runaway chemical reaction in the waste it contained. This incident occurred on February 14, 2014. Moderate levels of radioactivity were released into the underground air. A small portion of the contaminated underground air also escaped to the surface through the ventilation system and was detected approximately 1 km away from the facility. According to the source term estimation, the actual amount of radioactivity released from the WIPP site was less than 1.5 mCi. The highest activity detected on the surface was 115.2 μBq/m(3) for (241)Am and 10.2 μBq/m(3) for (239+240)Pu at a sampling station located 91 m away from the underground air exhaust point and 81.4 μBq/m(3) of (241)Am and 5.8 μBq/m(3) of (239+240)Pu at a monitoring station located approximately 1 km northwest of the WIPP facility. The dominant radionuclides released were americium and plutonium, in a ratio that matches the content of the breached drum. Air monitoring across the WIPP site intensified following the first reports of radiation detection underground to determine the extent of impact to WIPP personnel, the public, and the environment. In this paper, the early stage monitoring data collected by an independent monitoring program conducted by the Carlsbad Environmental Monitoring & Research Center (CEMRC) and an oversight monitoring program conducted by the WIPP's management and operating contractor, the Nuclear Waste Partnership (NWP) LLC were utilized to estimate the actual amount of radioactivity released from the WIPP underground. The Am and Pu isotope ratios were measured and used to support the hypothesis that the release came from one drum identified as having breached that represents a specific waste stream with this radionuclide ratio in its inventory. This failed drum underwent a heat and gas producing reaction that overpowered its vent and

  9. Source term estimation and the isotopic ratio of radioactive material released from the WIPP repository in New Mexico, USA.

    PubMed

    Thakur, P

    2016-01-01

    After almost 15 years of operations, the Waste Isolation Pilot Plant (WIPP) had one of its waste drums breach underground as a result of a runaway chemical reaction in the waste it contained. This incident occurred on February 14, 2014. Moderate levels of radioactivity were released into the underground air. A small portion of the contaminated underground air also escaped to the surface through the ventilation system and was detected approximately 1 km away from the facility. According to the source term estimation, the actual amount of radioactivity released from the WIPP site was less than 1.5 mCi. The highest activity detected on the surface was 115.2 μBq/m(3) for (241)Am and 10.2 μBq/m(3) for (239+240)Pu at a sampling station located 91 m away from the underground air exhaust point and 81.4 μBq/m(3) of (241)Am and 5.8 μBq/m(3) of (239+240)Pu at a monitoring station located approximately 1 km northwest of the WIPP facility. The dominant radionuclides released were americium and plutonium, in a ratio that matches the content of the breached drum. Air monitoring across the WIPP site intensified following the first reports of radiation detection underground to determine the extent of impact to WIPP personnel, the public, and the environment. In this paper, the early stage monitoring data collected by an independent monitoring program conducted by the Carlsbad Environmental Monitoring & Research Center (CEMRC) and an oversight monitoring program conducted by the WIPP's management and operating contractor, the Nuclear Waste Partnership (NWP) LLC were utilized to estimate the actual amount of radioactivity released from the WIPP underground. The Am and Pu isotope ratios were measured and used to support the hypothesis that the release came from one drum identified as having breached that represents a specific waste stream with this radionuclide ratio in its inventory. This failed drum underwent a heat and gas producing reaction that overpowered its vent and

  10. Analytical source term optimization for radioactive releases with approximate knowledge of nuclide ratios

    NASA Astrophysics Data System (ADS)

    Hofman, Radek; Seibert, Petra; Kovalets, Ivan; Andronopoulos, Spyros

    2015-04-01

    We are concerned with source term retrieval in the case of an accident in a nuclear power with off-site consequences. The goal is to optimize atmospheric dispersion model inputs using inverse modeling of gamma dose rate measurements (instantaneous or time-integrated). These are the most abundant type of measurements provided by various radiation monitoring networks across Europe and available continuously in near-real time. Usually, a source term of an accidental release comprises of a mixture of nuclides. Unfortunately, gamma dose rate measurements do not provide a direct information on the source term composition; however, physical properties of respective nuclides (deposition properties, decay half-life) can yield some insight. In the method presented, we assume that nuclide ratios are known at least approximately, e.g. from nuclide specific observations or reactor inventory and assumptions on the accident type. The source term can be in multiple phases, each being characterized by constant nuclide ratios. The method is an extension of a well-established source term inversion approach based on the optimization of an objective function (minimization of a cost function). This function has two quadratic terms: mismatch between model and measurements weighted by an observation error covariance matrix and the deviation of the solution from a first guess weighted by the first-guess error covariance matrix. For simplicity, both error covariance matrices are approximated as diagonal. Analytical minimization of the cost function leads to a liner system of equations. Possible negative parts of the solution are iteratively removed by the means of first guess error variance reduction. Nuclide ratios enter the problem in the form of additional linear equations, where the deviations from prescribed ratios are weighted by factors; the corresponding error variance allows us to control how strongly we want to impose the prescribed ratios. This introduces some freedom into the

  11. Open-Source web-based Geographical Information System for health exposure assessment.

    PubMed

    Evans, Barry; Sabel, Clive E

    2012-01-01

    This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software.

  12. Anatomy of BioJS, an open source community for the life sciences

    PubMed Central

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-01-01

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects. DOI: http://dx.doi.org/10.7554/eLife.07009.001 PMID:26153621

  13. Anatomy of BioJS, an open source community for the life sciences.

    PubMed

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-01-01

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects. PMID:26153621

  14. Anatomy of BioJS, an open source community for the life sciences.

    PubMed

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-07-08

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects.

  15. Burnup estimation of fuel sourcing radioactive material based on monitored Cs and Pu isotopic activity ratios in Fukushima N. P. S. accident

    SciTech Connect

    Yamamoto, T.; Suzuki, M.; Ando, Y.

    2012-07-01

    After the severe core damage of Fukushima Dai-Ichi Nuclear Power Station, radioactive material leaked from the reactor buildings. As part of monitoring of radioactivity in the site, measurements of radioactivity in soils at three fixed points have been performed for {sup 134}Cs and {sup 137}Cs with gamma-ray spectrometry and for Pu, Pu, and {sup 240}Pu with {alpha}-ray spectrometry. Correlations of radioactivity ratios of {sup 134}Cs to {sup 137}Cs, and {sup 238}Pu to the sum of {sup 239}Pu and {sup 240}Pu with fuel burnup were studied by using theoretical burnup calculations and measurements on isotopic inventories, and compared with the Cs and Pu radioactivity rations in the soils. The comparison indicated that the burnup of the fuel sourcing the radioactivity was from 18 to 38 GWd/t, which corresponded to that of the fuel in the highest power and, therefore, the highest decay heat in operating high-burnup fueled BWR cores. (authors)

  16. A Comparison of Simple Algorithms for Gamma-ray Spectrometers in Radioactive Source Search Applications

    SciTech Connect

    Jarman, Kenneth D.; Runkle, Robert C.; Anderson, Kevin K.; Pfund, David M.

    2008-03-01

    Large variation in time-dependent ambient gamma-ray radiation challenges the search for radiation sources. A common strategy to reduce the effects of background variation is to raise detection thresholds, but at the price of reduced detection sensitivity. We present simple algorithms that both reduce background variation and maintain trip-wire detection sensitivity with gamma-ray spectrometry. The best-performing algorithms focus on the spectral shape over several energy bins using Spectral Comparison Ratios and dynamically predict background with the Kalman Filter.

  17. Adding tools to the open source toolbox: The Internet

    NASA Technical Reports Server (NTRS)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  18. Open source information on the U.S. infrastructure

    NASA Astrophysics Data System (ADS)

    Freiwald, David A.

    1995-05-01

    Terrorism is expected to increase on a global scale, with the US also becoming more of a target. Since there has not been a war in the lower 48 states of the continental US since about the turn of the century, the US has been quite open and lax about publishing information on our infrastructure, namely details on locations of power lines, gas and oil pipelines, etc.-- information not publically available in Europe. Examples are given, along with comments on the potential implications. Finally, brief remarks are given on some ways to address the situation.

  19. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.

    PubMed

    Klumpp, John; Brandl, Alexander

    2015-03-01

    A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.

  20. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.

    PubMed

    Klumpp, John; Brandl, Alexander

    2015-03-01

    A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity. PMID:25627949

  1. Open source drug discovery--a new paradigm of collaborative research in tuberculosis drug development.

    PubMed

    Bhardwaj, Anshu; Scaria, Vinod; Raghava, Gajendra Pal Singh; Lynn, Andrew Michael; Chandra, Nagasuma; Banerjee, Sulagna; Raghunandanan, Muthukurussi V; Pandey, Vikas; Taneja, Bhupesh; Yadav, Jyoti; Dash, Debasis; Bhattacharya, Jaijit; Misra, Amit; Kumar, Anil; Ramachandran, Srinivasan; Thomas, Zakir; Brahmachari, Samir K

    2011-09-01

    It is being realized that the traditional closed-door and market driven approaches for drug discovery may not be the best suited model for the diseases of the developing world such as tuberculosis and malaria, because most patients suffering from these diseases have poor paying capacity. To ensure that new drugs are created for patients suffering from these diseases, it is necessary to formulate an alternate paradigm of drug discovery process. The current model constrained by limitations for collaboration and for sharing of resources with confidentiality hampers the opportunities for bringing expertise from diverse fields. These limitations hinder the possibilities of lowering the cost of drug discovery. The Open Source Drug Discovery project initiated by Council of Scientific and Industrial Research, India has adopted an open source model to power wide participation across geographical borders. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multi-faceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Since the inventions are community generated, the new chemical entities developed by Open Source Drug Discovery will be taken up for clinical trial in a non-exclusive manner by participation of multiple companies with majority funding from Open Source Drug Discovery. This will ensure availability of drugs through a lower cost community driven drug discovery process for diseases afflicting people with poor paying capacity. Hopefully what LINUX the World Wide Web have done for the information technology, Open Source Drug Discovery will do for drug discovery.

  2. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  3. Solar neutrino experiments and a test for neutrino oscillations with radioactive sources

    SciTech Connect

    Cleveland, B.T.; Davis, R. Jr.; Rowley, J.K.

    1980-01-01

    The results of the Brookhaven solar neutrino experiment are given and compared to the most recent standard solar model calculations. The observations are about a factor of 4 below theoretical expectations. In view of the uncertainties involved in the theoretical models of the sun, the discrepancy is not considered to be evidence for neutrino oscillations. The status of the development of a gallium solar neutrino detector is described. Radiochemical neutrino detectors can be used to search for ..nu../sub e/ oscillations by using megacurie sources of monoenergetic neutrinos like /sup 65/Zn. A quantitative evaluation of possible experiments using the Brookhaven chlorine solar neutrino detector and a gallium detector is given. 6 figures, 3 tables.

  4. 2E1 Ar(17+) decay and conventional radioactive sources to determine efficiency of semiconductor detectors.

    PubMed

    Lamour, Emily; Prigent, Christophe; Eberhardt, Benjamin; Rozet, Jean Pierre; Vernhet, Dominique

    2009-02-01

    Although reliable models may predict the detection efficiency of semiconductor detectors, measurements are needed to check the parameters supplied by the manufacturers, namely, the thicknesses of dead layer, beryllium window, and crystal active area. The efficiency of three silicon detectors has been precisely investigated in their entire photon energy range of detection. In the zero to a few keV range, we developed a new method based on the detection of the 2E1 decay of the metastable Ar(17+) 2s-->1s transition. Very good theoretical knowledge of the energetic distribution of the 2E1 decay mode enables precise characterization of the absorbing layers in front of the detectors. In the high-energy range (>10 keV), the detector crystal thickness plays a major role in the detection efficiency and has been determined using a (241)Am source.

  5. [Efficiencies of contamination source for flooring and some materials used in unencapsulated radioactivity handling facilities].

    PubMed

    Yoshida, M; Yoshizawa, M; Minami, K

    1990-09-01

    The efficiencies of contamination source, defined in ISO Report 7506-1, were experimentally determined for such materials as flooring, polyethylene, smear-tested filter paper and stainless steel plate. 5 nuclides of 147Pm, 60Co, 137Cs, 204Tl and 90Sr-Y were used to study beta-ray energy dependence of the efficiency, and 241Am as alpha-ray emitter. The charge-up effect in the measurement by a window-less 2 pi-proportional counter was evaluated to obtain reliable surface emission rate. The measured efficiencies for non-permeable materials, except for two cases, are more than 0.5 even for 147Pm. The ISO recommendations were shown to be conservative enough on the basis of present results.

  6. Asymptotic Giant Branch stars as a source of short-lived radioactive nuclei in the solar nebula

    NASA Technical Reports Server (NTRS)

    Wasserburg, G. J.; Busso, M.; Gallino, R.; Raiteri, C. M.

    1994-01-01

    We carried out a theoretical evaluation of the contribution of Asymptotic Giant Branch (AGB) stars to some short-lived (10(exp 6) less than or equal to Tau-bar less than or equal to 2 x 10(exp 7) yr) isotopes in the Interstellar Medium (ISM) and in the early solar system using stellar model calculations for thermally pulsing evolutionary phases of low-mass stars. The yields of s-process nuclei in the convective He-shell for different neutron exposures tau(sub 0) were obtained, and AGB stars were shown to produce several radioactive nuclei (especially Pd-107, Pb-205, Fe-60, Zr-93, Tc-99, Cs-135, and Hf-182) in diferent amounts. Assuming either contamination of the solar nebula from a single AGB star or models for continuous injection and mixing from many stars into the ISM, we calculate the ratios of radioactive to stable nuclei at the epoch of the Sun's formation. The dilution factor between the AGB ejecta and the early solar system matter is obtained by matching the observed Pd-107/Pd-108 and depends on the value of tau(sub 0). It is found that small masses M(sub He) of He-shell material (10(exp -4)-10(exp -7) solar mass) enriched in s-process nuclei are sufficient to contaminate 1 solar mass of the ISM to produce the Pd-107 found in the early solar system. Predictions are made for all of the other radioactive isotopes. The optimal model to explain several observed radioactive species at different states of the proto-solar nebula involves a single AGB star with a low neutron exposure (tau(sub 0) = 0.03 mbarn(sup -1)) which contaminated the cloud with a dilution factor of M(sub He)/solar mass approximately 1.5 x 10(exp -4). This will also contribute newly synthesized stable s-process nuclei in the amount of approximately 10(exp -4) of their abundances already present in the proto-solar cloud. Variations in the degree of homogenization (approximately 30%) of the injected material may account for some of the small general isotopic anomalies found in meteorites. It is

  7. Investigation of a corrugated channel flow with an open source PIV software

    NASA Astrophysics Data System (ADS)

    Sivas, Deniz; Bahadır Olcay, A.; Ahn, Hojin

    2016-03-01

    In this study, the corrugated channel flow was investigated by using an open-source particle image velocimetry (PIV) software. The open-source software called OpenPIV was first verified by using images of an earlier experimental work of a vortex ring formation. The corrugated channel flow images were taken with 200 W power LED light source and a high speed camera and those images were analysed with these spatial and temporal tools of OpenPIV. Laminar, transient and turbulent flow regimes were identified when Reynolds number was below 1100, in between 1100 and 2000 and higher than 2000, respectively. The velocity vectors were found to be about 20% lower than the previous study results. The flow inside the grooves was also investigated with OpenPIV and flow characteristics at the grooves were captured when interrogation window size was lowered. The visualization of the flow was presented for different Reynolds numbers with the relative scale values. As a result of this study, OpenPIV software was determined as promising open source PIV analysis software.

  8. 10 CFR 39.43 - Inspection, maintenance, and opening of a source or source holder.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... holder. 39.43 Section 39.43 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY... holder. (a) Each licensee shall visually check source holders, logging tools, and source handling tools... holders, logging tools, injection tools, source handling tools, storage containers, transport...

  9. Utilization of open source electronic health record around the world: A systematic review

    PubMed Central

    Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahamdi, Maryam

    2014-01-01

    Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems. PMID:24672566

  10. GIS-Based Noise Simulation Open Source Software: N-GNOIS

    NASA Astrophysics Data System (ADS)

    Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh

    2015-12-01

    Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.

  11. AGB stars as a source of short-lived radioactive nuclei in the solar nebula

    NASA Technical Reports Server (NTRS)

    Wasserburg, G. J.; Gallino, R.; Busso, M.; Raiteri, C. M.

    1993-01-01

    The purpose is to estimate the possible contribution of some short-lived nuclei to the early solar nebula from asymptotic giant branch (AGB) sources. Low mass (1 to 3 solar mass) AGB stars appear to provide a site for synthesis of the main s process component for solar system material with an exponential distribution of neutron irradiations varies as exp(-tau/tau(sub 0)) (where tau is the time integrated neutron flux with a mean neutron exposure tau(sub 0)) for solar abundances with tau(sub 0) = 0.28 mb(sup -1). Previous workers estimated the synthesis of key short-lived nuclei which might be produced in AGB stars. While these calculations exhibit the basic characteristics of nuclei production by neutron exposure, there is need for a self-consistent calculation that follows AGB evolution and takes into account the net production from a star and dilution with the cloud medium. Many of the general approaches and the conclusions arrived at were presented earlier by Cameron. The production of nuclei for a star of 1.5 solar mass during the thermal pulsing of the AGB phase was evaluated. Calculations were done for a series of thermal pulses with tau(sub 0) = 0.12 and 0.28 mb(sup -1). These pulses involve s nucleosynthesis in the burning shell at the base of the He zone followed by the ignition of the H burning shell at the top of the He zone. After about 10-15 cycles the abundances of the various nuclei in the He zone become constant. Computations of the abundances of all nuclei in the He zone were made following Gallino. The mass of the solar nebula was considered to consist of some initial material of approximately solar composition plus some contributions from AGB stars. The ratios of the masses required from the AGB He burning zone to the ISM necessary to produce the observed value of Pd-107/Pd-108 in the early solar system were calculated and this dilution factor was applied to all other relevant nuclei.

  12. An open-source, mobile-friendly search engine for public medical knowledge.

    PubMed

    Samwald, Matthias; Hanbury, Allan

    2014-01-01

    The World Wide Web has become an important source of information for medical practitioners. To complement the capabilities of currently available web search engines we developed FindMeEvidence, an open-source, mobile-friendly medical search engine. In a preliminary evaluation, the quality of results from FindMeEvidence proved to be competitive with those from TRIP Database, an established, closed-source search engine for evidence-based medicine.

  13. Building an open-source robotic stereotaxic instrument.

    PubMed

    Coffey, Kevin R; Barker, David J; Ma, Sisi; West, Mark O

    2013-10-29

    This protocol includes the designs and software necessary to upgrade an existing stereotaxic instrument to a robotic (CNC) stereotaxic instrument for around $1,000 (excluding a drill), using industry standard stepper motors and CNC controlling software. Each axis has variable speed control and may be operated simultaneously or independently. The robot's flexibility and open coding system (g-code) make it capable of performing custom tasks that are not supported by commercial systems. Its applications include, but are not limited to, drilling holes, sharp edge craniotomies, skull thinning, and lowering electrodes or cannula. In order to expedite the writing of g-coding for simple surgeries, we have developed custom scripts that allow individuals to design a surgery with no knowledge of programming. However, for users to get the most out of the motorized stereotax, it would be beneficial to be knowledgeable in mathematical programming and G-Coding (simple programming for CNC machining). The recommended drill speed is greater than 40,000 rpm. The stepper motor resolution is 1.8°/Step, geared to 0.346°/Step. A standard stereotax has a resolution of 2.88 μm/step. The maximum recommended cutting speed is 500 μm/sec. The maximum recommended jogging speed is 3,500 μm/sec. The maximum recommended drill bit size is HP 2.

  14. Building An Open-source Robotic Stereotaxic Instrument

    PubMed Central

    Coffey, Kevin R.; Barker, David J.; Ma, Sisi; West, Mark O.

    2013-01-01

    This protocol includes the designs and software necessary to upgrade an existing stereotaxic instrument to a robotic (CNC) stereotaxic instrument for around $1,000 (excluding a drill), using industry standard stepper motors and CNC controlling software. Each axis has variable speed control and may be operated simultaneously or independently. The robot's flexibility and open coding system (g-code) make it capable of performing custom tasks that are not supported by commercial systems. Its applications include, but are not limited to, drilling holes, sharp edge craniotomies, skull thinning, and lowering electrodes or cannula. In order to expedite the writing of g-coding for simple surgeries, we have developed custom scripts that allow individuals to design a surgery with no knowledge of programming. However, for users to get the most out of the motorized stereotax, it would be beneficial to be knowledgeable in mathematical programming and G-Coding (simple programming for CNC machining). The recommended drill speed is greater than 40,000 rpm. The stepper motor resolution is 1.8°/Step, geared to 0.346°/Step. A standard stereotax has a resolution of 2.88 μm/step. The maximum recommended cutting speed is 500 μm/sec. The maximum recommended jogging speed is 3,500 μm/sec. The maximum recommended drill bit size is HP 2. PMID:24192514

  15. Building an open-source robotic stereotaxic instrument.

    PubMed

    Coffey, Kevin R; Barker, David J; Ma, Sisi; West, Mark O

    2013-01-01

    This protocol includes the designs and software necessary to upgrade an existing stereotaxic instrument to a robotic (CNC) stereotaxic instrument for around $1,000 (excluding a drill), using industry standard stepper motors and CNC controlling software. Each axis has variable speed control and may be operated simultaneously or independently. The robot's flexibility and open coding system (g-code) make it capable of performing custom tasks that are not supported by commercial systems. Its applications include, but are not limited to, drilling holes, sharp edge craniotomies, skull thinning, and lowering electrodes or cannula. In order to expedite the writing of g-coding for simple surgeries, we have developed custom scripts that allow individuals to design a surgery with no knowledge of programming. However, for users to get the most out of the motorized stereotax, it would be beneficial to be knowledgeable in mathematical programming and G-Coding (simple programming for CNC machining). The recommended drill speed is greater than 40,000 rpm. The stepper motor resolution is 1.8°/Step, geared to 0.346°/Step. A standard stereotax has a resolution of 2.88 μm/step. The maximum recommended cutting speed is 500 μm/sec. The maximum recommended jogging speed is 3,500 μm/sec. The maximum recommended drill bit size is HP 2. PMID:24192514

  16. Design, construction, and use of a shipping case for radioactive sources used in the calibration of portal monitors in the radiation portal monitoring project

    SciTech Connect

    Lepel, Elwood A.; Hensley, Walter K.

    2009-12-01

    Pacific Northwest National Laboratory is working with US Customs and Border Protection to assist in the installation of radiation portal monitors. We need to provide radioactive sources – both gamma- and neutron-emitting – to ports of entry where the monitors are being installed. The monitors must be calibrated to verify proper operation and detection sensitivity. We designed a portable source-shipping case using numerical modeling to predict the neutron dose rate at the case’s surface. The shipping case including radioactive sources meets the DOT requirements for “limited quantity.” Over 300 shipments, domestic and international, were made in FY2008 using this type of shipping case.

  17. Characterization and Source Term Assessments of Radioactive Particles from Marshall Islands Using Non-Destructive Analytical Techniques

    SciTech Connect

    Jernstrom, J; Eriksson, M; Simon, R; Tamborini, G; Bildstein, O; Carlos-Marquez, R; Kehl, S R; Betti, M; Hamilton, T

    2005-06-11

    A considerable fraction of radioactivity entering the environment from different nuclear events is associated with particles. The impact of these events can only be fully assessed where there is some knowledge about the mobility of particle bound radionuclides entering the environment. The behavior of particulate radionuclides is dependent on several factors, including the physical, chemical and redox state of the environment, the characteristics of the particles (e.g., the chemical composition, crystallinity and particle size) and on the oxidative state of radionuclides contained in the particles. Six plutonium-containing particles stemming from Runit Island soil (Marshall Islands) were characterized using non-destructive analytical and microanalytical methods. By determining the activity of {sup 239,240}Pu and {sup 241}Am isotopes from their gamma peaks structural information related to Pu matrix was obtained, and the source term was revealed. Composition and elemental distribution in the particles were studied with synchrotron radiation based micro X-ray fluorescence (SR-{mu}-XRF) spectrometry. Scanning electron microscope equipped with energy dispersive X-ray detector (SEMEDX) and secondary ion mass spectrometer (SIMS) were used to examine particle surfaces. Based on the elemental composition the particles were divided into two groups; particles with plain Pu matrix, and particles where the plutonium is included in Si/O-rich matrix being more heterogeneously distributed. All of the particles were identified as fragments of initial weapons material. As containing plutonium with low {sup 240}Pu/{sup 239}Pu atomic ratio, {approx}2-6%, which corresponds to weapons grade plutonium, the source term was identified to be among the safety tests conducted in the history of Runit Island.

  18. Free and Open Source Software for Geospatial in the field of planetary science

    NASA Astrophysics Data System (ADS)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and

  19. OpenSourcePACS: an extensible infrastructure for medical image management.

    PubMed

    Bui, Alex A T; Morioka, Craig; Dionisio, John David N; Johnson, David B; Sinha, Usha; Ardekani, Siamak; Taira, Ricky K; Aberle, Denise R; El-Saden, Suzie; Kangarloo, Hooshang

    2007-01-01

    The development of comprehensive picture archive and communication systems (PACS) has mainly been limited to proprietary developments by vendors, though a number of freely available software projects have addressed specific image management tasks. The openSourcePACS project aims to provide an open source, common foundation upon which not only can a basic PACS be readily implemented, but to also support the evolution of new PACS functionality through the development of novel imaging applications and services. openSourcePACS consists of four main software modules: 1) image order entry, which enables the ordering and tracking of structured image requisitions; 2) an agent-based image server framework that coordinates distributed image services including routing, image processing, and querying beyond the present digital image and communications in medicine (DICOM) capabilities; 3) an image viewer, supporting standard display and image manipulation tools, DICOM presentation states, and structured reporting; and 4) reporting and result dissemination, supplying web-based widgets for creating integrated reports. All components are implemented using Java to encourage cross-platform deployment. To demonstrate the usage of openSourcePACS, a preliminary application supporting primary care/specialist communication was developed and is described herein. Ultimately, the goal of openSourcePACS is to promote the wide-scale development and usage of PACS and imaging applications within academic and research communities.

  20. GALE: a generic open source extensible adaptation engine

    NASA Astrophysics Data System (ADS)

    De Bra, Paul; Knutov, Evgeny; Smits, David; Stash, Natalia; Ramos, Vinicius F. C.

    2013-06-01

    This paper motivates and describes GALE, the Generic Adaptation Language and Engine that came out of the GRAPPLE EU FP7 project. The main focus of the paper is the extensible nature of GALE. The purpose of this description is to illustrate how a single core adaptation engine can be used for different types of adaptation, applied to different types of information items and documents. We illustrate the adaptive functionality on some examples of hypermedia documents. In April 2012, David Smits defended the world's first adaptive PhD thesis on this topic. The thesis, available for download and direct adaptive access at http://gale.win.tue.nl/thesis, shows that a single source of information can serve different audiences and at the same time also allows more freedom of navigation than is possible in any paper or static hypermedia document. The same can be done for course texts, hyperfiction, encyclopedia, museum, or other cultural heritage websites, etc. We explain how to add functionality to GALE if desired, to adapt the system's behavior to whatever the application requires. This stresses our main objective: to provide a technological base for adaptive (hypermedia) system researchers on which they can build extensions for the specific research they have in mind.