Sample records for utilizing open source

  1. The Efficient Utilization of Open Source Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baty, Samuel R.

    These are a set of slides on the efficient utilization of open source information. Open source information consists of a vast set of information from a variety of sources. Not only does the quantity of open source information pose a problem, the quality of such information can hinder efforts. To show this, two case studies are mentioned: Iran and North Korea, in order to see how open source information can be utilized. The huge breadth and depth of open source information can complicate an analysis, especially because open information has no guarantee of accuracy. Open source information can provide keymore » insights either directly or indirectly: looking at supporting factors (flow of scientists, products and waste from mines, government budgets, etc.); direct factors (statements, tests, deployments). Fundamentally, it is the independent verification of information that allows for a more complete picture to be formed. Overlapping sources allow for more precise bounds on times, weights, temperatures, yields or other issues of interest in order to determine capability. Ultimately, a "good" answer almost never comes from an individual, but rather requires the utilization of a wide range of skill sets held by a team of people.« less

  2. New Open-Source Version of FLORIS Released | News | NREL

    Science.gov Websites

    New Open-Source Version of FLORIS Released New Open-Source Version of FLORIS Released January 26 , 2018 National Renewable Energy Laboratory (NREL) researchers recently released an updated open-source simplified and documented. Because of the living, open-source nature of the newly updated utility, NREL

  3. Open-Source RTOS Space Qualification: An RTEMS Case Study

    NASA Technical Reports Server (NTRS)

    Zemerick, Scott

    2017-01-01

    NASA space-qualification of reusable off-the-shelf real-time operating systems (RTOSs) remains elusive due to several factors notably (1) The diverse nature of RTOSs utilized across NASA, (2) No single NASA space-qualification criteria, lack of verification and validation (V&V) analysis, or test beds, and (3) different RTOS heritages, specifically open-source RTOSs and closed vendor-provided RTOSs. As a leader in simulation test beds, the NASA IV&V Program is poised to help jump-start and lead the space-qualification effort of the open source Real-Time Executive for Multiprocessor Systems (RTEMS) RTOS. RTEMS, as a case-study, can be utilized as an example of how to qualify all RTOSs, particularly the reusable non-commercial (open-source) ones that are gaining usage and popularity across NASA. Qualification will improve the overall safety and mission assurance of RTOSs for NASA-agency wide usage. NASA's involvement in space-qualification of an open-source RTOS such as RTEMS will drive the RTOS industry toward a more qualified and mature open-source RTOS product.

  4. The open-source movement: an introduction for forestry professionals

    Treesearch

    Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove

    2005-01-01

    In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....

  5. Utilization of open source electronic health record around the world: A systematic review.

    PubMed

    Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahamdi, Maryam

    2014-01-01

    Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems.

  6. Open-path spectroscopic methane detection using a broadband monolithic distributed feedback-quantum cascade laser array.

    PubMed

    Michel, Anna P M; Kapit, Jason; Witinski, Mark F; Blanchard, Romain

    2017-04-10

    Methane is a powerful greenhouse gas that has both natural and anthropogenic sources. The ability to measure methane using an integrated path length approach such as an open/long-path length sensor would be beneficial in several environments for examining anthropogenic and natural sources, including tundra landscapes, rivers, lakes, landfills, estuaries, fracking sites, pipelines, and agricultural sites. Here a broadband monolithic distributed feedback-quantum cascade laser array was utilized as the source for an open-path methane sensor. Two telescopes were utilized for the launch (laser source) and receiver (detector) in a bistatic configuration for methane sensing across a 50 m path length. Direct-absorption spectroscopy was utilized with intrapulse tuning. Ambient methane levels were detectable, and an instrument precision of 70 ppb with 100 s averaging and 90 ppb with 10 s averaging was achieved. The sensor system was designed to work "off the grid" and utilizes batteries that are rechargeable with solar panels and wind turbines.

  7. Utilization of open source electronic health record around the world: A systematic review

    PubMed Central

    Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahamdi, Maryam

    2014-01-01

    Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems. PMID:24672566

  8. Expanding Human Capabilities through the Adoption and Utilization of Free, Libre, and Open Source Software

    ERIC Educational Resources Information Center

    Simpson, James Daniel

    2014-01-01

    Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…

  9. Exploring the Role of Value Networks for Software Innovation

    NASA Astrophysics Data System (ADS)

    Morgan, Lorraine; Conboy, Kieran

    This paper describes a research-in-progress that aims to explore the applicability and implications of open innovation practices in two firms - one that employs agile development methods and another that utilizes open source software. The open innovation paradigm has a lot in common with open source and agile development methodologies. A particular strength of agile approaches is that they move away from 'introverted' development, involving only the development personnel, and intimately involves the customer in all areas of software creation, supposedly leading to the development of a more innovative and hence more valuable information system. Open source software (OSS) development also shares two key elements of the open innovation model, namely the collaborative development of the technology and shared rights to the use of the technology. However, one shortfall with agile development in particular is the narrow focus on a single customer representative. In response to this, we argue that current thinking regarding innovation needs to be extended to include multiple stakeholders both across and outside the organization. Additionally, for firms utilizing open source, it has been found that their position in a network of potential complementors determines the amount of superior value they create for their customers. Thus, this paper aims to get a better understanding of the applicability and implications of open innovation practices in firms that employ open source and agile development methodologies. In particular, a conceptual framework is derived for further testing.

  10. Transforming High School Classrooms with Free/Open Source Software: "It's Time for an Open Source Software Revolution"

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2008-01-01

    Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…

  11. Openness, Web 2.0 Technology, and Open Science

    ERIC Educational Resources Information Center

    Peters, Michael A.

    2010-01-01

    Open science is a term that is being used in the literature to designate a form of science based on open source models or that utilizes principles of open access, open archiving and open publishing to promote scientific communication. Open science increasingly also refers to open governance and more democratized engagement and control of science…

  12. The Use of Open Source Software in the Global Land Ice Measurements From Space (GLIMS) Project, and the Relevance to Institutional Cooperation

    Treesearch

    Christopher W. Helm

    2006-01-01

    GLIMS is a NASA funded project that utilizes Open-Source Software to achieve its goal of creating a globally complete inventory of glaciers. The participation of many international institutions and the development of on-line mapping applications to provide access to glacial data have both been enhanced by Open-Source GIS capabilities and play a crucial role in the...

  13. SWMM5 Application Programming Interface and PySWMM: A ...

    EPA Pesticide Factsheets

    In support of the OpenWaterAnalytics open source initiative, the PySWMM project encompasses the development of a Python interfacing wrapper to SWMM5 with parallel ongoing development of the USEPA Stormwater Management Model (SWMM5) application programming interface (API). ... The purpose of this work is to increase the utility of the SWMM dll by creating a Toolkit API for accessing its functionality. The utility of the Toolkit is further enhanced with a wrapper to allow access from the Python scripting language. This work is being prosecuted as part of an Open Source development strategy and is being performed by volunteer software developers.

  14. APPARATUS FOR PRODUCING IONS OF VAPORIZABLE MATERIALS

    DOEpatents

    Starr, C.

    1957-11-19

    This patent relates to electronic discharge devices used as ion sources, and in particular describes an ion source for application in a calutron. The source utilizes two cathodes disposed at opposite ends of a longitudinal opening in an arc block fed with vaporized material. A magnetic field is provided parallel to the length of the arc block opening. The electrons from the cathodes are directed through slits in collimating electrodes into the arc block parallel to the magnetic field and cause an arc discharge to occur between the cathodes, as the arc block and collimating electrodes are at a positive potential with respect to the cathode. The ions are withdrawn by suitable electrodes disposed opposite the arc block opening. When such an ion source is used in a calutron, an arc discharge of increased length may be utilized, thereby increasing the efficiency and economy of operation.

  15. Vector-Based Data Services for NASA Earth Science

    NASA Astrophysics Data System (ADS)

    Rodriguez, J.; Roberts, J. T.; Ruvane, K.; Cechini, M. F.; Thompson, C. K.; Boller, R. A.; Baynes, K.

    2016-12-01

    Vector data sources offer opportunities for mapping and visualizing science data in a way that allows for more customizable rendering and deeper data analysis than traditional raster images, and popular formats like GeoJSON and Mapbox Vector Tiles allow diverse types of geospatial data to be served in a high-performance and easily consumed-package. Vector data is especially suited to highly dynamic mapping applications and visualization of complex datasets, while growing levels of support for vector formats and features in open-source mapping clients has made utilizing them easier and more powerful than ever. NASA's Global Imagery Browse Services (GIBS) is working to make NASA data more easily and conveniently accessible than ever by serving vector datasets via GeoJSON, Mapbox Vector Tiles, and raster images. This presentation will review these output formats, the services, including WFS, WMS, and WMTS, that can be used to access the data, and some ways in which vector sources can be utilized in popular open-source mapping clients like OpenLayers. Lessons learned from GIBS' recent move towards serving vector will be discussed, as well as how to use GIBS open source software to create, configure, and serve vector data sources using Mapserver and the GIBS OnEarth Apache module.

  16. Bayesian Model Development for Analysis of Open Source Information to Support the Assessment of Nuclear Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.

    2013-07-15

    Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less

  17. Developing a GIS for CO2 analysis using lightweight, open source components

    NASA Astrophysics Data System (ADS)

    Verma, R.; Goodale, C. E.; Hart, A. F.; Kulawik, S. S.; Law, E.; Osterman, G. B.; Braverman, A.; Nguyen, H. M.; Mattmann, C. A.; Crichton, D. J.; Eldering, A.; Castano, R.; Gunson, M. R.

    2012-12-01

    There are advantages to approaching the realm of geographic information systems (GIS) using lightweight, open source components in place of a more traditional web map service (WMS) solution. Rapid prototyping, schema-less data storage, the flexible interchange of components, and open source community support are just some of the benefits. In our effort to develop an application supporting the geospatial and temporal rendering of remote sensing carbon-dioxide (CO2) data for the CO2 Virtual Science Data Environment project, we have connected heterogeneous open source components together to form a GIS. Utilizing widely popular open source components including the schema-less database MongoDB, Leaflet interactive maps, the HighCharts JavaScript graphing library, and Python Bottle web-services, we have constructed a system for rapidly visualizing CO2 data with reduced up-front development costs. These components can be aggregated together, resulting in a configurable stack capable of replicating features provided by more standard GIS technologies. The approach we have taken is not meant to replace the more established GIS solutions, but to instead offer a rapid way to provide GIS features early in the development of an application and to offer a path towards utilizing more capable GIS technology in the future.

  18. Common characteristics of open source software development and applicability for drug discovery: a systematic review.

    PubMed

    Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne

    2011-09-28

    Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.

  19. compomics-utilities: an open-source Java library for computational proteomics.

    PubMed

    Barsnes, Harald; Vaudel, Marc; Colaert, Niklaas; Helsens, Kenny; Sickmann, Albert; Berven, Frode S; Martens, Lennart

    2011-03-08

    The growing interest in the field of proteomics has increased the demand for software tools and applications that process and analyze the resulting data. And even though the purpose of these tools can vary significantly, they usually share a basic set of features, including the handling of protein and peptide sequences, the visualization of (and interaction with) spectra and chromatograms, and the parsing of results from various proteomics search engines. Developers typically spend considerable time and effort implementing these support structures, which detracts from working on the novel aspects of their tool. In order to simplify the development of proteomics tools, we have implemented an open-source support library for computational proteomics, called compomics-utilities. The library contains a broad set of features required for reading, parsing, and analyzing proteomics data. compomics-utilities is already used by a long list of existing software, ensuring library stability and continued support and development. As a user-friendly, well-documented and open-source library, compomics-utilities greatly simplifies the implementation of the basic features needed in most proteomics tools. Implemented in 100% Java, compomics-utilities is fully portable across platforms and architectures. Our library thus allows the developers to focus on the novel aspects of their tools, rather than on the basic functions, which can contribute substantially to faster development, and better tools for proteomics.

  20. Cultural Geography Model Validation

    DTIC Science & Technology

    2010-03-01

    the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S

  1. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  2. Open-source colorimeter.

    PubMed

    Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M

    2013-04-19

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.

  3. Open-Source Colorimeter

    PubMed Central

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032

  4. Open-source meteor detection software for low-cost single-board computers

    NASA Astrophysics Data System (ADS)

    Vida, D.; Zubović, D.; Šegon, D.; Gural, P.; Cupec, R.

    2016-01-01

    This work aims to overcome the current price threshold of meteor stations which can sometimes deter meteor enthusiasts from owning one. In recent years small card-sized computers became widely available and are used for numerous applications. To utilize such computers for meteor work, software which can run on them is needed. In this paper we present a detailed description of newly-developed open-source software for fireball and meteor detection optimized for running on low-cost single board computers. Furthermore, an update on the development of automated open-source software which will handle video capture, fireball and meteor detection, astrometry and photometry is given.

  5. Biomass utilization modeling on the Bitterroot National Forest

    Treesearch

    Robin P. Silverstein; Dan Loeffler; J. Greg Jones; Dave E. Calkin; Hans R. Zuuring; Martin Twer

    2006-01-01

    Utilization of small-sized wood (biomass) from forests as a potential source of renewable energy is an increasingly important aspect of fuels management on public lands as an alternative to traditional disposal methods (open burning). The potential for biomass utilization to enhance the economics of treating hazardous forest fuels was examined on the Bitterroot...

  6. Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy.

    PubMed

    Görlitz, Frederik; Kelly, Douglas J; Warren, Sean C; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J; Stuhmeier, Frank; Neil, Mark A A; Tate, Edward W; Dunsby, Christopher; French, Paul M W

    2017-01-18

    We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set.

  7. Efficient radiologic reading environment by using an open-source macro program as connection software.

    PubMed

    Lee, Young Han

    2012-01-01

    The objectives are (1) to introduce an easy open-source macro program as connection software and (2) to illustrate the practical usages in radiologic reading environment by simulating the radiologic reading process. The simulation is a set of radiologic reading process to do a practical task in the radiologic reading room. The principal processes are: (1) to view radiologic images on the Picture Archiving and Communicating System (PACS), (2) to connect the HIS/EMR (Hospital Information System/Electronic Medical Record) system, (3) to make an automatic radiologic reporting system, and (4) to record and recall information of interesting cases. This simulation environment was designed by using open-source macro program as connection software. The simulation performed well on the Window-based PACS workstation. Radiologists practiced the steps of the simulation comfortably by utilizing the macro-powered radiologic environment. This macro program could automate several manual cumbersome steps in the radiologic reading process. This program successfully acts as connection software for the PACS software, EMR/HIS, spreadsheet, and other various input devices in the radiologic reading environment. A user-friendly efficient radiologic reading environment could be established by utilizing open-source macro program as connection software. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy

    PubMed Central

    Warren, Sean C.; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A.; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J.; Stuhmeier, Frank; Neil, Mark A. A.; Tate, Edward W.; Dunsby, Christopher; French, Paul M. W.

    2017-01-01

    We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set. PMID:28190060

  9. Open source electronic health records and chronic disease management.

    PubMed

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-02-01

    To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC.

  10. Open Knee: Open Source Modeling & Simulation to Enable Scientific Discovery and Clinical Care in Knee Biomechanics

    PubMed Central

    Erdemir, Ahmet

    2016-01-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical function of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor intensive reproduction of model development steps can be avoided. The interested parties can immediately utilize readily available models for scientific discovery and for clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes detailed anatomical representation of the joint's major tissue structures, their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next generation knee models are noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age groups and pathological states. PMID:26444849

  11. Open Knee: Open Source Modeling and Simulation in Knee Biomechanics.

    PubMed

    Erdemir, Ahmet

    2016-02-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical functions of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor-intensive reproduction of model development steps can be avoided. Interested parties can immediately utilize readily available models for scientific discovery and clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes the detailed anatomical representation of the joint's major tissue structures and their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next-generation knee models is noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age groups and pathological states. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  12. VIVO Open Source Software: Connecting Facilities to Promote Discovery and Further Research.

    NASA Astrophysics Data System (ADS)

    Gross, M. B.; Rowan, L. R.; Mayernik, M. S.; Daniels, M. D.; Stott, D.; Allison, J.; Maull, K. E.; Krafft, D. B.; Khan, H.

    2016-12-01

    EarthCollab (http://earthcube.org/group/earthcollab), a National Science Foundation (NSF) EarthCube Building Block project, has adapted an open source semantic web application, VIVO, for use within the earth science domain. EarthCollab is a partnership between UNAVCO, an NSF facility supporting research through geodetic services, the Earth Observing Laboratory (EOL) at the National Center for Atmospheric Research (NCAR), and Cornell University, where VIVO was created to highlight the scholarly output of researchers at universities. Two public sites have been released: Connect UNAVCO (connect.unavco.org) and Arctic Data Connects (vivo.eol.ucar.edu). The core VIVO software and ontology have been extended to work better with concepts necessary for capturing work within UNAVCO's and EOL's province such as principal investigators for continuous GPS/GNSS stations at UNAVCO and keywords describing cruise datasets at EOL. The sites increase discoverability of large and diverse data archives by linking data with people, research, and field projects. Disambiguation is a major challenge when using VIVO and open data when "anyone can say anything about anything." Concepts and controlled vocabularies help to build consistent and easily searchable connections within VIVO. We use aspects of subject heading services such as FAST and LOC, as well as AGU and GSA fields of research and subject areas to reveal connections, especially with VIVO instances at other institutions. VIVO works effectively with persistent IDs and the projects strive to utilize publication and data DOIs, ORCIDs for people, and ISNI and GRID for organizations. ORCID, an open source project, is very useful for disambiguation and unlike other identifier systems for people developed by publishers, makes public data available via an API. VIVO utilizes Solr and Freemarker, which are open source search engine and templating technologies, respectively. Additionally, a handful of popular open source libraries and applications are being used in the project such as D3.js, jQuery, Leaflet, and Elasticsearch. Our implementation of these open source projects within VIVO is available for adaptation by other institutions using VIVO via GitHub (git.io/vG9AJ).

  13. The use of open source electronic health records within the federal safety net.

    PubMed

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    To conduct a federally funded study that examines the acquisition, implementation and operation of open source electronic health records (EHR) within safety net medical settings, such as federally qualified health centers (FQHC). The study was conducted by the National Opinion Research Center (NORC) at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to West Virginia, California and Arizona FQHC that were currently using an open source EHR. Five of the six sites that were chosen as part of the study found a number of advantages in the use of their open source EHR system, such as utilizing a large community of users and developers to modify their EHR to fit the needs of their provider and patient communities, and lower acquisition and implementation costs as compared to a commercial system. Despite these advantages, many of the informants and site visit participants felt that widespread dissemination and use of open source was restrained due to a negative connotation regarding this type of software. In addition, a number of participants stated that there is a necessary level of technical acumen needed within the FQHC to make an open source EHR effective. An open source EHR provides advantages for FQHC that have limited resources to acquire and implement an EHR, but additional study is needed to evaluate its overall effectiveness.

  14. Open source electronic health records and chronic disease management

    PubMed Central

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    Objective To study and report on the use of open source electronic health records (EHR) to assist with chronic care management within safety net medical settings, such as community health centers (CHC). Methods and Materials The study was conducted by NORC at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to CHC that currently use an open source EHR. Results Two of the sites chosen by NORC were actively using an open source EHR to assist in the redesign of their care delivery system to support more effective chronic disease management. This included incorporating the chronic care model into an CHC and using the EHR to help facilitate its elements, such as care teams for patients, in addition to maintaining health records on indigent populations, such as tuberculosis status on homeless patients. Discussion The ability to modify the open-source EHR to adapt to the CHC environment and leverage the ecosystem of providers and users to assist in this process provided significant advantages in chronic care management. Improvements in diabetes management, controlled hypertension and increases in tuberculosis vaccinations were assisted through the use of these open source systems. Conclusions The flexibility and adaptability of open source EHR demonstrated its utility and viability in the provision of necessary and needed chronic disease care among populations served by CHC. PMID:23813566

  15. SLURM: Simple Linux Utility for Resource Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jette, M; Grondona, M

    2002-12-19

    Simple Linux Utility for Resource Management (SLURM) is an open source, fault-tolerant, and highly scalable cluster management and job scheduling system for Linux clusters of thousands of nodes. Components include machine status, partition management, job management, scheduling and stream copy modules. This paper presents an overview of the SLURM architecture and functionality.

  16. SLURM: Simplex Linux Utility for Resource Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jette, M; Grondona, M

    2003-04-22

    Simple Linux Utility for Resource Management (SLURM) is an open source, fault-tolerant, and highly scalable cluster management and job scheduling system for Linux clusters of thousands of nodes. Components include machine status, partition management, job management, scheduling, and stream copy modules. This paper presents an overview of the SLURM architecture and functionality.

  17. Source-sink-storage relationships of conifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luxmoore, R.J.; Oren, R.; Sheriff, D.W.

    1995-07-01

    Irradiance, air temperature, saturation vapor pressure deficit, and soil temperature vary in association with Earth`s daily rotation, inducing significant hourly changes in the rates of plant physiological processes. These processes include carbon fixation in photosynthesis, sucrose translocation, and carbon utilization in growth, storage, and respiration. The sensitivity of these physiological processes to environmental factors such as temperature, soil water availability, and nutrient supply reveals differences that must be viewed as an interactive whole in order to comprehend whole-plant responses to the environment. Integrative frameworks for relationships between plant physiological processes are needed to provide syntheses of plant growth and development.more » Source-sink-storage relationships, addressed in this chapter, provide one framework for synthesis of whole-plant responses to external environmental variables. To address this issue, some examples of carbon assimilation and utilization responses of five conifer species to environmental factors from a range of field environments are first summarized. Next, the interactions between sources, sinks, and storages of carbon are examined at the leaf and tree scales, and finally, the review evaluates the proposition that processes involved with carbon utilization (sink activity) are more sensitive to the supply of water and nutrients (particularly nitrogen) than are the processes of carbon gain (source activity) and carbon storage. The terms {open_quotes}sink{close_quotes} and {open_quotes}source{close_quotes} refer to carbon utilization and carbon gain, respectively. The relative roles of stored carbon reserves and of current photosynthate in meeting sink demand are addressed. Discussions focus on source-sink-storage relationships within the diurnal, wetting-drying, and annual cycles of conifer growth and development, and some discussion of life cycle aspects is also presented.« less

  18. The use of open source electronic health records within the federal safety net

    PubMed Central

    Goldwater, Jason C; Kwon, Nancy J; Nathanson, Ashley; Muckle, Alison E; Brown, Alexa; Cornejo, Kerri

    2014-01-01

    Objective To conduct a federally funded study that examines the acquisition, implementation and operation of open source electronic health records (EHR) within safety net medical settings, such as federally qualified health centers (FQHC). Methods and materials The study was conducted by the National Opinion Research Center (NORC) at the University of Chicago from April to September 2010. The NORC team undertook a comprehensive environmental scan, including a literature review, a dozen key informant interviews using a semistructured protocol, and a series of site visits to West Virginia, California and Arizona FQHC that were currently using an open source EHR. Results Five of the six sites that were chosen as part of the study found a number of advantages in the use of their open source EHR system, such as utilizing a large community of users and developers to modify their EHR to fit the needs of their provider and patient communities, and lower acquisition and implementation costs as compared to a commercial system. Discussion Despite these advantages, many of the informants and site visit participants felt that widespread dissemination and use of open source was restrained due to a negative connotation regarding this type of software. In addition, a number of participants stated that there is a necessary level of technical acumen needed within the FQHC to make an open source EHR effective. Conclusions An open source EHR provides advantages for FQHC that have limited resources to acquire and implement an EHR, but additional study is needed to evaluate its overall effectiveness. PMID:23744787

  19. OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia

    PubMed Central

    2012-01-01

    Background The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing. Results The following related ontologies have been developed for OpenTox: a) Toxicological ontology – listing the toxicological endpoints; b) Organs system and Effects ontology – addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology – representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology– representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink–ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology. OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources. The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists). Availability The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page http://www.opentox.org/dev/ontology; the OpenTox ontology is available as OWL at http://opentox.org/api/1 1/opentox.owl, the ToxML - OWL conversion utility is an open source resource available at http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/ PMID:22541598

  20. OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia.

    PubMed

    Tcheremenskaia, Olga; Benigni, Romualdo; Nikolova, Ivelina; Jeliazkova, Nina; Escher, Sylvia E; Batke, Monika; Baier, Thomas; Poroikov, Vladimir; Lagunin, Alexey; Rautenberg, Micha; Hardy, Barry

    2012-04-24

    The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing. The following related ontologies have been developed for OpenTox: a) Toxicological ontology - listing the toxicological endpoints; b) Organs system and Effects ontology - addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology - representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology- representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink-ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology.OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources.The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists). The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page http://www.opentox.org/dev/ontology; the OpenTox ontology is available as OWL at http://opentox.org/api/1 1/opentox.owl, the ToxML - OWL conversion utility is an open source resource available at http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/

  1. scikit-image: image processing in Python.

    PubMed

    van der Walt, Stéfan; Schönberger, Johannes L; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org.

  2. Construction and Commissioning of A 248 m-long Beamline with X-ray Undulator Light Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suzuki, Yoshio; Uesugi, Kentaro; Takimoto, Naoki

    2004-05-12

    A medium-length beamline with undulator source, BL20XU at SPring-8, was constructed, and opened to public use. The distance from source point to the end of the beamline is 248 m. By utilizing the long beam transport path, the beamline has advantages for experiment that requires high spatial coherence in hard X-ray regions.

  3. SwarmSight: Measuring the Temporal Progression of Animal Group Activity Levels from Natural Scene and Laboratory Videos

    PubMed Central

    Birgiolas, Justas; Jernigan, Christopher M.; Smith, Brian H.; Crook, Sharon M.

    2016-01-01

    We describe SwarmSight (available at: https://github.com/justasb/SwarmSight), a novel, open-source, Microsoft Windows software tool for quantitative assessment of the temporal progression of animal group activity levels from recorded videos. The tool utilizes a background subtraction machine vision algorithm and provides an activity metric that can be used to quantitatively assess and compare animal group behavior. Here we demonstrate the tool utility by analyzing defensive bee behavior as modulated by alarm pheromones, wild bird feeding onset and interruption, and cockroach nest finding activity. While more sophisticated, commercial software packages are available, SwarmSight provides a low-cost, open-source, and easy-to-use alternative that is suitable for a wide range of users, including minimally trained research technicians and behavioral science undergraduate students in classroom laboratory settings. PMID:27130170

  4. Sensitive detection of chemical agents and toxic industrial chemicals using active open-path FTIRs

    NASA Astrophysics Data System (ADS)

    Walter, William T.

    2004-03-01

    Active open-path FTIR sensors provide more sensitive detection of chemical agents than passive FTIRs, such as the M21 RSCAAL and JSLSCAD, and at the same time identify and quantify toxic industrial chemicals (TIC). Passive FTIRs are bistatic sensors relying on infrared sources of opportunity. Utilization of earth-based sources of opportunity limits the source temperatures available for passive chemical-agent FTIR sensors to 300° K. Active FTIR chemical-agent sensors utilize silicon carbide sources, which can be operated at 1500° K. The higher source temperature provides more than an 80-times increase in the infrared radiant flux emitted per unit area in the 7 to 14 micron spectral fingerprint region. Minimum detection limits are better than 5 μgm/m3 for GA, GB, GD, GF and VX. Active FTIR sensors can (1) assist first responders and emergency response teams in their assessment of and reaction to a terrorist threat, (2) provide information on the identification of the TIC present and their concentrations and (3) contribute to the understanding and prevention of debilitating disorders analogous to the Gulf War Syndrome for military and civilian personnel.

  5. An integrated SNP mining and utilization (ISMU) pipeline for next generation sequencing data.

    PubMed

    Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A V S K; Varshney, Rajeev K

    2014-01-01

    Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software.

  6. Generalizable open source urban water portfolio simulation framework demonstrated using a multi-objective risk-based planning benchmark problem.

    NASA Astrophysics Data System (ADS)

    Trindade, B. C.; Reed, P. M.

    2017-12-01

    The growing access and reduced cost for computing power in recent years has promoted rapid development and application of multi-objective water supply portfolio planning. As this trend continues there is a pressing need for flexible risk-based simulation frameworks and improved algorithm benchmarking for emerging classes of water supply planning and management problems. This work contributes the Water Utilities Management and Planning (WUMP) model: a generalizable and open source simulation framework designed to capture how water utilities can minimize operational and financial risks by regionally coordinating planning and management choices, i.e. making more efficient and coordinated use of restrictions, water transfers and financial hedging combined with possible construction of new infrastructure. We introduce the WUMP simulation framework as part of a new multi-objective benchmark problem for planning and management of regionally integrated water utility companies. In this problem, a group of fictitious water utilities seek to balance the use of the mentioned reliability driven actions (e.g., restrictions, water transfers and infrastructure pathways) and their inherent financial risks. Several traits of this problem make it ideal for a benchmark problem, namely the presence of (1) strong non-linearities and discontinuities in the Pareto front caused by the step-wise nature of the decision making formulation and by the abrupt addition of storage through infrastructure construction, (2) noise due to the stochastic nature of the streamflows and water demands, and (3) non-separability resulting from the cooperative formulation of the problem, in which decisions made by stakeholder may substantially impact others. Both the open source WUMP simulation framework and its demonstration in a challenging benchmarking example hold value for promoting broader advances in urban water supply portfolio planning for regions confronting change.

  7. SLURM: Simple Linux Utility for Resource Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jette, M; Dunlap, C; Garlick, J

    2002-07-08

    Simple Linux Utility for Resource Management (SLURM) is an open source, fault-tolerant, and highly scalable cluster management and job scheduling system for Linux clusters of thousands of nodes. Components include machine status, partition management, job management, scheduling and stream copy modules. The design also includes a scalable, general-purpose communication infrastructure. This paper presents a overview of the SLURM architecture and functionality.

  8. The importance of using open source technologies and common standards for interoperability within eHealth: Perspectives from the Millennium Villages Project

    PubMed Central

    Borland, Rob; Barasa, Mourice; Iiams-Hauser, Casey; Velez, Olivia; Kaonga, Nadi Nina; Berg, Matt

    2013-01-01

    The purpose of this paper is to illustrate the importance of using open source technologies and common standards for interoperability when implementing eHealth systems and illustrate this through case studies, where possible. The sources used to inform this paper draw from the implementation and evaluation of the eHealth Program in the context of the Millennium Villages Project (MVP). As the eHealth Team was tasked to deploy an eHealth architecture, the Millennium Villages Global-Network (MVG-Net), across all fourteen of the MVP sites in Sub-Saharan Africa, the team recognized the need for standards and uniformity but also realized that context would be an important factor. Therefore, the team decided to utilize open source solutions. The MVP implementation of MVG-Net provides a model for those looking to implement informatics solutions across disciplines and countries. Furthermore, there are valuable lessons learned that the eHealth community can benefit from. By sharing lessons learned and developing an accessible, open-source eHealth platform, we believe that we can more efficiently and rapidly achieve the health-related and collaborative Millennium Development Goals (MDGs). PMID:22894051

  9. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  10. GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models

    NASA Astrophysics Data System (ADS)

    Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.

    2015-11-01

    We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.

  11. scikit-image: image processing in Python

    PubMed Central

    Schönberger, Johannes L.; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D.; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921

  12. Energy

    DTIC Science & Technology

    2003-01-01

    and universal service. In 1978 the Public Utility Regulatory Policy Act ( PURPA ) was enacted to permit non-utilities to enter the electric power...and renewable resources as alternate sources for electricity.[87] PURPA opened the door to a new paradigm – power didn’t have to come from large...industry and provided the basis for the current structure of the entire power industry. EPACT and PURPA have freed, in an economic sense, most power

  13. Personalized reminiscence therapy M-health application for patients living with dementia: Innovating using open source code repository.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.

  14. "Do-It-Yourself" reliable pH-stat device by using open-source software, inexpensive hardware and available laboratory equipment

    PubMed Central

    Kragic, Rastislav; Kostic, Mirjana

    2018-01-01

    In this paper, we present the construction of a reliable and inexpensive pH stat device, by using open-source “OpenPhControl” software, inexpensive hardware (a peristaltic and a syringe pump, Arduino, a step motor…), readily available laboratory devices: a pH meter, a computer, a webcam, and some 3D printed parts. We provide a methodology for the design, development and test results of each part of the device, as well as of the entire system. In addition to dosing reagents by means of a low-cost peristaltic pump, we also present carefully controlled dosing of reagents by an open-source syringe pump. The upgrading of the basic open-source syringe pump is given in terms of pump control and application of a larger syringe. In addition to the basic functions of pH stat, i.e. pH value measurement and maintenance, an improvement allowing the device to be used for potentiometric titration has been made as well. We have demonstrated the device’s utility when applied for cellulose fibers oxidation with 2,2,6,6-tetramethylpiperidine-1-oxyl radical, i.e. for TEMPO-mediated oxidation. In support of this, we present the results obtained for the oxidation kinetics, the consumption of added reagent and experimental repeatability. Considering that the open-source scientific tools are available to everyone, and that researchers can construct and adjust the device according to their needs, as well as, that the total cost of the open-source pH stat device, excluding the existing laboratory equipment (pH meter, computer and glossary) was less than 150 EUR, we believe that, at a small fraction of the cost of available commercial offers, our open-source pH stat can significantly improve experimental work where the use of pH stat is necessary. PMID:29509793

  15. "Do-It-Yourself" reliable pH-stat device by using open-source software, inexpensive hardware and available laboratory equipment.

    PubMed

    Milanovic, Jovana Z; Milanovic, Predrag; Kragic, Rastislav; Kostic, Mirjana

    2018-01-01

    In this paper, we present the construction of a reliable and inexpensive pH stat device, by using open-source "OpenPhControl" software, inexpensive hardware (a peristaltic and a syringe pump, Arduino, a step motor…), readily available laboratory devices: a pH meter, a computer, a webcam, and some 3D printed parts. We provide a methodology for the design, development and test results of each part of the device, as well as of the entire system. In addition to dosing reagents by means of a low-cost peristaltic pump, we also present carefully controlled dosing of reagents by an open-source syringe pump. The upgrading of the basic open-source syringe pump is given in terms of pump control and application of a larger syringe. In addition to the basic functions of pH stat, i.e. pH value measurement and maintenance, an improvement allowing the device to be used for potentiometric titration has been made as well. We have demonstrated the device's utility when applied for cellulose fibers oxidation with 2,2,6,6-tetramethylpiperidine-1-oxyl radical, i.e. for TEMPO-mediated oxidation. In support of this, we present the results obtained for the oxidation kinetics, the consumption of added reagent and experimental repeatability. Considering that the open-source scientific tools are available to everyone, and that researchers can construct and adjust the device according to their needs, as well as, that the total cost of the open-source pH stat device, excluding the existing laboratory equipment (pH meter, computer and glossary) was less than 150 EUR, we believe that, at a small fraction of the cost of available commercial offers, our open-source pH stat can significantly improve experimental work where the use of pH stat is necessary.

  16. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  17. A Scalable, Open Source Platform for Data Processing, Archiving and Dissemination

    DTIC Science & Technology

    2016-01-01

    Object Oriented Data Technology (OODT) big data toolkit developed by NASA and the Work-flow INstance Generation and Selection (WINGS) scientific work...to several challenge big data problems and demonstrated the utility of OODT-WINGS in addressing them. Specific demonstrated analyses address i...source software, Apache, Object Oriented Data Technology, OODT, semantic work-flows, WINGS, big data , work- flow management 16. SECURITY CLASSIFICATION OF

  18. An Open Avionics and Software Architecture to Support Future NASA Exploration Missions

    NASA Technical Reports Server (NTRS)

    Schlesinger, Adam

    2017-01-01

    The presentation describes an avionics and software architecture that has been developed through NASAs Advanced Exploration Systems (AES) division. The architecture is open-source, highly reliable with fault tolerance, and utilizes standard capabilities and interfaces, which are scalable and customizable to support future exploration missions. Specific focus areas of discussion will include command and data handling, software, human interfaces, communication and wireless systems, and systems engineering and integration.

  19. An Integrated SNP Mining and Utilization (ISMU) Pipeline for Next Generation Sequencing Data

    PubMed Central

    Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M.; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A. V. S. K.; Varshney, Rajeev K.

    2014-01-01

    Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone free software. PMID:25003610

  20. Relationship Between Climate Change Impact, Migration and Socioeconomic Development

    NASA Astrophysics Data System (ADS)

    Sann Oo, Kyaw

    2016-06-01

    Geospatial data are available in raster and vector formats and some of them are available in open data form. The technique and tools to handle those data are also available in open source. Though it is free of charge, the knowledge to utilize those data is limited to non-educated in the specific field. The data and technology should be promoted to those levels to utilize in required fields with priceless in developing countries. Before utilize open data, which are required to verify with local knowledge to become usable information for the local people as priceless data resources. Developing country, which economic is based in agriculture, required more information about precise weather data and weather variation by the climate change impact for their socioeconomic development. This study found that rural to urban migration occurs in the developing countries such agriculture based country likes Myanmar when the agriculture economic are affected by unpredictable impact by the climate change. The knowledge sharing using open data resources to non-educated local people is one of the curable solutions for the agriculture economy development in the country. Moreover, the study will find ways to reduce the rural to urban migration.

  1. A Cloud-based, Open-Source, Command-and-Control Software Paradigm for Space Situational Awareness (SSA)

    NASA Astrophysics Data System (ADS)

    Melton, R.; Thomas, J.

    With the rapid growth in the number of space actors, there has been a marked increase in the complexity and diversity of software systems utilized to support SSA target tracking, indication, warning, and collision avoidance. Historically, most SSA software has been constructed with "closed" proprietary code, which limits interoperability, inhibits the code transparency that some SSA customers need to develop domain expertise, and prevents the rapid injection of innovative concepts into these systems. Open-source aerospace software, a rapidly emerging, alternative trend in code development, is based on open collaboration, which has the potential to bring greater transparency, interoperability, flexibility, and reduced development costs. Open-source software is easily adaptable, geared to rapidly changing mission needs, and can generally be delivered at lower costs to meet mission requirements. This paper outlines Ball's COSMOS C2 system, a fully open-source, web-enabled, command-and-control software architecture which provides several unique capabilities to move the current legacy SSA software paradigm to an open source model that effectively enables pre- and post-launch asset command and control. Among the unique characteristics of COSMOS is the ease with which it can integrate with diverse hardware. This characteristic enables COSMOS to serve as the command-and-control platform for the full life-cycle development of SSA assets, from board test, to box test, to system integration and test, to on-orbit operations. The use of a modern scripting language, Ruby, also permits automated procedures to provide highly complex decision making for the tasking of SSA assets based on both telemetry data and data received from outside sources. Detailed logging enables quick anomaly detection and resolution. Integrated real-time and offline data graphing renders the visualization of the both ground and on-orbit assets simple and straightforward.

  2. Open source EMR software: profiling, insights and hands-on analysis.

    PubMed

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    The use of open source software in health informatics is increasingly advocated by authors in the literature. Although there is no clear evidence of the superiority of the current open source applications in the healthcare field, the number of available open source applications online is growing and they are gaining greater prominence. This repertoire of open source options is of a great value for any future-planner interested in adopting an electronic medical/health record system, whether selecting an existent application or building a new one. The following questions arise. How do the available open source options compare to each other with respect to functionality, usability and security? Can an implementer of an open source application find sufficient support both as a user and as a developer, and to what extent? Does the available literature provide adequate answers to such questions? This review attempts to shed some light on these aspects. The objective of this study is to provide more comprehensive guidance from an implementer perspective toward the available alternatives of open source healthcare software, particularly in the field of electronic medical/health records. The design of this study is twofold. In the first part, we profile the published literature on a sample of existent and active open source software in the healthcare area. The purpose of this part is to provide a summary of the available guides and studies relative to the sampled systems, and to identify any gaps in the published literature with respect to our research questions. In the second part, we investigate those alternative systems relative to a set of metrics, by actually installing the software and reporting a hands-on experience of the installation process, usability, as well as other factors. The literature covers many aspects of open source software implementation and utilization in healthcare practice. Roughly, those aspects could be distilled into a basic taxonomy, making the literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators

    NASA Astrophysics Data System (ADS)

    Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.

    2015-12-01

    Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  4. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  5. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  6. ENERGY MARKETS: Concerted Actions Needed by FERC to Confront Challenges That Impede Effective Oversight

    DTIC Science & Technology

    2002-06-01

    Act PURPA Public Utilities Regulatory Policies Act QF qualifying facility RTO regional transmission organization Page 1 GAO-02-656 Energy Markets June...alternative sources of power and energy efficiency. The Public Utility Regulatory Policies Act of 1978 ( PURPA ) was enacted, in part, to augment electric...requirements.5 More significantly, by opening wholesale power markets to nonutility producers of electricity, PURPA laid the groundwork for increased competition

  7. Utilizing Robot Operating System (ROS) in Robot Vision and Control

    DTIC Science & Technology

    2015-09-01

    actually feel more comfortable with the black screen and white letters now. I would also like to thank James Calusdian for his tireless efforts in...originally designed by Willow Garage and currently maintained by the Open Source Robotics Foundation, is a powerful tool because it utilizes object...Visualization The Rviz package, developed by Willow Garage, comes standard with ROS and is a powerful visualization tool that allows users to visualize

  8. PMIX_Ring patch for SLURM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moody, A. T.

    2014-04-20

    This code adds an implementation of PMIX_Ring to the existing PM12 Library in the SLURM open source software package (Simple Linux Utility for Resource Management). PMIX_Ring executes a particular communication pattern that is used to bootstrap connections between MPI processes in a parallel job.

  9. Windows Memory Forensic Data Visualization

    DTIC Science & Technology

    2014-06-12

    clustering characteristics (Bastian, et al, 2009). The software is written in Java and utilizes the OpenGL library for rendering graphical content...Toolkit 2 nd ed. Burlington MA: Syngress. D3noob. (2013, February 8). Using a MYSQL database as a source of data. Message posted to http

  10. Towards Cross-Organizational Innovative Business Process Interoperability Services

    NASA Astrophysics Data System (ADS)

    Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco

    This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.

  11. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  12. An open source high-performance solution to extract surface water drainage networks from diverse terrain conditions

    USGS Publications Warehouse

    Stanislawski, Larry V.; Survila, Kornelijus; Wendel, Jeffrey; Liu, Yan; Buttenfield, Barbara P.

    2018-01-01

    This paper describes a workflow for automating the extraction of elevation-derived stream lines using open source tools with parallel computing support and testing the effectiveness of procedures in various terrain conditions within the conterminous United States. Drainage networks are extracted from the US Geological Survey 1/3 arc-second 3D Elevation Program elevation data having a nominal cell size of 10 m. This research demonstrates the utility of open source tools with parallel computing support for extracting connected drainage network patterns and handling depressions in 30 subbasins distributed across humid, dry, and transitional climate regions and in terrain conditions exhibiting a range of slopes. Special attention is given to low-slope terrain, where network connectivity is preserved by generating synthetic stream channels through lake and waterbody polygons. Conflation analysis compares the extracted streams with a 1:24,000-scale National Hydrography Dataset flowline network and shows that similarities are greatest for second- and higher-order tributaries.

  13. Digital beacon receiver for ionospheric TEC measurement developed with GNU Radio

    NASA Astrophysics Data System (ADS)

    Yamamoto, M.

    2008-11-01

    A simple digital receiver named GNU Radio Beacon Receiver (GRBR) was developed for the satellite-ground beacon experiment to measure the ionospheric total electron content (TEC). The open-source software toolkit for the software defined radio, GNU Radio, is utilized to realize the basic function of the receiver and perform fast signal processing. The software is written in Python for a LINUX PC. The open-source hardware called Universal Software Radio Peripheral (USRP), which best matches the GNU Radio, is used as a front-end to acquire the satellite beacon signals of 150 and 400 MHz. The first experiment was successful as results from GRBR showed very good agreement to those from the co-located analog beacon receiver. Detailed design information and software codes are open at the URL http://www.rish.kyoto-u.ac.jp/digitalbeacon/.

  14. An open data mining framework for the analysis of medical images: application on obstructive nephropathy microscopy images.

    PubMed

    Doukas, Charalampos; Goudas, Theodosis; Fischer, Simon; Mierswa, Ingo; Chatziioannou, Aristotle; Maglogiannis, Ilias

    2010-01-01

    This paper presents an open image-mining framework that provides access to tools and methods for the characterization of medical images. Several image processing and feature extraction operators have been implemented and exposed through Web Services. Rapid-Miner, an open source data mining system has been utilized for applying classification operators and creating the essential processing workflows. The proposed framework has been applied for the detection of salient objects in Obstructive Nephropathy microscopy images. Initial classification results are quite promising demonstrating the feasibility of automated characterization of kidney biopsy images.

  15. Vector-Based Ground Surface and Object Representation Using Cameras

    DTIC Science & Technology

    2009-12-01

    representations and it is a digital data structure used for the representation of a ground surface in geographical information systems ( GIS ). Figure...Vision API library, and the OpenCV library. Also, the Posix thread library was utilized to quickly capture the source images from cameras. Both

  16. Supporting Building Portfolio Investment and Policy Decision Making through an Integrated Building Utility Data Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena

    The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less

  17. Open Source Virtual Worlds and Low Cost Sensors for Physical Rehab of Patients with Chronic Diseases

    NASA Astrophysics Data System (ADS)

    Romero, Salvador J.; Fernandez-Luque, Luis; Sevillano, José L.; Vognild, Lars

    For patients with chronic diseases, exercise is a key part of rehab to deal better with their illness. Some of them do rehabilitation at home with telemedicine systems. However, keeping to their exercising program is challenging and many abandon the rehabilitation. We postulate that information technologies for socializing and serious games can encourage patients to keep doing physical exercise and rehab. In this paper we present Virtual Valley, a low cost telemedicine system for home exercising, based on open source virtual worlds and utilizing popular low cost motion controllers (e.g. Wii Remote) and medical sensors. Virtual Valley allows patient to socialize, learn, and play group based serious games while exercising.

  18. What Can OpenEI Do For You?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-12-10

    Open Energy Information (OpenEI) is an open source web platform—similar to the one used by Wikipedia—developed by the US Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to make the large amounts of energy-related data and information more easily searched, accessed, and used both by people and automated machine processes. Built utilizing the standards and practices of the Linked Open Data community, the OpenEI platform is much more robust and powerful than typical web sites and databases. As an open platform, all users can search, edit, add, and access data in OpenEI for free. The user communitymore » contributes the content and ensures its accuracy and relevance; as the community expands, so does the content's comprehensiveness and quality. The data are structured and tagged with descriptors to enable cross-linking among related data sets, advanced search functionality, and consistent, usable formatting. Data input protocols and quality standards help ensure the content is structured and described properly and derived from a credible source. Although DOE/NREL is developing OpenEI and seeding it with initial data, it is designed to become a true community model with millions of users, a large core of active contributors, and numerous sponsors.« less

  19. What Can OpenEI Do For You?

    ScienceCinema

    None

    2018-02-06

    Open Energy Information (OpenEI) is an open source web platform—similar to the one used by Wikipedia—developed by the US Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to make the large amounts of energy-related data and information more easily searched, accessed, and used both by people and automated machine processes. Built utilizing the standards and practices of the Linked Open Data community, the OpenEI platform is much more robust and powerful than typical web sites and databases. As an open platform, all users can search, edit, add, and access data in OpenEI for free. The user community contributes the content and ensures its accuracy and relevance; as the community expands, so does the content's comprehensiveness and quality. The data are structured and tagged with descriptors to enable cross-linking among related data sets, advanced search functionality, and consistent, usable formatting. Data input protocols and quality standards help ensure the content is structured and described properly and derived from a credible source. Although DOE/NREL is developing OpenEI and seeding it with initial data, it is designed to become a true community model with millions of users, a large core of active contributors, and numerous sponsors.

  20. A Water-Service Challenge

    ERIC Educational Resources Information Center

    Roman, Harry T.

    2011-01-01

    It is important to let students see the value of mathematics in design--and how mathematics lends perspective to problem solving. In this article, the author describes a water-service challenge which enables students to design a water utility system that uses surface runoff into an open reservoir as the potable water source. This challenge…

  1. Homology Modeling and Molecular Docking for the Science Curriculum

    ERIC Educational Resources Information Center

    McDougal, Owen M.; Cornia, Nic; Sambasivarao, S. V.; Remm, Andrew; Mallory, Chris; Oxford, Julia Thom; Maupin, C. Mark; Andersen, Tim

    2014-01-01

    DockoMatic 2.0 is a powerful open source software program (downloadable from sourceforge.net) that allows users to utilize a readily accessible computational tool to explore biomolecules and their interactions. This manuscript describes a practical tutorial for use in the undergraduate curriculum that introduces students to macromolecular…

  2. Integrating an Awareness of Selfhood and Society into Virtual Learning

    ERIC Educational Resources Information Center

    Stricker, Andrew, Ed.; Calongne, Cynthia, Ed.; Truman, Barbara, Ed.; Arenas, Fil, Ed.

    2017-01-01

    Recent technological advances have opened new platforms for learning and teaching. By utilizing virtual spaces, more educational opportunities are created for students who cannot attend a physical classroom environment. "Integrating an Awareness of Selfhood and Society into Virtual Learning" is a pivotal reference source that discusses…

  3. The use of open data from social media for the creation of 3D georeferenced modeling

    NASA Astrophysics Data System (ADS)

    Themistocleous, Kyriacos

    2016-08-01

    There is a great deal of open source video on the internet that is posted by users on social media sites. With the release of low-cost unmanned aerial vehicles, many hobbyists are uploading videos from different locations, especially in remote areas. Using open source data that is available on the internet, this study utilized structure to motion (SfM) as a range imaging technique to estimate 3 dimensional landscape features from 2 dimensional image sequences subtracted from video, applied image distortion correction and geo-referencing. This type of documentation may be necessary for cultural heritage sites that are inaccessible or documentation is difficult, where we can access video from Unmanned Aerial Vehicles (UAV). These 3D models can be viewed using Google Earth, create orthoimage, drawings and create digital terrain modeling for cultural heritage and archaeological purposes in remote or inaccessible areas.

  4. Global Health Innovation Technology Models.

    PubMed

    Harding, Kimberly

    2016-01-01

    Chronic technology and business process disparities between High Income, Low Middle Income and Low Income (HIC, LMIC, LIC) research collaborators directly prevent the growth of sustainable Global Health innovation for infectious and rare diseases. There is a need for an Open Source-Open Science Architecture Framework to bridge this divide. We are proposing such a framework for consideration by the Global Health community, by utilizing a hybrid approach of integrating agnostic Open Source technology and healthcare interoperability standards and Total Quality Management principles. We will validate this architecture framework through our programme called Project Orchid. Project Orchid is a conceptual Clinical Intelligence Exchange and Virtual Innovation platform utilizing this approach to support clinical innovation efforts for multi-national collaboration that can be locally sustainable for LIC and LMIC research cohorts. The goal is to enable LIC and LMIC research organizations to accelerate their clinical trial process maturity in the field of drug discovery, population health innovation initiatives and public domain knowledge networks. When sponsored, this concept will be tested by 12 confirmed clinical research and public health organizations in six countries. The potential impact of this platform is reduced drug discovery and public health innovation lag time and improved clinical trial interventions, due to reliable clinical intelligence and bio-surveillance across all phases of the clinical innovation process.

  5. Global Health Innovation Technology Models

    PubMed Central

    Harding, Kimberly

    2016-01-01

    Chronic technology and business process disparities between High Income, Low Middle Income and Low Income (HIC, LMIC, LIC) research collaborators directly prevent the growth of sustainable Global Health innovation for infectious and rare diseases. There is a need for an Open Source-Open Science Architecture Framework to bridge this divide. We are proposing such a framework for consideration by the Global Health community, by utilizing a hybrid approach of integrating agnostic Open Source technology and healthcare interoperability standards and Total Quality Management principles. We will validate this architecture framework through our programme called Project Orchid. Project Orchid is a conceptual Clinical Intelligence Exchange and Virtual Innovation platform utilizing this approach to support clinical innovation efforts for multi-national collaboration that can be locally sustainable for LIC and LMIC research cohorts. The goal is to enable LIC and LMIC research organizations to accelerate their clinical trial process maturity in the field of drug discovery, population health innovation initiatives and public domain knowledge networks. When sponsored, this concept will be tested by 12 confirmed clinical research and public health organizations in six countries. The potential impact of this platform is reduced drug discovery and public health innovation lag time and improved clinical trial interventions, due to reliable clinical intelligence and bio-surveillance across all phases of the clinical innovation process.

  6. Global ISR: Toward a Comprehensive Defense Against Unauthorized Code Execution

    DTIC Science & Technology

    2010-10-01

    implementation using two of the most popular open- source servers: the Apache web server, and the MySQL database server. For Apache, we measure the effect that...utility ab. T o ta l T im e ( s e c ) 0 500 1000 1500 2000 2500 3000 Native Null ISR ISR−MP Fig. 3. The MySQL test-insert bench- mark measures...various SQL operations. The figure draws total execution time as reported by the benchmark utility. Finally, we benchmarked a MySQL database server using

  7. Land use in the northern Coachella Valley

    NASA Technical Reports Server (NTRS)

    Bale, J. B.; Bowden, L. W.

    1973-01-01

    Satellite imagery has proved to have great utility for monitoring land use change and as a data source for regional planning. In California, open space desert resources are under severe pressure to serve as a source for recreational gratification to individuals living in the heavily populated southern coastal plain. Concern for these sensitive arid environments has been expressed by both federal and state agencies. The northern half of the Coachella Valley has historically served as a focal point for weekend recreational activity and second homes. Since demand in this area has remained high, land use change from rural to urban residential has been occurring continuously since 1968. This area of rapid change is an ideal site to illustrate the utility of satellite imagery as a data source for planning information, and has served as the areal focus of this investigation.

  8. PRESTO-Tango as an open-source resource for interrogation of the druggable human GPCRome.

    PubMed

    Kroeze, Wesley K; Sassano, Maria F; Huang, Xi-Ping; Lansu, Katherine; McCorvy, John D; Giguère, Patrick M; Sciaky, Noah; Roth, Bryan L

    2015-05-01

    G protein-coupled receptors (GPCRs) are essential mediators of cellular signaling and are important targets of drug action. Of the approximately 350 nonolfactory human GPCRs, more than 100 are still considered to be 'orphans' because their endogenous ligands remain unknown. Here, we describe a unique open-source resource that allows interrogation of the druggable human GPCRome via a G protein-independent β-arrestin-recruitment assay. We validate this unique platform at more than 120 nonorphan human GPCR targets, demonstrate its utility for discovering new ligands for orphan human GPCRs and describe a method (parallel receptorome expression and screening via transcriptional output, with transcriptional activation following arrestin translocation (PRESTO-Tango)) for the simultaneous and parallel interrogation of the entire human nonolfactory GPCRome.

  9. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    NASA Astrophysics Data System (ADS)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  10. Embracing the Open-Source Movement for the Management of Spatial Data: A Case Study of African Trypanosomiasis in Kenya

    PubMed Central

    Langley, Shaun A.; Messina, Joseph P.

    2011-01-01

    The past decade has seen an explosion in the availability of spatial data not only for researchers, but the public alike. As the quantity of data increases, the ability to effectively navigate and understand the data becomes more challenging. Here we detail a conceptual model for a spatially explicit database management system that addresses the issues raised with the growing data management problem. We demonstrate utility with a case study in disease ecology: to develop a multi-scale predictive model of African Trypanosomiasis in Kenya. International collaborations and varying technical expertise necessitate a modular open-source software solution. Finally, we address three recurring problems with data management: scalability, reliability, and security. PMID:21686072

  11. Embracing the Open-Source Movement for the Management of Spatial Data: A Case Study of African Trypanosomiasis in Kenya.

    PubMed

    Langley, Shaun A; Messina, Joseph P

    2011-01-01

    The past decade has seen an explosion in the availability of spatial data not only for researchers, but the public alike. As the quantity of data increases, the ability to effectively navigate and understand the data becomes more challenging. Here we detail a conceptual model for a spatially explicit database management system that addresses the issues raised with the growing data management problem. We demonstrate utility with a case study in disease ecology: to develop a multi-scale predictive model of African Trypanosomiasis in Kenya. International collaborations and varying technical expertise necessitate a modular open-source software solution. Finally, we address three recurring problems with data management: scalability, reliability, and security.

  12. Utilizing social media for informal ocean conservation and education: The BioOceanography Project

    NASA Astrophysics Data System (ADS)

    Payette, J.

    2016-02-01

    Science communication through the use of social media is a rapidly evolving and growing pursuit in academic and scientific circles. Online tools and social media are being used in not only scientific communication but also scientific publication, education, and outreach. Standards and usage of social media as well as other online tools for communication, networking, outreach, and publication are always in development. Caution and a conservative attitude towards these novel "Science 2.0" tools is understandable because of their rapidly changing nature and the lack of professional standards for using them. However there are some key benefits and unique ways social media, online systems, and other Open or Open Source technologies, software, and "Science 2.0" tools can be utilized for academic purposes such as education and outreach. Diverse efforts for ocean conservation and education will continue to utilize social media for a variety of purposes. The BioOceanography project is an informal communication, education, outreach, and conservation initiative created for enhancing knowledge related to Oceanography and Marine Science with an unbiased yet conservation-minded approach and in an Open Source format. The BioOceanography project is ongoing and still evolving, but has already contributed to ocean education and conservation communication in key ways through a concerted web presence since 2013, including a curated Twitter account @_Oceanography and BioOceanography blog style website. Social media tools like those used in this project, if used properly can be highly effective and valuable for encouraging students, networking with researchers, and educating the general public in Oceanography.

  13. ERMes: Open Source Simplicity for Your E-Resource Management

    ERIC Educational Resources Information Center

    Doering, William; Chilton, Galadriel

    2009-01-01

    ERMes, the latest version of electronic resource management system (ERM), is a relational database; content in different tables connects to, and works with, content in other tables. ERMes requires Access 2007 (Windows) or Access 2008 (Mac) to operate as the database utilizes functionality not available in previous versions of Microsoft Access. The…

  14. Integrating an Educational Game in Moodle LMS

    ERIC Educational Resources Information Center

    Minovic, Miroslav; Milovanovic, Milos; Minovic, Jelena; Starcevic, Dusan

    2012-01-01

    The authors present a learning platform based on a computer game. Learning games combine two industries: education and entertainment, which is often called "Edutainment." The game is realized as a strategic game (similar to Risk[TM]), implemented as a module for Moodle CMS, utilizing Java Applet technology. Moodle is an open-source course…

  15. FormScanner: Open-Source Solution for Grading Multiple-Choice Exams

    ERIC Educational Resources Information Center

    Young, Chadwick; Lo, Glenn; Young, Kaisa; Borsetta, Alberto

    2016-01-01

    The multiple-choice exam remains a staple for many introductory physics courses. In the past, people have graded these by hand or even flaming needles. Today, one usually grades the exams with a form scanner that utilizes optical mark recognition (OMR). Several companies provide these scanners and particular forms, such as the eponymous…

  16. Feeding Experimentation Device (FED): A flexible open-source device for measuring feeding behavior.

    PubMed

    Nguyen, Katrina P; O'Neal, Timothy J; Bolonduro, Olurotimi A; White, Elecia; Kravitz, Alexxai V

    2016-07-15

    Measuring food intake in rodents is a conceptually simple yet labor-intensive and temporally-imprecise task. Most commonly, food is weighed manually, with an interval of hours or days between measurements. Commercial feeding monitors are excellent, but are costly and require specialized caging and equipment. We have developed the Feeding Experimentation Device (FED): a low-cost, open-source, home cage-compatible feeding system. FED utilizes an Arduino microcontroller and open-source software and hardware. FED dispenses a single food pellet into a food well where it is monitored by an infrared beam. When the mouse removes the pellet, FED logs the timestamp to a secure digital (SD) card and dispenses a new pellet into the well. Post-hoc analyses of pellet retrieval timestamps reveal high-resolution details about feeding behavior. FED is capable of accurately measuring food intake, identifying discrete trends during light and dark-cycle feeding. Additionally, we show the utility of FED for measuring increases in feeding resulting from optogenetic stimulation of agouti-related peptide neurons in the arcuate nucleus of the hypothalamus. With a cost of ∼$350 per device, FED is >10× cheaper than commercially available feeding systems. FED is also self-contained, battery powered, and designed to be placed in standard colony rack cages, allowing for monitoring of true home cage feeding behavior. Moreover, FED is highly adaptable and can be synchronized with emerging techniques in neuroscience, such as optogenetics, as we demonstrate here. FED allows for accurate, precise monitoring of feeding behavior in a home cage setting. Published by Elsevier B.V.

  17. Deterministic Design Optimization of Structures in OpenMDAO Framework

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula M.; Pai, Shantaram S.

    2012-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.

  18. Methods for utilizing maximum power from a solar array

    NASA Technical Reports Server (NTRS)

    Decker, D. K.

    1972-01-01

    A preliminary study of maximum power utilization methods was performed for an outer planet spacecraft using an ion thruster propulsion system and a solar array as the primary energy source. The problems which arise from operating the array at or near the maximum power point of its 1-V characteristic are discussed. Two closed loop system configurations which use extremum regulators to track the array's maximum power point are presented. Three open loop systems are presented that either: (1) measure the maximum power of each array section and compute the total array power, (2) utilize a reference array to predict the characteristics of the solar array, or (3) utilize impedance measurements to predict the maximum power utilization. The advantages and disadvantages of each system are discussed and recommendations for further development are made.

  19. Current interruption in inductive storage systems with inertial current source

    NASA Astrophysics Data System (ADS)

    Vitkovitsky, I. M.; Conte, D.; Ford, R. D.; Lupton, W. H.

    1980-03-01

    Utilization of inertial current source inductive storage with high power output requires a switch with short opening time. This switch must operate as a circuit breaker, i.e., be capable to carry the current for a time period characteristic of inertial systems, such as homopolar generators. For reasonable efficiency, its opening time must be fast to minimize the energy dissipated in downstream fuse stages required for any additional pulse compression. A switch that satisfies these criteria, as well as other requirements such as that for high voltage operation associated with high power output, is an explosively driven switch consisting of large number of gaps arranged in series. The performance of this switch in limiting and/or interrupting currents produced by large generators has been studied. Single switch modules were designed and tested for limiting the commutating current output of 1 MW, 60 Hz, generator and 500 KJ capacitor banks. Current limiting and commutation were evaluated, using these sources, for currents ranging up to 0.4 MA. The explosive opening of the switch was found to provide an effective first stage for further pulse compression. It opens in tens of microseconds, commutates current at high efficiency ( = 905) recovers very rapidly over a wide range of operating conditions.

  20. OnEarth: An Open Source Solution for Efficiently Serving High-Resolution Mapped Image Products

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Plesea, L.; Hall, J. R.; Roberts, J. T.; Cechini, M. F.; Schmaltz, J. E.; Alarcon, C.; Huang, T.; McGann, J. M.; Chang, G.; Boller, R. A.; Ilavajhala, S.; Murphy, K. J.; Bingham, A. W.

    2013-12-01

    This presentation introduces OnEarth, a server side software package originally developed at the Jet Propulsion Laboratory (JPL), that facilitates network-based, minimum-latency geolocated image access independent of image size or spatial resolution. The key component in this package is the Meta Raster Format (MRF), a specialized raster file extension to the Geospatial Data Abstraction Library (GDAL) consisting of an internal indexed pyramid of image tiles. Imagery to be served is converted to the MRF format and made accessible online via an expandable set of server modules handling requests in several common protocols, including the Open Geospatial Consortium (OGC) compliant Web Map Tile Service (WMTS) as well as Tiled WMS and Keyhole Markup Language (KML). OnEarth has recently transitioned to open source status and is maintained and actively developed as part of GIBS (Global Imagery Browse Services), a collaborative project between JPL and Goddard Space Flight Center (GSFC). The primary function of GIBS is to enhance and streamline the data discovery process and to support near real-time (NRT) applications via the expeditious ingestion and serving of full-resolution imagery representing science products from across the NASA Earth Science spectrum. Open source software solutions are leveraged where possible in order to utilize existing available technologies, reduce development time, and enlist wider community participation. We will discuss some of the factors and decision points in transitioning OnEarth to a suitable open source paradigm, including repository and licensing agreement decision points, institutional hurdles, and perceived benefits. We will also provide examples illustrating how OnEarth is integrated within GIBS and other applications.

  1. Computational studies for a multiple-frequency electron cyclotron resonance ion source (abstract)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alton, G.D.

    1996-03-01

    The number density of electrons, the energy (electron temperature), and energy distribution are three of the fundamental properties which govern the performance of electron cyclotron resonance (ECR) ion sources in terms of their capability to produce high charge state ions. The maximum electron energy is affected by several processes including the ability of the plasma to absorb power. In principle, the performances of an ECR ion source can be realized by increasing the physical size of the ECR zone in relation to the total plasma volume. The ECR zones can be increased either in the spatial or frequency domains inmore » any ECR ion source based on B-minimum plasma confinement principles. The former technique requires the design of a carefully tailored magnetic field geometry so that the central region of the plasma volume is a large, uniformly distributed plasma volume which surrounds the axis of symmetry, as proposed in Ref. . Present art forms of the ECR source utilize single frequency microwave power supplies to maintain the plasma discharge; because the magnetic field distribution continually changes in this source design, the ECR zones are relegated to thin {open_quote}{open_quote}surfaces{close_quote}{close_quote} which surround the axis of symmetry. As a consequence of the small ECR zone in relation to the total plasma volume, the probability for stochastic heating of the electrons is quite low, thereby compromising the source performance. This handicap can be overcome by use of broadband, multiple frequency microwave power as evidenced by the enhanced performances of the CAPRICE and AECR ion sources when two frequency microwave power was utilized. We have used particle-in-cell codes to simulate the magnetic field distributions in these sources and to demonstrate the advantages of using multiple, discrete frequencies over single frequencies to power conventional ECR ion sources. (Abstract Truncated)« less

  2. Efficiency analysis of semi-open sorption heat pump systems

    DOE PAGES

    Gluesenkamp, Kyle R.; Chugh, Devesh; Abdelaziz, Omar; ...

    2016-08-10

    Sorption systems traditionally fall into two categories: closed (heat pumps and chillers) and open (dehumidification). Recent work has explored the possibility of semi-open systems, which can perform heat pumping or chilling while utilizing ambient humidity as the working fluid of the cycle, and are still capable of being driven by solar, waste, or combustion heat sources. The efficiencies of closed and open systems are well characterized, and can typically be determined from four temperature s. In this work, the performance potential of semi-open systems is explored by adapting expressions for the efficiency of closed and open systems to the novelmore » semi-open systems. A key new parameter is introduced, which involves five temperatures, since both the ambient dry bulb and ambient dew point are used. Furthermore, this additional temperature is necessary to capture the open absorber performance in terms of both the absorption of humidity and sensible heat transfer with surrounding air.« less

  3. Efficiency analysis of semi-open sorption heat pump systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gluesenkamp, Kyle R.; Chugh, Devesh; Abdelaziz, Omar

    Sorption systems traditionally fall into two categories: closed (heat pumps and chillers) and open (dehumidification). Recent work has explored the possibility of semi-open systems, which can perform heat pumping or chilling while utilizing ambient humidity as the working fluid of the cycle, and are still capable of being driven by solar, waste, or combustion heat sources. The efficiencies of closed and open systems are well characterized, and can typically be determined from four temperature s. In this work, the performance potential of semi-open systems is explored by adapting expressions for the efficiency of closed and open systems to the novelmore » semi-open systems. A key new parameter is introduced, which involves five temperatures, since both the ambient dry bulb and ambient dew point are used. Furthermore, this additional temperature is necessary to capture the open absorber performance in terms of both the absorption of humidity and sensible heat transfer with surrounding air.« less

  4. New developments in cogeneration: opening remarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuster, C.N.

    1982-06-01

    Cogeneration is defined as Total energy, that is, multiple use of a single source of energy. Dual utilization of radiation in an ancient bath in Pompeii is perhaps the earliest such use. Because of PURPA in 1978 development of small power production facilities and cogeneration is encouraged. A map shows the projected cogeneration facilities across the country in 1995.

  5. An Open Source Agenda for Research Linking Text and Image Content Features.

    ERIC Educational Resources Information Center

    Goodrum, Abby A.; Rorvig, Mark E.; Jeong, Ki-Tai; Suresh, Chitturi

    2001-01-01

    Proposes methods to utilize image primitives to support term assignment for image classification. Proposes to release code for image analysis in a common tool set for other researchers to use. Of particular focus is the expansion of work by researchers in image indexing to include image content-based feature extraction capabilities in their work.…

  6. Coastal On-line Assessment and Synthesis Tool 2.0

    NASA Technical Reports Server (NTRS)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  7. Student’s Entrepreneur Model Development in Creative Industry through Utilization of Web Development Software and Educational Game

    NASA Astrophysics Data System (ADS)

    Hasan, B.; Hasbullah, H.; Elvyanti, S.; Purnama, W.

    2018-02-01

    The creative industry is the utilization of creativity, skill and talent of individuals to create wealth and jobs by generating and exploiting creativity power of individual. In the field of design, utilization of information technology can spur creative industry, development of creative industry design will accommodate a lot of creative energy that can pour their ideas and creativity without limitations. Open Source software is a trend in the field of information technology has developed since the 1990s. Examples of applications developed by the Open Source approach is the Apache web services, Linux and Android Operating System, the MySQL database. This community service activities based entrepreneurship aims to: 1). give an idea about the profile of the UPI student’s knowledge of entrepreneurship about the business based creative industries in software by using web software development and educational game 2) create a model for fostering entrepreneurship based on the creative industries in software by leveraging web development and educational games, 3) conduct training and guidance on UPI students who want to develop business in the field of creative industries engaged in the software industry . PKM-based entrepreneurship activity was attended by about 35 students DPTE FPTK UPI had entrepreneurial high interest and competence in information technology. Outcome generated from PKM entrepreneurship is the emergence of entrepreneurs from the students who are interested in the creative industry in the field of software which is able to open up business opportunities for themselves and others. Another outcome of this entrepreneurship PKM activity is the publication of articles or scientific publications in journals of national/international indexed.

  8. Open source machine-learning algorithms for the prediction of optimal cancer drug therapies.

    PubMed

    Huang, Cai; Mezencev, Roman; McDonald, John F; Vannberg, Fredrik

    2017-01-01

    Precision medicine is a rapidly growing area of modern medical science and open source machine-learning codes promise to be a critical component for the successful development of standardized and automated analysis of patient data. One important goal of precision cancer medicine is the accurate prediction of optimal drug therapies from the genomic profiles of individual patient tumors. We introduce here an open source software platform that employs a highly versatile support vector machine (SVM) algorithm combined with a standard recursive feature elimination (RFE) approach to predict personalized drug responses from gene expression profiles. Drug specific models were built using gene expression and drug response data from the National Cancer Institute panel of 60 human cancer cell lines (NCI-60). The models are highly accurate in predicting the drug responsiveness of a variety of cancer cell lines including those comprising the recent NCI-DREAM Challenge. We demonstrate that predictive accuracy is optimized when the learning dataset utilizes all probe-set expression values from a diversity of cancer cell types without pre-filtering for genes generally considered to be "drivers" of cancer onset/progression. Application of our models to publically available ovarian cancer (OC) patient gene expression datasets generated predictions consistent with observed responses previously reported in the literature. By making our algorithm "open source", we hope to facilitate its testing in a variety of cancer types and contexts leading to community-driven improvements and refinements in subsequent applications.

  9. Open Source Seismic Software in NOAA's Next Generation Tsunami Warning System

    NASA Astrophysics Data System (ADS)

    Hellman, S. B.; Baker, B. I.; Hagerty, M. T.; Leifer, J. M.; Lisowski, S.; Thies, D. A.; Donnelly, B. K.; Griffith, F. P.

    2014-12-01

    The Tsunami Information technology Modernization (TIM) is a project spearheaded by National Oceanic and Atmospheric Administration to update the United States' Tsunami Warning System software currently employed at the Pacific Tsunami Warning Center (Eva Beach, Hawaii) and the National Tsunami Warning Center (Palmer, Alaska). This entirely open source software project will integrate various seismic processing utilities with the National Weather Service Weather Forecast Office's core software, AWIPS2. For the real-time and near real-time seismic processing aspect of this project, NOAA has elected to integrate the open source portions of GFZ's SeisComP 3 (SC3) processing system into AWIPS2. To provide for better tsunami threat assessments we are developing open source tools for magnitude estimations (e.g., moment magnitude, energy magnitude, surface wave magnitude), detection of slow earthquakes with the Theta discriminant, moment tensor inversions (e.g. W-phase and teleseismic body waves), finite fault inversions, and array processing. With our reliance on common data formats such as QuakeML and seismic community standard messaging systems, all new facilities introduced into AWIPS2 and SC3 will be available as stand-alone tools or could be easily integrated into other real time seismic monitoring systems such as Earthworm, Antelope, etc. Additionally, we have developed a template based design paradigm so that the developer or scientist can efficiently create upgrades, replacements, and/or new metrics to the seismic data processing with only a cursory knowledge of the underlying SC3.

  10. OpenSeesPy: Python library for the OpenSees finite element framework

    NASA Astrophysics Data System (ADS)

    Zhu, Minjie; McKenna, Frank; Scott, Michael H.

    2018-01-01

    OpenSees, an open source finite element software framework, has been used broadly in the earthquake engineering community for simulating the seismic response of structural and geotechnical systems. The framework allows users to perform finite element analysis with a scripting language and for developers to create both serial and parallel finite element computer applications as interpreters. For the last 15 years, Tcl has been the primary scripting language to which the model building and analysis modules of OpenSees are linked. To provide users with different scripting language options, particularly Python, the OpenSees interpreter interface was refactored to provide multi-interpreter capabilities. This refactoring, resulting in the creation of OpenSeesPy as a Python module, is accomplished through an abstract interface for interpreter calls with concrete implementations for different scripting languages. Through this approach, users are able to develop applications that utilize the unique features of several scripting languages while taking advantage of advanced finite element analysis models and algorithms.

  11. Open-source three-dimensional printing of biodegradable polymer scaffolds for tissue engineering.

    PubMed

    Trachtenberg, Jordan E; Mountziaris, Paschalia M; Miller, Jordan S; Wettergreen, Matthew; Kasper, Fred K; Mikos, Antonios G

    2014-12-01

    The fabrication of scaffolds for tissue engineering requires elements of customization depending on the application and is often limited due to the flexibility of the processing technique. This investigation seeks to address this obstacle by utilizing an open-source three-dimensional printing (3DP) system that allows vast customizability and facilitates reproduction of experiments. The effects of processing parameters on printed poly(ε-caprolactone) scaffolds with uniform and gradient pore architectures have been characterized with respect to fiber and pore morphology and mechanical properties. The results demonstrate the ability to tailor the fiber diameter, pore size, and porosity through modification of pressure, printing speed, and programmed fiber spacing. A model was also used to predict the compressive mechanical properties of uniform and gradient scaffolds, and it was found that modulus and yield strength declined with increasing porosity. The use of open-source 3DP technologies for printing tissue-engineering scaffolds provides a flexible system that can be readily modified at a low cost and is supported by community documentation. In this manner, the 3DP system is more accessible to the scientific community, which further facilitates the translation of these technologies toward successful tissue-engineering strategies.

  12. Utilization of Open Source Technology to Create Cost-Effective Microscope Camera Systems for Teaching.

    PubMed

    Konduru, Anil Reddy; Yelikar, Balasaheb R; Sathyashree, K V; Kumar, Ankur

    2018-01-01

    Open source technologies and mobile innovations have radically changed the way people interact with technology. These innovations and advancements have been used across various disciplines and already have a significant impact. Microscopy, with focus on visually appealing contrasting colors for better appreciation of morphology, forms the core of the disciplines such as Pathology, microbiology, and anatomy. Here, learning happens with the aid of multi-head microscopes and digital camera systems for teaching larger groups and in organizing interactive sessions for students or faculty of other departments. The cost of the original equipment manufacturer (OEM) camera systems in bringing this useful technology at all the locations is a limiting factor. To avoid this, we have used the low-cost technologies like Raspberry Pi, Mobile high definition link and 3D printing for adapters to create portable camera systems. Adopting these open source technologies enabled us to convert any binocular or trinocular microscope be connected to a projector or HD television at a fraction of the cost of the OEM camera systems with comparable quality. These systems, in addition to being cost-effective, have also provided the added advantage of portability, thus providing the much-needed flexibility at various teaching locations.

  13. Sirepo for Synchrotron Radiation Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagler, Robert; Moeller, Paul; Rakitin, Maksim

    Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jinja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is the Synchrotron Radiation Workshop (SRW). SRW computes synchrotron radiation from relativistic electrons in arbitrary magnetic fields and propagates the radiation wavefronts through optical beamlines. SRW is open source and is primarily supported by Dr. Oleg Chubar of NSLS-II at Brookhaven National Laboratory.« less

  14. Governance Structures for Open Innovation: A Preliminary Framework

    NASA Astrophysics Data System (ADS)

    Feller, Joseph; Finnegan, Patrick; Hayes, Jeremy; O'Reilly, Philip

    This research-in-progress paper presents a preliminary framework of four open innovation governance structures. The study seeks to describe four distinct ways in which firms utilize hierarchical relationships, organizational intermediaries, and the market system to supply and acquire intellectual property and/or innovation capabilities from sources external to the firm. This paper reports on phase one of the study, which involved an analysis of six open innovation exemplars based on public data. This phase of the study reveals that governance structures for open innovation can be categorized based on whether they (1) are mediated or direct or (2) seek to acquire intellectual property or innovation capability. We analyze the differences in four governance structures along seven dimensions, and reveal the importance of knowledge dispersion and uncertainty to the use of open innovation hierarchies, brokerages, and markets. The paper concludes by examining the implications of the findings and outlining the next phase of the study.

  15. Genetic Basis of Variations in Nitrogen Source Utilization in Four Wine Commercial Yeast Strains

    PubMed Central

    Gutiérrez, Alicia; Beltran, Gemma; Warringer, Jonas; Guillamón, Jose M.

    2013-01-01

    The capacity of wine yeast to utilize the nitrogen available in grape must directly correlates with the fermentation and growth rates of all wine yeast fermentation stages and is, thus, of critical importance for wine production. Here we precisely quantified the ability of low complexity nitrogen compounds to support fast, efficient and rapidly initiated growth of four commercially important wine strains. Nitrogen substrate abundance in grape must failed to correlate with the rate or the efficiency of nitrogen source utilization, but well predicted lag phase length. Thus, human domestication of yeast for grape must growth has had, at the most, a marginal impact on wine yeast growth rates and efficiencies, but may have left a surprising imprint on the time required to adjust metabolism from non growth to growth. Wine yeast nitrogen source utilization deviated from that of the lab strain experimentation, but also varied between wine strains. Each wine yeast lineage harbored nitrogen source utilization defects that were private to that strain. By a massive hemizygote analysis, we traced the genetic basis of the most glaring of these defects, near inability of the PDM wine strain to utilize methionine, as consequence of mutations in its ARO8, ADE5,7 and VBA3 alleles. We also identified candidate causative mutations in these genes. The methionine defect of PDM is potentially very interesting as the strain can, in some circumstances, overproduce foul tasting H2S, a trait which likely stems from insufficient methionine catabolization. The poor adaptation of wine yeast to the grape must nitrogen environment, and the presence of defects in each lineage, open up wine strain optimization through biotechnological endeavors. PMID:23826223

  16. Comparison of cooperative and non-cooperative adaptive optics reference performance for propagation with thermal blooming effects

    NASA Astrophysics Data System (ADS)

    Edwards, Brian E.; Nitkowski, Arthur; Lawrence, Ryan; Horton, Kasey; Higgs, Charles

    2004-10-01

    Atmospheric turbulence and laser-induced thermal blooming effects can degrade the beam quality of a high-energy laser (HEL) weapon, and ultimately limit the amount of energy deliverable to a target. Lincoln Laboratory has built a thermal blooming laboratory capable of emulating atmospheric thermal blooming and turbulence effects for tactical HEL systems. The HEL weapon emulation hardware includes an adaptive optics beam delivery system, which utilizes a Shack-Hartman wavefront sensor and a 349 actuator deformable mirror. For this experiment, the laboratory was configured to emulate an engagement scenario consisting of sea skimming target approaching directly toward the HEL weapon at a range of 10km. The weapon utilizes a 1.5m aperture and radiates at a 1.62 micron wavelength. An adaptive optics reference beam was provided as either a point source located at the target (cooperative) or a projected point source reflected from the target (uncooperative). Performance of the adaptive optics system was then compared between reference sources. Results show that, for operating conditions with a thermal blooming distortion number of 75 and weak turbulence (Rytov of 0.02 and D/ro of 3), cooperative beacon AO correction experiences Phase Compensation Instability, resulting in lower performance than a simple, open-loop condition. The uncooperative beacon resulted in slightly better performance than the open-loop condition.

  17. Modern Teaching Methods in Physics with the Aid of Original Computer Codes and Graphical Representations

    ERIC Educational Resources Information Center

    Ivanov, Anisoara; Neacsu, Andrei

    2011-01-01

    This study describes the possibility and advantages of utilizing simple computer codes to complement the teaching techniques for high school physics. The authors have begun working on a collection of open source programs which allow students to compare the results and graphics from classroom exercises with the correct solutions and further more to…

  18. Open source tools for standardized privacy protection of medical images

    NASA Astrophysics Data System (ADS)

    Lien, Chung-Yueh; Onken, Michael; Eichelberg, Marco; Kao, Tsair; Hein, Andreas

    2011-03-01

    In addition to the primary care context, medical images are often useful for research projects and community healthcare networks, so-called "secondary use". Patient privacy becomes an issue in such scenarios since the disclosure of personal health information (PHI) has to be prevented in a sharing environment. In general, most PHIs should be completely removed from the images according to the respective privacy regulations, but some basic and alleviated data is usually required for accurate image interpretation. Our objective is to utilize and enhance these specifications in order to provide reliable software implementations for de- and re-identification of medical images suitable for online and offline delivery. DICOM (Digital Imaging and Communications in Medicine) images are de-identified by replacing PHI-specific information with values still being reasonable for imaging diagnosis and patient indexing. In this paper, this approach is evaluated based on a prototype implementation built on top of the open source framework DCMTK (DICOM Toolkit) utilizing standardized de- and re-identification mechanisms. A set of tools has been developed for DICOM de-identification that meets privacy requirements of an offline and online sharing environment and fully relies on standard-based methods.

  19. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  20. An open source business model for malaria.

    PubMed

    Årdal, Christine; Røttingen, John-Arne

    2015-01-01

    Greater investment is required in developing new drugs and vaccines against malaria in order to eradicate malaria. These precious funds must be carefully managed to achieve the greatest impact. We evaluate existing efforts to discover and develop new drugs and vaccines for malaria to determine how best malaria R&D can benefit from an enhanced open source approach and how such a business model may operate. We assess research articles, patents, clinical trials and conducted a smaller survey among malaria researchers. Our results demonstrate that the public and philanthropic sectors are financing and performing the majority of malaria drug/vaccine discovery and development, but are then restricting access through patents, 'closed' publications and hidden away physical specimens. This makes little sense since it is also the public and philanthropic sector that purchases the drugs and vaccines. We recommend that a more "open source" approach is taken by making the entire value chain more efficient through greater transparency which may lead to more extensive collaborations. This can, for example, be achieved by empowering an existing organization like the Medicines for Malaria Venture (MMV) to act as a clearing house for malaria-related data. The malaria researchers that we surveyed indicated that they would utilize such registry data to increase collaboration. Finally, we question the utility of publicly or philanthropically funded patents for malaria medicines, where little to no profits are available. Malaria R&D benefits from a publicly and philanthropically funded architecture, which starts with academic research institutions, product development partnerships, commercialization assistance through UNITAID and finally procurement through mechanisms like The Global Fund to Fight AIDS, Tuberculosis and Malaria and the U.S.' President's Malaria Initiative. We believe that a fresh look should be taken at the cost/benefit of patents particularly related to new malaria medicines and consider alternative incentives, like WHO prequalification.

  1. Scalable cloud without dedicated storage

    NASA Astrophysics Data System (ADS)

    Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.

    2015-05-01

    We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.

  2. Nmrglue: an open source Python package for the analysis of multidimensional NMR data.

    PubMed

    Helmus, Jonathan J; Jaroniec, Christopher P

    2013-04-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license.

  3. Nmrglue: An Open Source Python Package for the Analysis of Multidimensional NMR Data

    PubMed Central

    Helmus, Jonathan J.; Jaroniec, Christopher P.

    2013-01-01

    Nmrglue, an open source Python package for working with multidimensional NMR data, is described. When used in combination with other Python scientific libraries, nmrglue provides a highly flexible and robust environment for spectral processing, analysis and visualization and includes a number of common utilities such as linear prediction, peak picking and lineshape fitting. The package also enables existing NMR software programs to be readily tied together, currently facilitating the reading, writing and conversion of data stored in Bruker, Agilent/Varian, NMRPipe, Sparky, SIMPSON, and Rowland NMR Toolkit file formats. In addition to standard applications, the versatility offered by nmrglue makes the package particularly suitable for tasks that include manipulating raw spectrometer data files, automated quantitative analysis of multidimensional NMR spectra with irregular lineshapes such as those frequently encountered in the context of biomacromolecular solid-state NMR, and rapid implementation and development of unconventional data processing methods such as covariance NMR and other non-Fourier approaches. Detailed documentation, install files and source code for nmrglue are freely available at http://nmrglue.com. The source code can be redistributed and modified under the New BSD license. PMID:23456039

  4. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    PubMed

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.

  5. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    PubMed Central

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054

  6. A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows

    NASA Astrophysics Data System (ADS)

    Babin, B. L.; Hu, L.

    2008-12-01

    Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).

  7. Low-Cost energy contraption design using playground seesaw

    NASA Astrophysics Data System (ADS)

    Banlawe, I. A. P.; Acosta, N. J. E. L.

    2017-05-01

    The study was conducted at Western Philippines University, San Juan, Aborlan, Palawan. The study used the mechanical motion of playground seesaw as a means to produce electrical energy. The study aimed to design a low-cost prototype energy contraption using playground seesaw using locally available and recycled materials, to measure the voltage, current and power outputs produced at different situations and estimate the cost of the prototype. Using principle of pneumatics, two hand air pumps were employed on the two end sides of the playground seesaw and the mechanical motion of the seesaw up and down produces air that is used to rotate a DC motor to produce electrical energy. This electricity can be utilized for powering basic or low-power appliances. There were two trials of testing, each trial tests the different pressure level of the air tank and tests the opening of on-off valve (Full open and half open) when the compressed air was released. Results showed that all pressure level at full open produced significantly higher voltage, than the half open. However, the mean values of the current and power produced in all pressure level at full and half open have negligible variation. These results signify that the energy contraption using playground seesaw is an alternative viable source of electrical energy in the playgrounds, parks and other places and can be used as an auxiliary or back-up source for electricity.

  8. PRELIMINARY COUPLING OF THE MONTE CARLO CODE OPENMC AND THE MULTIPHYSICS OBJECT-ORIENTED SIMULATION ENVIRONMENT (MOOSE) FOR ANALYZING DOPPLER FEEDBACK IN MONTE CARLO SIMULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Ellis; Derek Gaston; Benoit Forget

    In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes.more » An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.« less

  9. The eGo grid model: An open source approach towards a model of German high and extra-high voltage power grids

    NASA Astrophysics Data System (ADS)

    Mueller, Ulf Philipp; Wienholt, Lukas; Kleinhans, David; Cussmann, Ilka; Bunke, Wolf-Dieter; Pleßmann, Guido; Wendiggensen, Jochen

    2018-02-01

    There are several power grid modelling approaches suitable for simulations in the field of power grid planning. The restrictive policies of grid operators, regulators and research institutes concerning their original data and models lead to an increased interest in open source approaches of grid models based on open data. By including all voltage levels between 60 kV (high voltage) and 380kV (extra high voltage), we dissolve the common distinction between transmission and distribution grid in energy system models and utilize a single, integrated model instead. An open data set for primarily Germany, which can be used for non-linear, linear and linear-optimal power flow methods, was developed. This data set consists of an electrically parameterised grid topology as well as allocated generation and demand characteristics for present and future scenarios at high spatial and temporal resolution. The usability of the grid model was demonstrated by the performance of exemplary power flow optimizations. Based on a marginal cost driven power plant dispatch, being subject to grid restrictions, congested power lines were identified. Continuous validation of the model is nescessary in order to reliably model storage and grid expansion in progressing research.

  10. Utilizing Commercial Hardware and Open Source Computer Vision Software to Perform Motion Capture for Reduced Gravity Flight

    NASA Technical Reports Server (NTRS)

    Humphreys, Brad; Bellisario, Brian; Gallo, Christopher; Thompson, William K.; Lewandowski, Beth

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts. To perform validation of these models and to support the Advanced Exercise Concepts Project, several candidate devices have been flown onboard NASAs Reduced Gravity Aircraft. In terrestrial laboratories, researchers typically have available to them motion capture systems for the measurement of subject kinematics. Onboard the parabolic flight aircraft it is not practical to utilize the traditional motion capture systems due to the large working volume they require and their relatively high replacement cost if damaged. To support measuring kinematics on board parabolic aircraft, a motion capture system is being developed utilizing open source computer vision code with commercial off the shelf (COTS) video camera hardware. While the systems accuracy is lower than lab setups, it provides a means to produce quantitative comparison motion capture kinematic data. Additionally, data such as required exercise volume for small spaces such as the Orion capsule can be determined. METHODS: OpenCV is an open source computer vision library that provides the ability to perform multi-camera 3 dimensional reconstruction. Utilizing OpenCV, via the Python programming language, a set of tools has been developed to perform motion capture in confined spaces using commercial cameras. Four Sony Video Cameras were intrinsically calibrated prior to flight. Intrinsic calibration provides a set of camera specific parameters to remove geometric distortion of the lens and sensor (specific to each individual camera). A set of high contrast markers were placed on the exercising subject (safety also necessitated that they be soft in case they become detached during parabolic flight); small yarn balls were used. Extrinsic calibration, the determination of camera location and orientation parameters, is performed using fixed landmark markers shared by the camera scenes. Additionally a wand calibration, the sweeping of the camera scenes simultaneously, was also performed. Techniques have been developed to perform intrinsic calibration, extrinsic calibration, isolation of the markers in the scene, calculation of marker 2D centroids, and 3D reconstruction from multiple cameras. These methods have been tested in the laboratory side-by-side comparison to a traditional motion capture system and also on a parabolic flight.

  11. A dynamic regression analysis tool for quantitative assessment of bacterial growth written in Python.

    PubMed

    Hoeflinger, Jennifer L; Hoeflinger, Daniel E; Miller, Michael J

    2017-01-01

    Herein, an open-source method to generate quantitative bacterial growth data from high-throughput microplate assays is described. The bacterial lag time, maximum specific growth rate, doubling time and delta OD are reported. Our method was validated by carbohydrate utilization of lactobacilli, and visual inspection revealed 94% of regressions were deemed excellent. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Competition-Based Learning: A Model for the Integration of Competitions with Project-Based Learning Using Open Source LMS

    ERIC Educational Resources Information Center

    Issa, Ghassan; Hussain, Shakir M.; Al-Bahadili, Hussein

    2014-01-01

    In an effort to enhance the learning process in higher education, a new model for Competition-Based Learning (CBL) is presented. The new model utilizes two well-known learning models, namely, the Project-Based Learning (PBL) and competitions. The new model is also applied in a networked environment with emphasis on collective learning as well as…

  13. Quantitative computed tomography (QCT) as a radiology reporting tool by using optical character recognition (OCR) and macro program.

    PubMed

    Lee, Young Han; Song, Ho-Taek; Suh, Jin-Suck

    2012-12-01

    The objectives are (1) to introduce a new concept of making a quantitative computed tomography (QCT) reporting system by using optical character recognition (OCR) and macro program and (2) to illustrate the practical usages of the QCT reporting system in radiology reading environment. This reporting system was created as a development tool by using an open-source OCR software and an open-source macro program. The main module was designed for OCR to report QCT images in radiology reading process. The principal processes are as follows: (1) to save a QCT report as a graphic file, (2) to recognize the characters from an image as a text, (3) to extract the T scores from the text, (4) to perform error correction, (5) to reformat the values into QCT radiology reporting template, and (6) to paste the reports into the electronic medical record (EMR) or picture archiving and communicating system (PACS). The accuracy test of OCR was performed on randomly selected QCTs. QCT as a radiology reporting tool successfully acted as OCR of QCT. The diagnosis of normal, osteopenia, or osteoporosis is also determined. Error correction of OCR is done with AutoHotkey-coded module. The results of T scores of femoral neck and lumbar vertebrae had an accuracy of 100 and 95.4 %, respectively. A convenient QCT reporting system could be established by utilizing open-source OCR software and open-source macro program. This method can be easily adapted for other QCT applications and PACS/EMR.

  14. Streamlined sign-out of capillary protein electrophoresis using middleware and an open-source macro application.

    PubMed

    Mathur, Gagan; Haugen, Thomas H; Davis, Scott L; Krasowski, Matthew D

    2014-01-01

    Interfacing of clinical laboratory instruments with the laboratory information system (LIS) via "middleware" software is increasingly common. Our clinical laboratory implemented capillary electrophoresis using a Sebia(®) Capillarys-2™ (Norcross, GA, USA) instrument for serum and urine protein electrophoresis. Using Data Innovations Instrument Manager, an interface was established with the LIS (Cerner) that allowed for bi-directional transmission of numeric data. However, the text of the interpretive pathology report was not properly transferred. To reduce manual effort and possibility for error in text data transfer, we developed scripts in AutoHotkey, a free, open-source macro-creation and automation software utility. Scripts were written to create macros that automated mouse and key strokes. The scripts retrieve the specimen accession number, capture user input text, and insert the text interpretation in the correct patient record in the desired format. The scripts accurately and precisely transfer narrative interpretation into the LIS. Combined with bar-code reading by the electrophoresis instrument, the scripts transfer data efficiently to the correct patient record. In addition, the AutoHotKey script automated repetitive key strokes required for manual entry into the LIS, making protein electrophoresis sign-out easier to learn and faster to use by the pathology residents. Scripts allow for either preliminary verification by residents or final sign-out by the attending pathologist. Using the open-source AutoHotKey software, we successfully improved the transfer of text data between capillary electrophoresis software and the LIS. The use of open-source software tools should not be overlooked as tools to improve interfacing of laboratory instruments.

  15. Technical Note: FreeCT_ICD: An Open Source Implementation of a Model-Based Iterative Reconstruction Method using Coordinate Descent Optimization for CT Imaging Investigations.

    PubMed

    Hoffman, John M; Noo, Frédéric; Young, Stefano; Hsieh, Scott S; McNitt-Gray, Michael

    2018-06-01

    To facilitate investigations into the impacts of acquisition and reconstruction parameters on quantitative imaging, radiomics and CAD using CT imaging, we previously released an open source implementation of a conventional weighted filtered backprojection reconstruction called FreeCT_wFBP. Our purpose was to extend that work by providing an open-source implementation of a model-based iterative reconstruction method using coordinate descent optimization, called FreeCT_ICD. Model-based iterative reconstruction offers the potential for substantial radiation dose reduction, but can impose substantial computational processing and storage requirements. FreeCT_ICD is an open source implementation of a model-based iterative reconstruction method that provides a reasonable tradeoff between these requirements. This was accomplished by adapting a previously proposed method that allows the system matrix to be stored with a reasonable memory requirement. The method amounts to describing the attenuation coefficient using rotating slices that follow the helical geometry. In the initially-proposed version, the rotating slices are themselves described using blobs. We have replaced this description by a unique model that relies on tri-linear interpolation together with the principles of Joseph's method. This model offers an improvement in memory requirement while still allowing highly accurate reconstruction for conventional CT geometries. The system matrix is stored column-wise and combined with an iterative coordinate descent (ICD) optimization. The result is FreeCT_ICD, which is a reconstruction program developed on the Linux platform using C++ libraries and the open source GNU GPL v2.0 license. The software is capable of reconstructing raw projection data of helical CT scans. In this work, the software has been described and evaluated by reconstructing datasets exported from a clinical scanner which consisted of an ACR accreditation phantom dataset and a clinical pediatric thoracic scan. For the ACR phantom, image quality was comparable to clinical reconstructions as well as reconstructions using open-source FreeCT_wFBP software. The pediatric thoracic scan also yielded acceptable results. In addition, we did not observe any deleterious impact in image quality associated with the utilization of rotating slices. These evaluations also demonstrated reasonable tradeoffs in storage requirements and computational demands. FreeCT_ICD is an open-source implementation of a model-based iterative reconstruction method that extends the capabilities of previously released open source reconstruction software and provides the ability to perform vendor-independent reconstructions of clinically acquired raw projection data. This implementation represents a reasonable tradeoff between storage and computational requirements and has demonstrated acceptable image quality in both simulated and clinical image datasets. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  16. The Application of the Open Pharmacological Concepts Triple Store (Open PHACTS) to Support Drug Discovery Research

    PubMed Central

    Ratnam, Joseline; Zdrazil, Barbara; Digles, Daniela; Cuadrado-Rodriguez, Emiliano; Neefs, Jean-Marc; Tipney, Hannah; Siebes, Ronald; Waagmeester, Andra; Bradley, Glyn; Chau, Chau Han; Richter, Lars; Brea, Jose; Evelo, Chris T.; Jacoby, Edgar; Senger, Stefan; Loza, Maria Isabel; Ecker, Gerhard F.; Chichester, Christine

    2014-01-01

    Integration of open access, curated, high-quality information from multiple disciplines in the Life and Biomedical Sciences provides a holistic understanding of the domain. Additionally, the effective linking of diverse data sources can unearth hidden relationships and guide potential research strategies. However, given the lack of consistency between descriptors and identifiers used in different resources and the absence of a simple mechanism to link them, gathering and combining relevant, comprehensive information from diverse databases remains a challenge. The Open Pharmacological Concepts Triple Store (Open PHACTS) is an Innovative Medicines Initiative project that uses semantic web technology approaches to enable scientists to easily access and process data from multiple sources to solve real-world drug discovery problems. The project draws together sources of publicly-available pharmacological, physicochemical and biomolecular data, represents it in a stable infrastructure and provides well-defined information exploration and retrieval methods. Here, we highlight the utility of this platform in conjunction with workflow tools to solve pharmacological research questions that require interoperability between target, compound, and pathway data. Use cases presented herein cover 1) the comprehensive identification of chemical matter for a dopamine receptor drug discovery program 2) the identification of compounds active against all targets in the Epidermal growth factor receptor (ErbB) signaling pathway that have a relevance to disease and 3) the evaluation of established targets in the Vitamin D metabolism pathway to aid novel Vitamin D analogue design. The example workflows presented illustrate how the Open PHACTS Discovery Platform can be used to exploit existing knowledge and generate new hypotheses in the process of drug discovery. PMID:25522365

  17. World of intelligence defense object detection-machine learning (artificial intelligence)

    NASA Astrophysics Data System (ADS)

    Gupta, Anitya; Kumar, Akhilesh; Bhushan, Vinayak

    2018-04-01

    This paper proposes a Quick Locale based Convolutional System strategy (Quick R-CNN) for question recognition. Quick R-CNN expands on past work to effectively characterize ob-ject recommendations utilizing profound convolutional systems. Com-pared to past work, Quick R-CNN utilizes a few in-novations to enhance preparing and testing speed while likewise expanding identification precision. Quick R-CNN trains the profound VGG16 arrange 9 quicker than R-CNN, is 213 speedier at test-time, and accomplishes a higher Guide on PASCAL VOC 2012. Contrasted with SPPnet, Quick R-CNN trains VGG16 3 quicker, tests 10 speedier, and is more exact. Quick R-CNN is actualized in Python and C++ (utilizing Caffe) and is accessible under the open-source MIT Permit.

  18. KaBOB: ontology-based semantic integration of biomedical databases.

    PubMed

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for formal reasoning over a wealth of integrated biomedical data.

  19. Simrank: Rapid and sensitive general-purpose k-mer search tool

    PubMed Central

    2011-01-01

    Background Terabyte-scale collections of string-encoded data are expected from consortia efforts such as the Human Microbiome Project http://nihroadmap.nih.gov/hmp. Intra- and inter-project data similarity searches are enabled by rapid k-mer matching strategies. Software applications for sequence database partitioning, guide tree estimation, molecular classification and alignment acceleration have benefited from embedded k-mer searches as sub-routines. However, a rapid, general-purpose, open-source, flexible, stand-alone k-mer tool has not been available. Results Here we present a stand-alone utility, Simrank, which allows users to rapidly identify database strings the most similar to query strings. Performance testing of Simrank and related tools against DNA, RNA, protein and human-languages found Simrank 10X to 928X faster depending on the dataset. Conclusions Simrank provides molecular ecologists with a high-throughput, open source choice for comparing large sequence sets to find similarity. PMID:21524302

  20. Energy-efficient regenerative liquid desiccant drying process

    DOEpatents

    Ko, Suk M.; Grodzka, Philomena G.; McCormick, Paul O.

    1980-01-01

    This invention relates to the use of desiccants in conjunction with an open oop drying cycle and a closed loop drying cycle to reclaim the energy expended in vaporizing moisture in harvested crops. In the closed loop cycle, the drying air is brought into contact with a desiccant after it exits the crop drying bin. Water vapor in the moist air is absorbed by the desiccant, thus reducing the relative humidity of the air. The air is then heated by the used desiccant and returned to the crop bin. During the open loop drying cycle the used desiccant is heated (either fossil or solar energy heat sources may be used) and regenerated at high temperature, driving water vapor from the desiccant. This water vapor is condensed and used to preheat the dilute (wet) desiccant before heat is added from the external source (fossil or solar). The latent heat of vaporization of the moisture removed from the desiccant is reclaimed in this manner. The sensible heat of the regenerated desiccant is utilized in the open loop drying cycle. Also, closed cycle operation implies that no net energy is expended in heating drying air.

  1. The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository.

    PubMed

    Clark, Kenneth; Vendt, Bruce; Smith, Kirk; Freymann, John; Kirby, Justin; Koppel, Paul; Moore, Stephen; Phillips, Stanley; Maffitt, David; Pringle, Michael; Tarbox, Lawrence; Prior, Fred

    2013-12-01

    The National Institutes of Health have placed significant emphasis on sharing of research data to support secondary research. Investigators have been encouraged to publish their clinical and imaging data as part of fulfilling their grant obligations. Realizing it was not sufficient to merely ask investigators to publish their collection of imaging and clinical data, the National Cancer Institute (NCI) created the open source National Biomedical Image Archive software package as a mechanism for centralized hosting of cancer related imaging. NCI has contracted with Washington University in Saint Louis to create The Cancer Imaging Archive (TCIA)-an open-source, open-access information resource to support research, development, and educational initiatives utilizing advanced medical imaging of cancer. In its first year of operation, TCIA accumulated 23 collections (3.3 million images). Operating and maintaining a high-availability image archive is a complex challenge involving varied archive-specific resources and driven by the needs of both image submitters and image consumers. Quality archives of any type (traditional library, PubMed, refereed journals) require management and customer service. This paper describes the management tasks and user support model for TCIA.

  2. Entrepreneurial model based technology creative industries sector software through the use of free open source software for Universitas Pendidikan Indonesia students

    NASA Astrophysics Data System (ADS)

    Hasan, B.; Hasbullah; Purnama, W.; Hery, A.

    2016-04-01

    Creative industry development areas of software by using Free Open Source Software (FOSS) is expected to be one of the solutions to foster new entrepreneurs of the students who can open job opportunities and contribute to economic development in Indonesia. This study aims to create entrepreneurial coaching model based on the creative industries by utilizing FOSS software field as well as provide understanding and fostering entrepreneurial creative industries based field software for students of Universitas Pendidikan Indonesia. This activity phase begins with identifying entrepreneurs or business software technology that will be developed, training and mentoring, apprenticeship process at industrial partners, creation of business plans and monitoring and evaluation. This activity involves 30 UPI student which has the motivation to self-employment and have competence in the field of information technology. The results and outcomes expected from these activities is the birth of a number of new entrepreneurs from the students engaged in the software industry both software in the world of commerce (e-commerce) and education/learning (e-learning/LMS) and games.

  3. Open Babel: An open chemical toolbox

    PubMed Central

    2011-01-01

    Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300

  4. Modular Open System Architecture for Reducing Contamination Risk in the Space and Missile Defense Supply Chain

    NASA Technical Reports Server (NTRS)

    Seasly, Elaine

    2015-01-01

    To combat contamination of physical assets and provide reliable data to decision makers in the space and missile defense community, a modular open system architecture for creation of contamination models and standards is proposed. Predictive tools for quantifying the effects of contamination can be calibrated from NASA data of long-term orbiting assets. This data can then be extrapolated to missile defense predictive models. By utilizing a modular open system architecture, sensitive data can be de-coupled and protected while benefitting from open source data of calibrated models. This system architecture will include modules that will allow the designer to trade the effects of baseline performance against the lifecycle degradation due to contamination while modeling the lifecycle costs of alternative designs. In this way, each member of the supply chain becomes an informed and active participant in managing contamination risk early in the system lifecycle.

  5. CSNS computing environment Based on OpenStack

    NASA Astrophysics Data System (ADS)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  6. Water cycle algorithm: A detailed standard code

    NASA Astrophysics Data System (ADS)

    Sadollah, Ali; Eskandar, Hadi; Lee, Ho Min; Yoo, Do Guen; Kim, Joong Hoon

    Inspired by the observation of the water cycle process and movements of rivers and streams toward the sea, a population-based metaheuristic algorithm, the water cycle algorithm (WCA) has recently been proposed. Lately, an increasing number of WCA applications have appeared and the WCA has been utilized in different optimization fields. This paper provides detailed open source code for the WCA, of which the performance and efficiency has been demonstrated for solving optimization problems. The WCA has an interesting and simple concept and this paper aims to use its source code to provide a step-by-step explanation of the process it follows.

  7. A Roadmap for using Agile Development in a Traditional System

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas

    2006-01-01

    I. Ensemble Development Group: a) Produces activity planning software for in spacecraft; b) Built on Eclipse Rich Client Platform (open source development and runtime software); c) Funded by multiple sources including the Mars Technology Program; d) Incorporated the use of Agile Development. II. Next Generation Uplink Planning System: a) Researches the Activity Planning and Sequencing Subsystem for Mars Science Laboratory (APSS); b) APSS includes Ensemble, Activity Modeling, Constraint Checking, Command Editing and Sequencing tools plus other uplink generation utilities; c) Funded by the Mars Technology Program; d) Integrates all of the tools for APSS.

  8. Summary and evaluation of the Strategic Defense Initiative Space Power Architecture Study

    NASA Technical Reports Server (NTRS)

    Edenburn, M. (Editor); Smith, J. M. (Editor)

    1989-01-01

    The Space Power Architecture Study (SPAS) identified and evaluated power subsystem options for multimegawatt electric (MMWE) space based weapons and surveillance platforms for the Strategic Defense Initiative (SDI) applications. Steady state requirements of less than 1 MMWE are adequately covered by the SP-100 nuclear space power program and hence were not addressed in the SPAS. Four steady state power systems less than 1 MMWE were investigated with little difference between them on a mass basis. The majority of the burst power systems utilized H(2) from the weapons and were either closed (no effluent), open (effluent release) or steady state with storage (no effluent). Closed systems used nuclear or combustion heat source with thermionic, Rankine, turboalternator, fuel cell and battery conversion devices. Open systems included nuclear or combustion heat sources using turboalternator, magnetohydrodynamic, fuel cell or battery power conversion devices. The steady state systems with storage used the SP-100 or Star-M reactors as energy sources and flywheels, fuel cells or batteries to store energy for burst applications. As with other studies the open systems are by far the lightest, most compact and simplist (most reliable) systems. However, unlike other studies the SPAS studied potential platform operational problems caused by effluents or vibration.

  9. Sirepo - Warp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagler, Robert; Moeller, Paul

    Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jin-ja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is Warp. Warp is a particle-in-cell (PIC) code de-signed to simulate high-intensity charged particle beams and plasmas in both the electrostatic and electromagnetic regimes, with a wide variety of integrated physics models and diagnostics. At pre-sent, Sirepo supports a small subset of Warp’s capabilities. Warp is open source and is part of the Berkeley Lab Accelerator Simulation Toolkit.« less

  10. Detecting a signal in the noise: monitoring the global spread of novel psychoactive substances using media and other open-source information†

    PubMed Central

    Young, Matthew M; Dubeau, Chad; Corazza, Ornella

    2015-01-01

    Objective To determine the feasibility and utility of using media reports and other open-source information collected by the Global Public Health Intelligence Network (GPHIN), an event-based surveillance system operated by the Public Health Agency of Canada, to rapidly detect clusters of adverse drug events associated with ‘novel psychoactive substances’ (NPS) at the international level. Methods and Results Researchers searched English media reports collected by the GPHIN between 1997 and 2013 for references to synthetic cannabinoids. They screened the resulting reports for relevance and content (i.e., reports of morbidity and arrest), plotted and compared with other available indicators (e.g., US poison control center exposures). The pattern of results from the analysis of GPHIN reports resembled the pattern seen from the other indicators. Conclusions The results of this study indicate that using media and other open-source information can help monitor the presence, usage, local policy, law enforcement responses, and spread of NPS in a rapid effective way. Further, modifying GPHIN to actively track NPS would be relatively inexpensive to implement and would be highly complementary to current national and international monitoring efforts. © 2015 The Authors. Human Psychopharmacology: Clinical and Experimental published by John Wiley & Sons, Ltd. PMID:26216568

  11. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    NASA Technical Reports Server (NTRS)

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  12. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    PubMed

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  13. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    PubMed Central

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  14. Re-utilization of Industrial CO 2 for Algae Production Using a Phase Change Material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, Brian

    This is the final report of a 36-month Phase II cooperative agreement. Under this project, Touchstone Research Laboratory (Touchstone) investigated the merits of incorporating a Phase Change Material (PCM) into an open-pond algae production system that can capture and re-use the CO 2 from a coal-fired flue gas source located in Wooster, OH. The primary objective of the project was to design, construct, and operate a series of open algae ponds that accept a slipstream of flue gas from a coal-fired source and convert a significant portion of the CO 2 to liquid biofuels, electricity, and specialty products, while demonstratingmore » the merits of the PCM technology. Construction of the pilot facility and shakedown of the facility in Wooster, OH, was completed during the first two years, and the focus of the last year was on operations and the cultivation of algae. During this Phase II effort a large-scale algae concentration unit from OpenAlgae was installed and utilized to continuously harvest algae from indoor raceways. An Algae Lysing Unit and Oil Recovery Unit were also received and installed. Initial parameters for lysing nanochloropsis were tested. Conditions were established that showed the lysing operation was effective at killing the algae cells. Continuous harvesting activities yielded over 200 kg algae dry weight for Ponds 1, 2 and 4. Studies were conducted to determine the effect of anaerobic digestion effluent as a nutrient source and the resulting lipid productivity of the algae. Lipid content and total fatty acids were unaffected by culture system and nutrient source, indicating that open raceway ponds fed diluted anaerobic digestion effluent can obtain similar lipid productivities to open raceway ponds using commercial nutrients. Data were also collected with respect to the performance of the PCM material on the pilot-scale raceway ponds. Parameters such as evaporative water loss, temperature differences, and growth/productivity were tracked. The pond with the PCM material was consistently 2 to 5°C warmer than the control pond. This difference did not seem to increase significantly over time. During phase transitions for the PCM, the magnitude of the difference between the daily minimum and maximum temperatures decreased, resulting in smaller daily temperature fluctuations. A thin layer of PCM material reduced overall water loss by 74% and consistently provided algae densities that were 80% greater than the control pond.« less

  15. A Descriptive Study of the Utilization of Behavioral Health Resources in the Fort Hood Catchment Area

    DTIC Science & Technology

    2008-07-15

    therapy (CBT), eye movement desensitization and reprocessing ( EMDR ), and medications, particularly selective serotonin reuptake inhibitors (SSRI...Test. CAGE is an acronym created by taking the first letter of the words Cut Down, Annoyed, Guilty, and Eye Opener, which are words imbedded in the...hour per response, including the time for reviewing instructions, searching existing data sources gathering end maintaining the data needed, and

  16. Development of a 3D WebGIS System for Retrieving and Visualizing CityGML Data Based on their Geometric and Semantic Characteristics by Using Free and Open Source Technology

    NASA Astrophysics Data System (ADS)

    Pispidikis, I.; Dimopoulou, E.

    2016-10-01

    CityGML is considered as an optimal standard for representing 3D city models. However, international experience has shown that visualization of the latter is quite difficult to be implemented on the web, due to the large size of data and the complexity of CityGML. As a result, in the context of this paper, a 3D WebGIS application is developed in order to successfully retrieve and visualize CityGML data in accordance with their respective geometric and semantic characteristics. Furthermore, the available web technologies and the architecture of WebGIS systems are investigated, as provided by international experience, in order to be utilized in the most appropriate way for the purposes of this paper. Specifically, a PostgreSQL/ PostGIS Database is used, in compliance with the 3DCityDB schema. At Server tier, Apache HTTP Server and GeoServer are utilized, while a Server Side programming language PHP is used. At Client tier, which implemented the interface of the application, the following technologies were used: JQuery, AJAX, JavaScript, HTML5, WebGL and Ol3-Cesium. Finally, it is worth mentioning that the application's primary objectives are a user-friendly interface and a fully open source development.

  17. Open Source Web-Based Solutions for Disseminating and Analyzing Flood Hazard Information at the Community Level

    NASA Astrophysics Data System (ADS)

    Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.

    2017-09-01

    We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.

  18. Finding novel relationships with integrated gene-gene association network analysis of Synechocystis sp. PCC 6803 using species-independent text-mining.

    PubMed

    Kreula, Sanna M; Kaewphan, Suwisa; Ginter, Filip; Jones, Patrik R

    2018-01-01

    The increasing move towards open access full-text scientific literature enhances our ability to utilize advanced text-mining methods to construct information-rich networks that no human will be able to grasp simply from 'reading the literature'. The utility of text-mining for well-studied species is obvious though the utility for less studied species, or those with no prior track-record at all, is not clear. Here we present a concept for how advanced text-mining can be used to create information-rich networks even for less well studied species and apply it to generate an open-access gene-gene association network resource for Synechocystis sp. PCC 6803, a representative model organism for cyanobacteria and first case-study for the methodology. By merging the text-mining network with networks generated from species-specific experimental data, network integration was used to enhance the accuracy of predicting novel interactions that are biologically relevant. A rule-based algorithm (filter) was constructed in order to automate the search for novel candidate genes with a high degree of likely association to known target genes by (1) ignoring established relationships from the existing literature, as they are already 'known', and (2) demanding multiple independent evidences for every novel and potentially relevant relationship. Using selected case studies, we demonstrate the utility of the network resource and filter to ( i ) discover novel candidate associations between different genes or proteins in the network, and ( ii ) rapidly evaluate the potential role of any one particular gene or protein. The full network is provided as an open-source resource.

  19. HitWalker2: visual analytics for precision medicine and beyond.

    PubMed

    Bottomly, Daniel; McWeeney, Shannon K; Wilmot, Beth

    2016-04-15

    The lack of visualization frameworks to guide interpretation and facilitate discovery is a potential bottleneck for precision medicine, systems genetics and other studies. To address this we have developed an interactive, reproducible, web-based prioritization approach that builds on our earlier work. HitWalker2 is highly flexible and can utilize many data types and prioritization methods based upon available data and desired questions, allowing it to be utilized in a diverse range of studies such as cancer, infectious disease and psychiatric disorders. Source code is freely available at https://github.com/biodev/HitWalker2 and implemented using Python/Django, Neo4j and Javascript (D3.js and jQuery). We support major open source browsers (e.g. Firefox and Chromium/Chrome). wilmotb@ohsu.edu Supplementary data are available at Bioinformatics online. Additional information/instructions are available at https://github.com/biodev/HitWalker2/wiki. © The Author 2015. Published by Oxford University Press.

  20. Feasibility study of solar energy in residential electricity generation

    NASA Astrophysics Data System (ADS)

    Solanki, Divyangsinh G.

    With the increasing demand for energy and the concerns about the global environment, along with the steady progress in the field of renewable energy technologies, new opportunities and possibilities are opening up for an efficient utilization of renewable energy sources. Solar energy is undoubtedly the most clean, inexhaustible and abundant source of renewable energy. Photovoltaic (PV) technology is one of the most efficient mean to utilize solar power. The focus of this study was to establish economics of a residential photovoltaic system for a typical home in south Texas. The PV system serves the needs of a typical mid-size home inhibited by a typical family. Assumptions are made for the typical daily energy consumption, and the necessary equipments like solar arrays, batteries, inverter, etc. are sized and evaluated optimally so as to reduce the life cycle cost (LCC) of the system. Calculations are done taking into consideration the economic parameters concerned with the system.

  1. Retail wheeling - users, utilities and power producers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kubacki, J. Jr.

    1996-12-31

    Information is outlined on the retail wheeling of electric power. Topics discussed include: SEL mission; average cost per kWh; retail pilot programs; retail wheeling activity; key tasks for industrials; power marketer quote; retail wheeling strategic planning; metered customer load profile; proposed ISO regions; conjunctive billing; interconnection areas; FERC order 888; open access same time information systems; transmission inferconnections; suppliers of energy and capacity; self-generation; FERC Form 714; rebundling unbundled services; key variables: load factor; energy and capacity; metering today; competitive industry configuration; power cost reduction: strategic planning; real-time pricing; prime sources of leverage; likeliness of switching utilities; and Strategic Energymore » Ltd.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.

    VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less

  3. An open source, 3D printed preclinical MRI phantom for repeated measures of contrast agents and reference standards.

    PubMed

    Cox, B L; Ludwig, K D; Adamson, E B; Eliceiri, K W; Fain, S B

    2018-03-01

    In medical imaging, clinicians, researchers and technicians have begun to use 3D printing to create specialized phantoms to replace commercial ones due to their customizable and iterative nature. Presented here is the design of a 3D printed open source, reusable magnetic resonance imaging (MRI) phantom, capable of flood-filling, with removable samples for measurements of contrast agent solutions and reference standards, and for use in evaluating acquisition techniques and image reconstruction performance. The phantom was designed using SolidWorks, a computer-aided design software package. The phantom consists of custom and off-the-shelf parts and incorporates an air hole and Luer Lock system to aid in flood filling, a marker for orientation of samples in the filled mode and bolt and tube holes for assembly. The cost of construction for all materials is under $90. All design files are open-source and available for download. To demonstrate utility, B 0 field mapping was performed using a series of gadolinium concentrations in both the unfilled and flood-filled mode. An excellent linear agreement (R 2 >0.998) was observed between measured relaxation rates (R 1 /R 2 ) and gadolinium concentration. The phantom provides a reliable setup to test data acquisition and reconstruction methods and verify physical alignment in alternative nuclei MRI techniques (e.g. carbon-13 and fluorine-19 MRI). A cost-effective, open-source MRI phantom design for repeated quantitative measurement of contrast agents and reference standards in preclinical research is presented. Specifically, the work is an example of how the emerging technology of 3D printing improves flexibility and access for custom phantom design.

  4. Development of an Analysis and Design Optimization Framework for Marine Propellers

    NASA Astrophysics Data System (ADS)

    Tamhane, Ashish C.

    In this thesis, a framework for the analysis and design optimization of ship propellers is developed. This framework can be utilized as an efficient synthesis tool in order to determine the main geometric characteristics of the propeller but also to provide the designer with the capability to optimize the shape of the blade sections based on their specific criteria. A hybrid lifting-line method with lifting-surface corrections to account for the three-dimensional flow effects has been developed. The prediction of the correction factors is achieved using Artificial Neural Networks and Support Vector Regression. This approach results in increased approximation accuracy compared to existing methods and allows for extrapolation of the correction factor values. The effect of viscosity is implemented in the framework via the coupling of the lifting line method with the open-source RANSE solver OpenFOAM for the calculation of lift, drag and pressure distribution on the blade sections using a transition kappa-o SST turbulence model. Case studies of benchmark high-speed propulsors are utilized in order to validate the proposed framework for propeller operation in open-water conditions but also in a ship's wake.

  5. 40 CFR Table 3 to Subpart Wwww of... - Organic HAP Emissions Limits for Existing Open Molding Sources, New Open Molding Sources Emitting...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and... CATEGORIES National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites... Existing Open Molding Sources, New Open Molding Sources Emitting Less Than 100 TPY of HAP, and New and...

  6. FormScanner: Open-Source Solution for Grading Multiple-Choice Exams

    NASA Astrophysics Data System (ADS)

    Young, Chadwick; Lo, Glenn; Young, Kaisa; Borsetta, Alberto

    2016-01-01

    The multiple-choice exam remains a staple for many introductory physics courses. In the past, people have graded these by hand or even flaming needles. Today, one usually grades the exams with a form scanner that utilizes optical mark recognition (OMR). Several companies provide these scanners and particular forms, such as the eponymous "Scantron." OMR scanners combine hardware and software—a scanner and OMR program—to read and grade student-filled forms.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less

  8. Bright broadband coherent fiber sources emitting strongly blue-shifted resonant dispersive wave pulses

    PubMed Central

    Tu, Haohua; Lægsgaard, Jesper; Zhang, Rui; Tong, Shi; Liu, Yuan; Boppart, Stephen A.

    2013-01-01

    We predict and realize the targeted wavelength conversion from the 1550-nm band of a fs Er:fiber laser to an isolated band inside 370-850 nm, corresponding to a blue-shift of 700-1180 nm. The conversion utilizes resonant dispersive wave generation in widely available optical fibers with good efficiency (~7%). The converted band has a large pulse energy (~1 nJ), high spectral brightness (~1 mW/nm), and broad Gaussian-like spectrum compressible to clean transform-limited ~17 fs pulses. The corresponding coherent fiber sources open up portable applications of optical parametric oscillators and dual-output synchronized ultrafast lasers. PMID:24104233

  9. Evaluation of the Acoustic Measurement Capability of the NASA Langley V/STOL Wind Tunnel Open Test Section with Acoustically Absorbent Ceiling and Floor Treatments

    NASA Technical Reports Server (NTRS)

    Theobald, M. A.

    1978-01-01

    The single source location used for helicopter model studies was utilized in a study to determine the distances and directions upstream of the model accurate at which measurements of the direct acoustic field could be obtained. The method used was to measure the decrease of sound pressure levels with distance from a noise source and thereby determine the Hall radius as a function of frequency and direction. Test arrangements and procedures are described. Graphs show the normalized sound pressure level versus distance curves for the glass fiber floor treatment and for the foam floor treatment.

  10. ObsPy: Establishing and maintaining an open-source community package

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Megies, T.; Barsch, R.

    2017-12-01

    Python's ecosystem evolved into one of the most powerful and productive research environment across disciplines. ObsPy (https://obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It does so by offering Read and write support for essentially every commonly used data format in seismology, Integrated access to the largest data centers, web services, and real-time data streams, A powerful signal processing toolbox tuned to the specific needs of seismologists, and Utility functionality like travel time calculations, geodetic functions, and data visualizations. ObsPy has been in constant unfunded development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. By now around 70 people directly contributed code to ObsPy and we aim to make it a self-sustaining community project.This contributions focusses on several meta aspects of open-source software in science, in particular how we experienced them. During the panel we would like to discuss obvious questions like long-term sustainability with very limited to no funding, insufficient computer science training in many sciences, and gaining hard scientific credits for software development, but also the following questions: How to best deal with the fact that a lot of scientific software is very specialized thus usually solves a complex problem but at the same time can only ever reach a limited pool of developers and users by virtue of it being so specialized? Therefore the "many eyes on the code" approach to develop and improve open-source software only applies in a limited fashion. An initial publication for a significant new scientific software package is fairly straightforward. How to on-board and motivate potential new contributors when they can no longer be lured by a potential co-authorship? When is spending significant time and effort on reusable scientific open-source development a reasonable choice for young researchers? The effort to go from purpose tailored code for a single application resulting in a scientific publication is significantly less compared to generalising and engineering it well enough so it can be used by others.

  11. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D

    PubMed Central

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron

    2017-01-01

    Abstract Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. PMID:28814063

  12. The Development of an Open Hardware and Software System Onboard Unmanned Aerial Vehicles to Monitor Concentrated Solar Power Plants

    PubMed Central

    Mesas-Carrascosa, Francisco Javier; Verdú Santano, Daniel; Pérez Porras, Fernando; Meroño-Larriva, José Emilio; García-Ferrer, Alfonso

    2017-01-01

    Concentrated solar power (CSP) plants are increasingly gaining interest as a source of renewable energy. These plants face several technical problems and the inspection of components such as absorber tubes in parabolic trough concentrators (PTC), which are widely deployed, is necessary to guarantee plant efficiency. This article presents a system for real-time industrial inspection of CSP plants using low-cost, open-source components in conjunction with a thermographic sensor and an unmanned aerial vehicle (UAV). The system, available in open-source hardware and software, is designed to be employed independently of the type of device used for inspection (laptop, smartphone, tablet or smartglasses) and its operating system. Several UAV flight missions were programmed as follows: flight altitudes at 20, 40, 60, 80, 100 and 120 m above ground level; and three cruising speeds: 5, 7 and 10 m/s. These settings were chosen and analyzed in order to optimize inspection time. The results indicate that it is possible to perform inspections by an UAV in real time at CSP plants as a means of detecting anomalous absorber tubes and improving the effectiveness of methodologies currently being utilized. Moreover, aside from thermographic sensors, this contribution can be applied to other sensors and can be used in a broad range of applications where real-time georeferenced data visualization is necessary. PMID:28594353

  13. The Development of an Open Hardware and Software System Onboard Unmanned Aerial Vehicles to Monitor Concentrated Solar Power Plants.

    PubMed

    Mesas-Carrascosa, Francisco Javier; Verdú Santano, Daniel; Pérez Porras, Fernando; Meroño-Larriva, José Emilio; García-Ferrer, Alfonso

    2017-06-08

    Concentrated solar power (CSP) plants are increasingly gaining interest as a source of renewable energy. These plants face several technical problems and the inspection of components such as absorber tubes in parabolic trough concentrators (PTC), which are widely deployed, is necessary to guarantee plant efficiency. This article presents a system for real-time industrial inspection of CSP plants using low-cost, open-source components in conjunction with a thermographic sensor and an unmanned aerial vehicle (UAV). The system, available in open-source hardware and software, is designed to be employed independently of the type of device used for inspection (laptop, smartphone, tablet or smartglasses) and its operating system. Several UAV flight missions were programmed as follows: flight altitudes at 20, 40, 60, 80, 100 and 120 m above ground level; and three cruising speeds: 5, 7 and 10 m/s. These settings were chosen and analyzed in order to optimize inspection time. The results indicate that it is possible to perform inspections by an UAV in real time at CSP plants as a means of detecting anomalous absorber tubes and improving the effectiveness of methodologies currently being utilized. Moreover, aside from thermographic sensors, this contribution can be applied to other sensors and can be used in a broad range of applications where real-time georeferenced data visualization is necessary.

  14. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D.

    PubMed

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron; Gümüs, Zeynep H

    2017-08-01

    Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. © The Authors 2017. Published by Oxford University Press.

  15. The Emergence of Open-Source Software in China

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…

  16. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  17. Finding novel relationships with integrated gene-gene association network analysis of Synechocystis sp. PCC 6803 using species-independent text-mining

    PubMed Central

    Kreula, Sanna M.; Kaewphan, Suwisa; Ginter, Filip

    2018-01-01

    The increasing move towards open access full-text scientific literature enhances our ability to utilize advanced text-mining methods to construct information-rich networks that no human will be able to grasp simply from ‘reading the literature’. The utility of text-mining for well-studied species is obvious though the utility for less studied species, or those with no prior track-record at all, is not clear. Here we present a concept for how advanced text-mining can be used to create information-rich networks even for less well studied species and apply it to generate an open-access gene-gene association network resource for Synechocystis sp. PCC 6803, a representative model organism for cyanobacteria and first case-study for the methodology. By merging the text-mining network with networks generated from species-specific experimental data, network integration was used to enhance the accuracy of predicting novel interactions that are biologically relevant. A rule-based algorithm (filter) was constructed in order to automate the search for novel candidate genes with a high degree of likely association to known target genes by (1) ignoring established relationships from the existing literature, as they are already ‘known’, and (2) demanding multiple independent evidences for every novel and potentially relevant relationship. Using selected case studies, we demonstrate the utility of the network resource and filter to (i) discover novel candidate associations between different genes or proteins in the network, and (ii) rapidly evaluate the potential role of any one particular gene or protein. The full network is provided as an open-source resource. PMID:29844966

  18. Open Access, Open Source and Digital Libraries: A Current Trend in University Libraries around the World

    ERIC Educational Resources Information Center

    Krishnamurthy, M.

    2008-01-01

    Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…

  19. SLURM: Simple Linux Utility for Resource Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jette, M; Dunlap, C; Garlick, J

    2002-04-24

    Simple Linux Utility for Resource Management (SLURM) is an open source, fault-tolerant, and highly scalable cluster management and job scheduling system for Linux clusters of thousands of nodes. Components include machine status, partition management, job management, and scheduling modules. The design also includes a scalable, general-purpose communication infrastructure. Development will take place in four phases: Phase I results in a solid infrastructure; Phase II produces a functional but limited interactive job initiation capability without use of the interconnect/switch; Phase III provides switch support and documentation; Phase IV provides job status, fault-tolerance, and job queuing and control through Livermore's Distributed Productionmore » Control System (DPCS), a meta-batch and resource management system.« less

  20. Assessing the Implications of Allowing Transgender Personnel to Serve Openly

    DTIC Science & Technology

    2016-01-01

    their utilization estimates. Data from several sources (e.g., Sonier et al ., 2013; Gould , 2012) imply an approximate average one-to-one ratio of... et al ., 2014). Our analysis focused on the policies of the four countries—Australia, Canada, Israel, and the United Kingdom— with the most well...hor- mone levels stabilize (Hembree et al ., 2009; Elders et al ., 2014). To avoid this cost, DoD would need to either permit more flexible monitoring

  1. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  2. Cyber Defence in the Armed Forces of the Czech Republic

    DTIC Science & Technology

    2010-11-01

    undesirable action backward discovery. This solution is based on special tools using NetFlow protocol. Active network elements or specialized hardware...probes attached to the backbone network using a tap can be the sources of NetFlow data. The principal advantage of NetFlow protocol is the fact that it...provides primary data in the open form, which can be easily utilized in the subsequent operations. The FlowMon Probe 4000 is mostly used NetFlow

  3. An Open Source Tool for Game Theoretic Health Data De-Identification.

    PubMed

    Prasser, Fabian; Gaupp, James; Wan, Zhiyu; Xia, Weiyi; Vorobeychik, Yevgeniy; Kantarcioglu, Murat; Kuhn, Klaus; Malin, Brad

    2017-01-01

    Biomedical data continues to grow in quantity and quality, creating new opportunities for research and data-driven applications. To realize these activities at scale, data must be shared beyond its initial point of collection. To maintain privacy, healthcare organizations often de-identify data, but they assume worst-case adversaries, inducing high levels of data corruption. Recently, game theory has been proposed to account for the incentives of data publishers and recipients (who attempt to re-identify patients), but this perspective has been more hypothetical than practical. In this paper, we report on a new game theoretic data publication strategy and its integration into the open source software ARX. We evaluate our implementation with an analysis on the relationship between data transformation, utility, and efficiency for over 30,000 demographic records drawn from the U.S. Census Bureau. The results indicate that our implementation is scalable and can be combined with various data privacy risk and quality measures.

  4. CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.

    PubMed

    Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka

    2017-09-15

    CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).

  5. Dynamic online surveys and experiments with the free open-source software dynQuest.

    PubMed

    Rademacher, Jens D M; Lippke, Sonia

    2007-08-01

    With computers and the World Wide Web widely available, collecting data through Web browsers is an attractive method utilized by the social sciences. In this article, conducting PC- and Web-based trials with the software package dynQuest is described. The software manages dynamic questionnaire-based trials over the Internet or on single computers, possibly as randomized control trials (RCT), if two or more groups are involved. The choice of follow-up questions can depend on previous responses, as needed for matched interventions. Data are collected in a simple text-based database that can be imported easily into other programs for postprocessing and statistical analysis. The software consists of platform-independent scripts written in the programming language PERL that use the common gateway interface between Web browser and server for submission of data through HTML forms. Advantages of dynQuest are parsimony, simplicity in use and installation, transparency, and reliability. The program is available as open-source freeware from the authors.

  6. The Digital Slide Archive: A Software Platform for Management, Integration, and Analysis of Histology for Cancer Research.

    PubMed

    Gutman, David A; Khalilia, Mohammed; Lee, Sanghoon; Nalisnik, Michael; Mullen, Zach; Beezley, Jonathan; Chittajallu, Deepak R; Manthey, David; Cooper, Lee A D

    2017-11-01

    Tissue-based cancer studies can generate large amounts of histology data in the form of glass slides. These slides contain important diagnostic, prognostic, and biological information and can be digitized into expansive and high-resolution whole-slide images using slide-scanning devices. Effectively utilizing digital pathology data in cancer research requires the ability to manage, visualize, share, and perform quantitative analysis on these large amounts of image data, tasks that are often complex and difficult for investigators with the current state of commercial digital pathology software. In this article, we describe the Digital Slide Archive (DSA), an open-source web-based platform for digital pathology. DSA allows investigators to manage large collections of histologic images and integrate them with clinical and genomic metadata. The open-source model enables DSA to be extended to provide additional capabilities. Cancer Res; 77(21); e75-78. ©2017 AACR . ©2017 American Association for Cancer Research.

  7. National Utility Rate Database: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ong, S.; McKeel, R.

    2012-08-01

    When modeling solar energy technologies and other distributed energy systems, using high-quality expansive electricity rates is essential. The National Renewable Energy Laboratory (NREL) developed a utility rate platform for entering, storing, updating, and accessing a large collection of utility rates from around the United States. This utility rate platform lives on the Open Energy Information (OpenEI) website, OpenEI.org, allowing the data to be programmatically accessed from a web browser, using an application programming interface (API). The semantic-based utility rate platform currently has record of 1,885 utility rates and covers over 85% of the electricity consumption in the United States.

  8. The successes and challenges of open-source biopharmaceutical innovation.

    PubMed

    Allarakhia, Minna

    2014-05-01

    Increasingly, open-source-based alliances seek to provide broad access to data, research-based tools, preclinical samples and downstream compounds. The challenge is how to create value from open-source biopharmaceutical innovation. This value creation may occur via transparency and usage of data across the biopharmaceutical value chain as stakeholders move dynamically between open source and open innovation. In this article, several examples are used to trace the evolution of biopharmaceutical open-source initiatives. The article specifically discusses the technological challenges associated with the integration and standardization of big data; the human capacity development challenges associated with skill development around big data usage; and the data-material access challenge associated with data and material access and usage rights, particularly as the boundary between open source and open innovation becomes more fluid. It is the author's opinion that the assessment of when and how value creation will occur, through open-source biopharmaceutical innovation, is paramount. The key is to determine the metrics of value creation and the necessary technological, educational and legal frameworks to support the downstream outcomes of now big data-based open-source initiatives. The continued focus on the early-stage value creation is not advisable. Instead, it would be more advisable to adopt an approach where stakeholders transform open-source initiatives into open-source discovery, crowdsourcing and open product development partnerships on the same platform.

  9. Teaching Introductory Astronomy "Open and Out" & Looking Forward to the 2017 Solar Eclipse

    NASA Astrophysics Data System (ADS)

    Chu, I.-Wen Mike; Cronkhite, Jeff

    2016-01-01

    We present a new effort on teaching introductory astronomy addressing the specific challenges facing small colleges including limited resources, changing generational behavior and new technological trends. The approach adopts open source solutions into the developmental learning materials aiming for standardization and wide-scale applicability. In addition we utilize events and resources outside classroom into the learning. Among examples of the development are laboratory exercises based on the planetarium software Stellarium and remediation exercises using Khan Academy instructional videos. As the eventual goal is to move toward greater autonomy the cycles of improvement necessarily require student feedback in an entirely different instructional style based on egalitarian dialogues. We highlight a laboratory exercise on Earth-Moon distance estimation using parallax of the upcoming 2017 solar eclipse to illustrate the "open and out" philosophy. Achievements, limitations and some diagnostics of the current effort are also presented.

  10. Vacuum vapor deposition

    NASA Technical Reports Server (NTRS)

    Poorman, Richard M. (Inventor); Weeks, Jack L. (Inventor)

    1995-01-01

    A method and apparatus is described for vapor deposition of a thin metallic film utilizing an ionized gas arc directed onto a source material spaced from a substrate to be coated in a substantial vacuum while providing a pressure differential between the source and the substrate so that, as a portion of the source is vaporized, the vapors are carried to the substrate. The apparatus includes a modified tungsten arc welding torch having a hollow electrode through which a gas, preferably inert, flows and an arc is struck between the electrode and the source. The torch, source, and substrate are confined within a chamber within which a vacuum is drawn. When the arc is struck, a portion of the source is vaporized and the vapors flow rapidly toward the substrate. A reflecting shield is positioned about the torch above the electrode and the source to ensure that the arc is struck between the electrode and the source at startup. The electrode and the source may be confined within a vapor guide housing having a duct opening toward the substrate for directing the vapors onto the substrate.

  11. Emission reductions from woody biomass waste for energy as an alternative to open burning.

    PubMed

    Springsteen, Bruce; Christofk, Tom; Eubanks, Steve; Mason, Tad; Clavin, Chris; Storey, Brett

    2011-01-01

    Woody biomass waste is generated throughout California from forest management, hazardous fuel reduction, and agricultural operations. Open pile burning in the vicinity of generation is frequently the only economic disposal option. A framework is developed to quantify air emissions reductions for projects that alternatively utilize biomass waste as fuel for energy production. A demonstration project was conducted involving the grinding and 97-km one-way transport of 6096 bone-dry metric tons (BDT) of mixed conifer forest slash in the Sierra Nevada foothills for use as fuel in a biomass power cogeneration facility. Compared with the traditional open pile burning method of disposal for the forest harvest slash, utilization of the slash for fuel reduced particulate matter (PM) emissions by 98% (6 kg PM/BDT biomass), nitrogen oxides (NOx) by 54% (1.6 kg NOx/BDT), nonmethane volatile organics (NMOCs) by 99% (4.7 kg NMOCs/BDT), carbon monoxide (CO) by 97% (58 kg CO/BDT), and carbon dioxide equivalents (CO2e) by 17% (0.38 t CO2e/BDT). Emission contributions from biomass processing and transport operations are negligible. CO2e benefits are dependent on the emission characteristics of the displaced marginal electricity supply. Monetization of emissions reductions will assist with fuel sourcing activities and the conduct of biomass energy projects.

  12. Open for Business

    ERIC Educational Resources Information Center

    Voyles, Bennett

    2007-01-01

    People know about the Sakai Project (open source course management system); they may even know about Kuali (open source financials). So, what is the next wave in open source software? This article discusses business intelligence (BI) systems. Though open source BI may still be only a rumor in most campus IT departments, some brave early adopters…

  13. ADMS State of the Industry and Gap Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agalgaonkar, Yashodhan P.; Marinovici, Maria C.; Vadari, Subramanian V.

    2016-03-31

    An Advanced distribution management system (ADMS) is a platform for optimized distribution system operational management. This platform comprises of distribution management system (DMS) applications, supervisory control and data acquisition (SCADA), outage management system (OMS), and distributed energy resource management system (DERMS). One of the primary objectives of this work is to study and analyze several ADMS component and auxiliary systems. All the important component and auxiliary systems, SCADA, GISs, DMSs, AMRs/AMIs, OMSs, and DERMS, are discussed in this report. Their current generation technologies are analyzed, and their integration (or evolution) with an ADMS technology is discussed. An ADMS technology statemore » of the art and gap analysis is also presented. There are two technical gaps observed. The integration challenge between the component operational systems is the single largest challenge for ADMS design and deployment. Another significant challenge noted is concerning essential ADMS applications, for instance, fault location, isolation, and service restoration (FLISR), volt-var optimization (VVO), etc. There are a relatively small number of ADMS application developers as ADMS software platform is not open source. There is another critical gap and while not being technical in nature (when compared the two above) is still important to consider. The data models currently residing in utility GIS systems are either incomplete or inaccurate or both. This data is essential for planning and operations because it is typically one of the primary sources from which power system model are created. To achieve the full potential of ADMS, the ability to execute acute Power Flow solution is an important pre-requisite. These critical gaps are hindering wider Utility adoption of an ADMS technology. The development of an open architecture platform can eliminate many of these barriers and also aid seamless integration of distribution Utility legacy systems with an ADMS.« less

  14. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Assessing the impact of energy efficiency technologies at a district or city scale is of great interest to local governments, real estate developers, utility companies, and policymakers. This paper describes a flexible framework that can be used to create and run district and city scale building energy simulations. The framework is built around the new OpenStudio City Database (CityDB). Building footprints, building height, building type, and other data can be imported from public records or other sources. Missing data can be inferred or assigned from a statistical sampling of other datasets. Once all required data is available, OpenStudio Measures aremore » used to create starting point energy models and to model energy efficiency measures for each building. Together this framework allows a user to pose several scenarios such as 'what if 30% of the commercial retail buildings added rooftop solar' or 'what if all elementary schools converted to ground source heat pumps' and then visualize the impacts at a district or city scale. This paper focuses on modeling existing building stock using public records. However, the framework is capable of supporting the evaluation of new construction, district systems, and the use of proprietary data sources.« less

  16. Utilizing public health clinics for service-learning rotations in dental hygiene: a four-year retrospective study.

    PubMed

    Aston-Brown, Roberta E; Branson, Bonnie; Gadbury-Amyot, Cynthia C; Bray, Kimberly Krust

    2009-03-01

    National reports outlining disparities in oral health care in the United States have focused attention on ways to encourage health care providers to become more involved in the public health arena. Utilization of service-learning in professional health education programs is one method being explored. The purpose of this study was to conduct a retrospective review of a service-learning rotation within a dental hygiene public health course. The study utilized data sources generated by students as part of a course evaluation. These sources included student journals (qualitative/quantitative) and Likert-scaled (quantitative) and open-ended (qualitative) student satisfaction survey items. Mixed methodology data analysis techniques were used to analyze and triangulate data in order to form conclusions related to the effectiveness of service-learning as a teaching strategy in dental hygiene. This investigation suggests that service-learning is an effective learning strategy for increasing student awareness of underserved populations, cultural diversity, and ethical patient care. The study also suggests that service-learning helped students to determine their level of interest in public health as a career choice by giving them a real-world experience in public health patient care.

  17. The Impact of Thoracoscopic Surgery on Payment and Health Care Utilization After Lung Resection.

    PubMed

    Watson, Thomas J; Qiu, Jiejing

    2016-04-01

    Lung resection by video-assisted thoracoscopic surgery (VATS) is associated with multiple clinical benefits compared with resection by thoracotomy (OPEN). Less is known about reimbursements, costs, and resource use with each approach. This study used a commercial insurance claims database to examine differences between VATS and OPEN lung resections in payment, health care utilization, and estimated days off work for health care visits. All adult inpatient discharges for patients undergoing VATS or OPEN lung resection in 2010 were identified from the Truven MarketScan Database (Ann Arbor, MI). A total of 2,611 patients underwent lobectomy (VATS, 270; OPEN, 669) or wedge resection (VATS, 1,332; OPEN, 340). After adjustment, OPEN lobectomies had a longer length of stay (mean difference, 1.79 days) and higher payment to hospitals (mean difference, $3,497) and physicians (mean difference, $433) compared with VATS. Similar findings were noted after wedge resections. OPEN lobectomies had 1.28-times and 1.14-times more health care utilization days within 90 days and 365 days, respectively, after the operation compared with VATS, translating into increased expenditures of $3,260 at 90 days and $822 at 365 days for OPEN procedures. No significant differences in utilization were noted between OPEN and VATS wedge resections, except for fewer outpatient visits within 90 days in the OPEN group. Compared with an OPEN approach, lobectomy and wedge resection by VATS were associated with lower hospital and physician payments. In addition, lobectomy by VATS was associated with less health care utilization in the early postoperative period and during the first year after the operation. These payment and utilization reductions are important in an era of value-based purchasing in health care. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Development of fire test methods for airplane interior materials

    NASA Technical Reports Server (NTRS)

    Tustin, E. A.

    1978-01-01

    Fire tests were conducted in a 737 airplane fuselage at NASA-JSC to characterize jet fuel fires in open steel pans (simulating post-crash fire sources and a ruptured airplane fuselage) and to characterize fires in some common combustibles (simulating in-flight fire sources). Design post-crash and in-flight fire source selections were based on these data. Large panels of airplane interior materials were exposed to closely-controlled large scale heating simulations of the two design fire sources in a Boeing fire test facility utilizing a surplused 707 fuselage section. Small samples of the same airplane materials were tested by several laboratory fire test methods. Large scale and laboratory scale data were examined for correlative factors. Published data for dangerous hazard levels in a fire environment were used as the basis for developing a method to select the most desirable material where trade-offs in heat, smoke and gaseous toxicant evolution must be considered.

  19. Alternative financing sources. ECRI. Emergency Care Research Institute.

    PubMed

    1987-01-01

    A number of new capital sources have been developed and used by health care institutions unable to finance high-tech projects with equity or conventional tax-exempt debt instruments; these include REITs, MLPs, per-use rentals, venture capital, and banks as brokers. However, there are no magic capital acquisition solutions. Institutions with good credit will continue to find a number of doors open to them; poorer credit risks will have fewer options, and those available will carry greater risk, allow for less provider control over projects, and limit potential return on investment to some extent. It is essential to examine carefully the drawbacks inherent in each type of alternative financing source. Venture capital in particular requires specific analysis because of the wide variety of possible arrangements that exist. If you cannot find either traditional or alternative sources of funding for a proposed project, you should reexamine the project and its underlying utilization projections and reimbursement assumptions.

  20. Altered [99mTc]Tc-MDP biodistribution from neutron activation sourced 99Mo.

    PubMed

    Demeter, Sandor; Szweda, Roman; Patterson, Judy; Grigoryan, Marine

    2018-01-01

    Given potential worldwide shortages of fission sourced 99 Mo/ 99m Tc medical isotopes there is increasing interest in alternate production strategies. A neutron activated 99 Mo source was utilized in a single center phase III open label study comparing 99m Tc, as 99m Tc Methylene Diphosphonate ([ 99m Tc]Tc-MDP), obtained from solvent generator separation of neutron activation produced 99 Mo, versus nuclear reactor produced 99 Mo (e.g., fission sourced) in oncology patients for which an [ 99m Tc]Tc-MDP bone scan would normally have been indicated. Despite the investigational [ 99m Tc]Tc-MDP passing all standard, and above standard of care, quality assurance tests, which would normally be sufficient to allow human administration, there was altered biodistribution which could lead to erroneous clinical interpretation. The cause of the altered biodistribution remains unknown and requires further research.

  1. Scaling Critical Zone analysis tasks from desktop to the cloud utilizing contemporary distributed computing and data management approaches: A case study for project based learning of Cyberinfrastructure concepts

    NASA Astrophysics Data System (ADS)

    Swetnam, T. L.; Pelletier, J. D.; Merchant, N.; Callahan, N.; Lyons, E.

    2015-12-01

    Earth science is making rapid advances through effective utilization of large-scale data repositories such as aerial LiDAR and access to NSF-funded cyberinfrastructures (e.g. the OpenTopography.org data portal, iPlant Collaborative, and XSEDE). Scaling analysis tasks that are traditionally developed using desktops, laptops or computing clusters to effectively leverage national and regional scale cyberinfrastructure pose unique challenges and barriers to adoption. To address some of these challenges in Fall 2014 an 'Applied Cyberinfrastructure Concepts' a project-based learning course (ISTA 420/520) at the University of Arizona focused on developing scalable models of 'Effective Energy and Mass Transfer' (EEMT, MJ m-2 yr-1) for use by the NSF Critical Zone Observatories (CZO) project. EEMT is a quantitative measure of the flux of available energy to the critical zone, and its computation involves inputs that have broad applicability (e.g. solar insolation). The course comprised of 25 students with varying level of computational skills and with no prior domain background in the geosciences, collaborated with domain experts to develop the scalable workflow. The original workflow relying on open-source QGIS platform on a laptop was scaled to effectively utilize cloud environments (Openstack), UA Campus HPC systems, iRODS, and other XSEDE and OSG resources. The project utilizes public data, e.g. DEMs produced by OpenTopography.org and climate data from Daymet, which are processed using GDAL, GRASS and SAGA and the Makeflow and Work-queue task management software packages. Students were placed into collaborative groups to develop the separate aspects of the project. They were allowed to change teams, alter workflows, and design and develop novel code. The students were able to identify all necessary dependencies, recompile source onto the target execution platforms, and demonstrate a functional workflow, which was further improved upon by one of the group leaders over Spring 2015. All of the code, documentation and workflow description are currently available on GitHub and a public data portal is in development. We present a case study of how students reacted to the challenge of a real science problem, their interactions with end-users, what went right, and what could be done better in the future.

  2. The Commercial Open Source Business Model

    NASA Astrophysics Data System (ADS)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  3. Development of a Model for the Representation of Nanotechnology-Specific Terminology

    PubMed Central

    Bailey, LeeAnn O.; Kennedy, Christopher H.; Fritts, Martin J.; Hartel, Francis W.

    2006-01-01

    Nanotechnology is an important, rapidly-evolving, multidisciplinary field [1]. The tremendous growth in this area necessitates the establishment of a common, open-source terminology to support the diverse biomedical applications of nanotechnology. Currently, the consensus process to define and categorize conceptual entities pertaining to nanotechnology is in a rudimentary stage. We have constructed a nanotechnology-specific conceptual hierarchy that can be utilized by end users to retrieve accurate, controlled terminology regarding emerging nanotechnology and corresponding clinical applications. PMID:17238469

  4. A method of exploration of the atmosphere of Titan. [hot air balloon heated by solar radiation or planetary thermal flux

    NASA Technical Reports Server (NTRS)

    Blamont, J.

    1978-01-01

    A hot-air balloon, with the air heated by natural sources, is described. Buoyancy is accomplished by either solar heating or by utilizing the IR thermal flux of the planet to heat the gas in the balloon. Altitude control is provided by a valve which is opened and closed by a barometer. The balloon is made of an organic material which has to absorb radiant energy and to emit as little as possible.

  5. Healthy Harlem: empowering health consumers through social networking, tailoring and web 2.0 technologies.

    PubMed

    Khan, Sharib A; McFarlane, Delano J; Li, Jianhua; Ancker, Jessica S; Hutchinson, Carly; Cohall, Alwyn; Kukafka, Rita

    2007-10-11

    Consumer health informatics has emerged as a strategy to inform and empower patients for self management of their health. The emergence of and explosion in use of user-generated online media (e.g.,blogs) has created new opportunities to inform and educate people about healthy living. Under a prevention research project, we are developing a website that utilizes social content collaboration mediums in conjunction with open-source technologies to create a community-driven resource that provides users with tailored health information.

  6. Openly Published Environmental Sensing (OPEnS) | Advancing Open-Source Research, Instrumentation, and Dissemination

    NASA Astrophysics Data System (ADS)

    Udell, C.; Selker, J. S.

    2017-12-01

    The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.

  7. Open-source software: not quite endsville.

    PubMed

    Stahl, Matthew T

    2005-02-01

    Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.

  8. Developing an Open Source Option for NASA Software

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Parks, John W. (Technical Monitor)

    2003-01-01

    We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.

  9. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  10. Observability of ionospheric space-time structure with ISR: A simulation study

    NASA Astrophysics Data System (ADS)

    Swoboda, John; Semeter, Joshua; Zettergren, Matthew; Erickson, Philip J.

    2017-02-01

    The sources of error from electronically steerable array (ESA) incoherent scatter radar (ISR) systems are investigated both theoretically and with use of an open-source ISR simulator, developed by the authors, called Simulator for ISR (SimISR). The main sources of error incorporated in the simulator include statistical uncertainty, which arises due to nature of the measurement mechanism and the inherent space-time ambiguity from the sensor. SimISR can take a field of plasma parameters, parameterized by time and space, and create simulated ISR data at the scattered electric field (i.e., complex receiver voltage) level, subsequently processing these data to show possible reconstructions of the original parameter field. To demonstrate general utility, we show a number of simulation examples, with two cases using data from a self-consistent multifluid transport model. Results highlight the significant influence of the forward model of the ISR process and the resulting statistical uncertainty on plasma parameter measurements and the core experiment design trade-offs that must be made when planning observations. These conclusions further underscore the utility of this class of measurement simulator as a design tool for more optimal experiment design efforts using flexible ESA class ISR systems.

  11. An IoT-Based Solution for Monitoring a Fleet of Educational Buildings Focusing on Energy Efficiency.

    PubMed

    Amaxilatis, Dimitrios; Akrivopoulos, Orestis; Mylonas, Georgios; Chatzigiannakis, Ioannis

    2017-10-10

    Raising awareness among young people and changing their behaviour and habits concerning energy usage is key to achieving sustained energy saving. Additionally, young people are very sensitive to environmental protection so raising awareness among children is much easier than with any other group of citizens. This work examines ways to create an innovative Information & Communication Technologies (ICT) ecosystem (including web-based, mobile, social and sensing elements) tailored specifically for school environments, taking into account both the users (faculty, staff, students, parents) and school buildings, thus motivating and supporting young citizens' behavioural change to achieve greater energy efficiency. A mixture of open-source IoT hardware and proprietary platforms on the infrastructure level, are currently being utilized for monitoring a fleet of 18 educational buildings across 3 countries, comprising over 700 IoT monitoring points. Hereon presented is the system's high-level architecture, as well as several aspects of its implementation, related to the application domain of educational building monitoring and energy efficiency. The system is developed based on open-source technologies and services in order to make it capable of providing open IT-infrastructure and support from different commercial hardware/sensor vendors as well as open-source solutions. The system presented can be used to develop and offer new app-based solutions that can be used either for educational purposes or for managing the energy efficiency of the building. The system is replicable and adaptable to settings that may be different than the scenarios envisioned here (e.g., targeting different climate zones), different IT infrastructures and can be easily extended to accommodate integration with other systems. The overall performance of the system is evaluated in real-world environment in terms of scalability, responsiveness and simplicity.

  12. Science in the Wild: Adventure Citizen Science in the Arctic and Himalaya

    NASA Astrophysics Data System (ADS)

    Horodyskyj, U. N.; Rufat-Latre, J.; Reimuller, J. D.; Rowe, P.; Pothier, B.; Thapa, A.

    2016-12-01

    Science in the Wild provides educational hands-on adventure science expeditions for the everyday person, blending athletics and academics in remote regions of the planet. Participants receive training on field data collection techniques in order to be able to help scientists in the field while on expedition with them. At SITW, we also involve our participants in analyzing and interpreting the data, thus teaching them about data quality and sources of error and uncertainty. SITW teaches citizens the art of science storytelling, aims to make science more open and transparent, and utilizes open source software and hardware in projects. Open science serves both the research community and the greater public. For the former, it makes science reproducible, transparent and more impactful by mobilizing multidisciplinary and international collaborative research efforts. For the latter, it minimizes mistrust in the sciences by allowing the public a `behind-the-scenes' look into how scientific research is conducted, raw and unfiltered. We present results from a citizen-science expedition to Baffin Island (Canadian Arctic), which successfully skied and sampled snow for dust and black carbon concentration from the Penny Ice Cap, down the 25-mile length of Coronation Glacier, and back to the small Arctic town of Qikitarjuaq. From a May/June 2016 citizen-science expedition to Nepal (Himalaya), we present results comparing 2014/16 depth and lake floor compositional data from supraglacial lakes on Ngozumpa glacier while using open-source surface and underwater robotics. The Sherpa-Scientist Initiative, a program aimed at empowering locals in data collection and interpretation, successfully trained half a dozen Sherpas during this expedition and demonstrates the value of local engagement. In future expeditions to the region, efforts will be made to scale up the number of trainees and expand our spatial reach in the Himalaya.

  13. An IoT-Based Solution for Monitoring a Fleet of Educational Buildings Focusing on Energy Efficiency

    PubMed Central

    Akrivopoulos, Orestis

    2017-01-01

    Raising awareness among young people and changing their behaviour and habits concerning energy usage is key to achieving sustained energy saving. Additionally, young people are very sensitive to environmental protection so raising awareness among children is much easier than with any other group of citizens. This work examines ways to create an innovative Information & Communication Technologies (ICT) ecosystem (including web-based, mobile, social and sensing elements) tailored specifically for school environments, taking into account both the users (faculty, staff, students, parents) and school buildings, thus motivating and supporting young citizens’ behavioural change to achieve greater energy efficiency. A mixture of open-source IoT hardware and proprietary platforms on the infrastructure level, are currently being utilized for monitoring a fleet of 18 educational buildings across 3 countries, comprising over 700 IoT monitoring points. Hereon presented is the system’s high-level architecture, as well as several aspects of its implementation, related to the application domain of educational building monitoring and energy efficiency. The system is developed based on open-source technologies and services in order to make it capable of providing open IT-infrastructure and support from different commercial hardware/sensor vendors as well as open-source solutions. The system presented can be used to develop and offer new app-based solutions that can be used either for educational purposes or for managing the energy efficiency of the building. The system is replicable and adaptable to settings that may be different than the scenarios envisioned here (e.g., targeting different climate zones), different IT infrastructures and can be easily extended to accommodate integration with other systems. The overall performance of the system is evaluated in real-world environment in terms of scalability, responsiveness and simplicity. PMID:28994719

  14. A case study in open source innovation: developing the Tidepool Platform for interoperability in type 1 diabetes management.

    PubMed

    Neinstein, Aaron; Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh

    2016-03-01

    Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application ("app"), Blip, to visualize the data. Tidepool's software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool's open source, cloud model for health data interoperability is applicable to other healthcare use cases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  15. A case study in open source innovation: developing the Tidepool Platform for interoperability in type 1 diabetes management

    PubMed Central

    Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh

    2016-01-01

    Objective Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. Materials and Methods An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Results Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application (“app”), Blip, to visualize the data. Tidepool’s software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. Discussion By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. Conclusion The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool’s open source, cloud model for health data interoperability is applicable to other healthcare use cases. PMID:26338218

  16. Open-Source Data and the Study of Homicide.

    PubMed

    Parkin, William S; Gruenewald, Jeff

    2015-07-20

    To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.

  17. Thin client (web browser)-based collaboration for medical imaging and web-enabled data.

    PubMed

    Le, Tuong Huu; Malhi, Nadeem

    2002-01-01

    Utilizing thin client software and open source server technology, a collaborative architecture was implemented allowing for sharing of Digital Imaging and Communications in Medicine (DICOM) and non-DICOM images with real-time markup. Using the Web browser as a thin client integrated with standards-based components, such as DHTML (dynamic hypertext markup language), JavaScript, and Java, collaboration was achieved through a Web server/proxy server combination utilizing Java Servlets and Java Server Pages. A typical collaborative session involved the driver, who directed the navigation of the other collaborators, the passengers, and provided collaborative markups of medical and nonmedical images. The majority of processing was performed on the server side, allowing for the client to remain thin and more accessible.

  18. EO/IR scene generation open source initiative for real-time hardware-in-the-loop and all-digital simulation

    NASA Astrophysics Data System (ADS)

    Morris, Joseph W.; Lowry, Mac; Boren, Brett; Towers, James B.; Trimble, Darian E.; Bunfield, Dennis H.

    2011-06-01

    The US Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) and the Redstone Test Center (RTC) has formed the Scene Generation Development Center (SGDC) to support the Department of Defense (DoD) open source EO/IR Scene Generation initiative for real-time hardware-in-the-loop and all-digital simulation. Various branches of the DoD have invested significant resources in the development of advanced scene and target signature generation codes. The SGDC goal is to maintain unlimited government rights and controlled access to government open source scene generation and signature codes. In addition, the SGDC provides development support to a multi-service community of test and evaluation (T&E) users, developers, and integrators in a collaborative environment. The SGDC has leveraged the DoD Defense Information Systems Agency (DISA) ProjectForge (https://Project.Forge.mil) which provides a collaborative development and distribution environment for the DoD community. The SGDC will develop and maintain several codes for tactical and strategic simulation, such as the Joint Signature Image Generator (JSIG), the Multi-spectral Advanced Volumetric Real-time Imaging Compositor (MAVRIC), and Office of the Secretary of Defense (OSD) Test and Evaluation Science and Technology (T&E/S&T) thermal modeling and atmospherics packages, such as EOView, CHARM, and STAR. Other utility packages included are the ContinuumCore for real-time messaging and data management and IGStudio for run-time visualization and scenario generation.

  19. A 868MHz-based wireless sensor network for ground truthing of soil moisture for a hyperspectral remote sensing campaign - design and preliminary results

    NASA Astrophysics Data System (ADS)

    Näthe, Paul; Becker, Rolf

    2014-05-01

    Soil moisture and plant available water are important environmental parameters that affect plant growth and crop yield. Hence, they are significant parameters for vegetation monitoring and precision agriculture. However, validation through ground-based soil moisture measurements is necessary for accessing soil moisture, plant canopy temperature, soil temperature and soil roughness with airborne hyperspectral imaging systems in a corresponding hyperspectral imaging campaign as a part of the INTERREG IV A-Project SMART INSPECTORS. At this point, commercially available sensors for matric potential, plant available water and volumetric water content are utilized for automated measurements with smart sensor nodes which are developed on the basis of open-source 868MHz radio modules, featuring a full-scale microcontroller unit that allows an autarkic operation of the sensor nodes on batteries in the field. The generated data from each of these sensor nodes is transferred wirelessly with an open-source protocol to a central node, the so-called "gateway". This gateway collects, interprets and buffers the sensor readings and, eventually, pushes the data-time series onto a server-based database. The entire data processing chain from the sensor reading to the final storage of data-time series on a server is realized with open-source hardware and software in such a way that the recorded data can be accessed from anywhere through the internet. It will be presented how this open-source based wireless sensor network is developed and specified for the application of ground truthing. In addition, the system's perspectives and potentials with respect to usability and applicability for vegetation monitoring and precision agriculture shall be pointed out. Regarding the corresponding hyperspectral imaging campaign, results from ground measurements will be discussed in terms of their contributing aspects to the remote sensing system. Finally, the significance of the wireless sensor network for the application of ground truthing shall be determined.

  20. National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents

    NASA Astrophysics Data System (ADS)

    Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.

    2014-12-01

    The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.

  1. How Is Open Source Special?

    ERIC Educational Resources Information Center

    Kapor, Mitchell

    2005-01-01

    Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…

  2. QSAR DataBank - an approach for the digital organization and archiving of QSAR model information

    PubMed Central

    2014-01-01

    Background Research efforts in the field of descriptive and predictive Quantitative Structure-Activity Relationships or Quantitative Structure–Property Relationships produce around one thousand scientific publications annually. All the materials and results are mainly communicated using printed media. The printed media in its present form have obvious limitations when they come to effectively representing mathematical models, including complex and non-linear, and large bodies of associated numerical chemical data. It is not supportive of secondary information extraction or reuse efforts while in silico studies poses additional requirements for accessibility, transparency and reproducibility of the research. This gap can and should be bridged by introducing domain-specific digital data exchange standards and tools. The current publication presents a formal specification of the quantitative structure-activity relationship data organization and archival format called the QSAR DataBank (QsarDB for shorter, or QDB for shortest). Results The article describes QsarDB data schema, which formalizes QSAR concepts (objects and relationships between them) and QsarDB data format, which formalizes their presentation for computer systems. The utility and benefits of QsarDB have been thoroughly tested by solving everyday QSAR and predictive modeling problems, with examples in the field of predictive toxicology, and can be applied for a wide variety of other endpoints. The work is accompanied with open source reference implementation and tools. Conclusions The proposed open data, open source, and open standards design is open to public and proprietary extensions on many levels. Selected use cases exemplify the benefits of the proposed QsarDB data format. General ideas for future development are discussed. PMID:24910716

  3. Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.

    PubMed

    Benson, Tim

    2016-07-04

    Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.

  4. Getting started with Open-Hardware: Development and Control of Microfluidic Devices

    PubMed Central

    da Costa, Eric Tavares; Mora, Maria F.; Willis, Peter A.; do Lago, Claudimir L.; Jiao, Hong; Garcia, Carlos D.

    2014-01-01

    Understanding basic concepts of electronics and computer programming allows researchers to get the most out of the equipment found in their laboratories. Although a number of platforms have been specifically designed for the general public and are supported by a vast array of on-line tutorials, this subject is not normally included in university chemistry curricula. Aiming to provide the basic concepts of hardware and software, this article is focused on the design and use of a simple module to control a series of PDMS-based valves. The module is based on a low-cost microprocessor (Teensy) and open-source software (Arduino). The microvalves were fabricated using thin sheets of PDMS and patterned using CO2 laser engraving, providing a simple and efficient way to fabricate devices without the traditional photolithographic process or facilities. Synchronization of valve control enabled the development of two simple devices to perform injection (1.6 ± 0.4 μL/stroke) and mixing of different solutions. Furthermore, a practical demonstration of the utility of this system for microscale chemical sample handling and analysis was achieved performing an on-chip acid-base titration, followed by conductivity detection with an open-source low-cost detection system. Overall, the system provided a very reproducible (98%) platform to perform fluid delivery at the microfluidic scale. PMID:24823494

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, B.D.; Apel, W.A.; Walton, M.R.

    Conceptually, biofilters are vapor phase bioreactors that rely on microorganisms in the bed medium to oxidize contaminants in off-gases flowing through the bed to less hazardous compounds. In the most studied and utilized systems reduced compounds such as fuel hydrocarbons are enzymatically oxidized to compounds such as carbon dioxide and water. In these types of reactions the microorganisms in the bed oxidize the contaminant and transfer the electrons to oxygen which is the terminal electron acceptor in the process. In essence the contaminant is the carbon and energy source for the microorganisms in the bed medium and through this catabolicmore » process oxygen is reduced to water. An example of this oxidation process can be seen during the degradation of benzene and similar aromatic compounds. Aromatics are initially attacked by a dioxygenase enzyme which oxidizes the compounds to a labile dihydrodiole which is spontaneously converted to a catechol. The dihydroxylated aromatic rings is then opened by oxidative {open_quotes}ortho{close_quotes} or {open_quotes}meta{close_quotes} cleavage yielding cis, cis-muconic acid or 2-hydroxy-cis, cis-muconic semialdehyde, respectively. These organic compounds are further oxidized to carbon dioxide or are assimilated for cellular material. This paper describes the conversion of carbon tetrachloride using methanol as the primary carbon and energy source.« less

  6. Utilization of organic matter by invertebrates along an estuarine gradient in an intermittently open estuary

    NASA Astrophysics Data System (ADS)

    Lautenschlager, Agnes D.; Matthews, Ty G.; Quinn, Gerry P.

    2014-08-01

    In intermittently open estuaries, the sources of organic matter sustaining benthic invertebrates are likely to vary seasonally, particularly between periods of connection and disconnection with the ocean and higher and lower freshwater flows. This study investigated the contribution of allochthonous and autochthonous primary production to the diet of representative invertebrate species using stable isotope analysis (SIA) during the austral summer and winter (2008, 2009) in an intermittently open estuary on the south-eastern coast of Australia. As the study was conducted towards the end of a prolonged period of drought, a reduced influence of freshwater/terrestrial organic matter was expected. Sampling was conducted along an estuarine gradient, including upper, middle and lower reaches and showed that the majority of assimilated organic matter was derived from autochthonous estuarine food sources. Additionally, there was an input of allochthonous organic matter, which varied along the length of the estuary, indicated by distinct longitudinal trends in carbon and nitrogen stable isotope signatures along the estuarine gradient. Marine seaweed contributed to invertebrate diets in the lower reaches of the estuary, while freshwater/terrestrial organic matter had increased influence in the upper reaches. Suspension-feeding invertebrates derived large parts of their diet from freshwater/terrestrial material, despite flows being greatly reduced in comparison with non-drought years.

  7. Weather forecasting with open source software

    NASA Astrophysics Data System (ADS)

    Rautenhaus, Marc; Dörnbrack, Andreas

    2013-04-01

    To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.

  8. Open Source Software Development

    DTIC Science & Technology

    2011-01-01

    Software, 2002, 149(1), 3-17. 3. DiBona , C., Cooper, D., and Stone, M. (Eds.), Open Sources 2.0, 2005, O’Reilly Media, Sebastopol, CA. Also see, C... DiBona , S. Ockman, and M. Stone (Eds.). Open Sources: Vocides from the Open Source Revolution, 1999. O’Reilly Media, Sebastopol, CA. 4. Ducheneaut, N

  9. Air pollution effects due to deregulation of the electric industry

    NASA Astrophysics Data System (ADS)

    Davoodi, Khojasteh Riaz

    The Energy Policy Act of 1992 introduced the concept of open-access into the electric utility industry which allows privately-owned utilities to transmit power produced by non-utility generators and independent power producers (IPPs). In April 1996, the Federal Energy Regulatory Commission (FERC) laid down the final rules (Orders No. 888 & No. 889), which required utilities to open their transmission lines to any power producer and charge them no more than what they pay for the use of their own lines. These rules set the stage for the retail sale of electricity to industrial, commercial and residential utility customers; non-utility generators (Nugs); and power marketers. These statutory, regulatory and administrative changes create for the electric utility industry two different forces that contradict each other. The first is the concept of competition among utility companies; this places a greater emphasis on electric power generation cost control and affects generation/fuel mix selection and demand side management (DSM) activities. The second force, which is converse to the first, is that utilities are major contributors to the air pollution burden in the United States and environmental concerns are forcing them to reduce emissions of air pollutants by using more environmentally friendly fuels and implementing energy saving programs. This study evaluates the impact of deregulation within the investor owned electric utilities and how this deregulation effects air quality by investigating the trend in demand side management programs and generation/fuel mix. A survey was conducted of investor owned utilities and independent power producers. The results of the survey were analyzed by analysis of variance and regression analysis to determine the impact to Air Pollution. An air Quality Impact model was also developed in this study. This model consists of six modules: (1) demand side management and (2) consumption of coal, (3) gas, (4) renewable, (5) oil and (6) nuclear sources until the year 2005. Each module was analyzed separately and the result from each module was transferred into the Air Quality Impact model. The model assesses the changes in electricity generation within each module due to deregulation and these changes can then be correlated to the emission of air pollutants in the United States.

  10. Open-source hardware for medical devices

    PubMed Central

    2016-01-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528

  11. Open-source hardware for medical devices.

    PubMed

    Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold

    2016-04-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.

  12. The case for open-source software in drug discovery.

    PubMed

    DeLano, Warren L

    2005-02-01

    Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.

  13. Choosing Open Source ERP Systems: What Reasons Are There For Doing So?

    NASA Astrophysics Data System (ADS)

    Johansson, Björn; Sudzina, Frantisek

    Enterprise resource planning (ERP) systems attract a high attention and open source software does it as well. The question is then if, and if so, when do open source ERP systems take off. The paper describes the status of open source ERP systems. Based on literature review of ERP system selection criteria based on Web of Science articles, it discusses reported reasons for choosing open source or proprietary ERP systems. Last but not least, the article presents some conclusions that could act as input for future research. The paper aims at building up a foundation for the basic question: What are the reasons for an organization to adopt open source ERP systems.

  14. Developing open-source codes for electromagnetic geophysics using industry support

    NASA Astrophysics Data System (ADS)

    Key, K.

    2017-12-01

    Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.

  15. OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark

    NASA Astrophysics Data System (ADS)

    Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.

    2015-02-01

    We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.

  16. X-ray backscatter radiography with lower open fraction coded masks

    NASA Astrophysics Data System (ADS)

    Muñoz, André A. M.; Vella, Anna; Healy, Matthew J. F.; Lane, David W.; Jupp, Ian; Lockley, David

    2017-09-01

    Single sided radiographic imaging would find great utility for medical, aerospace and security applications. While coded apertures can be used to form such an image from backscattered X-rays they suffer from near field limitations that introduce noise. Several theoretical studies have indicated that for an extended source the images signal to noise ratio may be optimised by using a low open fraction (<0.5) mask. However, few experimental results have been published for such low open fraction patterns and details of their formulation are often unavailable or are ambiguous. In this paper we address this process for two types of low open fraction mask, the dilute URA and the Singer set array. For the dilute URA the procedure for producing multiple 2D array patterns from given 1D binary sequences (Barker codes) is explained. Their point spread functions are calculated and their imaging properties are critically reviewed. These results are then compared to those from the Singer set and experimental exposures are presented for both type of pattern; their prospects for near field imaging are discussed.

  17. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    ERIC Educational Resources Information Center

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  18. Implementing Open Source Platform for Education Quality Enhancement in Primary Education: Indonesia Experience

    ERIC Educational Resources Information Center

    Kisworo, Marsudi Wahyu

    2016-01-01

    Information and Communication Technology (ICT)-supported learning using free and open source platform draws little attention as open source initiatives were focused in secondary or tertiary educations. This study investigates possibilities of ICT-supported learning using open source platform for primary educations. The data of this study is taken…

  19. An Analysis of Open Source Security Software Products Downloads

    ERIC Educational Resources Information Center

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  20. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less

  1. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less

  2. Urban Flow and Pollutant Dispersion Simulation with Multi-scale coupling of Meteorological Model with Computational Fluid Dynamic Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yushi; Poh, Hee Joo

    2014-11-01

    The Computational Fluid Dynamics analysis has become increasingly important in modern urban planning in order to create highly livable city. This paper presents a multi-scale modeling methodology which couples Weather Research and Forecasting (WRF) Model with open source CFD simulation tool, OpenFOAM. This coupling enables the simulation of the wind flow and pollutant dispersion in urban built-up area with high resolution mesh. In this methodology meso-scale model WRF provides the boundary condition for the micro-scale CFD model OpenFOAM. The advantage is that the realistic weather condition is taken into account in the CFD simulation and complexity of building layout can be handled with ease by meshing utility of OpenFOAM. The result is validated against the Joint Urban 2003 Tracer Field Tests in Oklahoma City and there is reasonably good agreement between the CFD simulation and field observation. The coupling of WRF- OpenFOAM provide urban planners with reliable environmental modeling tool in actual urban built-up area; and it can be further extended with consideration of future weather conditions for the scenario studies on climate change impact.

  3. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  4. Changes in sub-soil river water quality upon its open storage-a case study.

    PubMed

    Mohanty, A K; Satpathy, K K; Prasad, M V R

    2017-08-01

    A study was carried out to investigate the changes in the physicochemical and biological properties of sub-soil river water upon its storage in a man-made reservoir. Palar sub-soil and reservoir water samples were collated fortnightly for a period of 5 years (2010-2014). The open reservoir is used as a reliable raw water source for condenser cooling systems and for the demineralizing (DM) plant input of Fast Breeder Test Reactor (FBTR), Madras Atomic Power Station (MAPS), and other laboratories at Kalpakkam, southeast coast of India. Relatively high nutrient concentration was observed in the Palar sub-soil water, and a significant reduction in average concentration (μmol l -1 ) of phosphate (Palar 1.92; open reservoir 1.54) and nitrate (Palar 9.78; open reservoir 5.67) was observed from Palar to open reservoir. Substantial increase in pH (Palar 8.05; open reservoir 8.45), dissolved oxygen (mg l -1 ) (Palar 6.07; open reservoir 8.47), and chlorophyll-a (mg m -3 ) (Palar 1.66; open reservoir 8.43) values were noticed from the Palar sub-soil water to open reservoir water. It is concluded that sub-soil water with higher nutrient concentrations when stored openly, exposing to the sun, resulted in growth of plants, planktonic, and macrophytes, which led to substantial deterioration in water quality from its utility point of view as a condenser cooling medium and raw water input for DM plant.

  5. The 2017 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year’s theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest. PMID:29118973

  6. The 2017 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather

    2017-01-01

    The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year's theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest.

  7. NWChem: A comprehensive and scalable open-source solution for large scale molecular simulations

    NASA Astrophysics Data System (ADS)

    Valiev, M.; Bylaska, E. J.; Govind, N.; Kowalski, K.; Straatsma, T. P.; Van Dam, H. J. J.; Wang, D.; Nieplocha, J.; Apra, E.; Windus, T. L.; de Jong, W. A.

    2010-09-01

    The latest release of NWChem delivers an open-source computational chemistry package with extensive capabilities for large scale simulations of chemical and biological systems. Utilizing a common computational framework, diverse theoretical descriptions can be used to provide the best solution for a given scientific problem. Scalable parallel implementations and modular software design enable efficient utilization of current computational architectures. This paper provides an overview of NWChem focusing primarily on the core theoretical modules provided by the code and their parallel performance. Program summaryProgram title: NWChem Catalogue identifier: AEGI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Open Source Educational Community License No. of lines in distributed program, including test data, etc.: 11 709 543 No. of bytes in distributed program, including test data, etc.: 680 696 106 Distribution format: tar.gz Programming language: Fortran 77, C Computer: all Linux based workstations and parallel supercomputers, Windows and Apple machines Operating system: Linux, OS X, Windows Has the code been vectorised or parallelized?: Code is parallelized Classification: 2.1, 2.2, 3, 7.3, 7.7, 16.1, 16.2, 16.3, 16.10, 16.13 Nature of problem: Large-scale atomistic simulations of chemical and biological systems require efficient and reliable methods for ground and excited solutions of many-electron Hamiltonian, analysis of the potential energy surface, and dynamics. Solution method: Ground and excited solutions of many-electron Hamiltonian are obtained utilizing density-functional theory, many-body perturbation approach, and coupled cluster expansion. These solutions or a combination thereof with classical descriptions are then used to analyze potential energy surface and perform dynamical simulations. Additional comments: Full documentation is provided in the distribution file. This includes an INSTALL file giving details of how to build the package. A set of test runs is provided in the examples directory. The distribution file for this program is over 90 Mbytes and therefore is not delivered directly when download or Email is requested. Instead a html file giving details of how the program can be obtained is sent. Running time: Running time depends on the size of the chemical system, complexity of the method, number of cpu's and the computational task. It ranges from several seconds for serial DFT energy calculations on a few atoms to several hours for parallel coupled cluster energy calculations on tens of atoms or ab-initio molecular dynamics simulation on hundreds of atoms.

  8. System for stabilizing cable phase delay utilizing a coaxial cable under pressure

    NASA Technical Reports Server (NTRS)

    Clements, P. A. (Inventor)

    1974-01-01

    Stabilizing the phase delay of signals passing through a pressurizable coaxial cable is disclosed. Signals from an appropriate source at a selected frequency, e.g., 100 MHz, are sent through the controlled cable from a first cable end to a second cable end which, electrically, is open or heavily mismatched at 100 MHz, thereby reflecting 100 MHz signals back to the first cable end. Thereat, the phase difference between the reflected-back signals and the signals from the source is detected by a phase detector. The output of the latter is used to control the flow of gas to or from the cable, thereby controlling the cable pressure, which in turn affects the cable phase delay.

  9. Affordable Open-Source Data Loggers for Distributed Measurements of Sap-Flux, Stem Growth, Relative Humidity, Temperature, and Soil Water Content

    NASA Astrophysics Data System (ADS)

    Anderson, T.; Jencso, K. G.; Hoylman, Z. H.; Hu, J.

    2015-12-01

    Characterizing the mechanisms that lead to differences in forest ecosystem productivity across complex terrain remains a challenge. This difficulty can be partially attributed to the cost of installing networks of proprietary data loggers that monitor differences in the biophysical factors contributing to tree growth. Here, we describe the development and initial application of a network of open source data loggers. These data loggers are based on the Arduino platform, but were refined into a custom printed circuit board (PCB). This reduced the cost and complexity of the data loggers, which made them cheap to reproduce and reliable enough to withstand the harsh environmental conditions experienced in Ecohydrology studies. We demonstrate the utility of these loggers for high frequency, spatially-distributed measurements of sap-flux, stem growth, relative humidity, temperature, and soil water content across 36 landscape positions in the Lubrecht Experimental Forest, MT, USA. This new data logging technology made it possible to develop a spatially distributed monitoring network within the constraints of our research budget and may provide new insights into factors affecting forest productivity across complex terrain.

  10. ADFNE: Open source software for discrete fracture network engineering, two and three dimensional applications

    NASA Astrophysics Data System (ADS)

    Fadakar Alghalandis, Younes

    2017-05-01

    Rapidly growing topic, the discrete fracture network engineering (DFNE), has already attracted many talents from diverse disciplines in academia and industry around the world to challenge difficult problems related to mining, geothermal, civil, oil and gas, water and many other projects. Although, there are few commercial software capable of providing some useful functionalities fundamental for DFNE, their costs, closed code (black box) distributions and hence limited programmability and tractability encouraged us to respond to this rising demand with a new solution. This paper introduces an open source comprehensive software package for stochastic modeling of fracture networks in two- and three-dimension in discrete formulation. Functionalities included are geometric modeling (e.g., complex polygonal fracture faces, and utilizing directional statistics), simulations, characterizations (e.g., intersection, clustering and connectivity analyses) and applications (e.g., fluid flow). The package is completely written in Matlab scripting language. Significant efforts have been made to bring maximum flexibility to the functions in order to solve problems in both two- and three-dimensions in an easy and united way that is suitable for beginners, advanced and experienced users.

  11. Analysis of greenhouse gas emissions from 10 biogas plants within the agricultural sector.

    PubMed

    Liebetrau, J; Reinelt, T; Clemens, J; Hafermann, C; Friehe, J; Weiland, P

    2013-01-01

    With the increasing number of biogas plants in Germany the necessity for an exact determination of the actual effect on the greenhouse gas emissions related to the energy production gains importance. Hitherto the life cycle assessments have been based on estimations of emissions of biogas plants. The lack of actual emission evaluations has been addressed within a project from which the selected results are presented here. The data presented here have been obtained during a survey in which 10 biogas plants were analysed within two measurement periods each. As the major methane emission sources the open storage of digestates ranging from 0.22 to 11.2% of the methane utilized and the exhaust of the co-generation units ranging from 0.40 to 3.28% have been identified. Relevant ammonia emissions have been detected from the open digestate storage. The main source of nitrous oxide emissions was the co-generation unit. Regarding the potential of measures to reduce emissions it is highly recommended to focus on the digestate storage and the exhaust of the co-generation.

  12. 2MASS Catalog Server Kit Version 2.1

    NASA Astrophysics Data System (ADS)

    Yamauchi, C.

    2013-10-01

    The 2MASS Catalog Server Kit is open source software for use in easily constructing a high performance search server for important astronomical catalogs. This software utilizes the open source RDBMS PostgreSQL, therefore, any users can setup the database on their local computers by following step-by-step installation guide. The kit provides highly optimized stored functions for positional searchs similar to SDSS SkyServer. Together with these, the powerful SQL environment of PostgreSQL will meet various user's demands. We released 2MASS Catalog Server Kit version 2.1 in 2012 May, which supports the latest WISE All-Sky catalog (563,921,584 rows) and 9 major all-sky catalogs. Local databases are often indispensable for observatories with unstable or narrow-band networks or severe use, such as retrieving large numbers of records within a small period of time. This software is the best for such purposes, and increasing supported catalogs and improvements of version 2.1 can cover a wider range of applications including advanced calibration system, scientific studies using complicated SQL queries, etc. Official page: http://www.ir.isas.jaxa.jp/~cyamauch/2masskit/

  13. Modeling deformation processes of salt caverns for gas storage due to fluctuating operation pressures

    NASA Astrophysics Data System (ADS)

    Böttcher, N.; Nagel, T.; Goerke, U.; Khaledi, K.; Lins, Y.; König, D.; Schanz, T.; Köhn, D.; Attia, S.; Rabbel, W.; Bauer, S.; Kolditz, O.

    2013-12-01

    In the course of the Energy Transition in Germany, the focus of the country's energy sources is shifting from fossil to renewable and sustainable energy carriers. Since renewable energy sources, such as wind and solar power, are subjected to annual, seasonal, and diurnal fluctuations, the development and extension of energy storage capacities is a priority in German R&D programs. Common methods of energy storage are the utilization of subsurface caverns as a reservoir for natural or artificial fuel gases, such as hydrogen, methane, or the storage of compressed air. The construction of caverns in salt rock is inexpensive in comparison to solid rock formations due to the possibility of solution mining. Another advantage of evaporite as a host material is the self-healing capacity of salt rock. Gas caverns are capable of short-term energy storage (hours to days), so the operating pressures inside the caverns are fluctuating periodically with a high number of cycles. This work investigates the influence of fluctuating operation pressures on the stability of the host rock of gas storage caverns utilizing numerical models. Therefore, we developed a coupled Thermo-Hydro-Mechanical (THM) model based on the finite element method utilizing the open-source software platform OpenGeoSys. Our simulations include the thermodynamic behaviour of the gas during the loading/ unloading of the cavern. This provides information on the transient pressure and temperature distribution on the cavern boundary to calculate the deformation of its geometry. Non-linear material models are used for the mechanical analysis, which describe the creep and self-healing behavior of the salt rock under fluctuating loading pressures. In order to identify the necessary material parameters, we perform experimental studies on the mechanical behaviour of salt rock under varying pressure and temperature conditions. Based on the numerical results, we further derive concepts for monitoring THM quantities in the vicinity of the cavern. These programs will allow detecting changes of the host rock properties during the construction and operation of the storage facility. The developed model will be used by public authorities for land use planning issues.

  14. The 2015 Bioinformatics Open Source Conference (BOSC 2015).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Lapp, Hilmar; Chapman, Brad; Davey, Rob; Fields, Christopher; Hokamp, Karsten; Munoz-Torres, Monica

    2016-02-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.

  15. Open Source, Openness, and Higher Education

    ERIC Educational Resources Information Center

    Wiley, David

    2006-01-01

    In this article David Wiley provides an overview of how the general expansion of open source software has affected the world of education in particular. In doing so, Wiley not only addresses the development of open source software applications for teachers and administrators, he also discusses how the fundamental philosophy of the open source…

  16. Analysis of vehicular traffic flow in the major areas of Kuala Lumpur utilizing open-traffic

    NASA Astrophysics Data System (ADS)

    Manogaran, Saargunawathy; Ali, Muhammad; Yusof, Kamaludin Mohamad; Suhaili, Ramdhan

    2017-09-01

    Vehicular traffic congestion occurs when a large number of drivers are overcrowded on the road and the traffic flow does not run smoothly. Traffic congestion causes chaos on the road and interruption to daily activities of users. Time consumed on road give lots of negative effects on productivity, social behavior, environmental and cost to economy. Congestion is worsens and leads to havoc during the emergency such as flood, accidents, road maintenance and etc., where behavior of traffic flow is always unpredictable and uncontrollable. Real-time and historical traffic data are critical inputs for most traffic flow analysis applications. Researcher attempt to predict traffic using simulations as there is no exact model of traffic flow exists due to its high complexity. Open Traffic is an open source platform available for traffic data analysis linked to Open Street Map (OSM). This research is aimed to study and understand the Open Traffic platform. The real-time traffic flow pattern in Kuala Lumpur area was successfully been extracted and analyzed using Open Traffic. It was observed that the congestion occurs on every major road in Kuala Lumpur and most of it owes to the offices and the economic and commercial centers during rush hours. At some roads the congestion occurs at night due to the tourism activities.

  17. The Emergence of Open-Source Software in North America

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…

  18. Extracting and connecting chemical structures from text sources using chemicalize.org.

    PubMed

    Southan, Christopher; Stracz, Andras

    2013-04-23

    Exploring bioactive chemistry requires navigating between structures and data from a variety of text-based sources. While PubChem currently includes approximately 16 million document-extracted structures (15 million from patents) the extent of public inter-document and document-to-database links is still well below any estimated total, especially for journal articles. A major expansion in access to text-entombed chemistry is enabled by chemicalize.org. This on-line resource can process IUPAC names, SMILES, InChI strings, CAS numbers and drug names from pasted text, PDFs or URLs to generate structures, calculate properties and launch searches. Here, we explore its utility for answering questions related to chemical structures in documents and where these overlap with database records. These aspects are illustrated using a common theme of Dipeptidyl Peptidase 4 (DPPIV) inhibitors. Full-text open URL sources facilitated the download of over 1400 structures from a DPPIV patent and the alignment of specific examples with IC50 data. Uploading the SMILES to PubChem revealed extensive linking to patents and papers, including prior submissions from chemicalize.org as submitting source. A DPPIV medicinal chemistry paper was completely extracted and structures were aligned to the activity results table, as well as linked to other documents via PubChem. In both cases, key structures with data were partitioned from common chemistry by dividing them into individual new PDFs for conversion. Over 500 structures were also extracted from a batch of PubMed abstracts related to DPPIV inhibition. The drug structures could be stepped through each text occurrence and included some converted MeSH-only IUPAC names not linked in PubChem. Performing set intersections proved effective for detecting compounds-in-common between documents and merged extractions. This work demonstrates the utility of chemicalize.org for the exploration of chemical structure connectivity between documents and databases, including structure searches in PubChem, InChIKey searches in Google and the chemicalize.org archive. It has the flexibility to extract text from any internal, external or Web source. It synergizes with other open tools and the application is undergoing continued development. It should thus facilitate progress in medicinal chemistry, chemical biology and other bioactive chemistry domains.

  19. BpWrapper: BioPerl-based sequence and tree utilities for rapid prototyping of bioinformatics pipelines.

    PubMed

    Hernández, Yözen; Bernstein, Rocky; Pagan, Pedro; Vargas, Levy; McCaig, William; Ramrattan, Girish; Akther, Saymon; Larracuente, Amanda; Di, Lia; Vieira, Filipe G; Qiu, Wei-Gang

    2018-03-02

    Automated bioinformatics workflows are more robust, easier to maintain, and results more reproducible when built with command-line utilities than with custom-coded scripts. Command-line utilities further benefit by relieving bioinformatics developers to learn the use of, or to interact directly with, biological software libraries. There is however a lack of command-line utilities that leverage popular Open Source biological software toolkits such as BioPerl ( http://bioperl.org ) to make many of the well-designed, robust, and routinely used biological classes available for a wider base of end users. Designed as standard utilities for UNIX-family operating systems, BpWrapper makes functionality of some of the most popular BioPerl modules readily accessible on the command line to novice as well as to experienced bioinformatics practitioners. The initial release of BpWrapper includes four utilities with concise command-line user interfaces, bioseq, bioaln, biotree, and biopop, specialized for manipulation of molecular sequences, sequence alignments, phylogenetic trees, and DNA polymorphisms, respectively. Over a hundred methods are currently available as command-line options and new methods are easily incorporated. Performance of BpWrapper utilities lags that of precompiled utilities while equivalent to that of other utilities based on BioPerl. BpWrapper has been tested on BioPerl Release 1.6, Perl versions 5.10.1 to 5.25.10, and operating systems including Apple macOS, Microsoft Windows, and GNU/Linux. Release code is available from the Comprehensive Perl Archive Network (CPAN) at https://metacpan.org/pod/Bio::BPWrapper . Source code is available on GitHub at https://github.com/bioperl/p5-bpwrapper . BpWrapper improves on existing sequence utilities by following the design principles of Unix text utilities such including a concise user interface, extensive command-line options, and standard input/output for serialized operations. Further, dozens of novel methods for manipulation of sequences, alignments, and phylogenetic trees, unavailable in existing utilities (e.g., EMBOSS, Newick Utilities, and FAST), are provided. Bioinformaticians should find BpWrapper useful for rapid prototyping of workflows on the command-line without creating custom scripts for comparative genomics and other bioinformatics applications.

  20. On the possibility of galactic cosmic ray-induced radiolysis-powered life in subsurface environments in the Universe.

    PubMed

    Atri, Dimitra

    2016-10-01

    Photosynthesis is a mechanism developed by terrestrial life to utilize the energy from photons of solar origin for biological use. Subsurface regions are isolated from the photosphere, and consequently are incapable of utilizing this energy. This opens up the opportunity for life to evolve alternative mechanisms for harvesting available energy. Bacterium Candidatus Desulforudis audaxviator, found 2.8 km deep in a South African mine, harvests energy from radiolysis, induced by particles emitted from radioactive U, Th and K present in surrounding rock. Another radiation source in the subsurface environments is secondary particles generated by galactic cosmic rays (GCRs). Using Monte Carlo simulations, it is shown that it is a steady source of energy comparable to that produced by radioactive substances, and the possibility of a slow metabolizing life flourishing on it cannot be ruled out. Two mechanisms are proposed through which GCR-induced secondary particles can be utilized for biological use in subsurface environments: (i) GCRs injecting energy in the environment through particle-induced radiolysis and (ii) organic synthesis from GCR secondaries interacting with the medium. Laboratory experiments to test these hypotheses are also proposed. Implications of these mechanisms on finding life in the Solar System and elsewhere in the Universe are discussed. © 2016 The Author(s).

  1. On the possibility of galactic cosmic ray-induced radiolysis-powered life in subsurface environments in the Universe

    PubMed Central

    2016-01-01

    Photosynthesis is a mechanism developed by terrestrial life to utilize the energy from photons of solar origin for biological use. Subsurface regions are isolated from the photosphere, and consequently are incapable of utilizing this energy. This opens up the opportunity for life to evolve alternative mechanisms for harvesting available energy. Bacterium Candidatus Desulforudis audaxviator, found 2.8 km deep in a South African mine, harvests energy from radiolysis, induced by particles emitted from radioactive U, Th and K present in surrounding rock. Another radiation source in the subsurface environments is secondary particles generated by galactic cosmic rays (GCRs). Using Monte Carlo simulations, it is shown that it is a steady source of energy comparable to that produced by radioactive substances, and the possibility of a slow metabolizing life flourishing on it cannot be ruled out. Two mechanisms are proposed through which GCR-induced secondary particles can be utilized for biological use in subsurface environments: (i) GCRs injecting energy in the environment through particle-induced radiolysis and (ii) organic synthesis from GCR secondaries interacting with the medium. Laboratory experiments to test these hypotheses are also proposed. Implications of these mechanisms on finding life in the Solar System and elsewhere in the Universe are discussed. PMID:27707907

  2. Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on

    PubMed Central

    2011-01-01

    Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community. PMID:21999342

  3. Identification of Methyl Halide-Utilizing Genes in the Methyl Bromide-Utilizing Bacterial Strain IMB-1 Suggests a High Degree of Conservation of Methyl Halide-Specific Genes in Gram-Negative Bacteria

    USGS Publications Warehouse

    Woodall, C.A.; Warner, K.L.; Oremland, R.S.; Murrell, J.C.; McDonald, I.R.

    2001-01-01

    Strain IMB-1, an aerobic methylotrophic member of the alpha subgroup of the Proteobacteria, can grow with methyl bromide as a sole carbon and energy source. A single cmu gene cluster was identified in IMB-1 that contained six open reading frames: cmuC, cmuA, orf146, paaE, hutI, and partial metF. CmuA from IMB-1 has high sequence homology to the methyltransferase CmuA from Methylobacterium chloromethanicum and Hyphomicrobium chloromethanicum and contains a C-terminal corrinoid-binding motif and an N-terminal methyl-transferase motif. However, cmuB, identified in M. chloromethanicum and H. chloromethanicum, was not detected in IMB-1.

  4. Open Genetic Code: on open source in the life sciences.

    PubMed

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  5. XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.

    PubMed

    Ching, Daniel J; Gürsoy, Dogˇa

    2017-03-01

    The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  6. Simple Linux Utility for Resource Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jette, M.

    2009-09-09

    SLURM is an open source, fault-tolerant, and highly scalable cluster management and job scheduling system for large and small computer clusters. As a cluster resource manager, SLURM has three key functions. First, it allocates exclusive and/or non exclusive access to resources (compute nodes) to users for some duration of time so they can perform work. Second, it provides a framework for starting, executing, and monitoring work (normally a parallel job) on the set of allciated nodes. Finally, it arbitrates conflicting requests for resouces by managing a queue of pending work.

  7. An innovative use of instant messaging technology to support a library's single-service point.

    PubMed

    Horne, Andrea S; Ragon, Bart; Wilson, Daniel T

    2012-01-01

    A library service model that provides reference and instructional services by summoning reference librarians from a single service point is described. The system utilizes Libraryh3lp, an open-source, multioperator instant messaging system. The selection and refinement of this solution and technical challenges encountered are explored, as is the design of public services around this technology, usage of the system, and best practices. This service model, while a major cultural and procedural change at first, is now a routine aspect of customer service for this library.

  8. An assessment of transient hydraulics phenomena and its characterization

    NASA Technical Reports Server (NTRS)

    Mortimer, R. W.

    1974-01-01

    A systematic search of the open literature was performed with the purpose of identifying the causes, effects, and characterization (modelling and solution techniques) of transient hydraulics phenomena. The governing partial differential equations are presented which were found to be used most often in the literature. Detail survey sheets are shown which contain the type of hydraulics problem, the cause, the modelling, the solution technique utilized, and experimental verification used for each paper. References and source documents are listed and a discussion of the purpose and accomplishments of the study is presented.

  9. XDesign: An open-source software package for designing X-ray imaging phantoms and experiments

    DOE PAGES

    Ching, Daniel J.; Gursoy, Dogˇa

    2017-02-21

    Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  10. The Open Source Teaching Project (OSTP): Research Note.

    ERIC Educational Resources Information Center

    Hirst, Tony

    The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…

  11. Free for All: Open Source Software

    ERIC Educational Resources Information Center

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  12. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  13. Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness

    ERIC Educational Resources Information Center

    Committee for Economic Development, 2006

    2006-01-01

    Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…

  14. Mapping, Monitoring, and Modeling Geomorphic Processes to Identify Sources of Anthropogenic Sediment Pollution in West Maui, Hawai'i

    NASA Astrophysics Data System (ADS)

    Cerovski-Darriau, C.; Stock, J. D.; Winans, W. R.

    2016-12-01

    Episodic storm runoff in West Maui (Hawai'i) brings plumes of terrestrially-sourced fine sediment to the nearshore ocean environment, degrading coral reef ecosystems. The sediment pollution sources were largely unknown, though suspected to be due to modern human disturbance of the landscape, and initially assumed to be from visibly obvious exposed soil on agricultural fields and unimproved roads. To determine the sediment sources and estimate a sediment budget for the West Maui watersheds, we mapped the geomorphic processes in the field and from DEMs and orthoimagery, monitored erosion rates in the field, and modeled the sediment flux using the mapped processes and corresponding rates. We found the primary source of fine sands, silts and clays to be previously unidentified fill terraces along the stream bed. These terraces, formed during legacy agricultural activity, are the banks along 40-70% of the streams where the channels intersect human-modified landscapes. Monitoring over the last year shows that a few storms erode the fill terraces 10-20 mm annually, contributing up to 100s of tonnes of sediment per catchment. Compared to the average long-term, geologic erosion rate of 0.03 mm/yr, these fill terraces alone increase the suspended sediment flux to the coral reefs by 50-90%. Stakeholders can use our resulting geomorphic process map and sediment budget to inform the location and type of mitigation effort needed to limit terrestrial sediment pollution. We compare our mapping, monitoring, and modeling (M3) approach to NOAA's OpenNSPECT model. OpenNSPECT uses empirical hydrologic and soil erosion models paired with land cover data to compare the spatially distributed sediment yield from different land-use scenarios. We determine the relative effectiveness of calculating a baseline watershed sediment yield from each approach, and the utility of calibrating OpenNSEPCT with M3 results to better forecast future sediment yields from land-use or climate change scenarios.

  15. The riddle of Tasmanian languages

    PubMed Central

    Bowern, Claire

    2012-01-01

    Recent work which combines methods from linguistics and evolutionary biology has been fruitful in discovering the history of major language families because of similarities in evolutionary processes. Such work opens up new possibilities for language research on previously unsolvable problems, especially in areas where information from other sources may be lacking. I use phylogenetic methods to investigate Tasmanian languages. Existing materials are so fragmentary that scholars have been unable to discover how many languages are represented in the sources. Using a clustering algorithm which identifies admixture, source materials representing more than one language are identified. Using the Neighbor-Net algorithm, 12 languages are identified in five clusters. Bayesian phylogenetic methods reveal that the families are not demonstrably related; an important result, given the importance of Tasmanian Aborigines for information about how societies have responded to population collapse in prehistory. This work provides insight into the societies of prehistoric Tasmania and illustrates a new utility of phylogenetics in reconstructing linguistic history. PMID:23015621

  16. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research

    PubMed Central

    Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard. PMID:27257542

  17. The 2015 Bioinformatics Open Source Conference (BOSC 2015)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J. A.; Lapp, Hilmar

    2016-01-01

    The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included “Data Science;” “Standards and Interoperability;” “Open Science and Reproducibility;” “Translational Bioinformatics;” “Visualization;” and “Bioinformatics Open Source Project Updates”. In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled “Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community,” that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule. PMID:26914653

  18. Meta-image navigation augmenters for unmanned aircraft systems (MINA for UAS)

    NASA Astrophysics Data System (ADS)

    Òªelik, Koray; Somani, Arun K.; Schnaufer, Bernard; Hwang, Patrick Y.; McGraw, Gary A.; Nadke, Jeremy

    2013-05-01

    GPS is a critical sensor for Unmanned Aircraft Systems (UASs) due to its accuracy, global coverage and small hardware footprint, but is subject to denial due to signal blockage or RF interference. When GPS is unavailable, position, velocity and attitude (PVA) performance from other inertial and air data sensors is not sufficient, especially for small UASs. Recently, image-based navigation algorithms have been developed to address GPS outages for UASs, since most of these platforms already include a camera as standard equipage. Performing absolute navigation with real-time aerial images requires georeferenced data, either images or landmarks, as a reference. Georeferenced imagery is readily available today, but requires a large amount of storage, whereas collections of discrete landmarks are compact but must be generated by pre-processing. An alternative, compact source of georeferenced data having large coverage area is open source vector maps from which meta-objects can be extracted for matching against real-time acquired imagery. We have developed a novel, automated approach called MINA (Meta Image Navigation Augmenters), which is a synergy of machine-vision and machine-learning algorithms for map aided navigation. As opposed to existing image map matching algorithms, MINA utilizes publicly available open-source geo-referenced vector map data, such as OpenStreetMap, in conjunction with real-time optical imagery from an on-board, monocular camera to augment the UAS navigation computer when GPS is not available. The MINA approach has been experimentally validated with both actual flight data and flight simulation data and results are presented in the paper.

  19. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  20. Automated Selection of Hotspots (ASH): enhanced automated segmentation and adaptive step finding for Ki67 hotspot detection in adrenal cortical cancer.

    PubMed

    Lu, Hao; Papathomas, Thomas G; van Zessen, David; Palli, Ivo; de Krijger, Ronald R; van der Spek, Peter J; Dinjens, Winand N M; Stubbs, Andrew P

    2014-11-25

    In prognosis and therapeutics of adrenal cortical carcinoma (ACC), the selection of the most active areas in proliferative rate (hotspots) within a slide and objective quantification of immunohistochemical Ki67 Labelling Index (LI) are of critical importance. In addition to intratumoral heterogeneity in proliferative rate i.e. levels of Ki67 expression within a given ACC, lack of uniformity and reproducibility in the method of quantification of Ki67 LI may confound an accurate assessment of Ki67 LI. We have implemented an open source toolset, Automated Selection of Hotspots (ASH), for automated hotspot detection and quantification of Ki67 LI. ASH utilizes NanoZoomer Digital Pathology Image (NDPI) splitter to convert the specific NDPI format digital slide scanned from the Hamamatsu instrument into a conventional tiff or jpeg format image for automated segmentation and adaptive step finding hotspots detection algorithm. Quantitative hotspot ranking is provided by the functionality from the open source application ImmunoRatio as part of the ASH protocol. The output is a ranked set of hotspots with concomitant quantitative values based on whole slide ranking. We have implemented an open source automated detection quantitative ranking of hotspots to support histopathologists in selecting the 'hottest' hotspot areas in adrenocortical carcinoma. To provide wider community easy access to ASH we implemented a Galaxy virtual machine (VM) of ASH which is available from http://bioinformatics.erasmusmc.nl/wiki/Automated_Selection_of_Hotspots . The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/13000_2014_216.

  1. The 2016 Bioinformatics Open Source Conference (BOSC).

    PubMed

    Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science.

  2. Beyond Open Source: According to Jim Hirsch, Open Technology, Not Open Source, Is the Wave of the Future

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    This article presents an interview with Jim Hirsch, an associate superintendent for technology at Piano Independent School District in Piano, Texas. Hirsch serves as a liaison for the open technologies committee of the Consortium for School Networking. In this interview, he shares his opinion on the significance of open source in K-12.

  3. 21. INTERIOR OF UTILITY ROOM SHOWING OPEN REAR DOOR AT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. INTERIOR OF UTILITY ROOM SHOWING OPEN REAR DOOR AT PHOTO CENTER, PAIRED NARROW 1-LIGHT OVER 1-LIGHT, DOUBLE-HUNG, WOOD-FRAMED WINDOWS AT PHOTO LEFT. OPEN DOOR AT PHOTO RIGHT LEADS TO BATHROOM. VIEW TO SOUTHWEST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA

  4. EMISSIONS OF ORGANIC AIR TOXICS FROM OPEN ...

    EPA Pesticide Factsheets

    A detailed literature search was performed to collect and collate available data reporting emissions of toxic organic substances into the air from open burning sources. Availability of data varied according to the source and the class of air toxics of interest. Volatile organic compound (VOC) and polycyclic aromatic hydrocarbon (PAH) data were available for many of the sources. Data on semivolatile organic compounds (SVOCs) that are not PAHs were available for several sources. Carbonyl and polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofuran (PCDD/F) data were available for only a few sources. There were several sources for which no emissions data were available at all. Several observations were made including: 1) Biomass open burning sources typically emitted less VOCs than open burning sources with anthropogenic fuels on a mass emitted per mass burned basis, particularly those where polymers were concerned; 2) Biomass open burning sources typically emitted less SVOCs and PAHs than anthropogenic sources on a mass emitted per mass burned basis. Burning pools of crude oil and diesel fuel produced significant amounts of PAHs relative to other types of open burning. PAH emissions were highest when combustion of polymers was taking place; and 3) Based on very limited data, biomass open burning sources typically produced higher levels of carbonyls than anthropogenic sources on a mass emitted per mass burned basis, probably due to oxygenated structures r

  5. Optical and spectroscopic studies on tannery wastes as a possible source of organic semiconductors

    NASA Astrophysics Data System (ADS)

    Nashy, El-Shahat H. A.; Al-Ashkar, Emad; Abdel Moez, A.

    2012-02-01

    Tanning industry produces a large quantity of solid wastes which contain hide proteins in the form of protein shavings containing chromium salts. The chromium wastes are the main concern from an environmental stand point of view, because chrome wastes posses a significant disposal problem. The present work is devoted to investigate the possibility of utilizing these wastes as a source of organic semi-conductors as an alternative method instead of the conventional ones. The chemical characterization of these wastes was determined. In addition, the Horizontal Attenuated Total Reflection (HATR) FT-IR spectroscopic analysis and optical parameters were also carried out for chromated samples. The study showed that the chromated samples had suitable absorbance and transmittance in the wavelength range (500-850 nm). Presence of chromium salt in the collagen samples increases the absorbance which improves the optical properties of the studied samples and leads to decrease the optical energy gap. The obtained optical energy gap gives an impression that the environmentally hazardous chrome shavings wastes can be utilized as a possible source of natural organic semiconductors with direct and indirect energy gap. This work opens the door to use some hazardous wastes in the manufacture of electronic devices such as IR-detectors, solar cells and also as solar cell windows.

  6. Open-Source 3D-Printable Optics Equipment

    PubMed Central

    Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104

  7. Open-source 3D-printable optics equipment.

    PubMed

    Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M

    2013-01-01

    Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.

  8. Creation of a simple natural language processing tool to support an imaging utilization quality dashboard.

    PubMed

    Swartz, Jordan; Koziatek, Christian; Theobald, Jason; Smith, Silas; Iturrate, Eduardo

    2017-05-01

    Testing for venous thromboembolism (VTE) is associated with cost and risk to patients (e.g. radiation). To assess the appropriateness of imaging utilization at the provider level, it is important to know that provider's diagnostic yield (percentage of tests positive for the diagnostic entity of interest). However, determining diagnostic yield typically requires either time-consuming, manual review of radiology reports or the use of complex and/or proprietary natural language processing software. The objectives of this study were twofold: 1) to develop and implement a simple, user-configurable, and open-source natural language processing tool to classify radiology reports with high accuracy and 2) to use the results of the tool to design a provider-specific VTE imaging dashboard, consisting of both utilization rate and diagnostic yield. Two physicians reviewed a training set of 400 lower extremity ultrasound (UTZ) and computed tomography pulmonary angiogram (CTPA) reports to understand the language used in VTE-positive and VTE-negative reports. The insights from this review informed the arguments to the five modifiable parameters of the NLP tool. A validation set of 2,000 studies was then independently classified by the reviewers and by the tool; the classifications were compared and the performance of the tool was calculated. The tool was highly accurate in classifying the presence and absence of VTE for both the UTZ (sensitivity 95.7%; 95% CI 91.5-99.8, specificity 100%; 95% CI 100-100) and CTPA reports (sensitivity 97.1%; 95% CI 94.3-99.9, specificity 98.6%; 95% CI 97.8-99.4). The diagnostic yield was then calculated at the individual provider level and the imaging dashboard was created. We have created a novel NLP tool designed for users without a background in computer programming, which has been used to classify venous thromboembolism reports with a high degree of accuracy. The tool is open-source and available for download at http://iturrate.com/simpleNLP. Results obtained using this tool can be applied to enhance quality by presenting information about utilization and yield to providers via an imaging dashboard. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Measuring Methane from Cars, Ships, Airplanes, Helicopters and Drones Using High-Speed Open-Path Technology

    NASA Astrophysics Data System (ADS)

    Burba, George; Anderson, Tyler; Biraud, Sebastien; Caulton, Dana; von Fischer, Joe; Gioli, Beniamino; Hanson, Chad; Ham, Jay; Kohnert, Katrin; Larmanou, Eric; Levy, Peter; Polidori, Andrea; Pikelnaya, Olga; Sachs, Torsten; Serafimovich, Andrei; Zaldei, Alessandro; Zondlo, Mark; Zulueta, Rommel

    2017-04-01

    Methane plays a critical role in the radiation balance, chemistry of the atmosphere, and air quality. The major anthropogenic sources of methane include oil and gas development sites, natural gas distribution networks, landfill emissions, and agricultural production. The majority of oil and gas and urban methane emission occurs via variable-rate point sources or diffused spots in topographically challenging terrains (e.g., street tunnels, elevated locations at water treatment plants, vents, etc.). Locating and measuring such methane emissions is challenging when using traditional micrometeorological techniques, and requires development of novel approaches. Landfill methane emissions traditionally assessed at monthly or longer time intervals are subject to large uncertainties because of the snapshot nature of the measurements and the barometric pumping phenomenon. The majority of agricultural and natural methane production occurs in areas with little infrastructure or easily available grid power (e.g., rice fields, arctic and boreal wetlands, tropical mangroves, etc.). A lightweight, high-speed, high-resolution, open-path technology was recently developed for eddy covariance measurements of methane flux, with power consumption 30-150 times below other available technologies. It was designed to run on solar panels or a small generator and be placed in the middle of the methane-producing ecosystem without a need for grid power. Lately, this instrumentation has been utilized increasingly more frequently outside of the traditional use on stationary flux towers. These novel approaches include measurements from various moving platforms, such as cars, aircraft, and ships. Projects included mapping of concentrations and vertical profiles, leak detection and quantification, mobile emission detection from natural gas-powered cars, soil methane flux surveys, etc. This presentation will describe the latest state of the key projects utilizing the novel lightweight low-power high-resolution open-path technology, and will highlight several novel approaches where such instrumentation was used in mobile deployments in urban, agricultural and natural environments by academic institutions, regulatory agencies and industry.

  10. Mobile Measurements of Methane Using High-Speed Open-Path Technology

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Anderson, T.; Ediger, K.; von Fischer, J.; Gioli, B.; Ham, J. M.; Hupp, J. R.; Kohnert, K.; Levy, P. E.; Polidori, A.; Pikelnaya, O.; Price, E.; Sachs, T.; Serafimovich, A.; Zondlo, M. A.; Zulueta, R. C.

    2016-12-01

    Methane plays a critical role in the radiation balance, chemistry of the atmosphere, and air quality. The major anthropogenic sources of CH4 include oil and gas development sites, natural gas distribution networks, landfill emissions, and agricultural production. The majority of oil and gas and urban CH4 emission occurs via variable-rate point sources or diffused spots in topographically challenging terrains (e.g., street tunnels, elevated locations at water treatment plants, vents, etc.). Locating and measuring such CH4 emissions is challenging when using traditional micrometeorological techniques, and requires development of novel approaches. Landfill CH4 emissions traditionally assessed at monthly or longer time intervals are subject to large uncertainties because of the snapshot nature of the measurements and the barometric pumping phenomenon. The majority of agricultural and natural CH4 production occurs in areas with little infrastructure or easily available grid power (e.g., rice fields, arctic and boreal wetlands, tropical mangroves, etc.). A lightweight, high-speed, high-resolution, open-path technology was recently developed for eddy covariance measurements of CH4 flux, with power consumption 30-150 times below other available technologies. It was designed to run on solar panels or a small generator and be placed in the middle of the methane-producing ecosystem without a need for grid power. Lately, this instrumentation has been utilized increasingly more frequently outside of the traditional use on stationary flux towers. These novel approaches include measurements from various moving platforms, such as cars, aircraft, and ships. Projects included mapping of concentrations and vertical profiles, leak detection and quantification, mobile emission detection from natural gas-powered cars, soil CH4 flux surveys, etc. This presentation will describe key projects utilizing the novel lightweight low-power high-resolution open-path technology, and will highlight several novel approaches where such instrumentation was used in mobile deployments in urban, agricultural and natural environments by academic institutions, regulatory agencies and industry.

  11. An Open Source Business Model for Malaria

    PubMed Central

    Årdal, Christine; Røttingen, John-Arne

    2015-01-01

    Greater investment is required in developing new drugs and vaccines against malaria in order to eradicate malaria. These precious funds must be carefully managed to achieve the greatest impact. We evaluate existing efforts to discover and develop new drugs and vaccines for malaria to determine how best malaria R&D can benefit from an enhanced open source approach and how such a business model may operate. We assess research articles, patents, clinical trials and conducted a smaller survey among malaria researchers. Our results demonstrate that the public and philanthropic sectors are financing and performing the majority of malaria drug/vaccine discovery and development, but are then restricting access through patents, ‘closed’ publications and hidden away physical specimens. This makes little sense since it is also the public and philanthropic sector that purchases the drugs and vaccines. We recommend that a more “open source” approach is taken by making the entire value chain more efficient through greater transparency which may lead to more extensive collaborations. This can, for example, be achieved by empowering an existing organization like the Medicines for Malaria Venture (MMV) to act as a clearing house for malaria-related data. The malaria researchers that we surveyed indicated that they would utilize such registry data to increase collaboration. Finally, we question the utility of publicly or philanthropically funded patents for malaria medicines, where little to no profits are available. Malaria R&D benefits from a publicly and philanthropically funded architecture, which starts with academic research institutions, product development partnerships, commercialization assistance through UNITAID and finally procurement through mechanisms like The Global Fund to Fight AIDS, Tuberculosis and Malaria and the U.S.’ President’s Malaria Initiative. We believe that a fresh look should be taken at the cost/benefit of patents particularly related to new malaria medicines and consider alternative incentives, like WHO prequalification. PMID:25658590

  12. Update on mandibular condylar fracture management.

    PubMed

    Weiss, Joshua P; Sawhney, Raja

    2016-08-01

    Fractures of the mandibular condyle have provided a lasting source of controversy in the field of facial trauma. Concerns regarding facial nerve injury as well as reasonable functional outcomes with closed management led to a reluctance to treat with an open operative intervention. This article reviews how incorporating new technologies and surgical methods have changed the treatment paradigm. Multiple large studies and meta-analyses continue to demonstrate superior outcomes for condylar fractures when managed surgically. Innovations, including endoscopic techniques, three-dimensional miniplates, and angled drills provide increased options in the treatment of condylar fractures. The literature on pediatric condylar fractures is limited and continues to favor a more conservative approach. There continues to be mounting evidence in radiographic, quality of life, and functional outcome studies to support open reduction with internal fixation for the treatment of condylar fractures in patients with malocclusion, significant displacement, or dislocation of the temporomandibular joint. The utilization of three-dimensional trapezoidal miniplates has shown improved outcomes and theoretically enhanced biomechanical properties when compared with traditional fixation with single or double miniplates. Endoscopic-assisted techniques can decrease surgical morbidity, but are technically challenging, require skilled assistants, and utilize specialized equipment.

  13. Use of Electronic Health Records in sub-Saharan Africa: Progress and challenges

    PubMed Central

    Akanbi, Maxwell O.; Ocheke, Amaka N.; Agaba, Patricia A.; Daniyam, Comfort A.; Agaba, Emmanuel I.; Okeke, Edith N.; Ukoli, Christiana O.

    2012-01-01

    Background The Electronic Health Record (EHR) is a key component of medical informatics that is increasingly being utilized in industrialized nations to improve healthcare. There is limited information on the use of EHR in sub-Saharan Africa. This paper reviews availability of EHRs in sub-Saharan Africa. Methods Searches were performed on PubMed and Google Scholar databases using the terms ‘Electronic Health Records OR Electronic Medical Records OR e-Health and Africa’. References from identified publications were reviewed. Inclusion criterion was documented use of EHR in Africa. Results The search yielded 147 publications of which 21papers from 15 sub-Saharan African countries documented the use of EHR in Africa and were reviewed. About 91% reported use of Open Source healthcare software, with OpenMRS being the most widely used. Most reports were from HIV related health centers. Barriers to adoption of EHRs include high cost of procurement and maintenance, poor network infrastructure and lack of comfort among health workers with electronic medical records. Conclusion There has been an increase in the use of EHRs in sub-Saharan Africa, largely driven by utilization by HIV treatment programs. Penetration is still however very low. PMID:25243111

  14. Aerostat-Lofted Instrument Platform and Sampling Method for Determination of Emissions from Open Area Sources

    EPA Science Inventory

    Sampling emissions from open area sources, particularly sources of open burning, is difficult due to fast dilution of emissions and safety concerns for personnel. Representative emission samples can be difficult to obtain with flaming and explosive sources since personnel safety ...

  15. Total recall. [Refinancing of debt by utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemptstead, J.

    1994-02-15

    The dramatic rally in the US treasury markets in 1992 and 1993 offered utility treasurers a unique opportunity to radically restructure their outstanding debt profiles by redeeming and refunding callable and refundable bonds. Since January 1991, utility companies have issued over $100 billion of nonconvertible debt securities; 53 percent of these companies indicated [open quotes]refinancing fixed income securities[close quotes] as the primary use of proceeds. After approximately 18 months of heavy refunding activity, utility treasurers have nearly exhausted the supply of currently callable debt and are now looking at alternative methods of reducing their embedded cost of debt and increasingmore » cash flows. The two most common methods are to repurchase highest-cost noncallable and/or currently nonrefundable bonds through [open quotes]open-market repurchases[close quotes] and [open quotes]tender offers.[close quotes] A third, less popular and less used, method is the [open quotes]defeasance[close quotes]. This article describes the advantages, disadvantages, and economic effects of these three types of financing.« less

  16. The Visible Human Data Sets (VHD) and Insight Toolkit (ITk): Experiments in Open Source Software

    PubMed Central

    Ackerman, Michael J.; Yoo, Terry S.

    2003-01-01

    From its inception in 1989, the Visible Human Project was designed as an experiment in open source software. In 1994 and 1995 the male and female Visible Human data sets were released by the National Library of Medicine (NLM) as open source data sets. In 2002 the NLM released the first version of the Insight Toolkit (ITk) as open source software. PMID:14728278

  17. The 2016 Bioinformatics Open Source Conference (BOSC)

    PubMed Central

    Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather

    2016-01-01

    Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science. PMID:27781083

  18. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less

  19. Getting started with open-hardware: development and control of microfluidic devices.

    PubMed

    da Costa, Eric Tavares; Mora, Maria F; Willis, Peter A; do Lago, Claudimir L; Jiao, Hong; Garcia, Carlos D

    2014-08-01

    Understanding basic concepts of electronics and computer programming allows researchers to get the most out of the equipment found in their laboratories. Although a number of platforms have been specifically designed for the general public and are supported by a vast array of on-line tutorials, this subject is not normally included in university chemistry curricula. Aiming to provide the basic concepts of hardware and software, this article is focused on the design and use of a simple module to control a series of PDMS-based valves. The module is based on a low-cost microprocessor (Teensy) and open-source software (Arduino). The microvalves were fabricated using thin sheets of PDMS and patterned using CO2 laser engraving, providing a simple and efficient way to fabricate devices without the traditional photolithographic process or facilities. Synchronization of valve control enabled the development of two simple devices to perform injection (1.6 ± 0.4 μL/stroke) and mixing of different solutions. Furthermore, a practical demonstration of the utility of this system for microscale chemical sample handling and analysis was achieved performing an on-chip acid-base titration, followed by conductivity detection with an open-source low-cost detection system. Overall, the system provided a very reproducible (98%) platform to perform fluid delivery at the microfluidic scale. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A modular, open-source, slide-scanning microscope for diagnostic applications in resource-constrained settings

    PubMed Central

    Lu, Qiang; Liu, Guanghui; Xiao, Chuanli; Hu, Chuanzhen; Zhang, Shiwu; Xu, Ronald X.; Chu, Kaiqin; Xu, Qianming

    2018-01-01

    In this paper we report the development of a cost-effective, modular, open source, and fully automated slide-scanning microscope, composed entirely of easily available off-the-shelf parts, and capable of bright field and fluorescence modes. The automated X-Y stage is composed of two low-cost micrometer stages coupled to stepper motors operated in open-loop mode. The microscope is composed of a low-cost CMOS sensor and low-cost board lenses placed in a 4f configuration. The system has approximately 1 micron resolution, limited by the f/# of available board lenses. The microscope is compact, measuring just 25×25×30 cm, and has an absolute positioning accuracy of ±1 μm in the X and Y directions. A Z-stage enables autofocusing and imaging over large fields of view even on non-planar samples, and custom software enables automatic determination of sample boundaries and image mosaicking. We demonstrate the utility of our device through imaging of fluorescent- and transmission-dye stained blood and fecal smears containing human and animal parasites, as well as several prepared tissue samples. These results demonstrate image quality comparable to high-end commercial microscopes at a cost of less than US$400 for a bright-field system, with an extra US$100 needed for the fluorescence module. PMID:29543835

  1. Preferences and Utilities for Health States after Treatment of Olfactory Groove Meningioma: Endoscopic versus Open.

    PubMed

    Yao, Christopher M; Kahane, Alyssa; Monteiro, Eric; Gentili, Fred; Zadeh, Gelareh; de Almeida, John R

    2017-08-01

    Objectives  The purpose of this study is to report health utility scores for patients with olfactory groove meningiomas (OGM) treated with either the standard transcranial approach, or the expanded endonasal endoscopic approach. Design  The time trade-off technique was used to derive health utility scores. Setting  Healthy individuals without skull base tumors were surveyed. Main Outcome Measures  Participants reviewed and rated scenarios describing treatment (endoscopic, open, stereotactic radiation, watchful waiting), remission, recurrence, and complications associated with the management of OGMs. Results  There were 51 participants. The endoscopic approach was associated with higher utility scores compared with an open craniotomy approach (0.88 vs. 0.74; p  < 0.001) and watchful waiting (0.88 vs.0.74; p  = 0.002). If recurrence occurred, revision endoscopic resection continued to have a higher utility score compared with revision open craniotomy (0.68; p  = 0.008). On multivariate analysis, older individuals were more likely to opt for watchful waiting ( p  = 0.001), whereas participants from higher income brackets were more likely to rate stereotactic radiosurgery with higher utility scores ( p  = 0.017). Conclusion  The endoscopic approach was associated with higher utility scores than craniotomy for primary and revision cases. The present utilities can be used for future cost-utility analyses.

  2. OpenMx: An Open Source Extended Structural Equation Modeling Framework

    ERIC Educational Resources Information Center

    Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John

    2011-01-01

    OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the "R" statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are…

  3. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.

  4. Automated Gait Analysis Through Hues and Areas (AGATHA): a method to characterize the spatiotemporal pattern of rat gait

    PubMed Central

    Kloefkorn, Heidi E.; Pettengill, Travis R.; Turner, Sara M. F.; Streeter, Kristi A.; Gonzalez-Rothi, Elisa J.; Fuller, David D.; Allen, Kyle D.

    2016-01-01

    While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns. PMID:27554674

  5. Automated Gait Analysis Through Hues and Areas (AGATHA): A Method to Characterize the Spatiotemporal Pattern of Rat Gait.

    PubMed

    Kloefkorn, Heidi E; Pettengill, Travis R; Turner, Sara M F; Streeter, Kristi A; Gonzalez-Rothi, Elisa J; Fuller, David D; Allen, Kyle D

    2017-03-01

    While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns.

  6. PV_LIB Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-11

    While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less

  7. Biotechnological production of gluconic acid: future implications.

    PubMed

    Singh, Om V; Kumar, Raj

    2007-06-01

    Gluconic acid (GA) is a multifunctional carbonic acid regarded as a bulk chemical in the food, feed, beverage, textile, pharmaceutical, and construction industries. The favored production process is submerged fermentation by Aspergillus niger utilizing glucose as a major carbohydrate source, which accompanied product yield of 98%. However, use of GA and its derivatives is currently restricted because of high prices: about US$ 1.20-8.50/kg. Advancements in biotechnology such as screening of microorganisms, immobilization techniques, and modifications in fermentation process for continuous fermentation, including genetic engineering programmes, could lead to cost-effective production of GA. Among alternative carbohydrate sources, sugarcane molasses, grape must show highest GA yield of 95.8%, and banana must may assist reducing the overall cost of GA production. These methodologies would open new markets and increase applications of GA.

  8. The Case for Open Source: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    Open source has continued to evolve and in the past three years the development of a graphical user interface has made it increasingly accessible and viable for end users without special training. Open source relies to a great extent on the free software movement. In this context, the term free refers not to cost, but to the freedom users have to…

  9. InChI in the wild: an assessment of InChIKey searching in Google

    PubMed Central

    2013-01-01

    While chemical databases can be queried using the InChI string and InChIKey (IK) the latter was designed for open-web searching. It is becoming increasingly effective for this since more sources enhance crawling of their websites by the Googlebot and consequent IK indexing. Searchers who use Google as an adjunct to database access may be less familiar with the advantages of using the IK as explored in this review. As an example, the IK for atorvastatin retrieves ~200 low-redundancy links from a Google search in 0.3 of a second. These include most major databases and a very low false-positive rate. Results encompass less familiar but potentially useful sources and can be extended to isomer capture by using just the skeleton layer of the IK. Google Advanced Search can be used to filter large result sets. Image searching with the IK is also effective and complementary to open-web queries. Results can be particularly useful for less-common structures as exemplified by a major metabolite of atorvastatin giving only three hits. Testing also demonstrated document-to-document and document-to-database joins via structure matching. The necessary generation of an IK from chemical names can be accomplished using open tools and resources for patents, papers, abstracts or other text sources. Active global sharing of local IK-linked information can be accomplished via surfacing in open laboratory notebooks, blogs, Twitter, figshare and other routes. While information-rich chemistry (e.g. approved drugs) can exhibit swamping and redundancy effects, the much smaller IK result sets for link-poor structures become a transformative first-pass option. The IK indexing has therefore turned Google into a de-facto open global chemical information hub by merging links to most significant sources, including over 50 million PubChem and ChemSpider records. The simplicity, specificity and speed of matching make it a useful option for biologists or others less familiar with chemical searching. However, compared to rigorously maintained major databases, users need to be circumspect about the consistency of Google results and provenance of retrieved links. In addition, community engagement may be necessary to ameliorate possible future degradation of utility. PMID:23399051

  10. Reengineering Workflow for Curation of DICOM Datasets.

    PubMed

    Bennett, William; Smith, Kirk; Jarosz, Quasar; Nolan, Tracy; Bosch, Walter

    2018-06-15

    Reusable, publicly available data is a pillar of open science and rapid advancement of cancer imaging research. Sharing data from completed research studies not only saves research dollars required to collect data, but also helps insure that studies are both replicable and reproducible. The Cancer Imaging Archive (TCIA) is a global shared repository for imaging data related to cancer. Insuring the consistency, scientific utility, and anonymity of data stored in TCIA is of utmost importance. As the rate of submission to TCIA has been increasing, both in volume and complexity of DICOM objects stored, the process of curation of collections has become a bottleneck in acquisition of data. In order to increase the rate of curation of image sets, improve the quality of the curation, and better track the provenance of changes made to submitted DICOM image sets, a custom set of tools was developed, using novel methods for the analysis of DICOM data sets. These tools are written in the programming language perl, use the open-source database PostgreSQL, make use of the perl DICOM routines in the open-source package Posda, and incorporate DICOM diagnostic tools from other open-source packages, such as dicom3tools. These tools are referred to as the "Posda Tools." The Posda Tools are open source and available via git at https://github.com/UAMS-DBMI/PosdaTools . In this paper, we briefly describe the Posda Tools and discuss the novel methods employed by these tools to facilitate rapid analysis of DICOM data, including the following: (1) use a database schema which is more permissive, and differently normalized from traditional DICOM databases; (2) perform integrity checks automatically on a bulk basis; (3) apply revisions to DICOM datasets on an bulk basis, either through a web-based interface or via command line executable perl scripts; (4) all such edits are tracked in a revision tracker and may be rolled back; (5) a UI is provided to inspect the results of such edits, to verify that they are what was intended; (6) identification of DICOM Studies, Series, and SOP instances using "nicknames" which are persistent and have well-defined scope to make expression of reported DICOM errors easier to manage; and (7) rapidly identify potential duplicate DICOM datasets by pixel data is provided; this can be used, e.g., to identify submission subjects which may relate to the same individual, without identifying the individual.

  11. Application of uniaxial confining-core clamp with hydrous pyrolysis in petrophysical and geochemical studies of source rocks at various thermal maturities

    USGS Publications Warehouse

    Lewan, Michael D.; Birdwell, Justin E.; Baez, Luis; Beeney, Ken; Sonnenberg, Steve

    2013-01-01

    Understanding changes in petrophysical and geochemical parameters during source rock thermal maturation is a critical component in evaluating source-rock petroleum accumulations. Natural core data are preferred, but obtaining cores that represent the same facies of a source rock at different thermal maturities is seldom possible. An alternative approach is to induce thermal maturity changes by laboratory pyrolysis on aliquots of a source-rock sample of a given facies of interest. Hydrous pyrolysis is an effective way to induce thermal maturity on source-rock cores and provide expelled oils that are similar in composition to natural crude oils. However, net-volume increases during bitumen and oil generation result in expanded cores due to opening of bedding-plane partings. Although meaningful geochemical measurements on expanded, recovered cores are possible, the utility of the core for measuring petrophysical properties relevant to natural subsurface cores is not suitable. This problem created during hydrous pyrolysis is alleviated by using a stainless steel uniaxial confinement clamp on rock cores cut perpendicular to bedding fabric. The clamp prevents expansion just as overburden does during natural petroleum formation in the subsurface. As a result, intact cores can be recovered at various thermal maturities for the measurement of petrophysical properties as well as for geochemical analyses. This approach has been applied to 1.7-inch diameter cores taken perpendicular to the bedding fabric of a 2.3- to 2.4-inch thick slab of Mahogany oil shale from the Eocene Green River Formation. Cores were subjected to hydrous pyrolysis at 360 °C for 72 h, which represents near maximum oil generation. One core was heated unconfined and the other was heated in the uniaxial confinement clamp. The unconfined core developed open tensile fractures parallel to the bedding fabric that result in a 38 % vertical expansion of the core. These open fractures did not occur in the confined core, but short, discontinuous vertical fractures on the core periphery occurred as a result of lateral expansion.

  12. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  13. When Free Isn't Free: The Realities of Running Open Source in School

    ERIC Educational Resources Information Center

    Derringer, Pam

    2009-01-01

    Despite the last few years' growth in awareness of open-source software in schools and the potential savings it represents, its widespread adoption is still hampered. Randy Orwin, technology director of the Bainbridge Island School District in Washington State and a strong open-source advocate, cautions that installing an open-source…

  14. Evaluation of a Framework to Implement Electronic Health Record Systems Based on the openEHR Standard

    NASA Astrophysics Data System (ADS)

    Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.

    2016-04-01

    The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.

  15. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  16. Open Source Vision

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    Increasingly, colleges and universities are turning to open source as a way to meet their technology infrastructure and application needs. Open source has changed life for visionary CIOs and their campus communities nationwide. The author discusses what these technologists see as the benefits--and the considerations.

  17. Quality Test of Flexible Flat Cable (FFC) With Short Open Test Using Law Ohm Approach through Embedded Fuzzy Logic Based On Open Source Arduino Data Logger

    NASA Astrophysics Data System (ADS)

    Rohmanu, Ajar; Everhard, Yan

    2017-04-01

    A technological development, especially in the field of electronics is very fast. One of the developments in the electronics hardware device is Flexible Flat Cable (FFC), which serves as a media liaison between the main boards with other hardware parts. The production of Flexible Flat Cable (FFC) will go through the process of testing and measuring of the quality Flexible Flat Cable (FFC). Currently, the testing and measurement is still done manually by observing the Light Emitting Diode (LED) by the operator, so there were many problems. This study will be made of test quality Flexible Flat Cable (FFC) computationally utilize Open Source Embedded System. The method used is the measurement with Short Open Test method using Ohm’s Law approach to 4-wire (Kelvin) and fuzzy logic as a decision maker measurement results based on Open Source Arduino Data Logger. This system uses a sensor current INA219 as a sensor to read the voltage value thus obtained resistance value Flexible Flat Cable (FFC). To get a good system we will do the Black-box testing as well as testing the accuracy and precision with the standard deviation method. In testing the system using three models samples were obtained the test results in the form of standard deviation for the first model of 1.921 second model of 4.567 and 6.300 for the third model. While the value of the Standard Error of Mean (SEM) for the first model of the model 0.304 second at 0.736 and 0.996 of the third model. In testing this system, we will also obtain the average value of the measurement tolerance resistance values for the first model of - 3.50% 4.45% second model and the third model of 5.18% with the standard measurement of prisoners and improve productivity becomes 118.33%. From the results of the testing system is expected to improve the quality and productivity in the process of testing Flexible Flat Cable (FFC).

  18. 76 FR 34634 - Federal Acquisition Regulation; Prioritizing Sources of Supplies and Services for Use by the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-14

    ... contracts before commercial sources in the open market. The proposed rule amends FAR 8.002 as follows: The... requirements for supplies and services from commercial sources in the open market. The proposed FAR 8.004 would... subpart 8.6). (b) Commercial sources (including educational and non-profit institutions) in the open...

  19. A feasibility study to determine if there is a market for automatic meter-reading devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hilberg, G.R.

    1996-08-01

    For many utilities the cost of manually reading meters is increasing due to personnel expenses and equipment costs. The current system of manual meters provides little ability for the utility to reduce costs. To reduce meter reading costs the utility must automate the manual system and reduce personnel expenses. A water utility in San Diego county was studied to calculate the cost of reading individual water meters. This would allow for the selective replacement of {open_quotes}high-cost{close_quotes} meters to quickly reduce meter-reading costs while limiting the necessary capital investments. As the {open_quotes}high-cost{close_quotes} meters are selectively replaced, a utility with a significantmore » difference in individual meter reading costs could save three to five dollars per meter per year. This study showed that the {open_quotes}high-cost{close_quotes} meters were six times more expensive to read than the average meter. Additionally, AMR systems increase the information available to consumers and to the utility on usage patterns and problems. The challenge was to cost effectively identify the {open_quotes}high-cost{close_quotes} meters. The costs to collect these data were less than $500.« less

  20. GIS Tool for Real-time Decision Making and Analysis of Multidisciplinary Cryosphere Datasets.

    NASA Astrophysics Data System (ADS)

    Roberts, S. D.; Moore, J. A.

    2004-12-01

    In support of the Western Arctic Shelf-Basin Interaction Project(SBI) a web-based interactive mapping server was installed on the USCGC Healy's on-board science computer network during its 2004 spring(HLY-04-02) and summer cruises (HLY-04-03) in the Chukchi and Beaufort Seas. SBI is a National Science Foundation sponsored multi-year and multidisciplinary project studying the biological productivity in the region. The mapping server was developed by the UCAR Joint Office of Science Support(JOSS) using OpenSource GIS tools(University of Minnesota Mapserver and USGS MapSurfer). Additional OpenSource tools such as GMT and MB-Systems were also utilized. The key layers in this system are the current ship track, station locations, multibeam bottom bathymetry, IBCAO bathymetry, DMSP satellite imagery , NOAA AVHRR Sea Surface temperature and all past SBI Project ship tracks and station locations. The ship track and multibeam layers are updated in real-time and the satellite layers are updated daily only during clear weather. In addition to using current high resolution multibeam bathymetry data, a composite high resolution bathymetry layer was created using multibeam data from past cruises in the SBI region. The server provides click-and-drag zooms, pan, feature query, distance measure and lat/lon/depth querys on a polar projection map of the arctic ocean. The main use of the system on the ship was for cruise track and station position planning by the scientists utilizing all available historical data and high resolution bathymetry. It was also the main source of information to all the scientist on board as to the cruise progress and plans. The system permitted on-board scientists to integrate historical cruise information for comparative purposes. A mirror web site was set up on land and the current ship track/station information was copied once a day to this site via a satellite link so people interested SBI research could follow the cruise progress.

  1. Enabling laboratory EUV research with a compact exposure tool

    NASA Astrophysics Data System (ADS)

    Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa

    2016-03-01

    In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.

  2. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    PubMed

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  3. Real-time Adaptive EEG Source Separation using Online Recursive Independent Component Analysis

    PubMed Central

    Hsu, Sheng-Hsiou; Mullen, Tim; Jung, Tzyy-Ping; Cauwenberghs, Gert

    2016-01-01

    Independent Component Analysis (ICA) has been widely applied to electroencephalographic (EEG) biosignal processing and brain-computer interfaces. The practical use of ICA, however, is limited by its computational complexity, data requirements for convergence, and assumption of data stationarity, especially for high-density data. Here we study and validate an optimized online recursive ICA algorithm (ORICA) with online recursive least squares (RLS) whitening for blind source separation of high-density EEG data, which offers instantaneous incremental convergence upon presentation of new data. Empirical results of this study demonstrate the algorithm's: (a) suitability for accurate and efficient source identification in high-density (64-channel) realistically-simulated EEG data; (b) capability to detect and adapt to non-stationarity in 64-ch simulated EEG data; and (c) utility for rapidly extracting principal brain and artifact sources in real 61-channel EEG data recorded by a dry and wearable EEG system in a cognitive experiment. ORICA was implemented as functions in BCILAB and EEGLAB and was integrated in an open-source Real-time EEG Source-mapping Toolbox (REST), supporting applications in ICA-based online artifact rejection, feature extraction for real-time biosignal monitoring in clinical environments, and adaptable classifications in brain-computer interfaces. PMID:26685257

  4. Integrated Practice Improvement Solutions-Practical Steps to Operating Room Management.

    PubMed

    Chernov, Mikhail; Pullockaran, Janet; Vick, Angela; Leyvi, Galina; Delphin, Ellise

    2016-10-01

    Perioperative productivity is a vital concern for surgeons, anesthesiologists, and administrators as the OR is a major source of hospital elective admissions and revenue. Based on elements of existing Practice Improvement Methodologies (PIMs), "Integrated Practice Improvement Solutions" (IPIS) is a practical and simple solution incorporating aspects of multiple management approaches into a single open source framework to increase OR efficiency and productivity by better utilization of existing resources. OR efficiency was measured both before and after IPIS implementation using the total number of cases versus room utilization, OR/anesthesia revenue and staff overtime (OT) costs. Other parameters of efficiency, such as the first case on-time start and the turnover time (TOT) were measured in parallel. IPIS implementation resulted in increased numbers of surgical procedures performed by an average of 10.7%, and OR and anesthesia revenue increases of 18.5% and 6.9%, respectively, with a simultaneous decrease in TOT (15%) and OT for anesthesia staff (26%). The number of perioperative adverse events was stable during the two-year study period which involved a total of 20,378 patients. IPIS, an effective and flexible practice improvement model, was designed to quickly, significantly, and sustainably improve OR efficiency by better utilization of existing resources. Success of its implementation directly correlates with the involvement of and acceptance by the entire OR team and hospital administration.

  5. The choice of primary energy source including PV installation for providing electric energy to a public utility building - a case study

    NASA Astrophysics Data System (ADS)

    Radomski, Bartosz; Ćwiek, Barbara; Mróz, Tomasz M.

    2017-11-01

    The paper presents multicriteria decision aid analysis of the choice of PV installation providing electric energy to a public utility building. From the energy management point of view electricity obtained by solar radiation has become crucial renewable energy source. Application of PV installations may occur a profitable solution from energy, economic and ecologic point of view for both existing and newly erected buildings. Featured variants of PV installations have been assessed by multicriteria analysis based on ANP (Analytic Network Process) method. Technical, economical, energy and environmental criteria have been identified as main decision criteria. Defined set of decision criteria has an open character and can be modified in the dialog process between the decision-maker and the expert - in the present case, an expert in planning of development of energy supply systems. The proposed approach has been used to evaluate three variants of PV installation acceptable for existing educational building located in Poznań, Poland - the building of Faculty of Chemical Technology, Poznań University of Technology. Multi-criteria analysis based on ANP method and the calculation software Super Decisions has proven to be an effective tool for energy planning, leading to the indication of the recommended variant of PV installation in existing and newly erected public buildings. Achieved results show prospects and possibilities of rational renewable energy usage as complex solution to public utility buildings.

  6. Open-source telemedicine platform for wireless medical video communication.

    PubMed

    Panayides, A; Eleftheriou, I; Pantziaris, M

    2013-01-01

    An m-health system for real-time wireless communication of medical video based on open-source software is presented. The objective is to deliver a low-cost telemedicine platform which will allow for reliable remote diagnosis m-health applications such as emergency incidents, mass population screening, and medical education purposes. The performance of the proposed system is demonstrated using five atherosclerotic plaque ultrasound videos. The videos are encoded at the clinically acquired resolution, in addition to lower, QCIF, and CIF resolutions, at different bitrates, and four different encoding structures. Commercially available wireless local area network (WLAN) and 3.5G high-speed packet access (HSPA) wireless channels are used to validate the developed platform. Objective video quality assessment is based on PSNR ratings, following calibration using the variable frame delay (VFD) algorithm that removes temporal mismatch between original and received videos. Clinical evaluation is based on atherosclerotic plaque ultrasound video assessment protocol. Experimental results show that adequate diagnostic quality wireless medical video communications are realized using the designed telemedicine platform. HSPA cellular networks provide for ultrasound video transmission at the acquired resolution, while VFD algorithm utilization bridges objective and subjective ratings.

  7. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  8. ERDDAP: Reducing Data Friction with an Open Source Data Platform

    NASA Astrophysics Data System (ADS)

    O'Brien, K.

    2017-12-01

    Data friction is not just an issue facing interdisciplinary research. Often times, even within disciplines, significant data friction can exist. Issues of differing formats, limited metadata and non-existent machine-to-machine data access are all issues that exist within disciplines and make it that much harder for successful interdisciplinary cooperation. Therefore, reducing data friction within disciplines is crucial first step in providing better overall collaboration. ERDDAP, an open source data platform developed at NOAA's Southwest Fisheries Center, is well poised to improve data useability and understanding and reduce data friction, both in single and multi-disciplinary research. By virtue of its ability to integrate data of varying formats and provide RESTful-based user access to data and metadata, use of ERDDAP has grown substantially throughout the ocean data community. ERDDAP also supports standards such as the DAP data protocol, the Climate and Forecast (CF) metadata conventions and the Bagit document standard for data archival. In this presentation, we will discuss the advantages of using ERDDAP as a data platform. We will also show specific use cases where utilizing ERDDAP has reduced friction within a single discipline (physical oceanography) and improved interdisciplinary collaboration as well.

  9. PharmacoGx: an R package for analysis of large pharmacogenomic datasets.

    PubMed

    Smirnov, Petr; Safikhani, Zhaleh; El-Hachem, Nehme; Wang, Dong; She, Adrian; Olsen, Catharina; Freeman, Mark; Selby, Heather; Gendoo, Deena M A; Grossmann, Patrick; Beck, Andrew H; Aerts, Hugo J W L; Lupien, Mathieu; Goldenberg, Anna; Haibe-Kains, Benjamin

    2016-04-15

    Pharmacogenomics holds great promise for the development of biomarkers of drug response and the design of new therapeutic options, which are key challenges in precision medicine. However, such data are scattered and lack standards for efficient access and analysis, consequently preventing the realization of the full potential of pharmacogenomics. To address these issues, we implemented PharmacoGx, an easy-to-use, open source package for integrative analysis of multiple pharmacogenomic datasets. We demonstrate the utility of our package in comparing large drug sensitivity datasets, such as the Genomics of Drug Sensitivity in Cancer and the Cancer Cell Line Encyclopedia. Moreover, we show how to use our package to easily perform Connectivity Map analysis. With increasing availability of drug-related data, our package will open new avenues of research for meta-analysis of pharmacogenomic data. PharmacoGx is implemented in R and can be easily installed on any system. The package is available from CRAN and its source code is available from GitHub. bhaibeka@uhnresearch.ca or benjamin.haibe.kains@utoronto.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Open-Source Telemedicine Platform for Wireless Medical Video Communication

    PubMed Central

    Panayides, A.; Eleftheriou, I.; Pantziaris, M.

    2013-01-01

    An m-health system for real-time wireless communication of medical video based on open-source software is presented. The objective is to deliver a low-cost telemedicine platform which will allow for reliable remote diagnosis m-health applications such as emergency incidents, mass population screening, and medical education purposes. The performance of the proposed system is demonstrated using five atherosclerotic plaque ultrasound videos. The videos are encoded at the clinically acquired resolution, in addition to lower, QCIF, and CIF resolutions, at different bitrates, and four different encoding structures. Commercially available wireless local area network (WLAN) and 3.5G high-speed packet access (HSPA) wireless channels are used to validate the developed platform. Objective video quality assessment is based on PSNR ratings, following calibration using the variable frame delay (VFD) algorithm that removes temporal mismatch between original and received videos. Clinical evaluation is based on atherosclerotic plaque ultrasound video assessment protocol. Experimental results show that adequate diagnostic quality wireless medical video communications are realized using the designed telemedicine platform. HSPA cellular networks provide for ultrasound video transmission at the acquired resolution, while VFD algorithm utilization bridges objective and subjective ratings. PMID:23573082

  11. The development of experimental techniques for the study of helicopter rotor noise

    NASA Technical Reports Server (NTRS)

    Widnall, S. E.; Harris, W. L.; Lee, Y. C. A.; Drees, H. M.

    1974-01-01

    The features of existing wind tunnels involved in noise studies are discussed. The acoustic characteristics of the MIT low noise open jet wind tunnel are obtained by employing calibration techniques: one technique is to measure the decay of sound pressure with distance in the far field; the other technique is to utilize a speaker, which was calibrated, as a sound source. The sound pressure level versus frequency was obtained in the wind tunnel chamber and compared with the corresponding calibrated values. Fiberglas board-block units were installed on the chamber interior. The free field was increased significantly after this treatment and the chamber cut-off frequency was reduced to 160 Hz from the original designed 250 Hz. The flow field characteristics of the rotor-tunnel configuration were studied by using flow visualization techniques. The influence of open-jet shear layer on the sound transmission was studied by using an Aeolian tone as the sound source. A dynamometer system was designed to measure the steady and low harmonics of the rotor thrust. A theoretical Mach number scaling formula was developed to scale the rotational noise and blade slap noise data of model rotors to full scale helicopter rotors.

  12. Early Detection of Apathetic Phenotypes in Huntington's Disease Knock-in Mice Using Open Source Tools.

    PubMed

    Minnig, Shawn; Bragg, Robert M; Tiwana, Hardeep S; Solem, Wes T; Hovander, William S; Vik, Eva-Mari S; Hamilton, Madeline; Legg, Samuel R W; Shuttleworth, Dominic D; Coffey, Sydney R; Cantle, Jeffrey P; Carroll, Jeffrey B

    2018-02-02

    Apathy is one of the most prevalent and progressive psychiatric symptoms in Huntington's disease (HD) patients. However, preclinical work in HD mouse models tends to focus on molecular and motor, rather than affective, phenotypes. Measuring behavior in mice often produces noisy data and requires large cohorts to detect phenotypic rescue with appropriate power. The operant equipment necessary for measuring affective phenotypes is typically expensive, proprietary to commercial entities, and bulky which can render adequately sized mouse cohorts as cost-prohibitive. Thus, we describe here a home-built, open-source alternative to commercial hardware that is reliable, scalable, and reproducible. Using off-the-shelf hardware, we adapted and built several of the rodent operant buckets (ROBucket) to test Htt Q111/+ mice for attention deficits in fixed ratio (FR) and progressive ratio (PR) tasks. We find that, despite normal performance in reward attainment in the FR task, Htt Q111/+ mice exhibit reduced PR performance at 9-11 months of age, suggesting motivational deficits. We replicated this in two independent cohorts, demonstrating the reliability and utility of both the apathetic phenotype, and these ROBuckets, for preclinical HD studies.

  13. Using R to implement spatial analysis in open source environment

    NASA Astrophysics Data System (ADS)

    Shao, Yixi; Chen, Dong; Zhao, Bo

    2007-06-01

    R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.

  14. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Technical Reports Server (NTRS)

    Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce

    2011-01-01

    Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases

  15. Ruthenium-Catalyzed Synthesis of Dialkoxymethane Ethers Utilizing Carbon Dioxide and Molecular Hydrogen.

    PubMed

    Thenert, Katharina; Beydoun, Kassem; Wiesenthal, Jan; Leitner, Walter; Klankermayer, Jürgen

    2016-09-26

    The synthesis of dimethoxymethane (DMM) by a multistep reaction of methanol with carbon dioxide and molecular hydrogen is reported. Using the molecular catalyst [Ru(triphos)(tmm)] in combination with the Lewis acid Al(OTf)3 resulted in a versatile catalytic system for the synthesis of various dialkoxymethane ethers. This new catalytic reaction provides the first synthetic example for the selective conversion of carbon dioxide and hydrogen into a formaldehyde oxidation level, thus opening access to new molecular structures using this important C1 source. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Calibrated Blade-Element/Momentum Theory Aerodynamic Model of the MARIN Stock Wind Turbine: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goupee, A.; Kimball, R.; de Ridder, E. J.

    2015-04-02

    In this paper, a calibrated blade-element/momentum theory aerodynamic model of the MARIN stock wind turbine is developed and documented. The model is created using open-source software and calibrated to closely emulate experimental data obtained by the DeepCwind Consortium using a genetic algorithm optimization routine. The provided model will be useful for those interested in validating interested in validating floating wind turbine numerical simulators that rely on experiments utilizing the MARIN stock wind turbine—for example, the International Energy Agency Wind Task 30’s Offshore Code Comparison Collaboration Continued, with Correlation project.

  17. WE-D-9A-06: Open Source Monitor Calibration and Quality Control Software for Enterprise Display Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bevins, N; Vanderhoek, M; Lang, S

    2014-06-15

    Purpose: Medical display monitor calibration and quality control present challenges to medical physicists. The purpose of this work is to demonstrate and share experiences with an open source package that allows for both initial monitor setup and routine performance evaluation. Methods: A software package, pacsDisplay, has been developed over the last decade to aid in the calibration of all monitors within the radiology group in our health system. The software is used to calibrate monitors to follow the DICOM Grayscale Standard Display Function (GSDF) via lookup tables installed on the workstation. Additional functionality facilitates periodic evaluations of both primary andmore » secondary medical monitors to ensure satisfactory performance. This software is installed on all radiology workstations, and can also be run as a stand-alone tool from a USB disk. Recently, a database has been developed to store and centralize the monitor performance data and to provide long-term trends for compliance with internal standards and various accrediting organizations. Results: Implementation and utilization of pacsDisplay has resulted in improved monitor performance across the health system. Monitor testing is now performed at regular intervals and the software is being used across multiple imaging modalities. Monitor performance characteristics such as maximum and minimum luminance, ambient luminance and illuminance, color tracking, and GSDF conformity are loaded into a centralized database for system performance comparisons. Compliance reports for organizations such as MQSA, ACR, and TJC are generated automatically and stored in the same database. Conclusion: An open source software solution has simplified and improved the standardization of displays within our health system. This work serves as an example method for calibrating and testing monitors within an enterprise health system.« less

  18. Development of CD3 cell quantitation algorithms for renal allograft biopsy rejection assessment utilizing open source image analysis software.

    PubMed

    Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad

    2018-02-01

    Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to < 0.0001). Methods for assessing inflammation suggested a progression through the tubulointerstitial ACR grades, with statistically different results in borderline versus other ACR types, in all but the custom methods. Assessment of CD3-stained slides using various open source image analysis algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.

  19. ClinicalTrials.gov as a data source for semi-automated point-of-care trial eligibility screening.

    PubMed

    Pfiffner, Pascal B; Oh, JiWon; Miller, Timothy A; Mandl, Kenneth D

    2014-01-01

    Implementing semi-automated processes to efficiently match patients to clinical trials at the point of care requires both detailed patient data and authoritative information about open studies. To evaluate the utility of the ClinicalTrials.gov registry as a data source for semi-automated trial eligibility screening. Eligibility criteria and metadata for 437 trials open for recruitment in four different clinical domains were identified in ClinicalTrials.gov. Trials were evaluated for up to date recruitment status and eligibility criteria were evaluated for obstacles to automated interpretation. Finally, phone or email outreach to coordinators at a subset of the trials was made to assess the accuracy of contact details and recruitment status. 24% (104 of 437) of trials declaring on open recruitment status list a study completion date in the past, indicating out of date records. Substantial barriers to automated eligibility interpretation in free form text are present in 81% to up to 94% of all trials. We were unable to contact coordinators at 31% (45 of 146) of the trials in the subset, either by phone or by email. Only 53% (74 of 146) would confirm that they were still recruiting patients. Because ClinicalTrials.gov has entries on most US and many international trials, the registry could be repurposed as a comprehensive trial matching data source. Semi-automated point of care recruitment would be facilitated by matching the registry's eligibility criteria against clinical data from electronic health records. But the current entries fall short. Ultimately, improved techniques in natural language processing will facilitate semi-automated complex matching. As immediate next steps, we recommend augmenting ClinicalTrials.gov data entry forms to capture key eligibility criteria in a simple, structured format.

  20. Application of OpenStreetMap (OSM) to Support the Mapping Village in Indonesia

    NASA Astrophysics Data System (ADS)

    Swasti Kanthi, Nurin; Hery Purwanto, Taufik

    2016-11-01

    Geospatial Information is a important thing in this era, because the need for location information is needed to know the condition of a region. In 2015 the Indonesian government release detailed mapping in village level and their Parent maps Indonesian state regulatory standards set forth in Rule form Norm Standards, Procedures and Criteria for Mapping Village (NSPK). Over time Web and Mobile GIS was developed with a wide range of applications. The merger between detailed mapping and Web GIS is still rarely performed and not used optimally. OpenStreetMap (OSM) is a WebGIS which can be utilized as Mobile GIS providing sufficient information to the representative levels of the building and can be used for mapping the village.Mapping Village using OSM was conducted using remote sensing approach and Geographical Information Systems (GIS), which's to interpret remote sensing imagery from OSM. The study was conducted to analyzed how far the role of OSM to support the mapping of the village, it's done by entering the house number data, administrative boundaries, public facilities and land use into OSM with reference data and data image Village Plan. The results of the mapping portion villages in OSM as a reference map-making village and analyzed in accordance with NSPK for detailed mapping Rukun Warga (RW) is part of the village mapping. The use of OSM greatly assists the process of mapping the details of the region with data sources in the form of images and can be accessed for Open Source. But still need their care and updating the data source to maintain the validity of the data.

  1. [The use of open source software in graphic anatomic reconstructions and in biomechanic simulations].

    PubMed

    Ciobanu, O

    2009-01-01

    The objective of this study was to obtain three-dimensional (3D) images and to perform biomechanical simulations starting from DICOM images obtained by computed tomography (CT). Open source software were used to prepare digitized 2D images of tissue sections and to create 3D reconstruction from the segmented structures. Finally, 3D images were used in open source software in order to perform biomechanic simulations. This study demonstrates the applicability and feasibility of open source software developed in our days for the 3D reconstruction and biomechanic simulation. The use of open source software may improve the efficiency of investments in imaging technologies and in CAD/CAM technologies for implants and prosthesis fabrication which need expensive specialized software.

  2. 12. INTERIOR OF KITCHEN/UTILITY AREA SHOWING OPEN FOURPANEL WOOD DOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. INTERIOR OF KITCHEN/UTILITY AREA SHOWING OPEN FOUR-PANEL WOOD DOOR TO SOUTH BEDROOM AT PHOTO CENTER RIGHT, OPEN DOORWAY TO LIVING ROOM AT PHOTO CENTER LEFT, AND BUILT-IN CABINETS AND CEILING VENT BETWEEN THE DOORS AND AROUND THE STOVE/RANGE POSITION. VIEW TO NORTHEAST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA

  3. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  4. Rapid development of medical imaging tools with open-source libraries.

    PubMed

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  5. Cyberscience and the Knowledge-Based Economy. Open Access and Trade Publishing: From Contradiction to Compatibility with Non-Exclusive Copyright Licensing

    ERIC Educational Resources Information Center

    Armbruster, Chris

    2008-01-01

    Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge…

  6. Learning from hackers: open-source clinical trials.

    PubMed

    Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico

    2012-05-02

    Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.

  7. Synchrotron x-ray diffraction studies of the structural properties of electrode materials in operating battery cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurston, T.R.; Jisrawi, N.M.; Mukerjee, S.

    Hard x rays from a synchrotron source were utilized in diffraction experiments which probed the bulk of electrode materials while they were operating {ital in} {ital situ} in battery cells. Two technologically relevant electrode materials were examined; an {ital AB}{sub 2}-type anode in a nickel{endash}metal{endash}hydride cell and a LiMn{sub 2}O{sub 4} cathode in a Li-ion {open_quote}{open_quote}rocking chair{close_quote}{close_quote} cell. Structural features such as lattice expansions and contractions, phase transitions, and the formation of multiple phases were easily observed as either hydrogen or lithium was electrochemically intercalated in and out of the electrode materials. The relevance of this technique for future studiesmore » of battery electrode materials is discussed. {copyright} {ital 1996 American Institute of Physics.}« less

  8. BIRS - Bioterrorism Information Retrieval System.

    PubMed

    Tewari, Ashish Kumar; Rashi; Wadhwa, Gulshan; Sharma, Sanjeev Kumar; Jain, Chakresh Kumar

    2013-01-01

    Bioterrorism is the intended use of pathogenic strains of microbes to widen terror in a population. There is a definite need to promote research for development of vaccines, therapeutics and diagnostic methods as a part of preparedness to any bioterror attack in the future. BIRS is an open-access database of collective information on the organisms related to bioterrorism. The architecture of database utilizes the current open-source technology viz PHP ver 5.3.19, MySQL and IIS server under windows platform for database designing. Database stores information on literature, generic- information and unique pathways of about 10 microorganisms involved in bioterrorism. This may serve as a collective repository to accelerate the drug discovery and vaccines designing process against such bioterrorist agents (microbes). The available data has been validated from various online resources and literature mining in order to provide the user with a comprehensive information system. The database is freely available at http://www.bioterrorism.biowaves.org.

  9. LAMMPS strong scaling performance optimization on Blue Gene/Q

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coffman, Paul; Jiang, Wei; Romero, Nichols A.

    2014-11-12

    LAMMPS "Large-scale Atomic/Molecular Massively Parallel Simulator" is an open-source molecular dynamics package from Sandia National Laboratories. Significant performance improvements in strong-scaling and time-to-solution for this application on IBM's Blue Gene/Q have been achieved through computational optimizations of the OpenMP versions of the short-range Lennard-Jones term of the CHARMM force field and the long-range Coulombic interaction implemented with the PPPM (particle-particle-particle mesh) algorithm, enhanced by runtime parameter settings controlling thread utilization. Additionally, MPI communication performance improvements were made to the PPPM calculation by re-engineering the parallel 3D FFT to use MPICH collectives instead of point-to-point. Performance testing was done using anmore » 8.4-million atom simulation scaling up to 16 racks on the Mira system at Argonne Leadership Computing Facility (ALCF). Speedups resulting from this effort were in some cases over 2x.« less

  10. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    PubMed

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  12. The validity of open-source data when assessing jail suicides.

    PubMed

    Thomas, Amanda L; Scott, Jacqueline; Mellow, Jeff

    2018-05-09

    The Bureau of Justice Statistics' Deaths in Custody Reporting Program is the primary source for jail suicide research, though the data is restricted from general dissemination. This study is the first to examine whether jail suicide data obtained from publicly available sources can help inform our understanding of this serious public health problem. Of the 304 suicides that were reported through the DCRP in 2009, roughly 56 percent (N = 170) of those suicides were identified through the open-source search protocol. Each of the sources was assessed based on how much information was collected on the incident and the types of variables available. A descriptive analysis was then conducted on the variables that were present in both data sources. The four variables present in each data source were: (1) demographic characteristics of the victim, (2) the location of occurrence within the facility, (3) the location of occurrence by state, and (4) the size of the facility. Findings demonstrate that the prevalence and correlates of jail suicides are extremely similar in both open-source and official data. However, for almost every variable measured, open-source data captured as much information as official data did, if not more. Further, variables not found in official data were identified in the open-source database, thus allowing researchers to have a more nuanced understanding of the situational characteristics of the event. This research provides support for the argument in favor of including open-source data in jail suicide research as it illustrates how open-source data can be used to provide additional information not originally found in official data. In sum, this research is vital in terms of possible suicide prevention, which may be directly linked to being able to manipulate environmental factors.

  13. Performance evaluation of multi-channel wireless mesh networks with embedded systems.

    PubMed

    Lam, Jun Huy; Lee, Sang-Gon; Tan, Whye Kit

    2012-01-01

    Many commercial wireless mesh network (WMN) products are available in the marketplace with their own proprietary standards, but interoperability among the different vendors is not possible. Open source communities have their own WMN implementation in accordance with the IEEE 802.11s draft standard, Linux open80211s project and FreeBSD WMN implementation. While some studies have focused on the test bed of WMNs based on the open80211s project, none are based on the FreeBSD. In this paper, we built an embedded system using the FreeBSD WMN implementation that utilizes two channels and evaluated its performance. This implementation allows the legacy system to connect to the WMN independent of the type of platform and distributes the load between the two non-overlapping channels. One channel is used for the backhaul connection and the other one is used to connect to the stations to wireless mesh network. By using the power efficient 802.11 technology, this device can also be used as a gateway for the wireless sensor network (WSN).

  14. Open source tools and toolkits for bioinformatics: significance, and where are we?

    PubMed

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  15. Open Source 2010: Reflections on 2007

    ERIC Educational Resources Information Center

    Wheeler, Brad

    2007-01-01

    Colleges and universities and commercial firms have demonstrated great progress in realizing the vision proffered for "Open Source 2007," and 2010 will mark even greater progress. Although much work remains in refining open source for higher education applications, the signals are now clear: the collaborative development of software can provide…

  16. Development and Use of an Open-Source, User-Friendly Package to Simulate Voltammetry Experiments

    ERIC Educational Resources Information Center

    Wang, Shuo; Wang, Jing; Gao, Yanjing

    2017-01-01

    An open-source electrochemistry simulation package has been developed that simulates the electrode processes of four reaction mechanisms and two typical electroanalysis techniques: cyclic voltammetry and chronoamperometry. Unlike other open-source simulation software, this package balances the features with ease of learning and implementation and…

  17. Creating Open Source Conversation

    ERIC Educational Resources Information Center

    Sheehan, Kate

    2009-01-01

    Darien Library, where the author serves as head of knowledge and learning services, launched a new website on September 1, 2008. The website is built with Drupal, an open source content management system (CMS). In this article, the author describes how she and her colleagues overhauled the library's website to provide an open source content…

  18. Integrating an Automatic Judge into an Open Source LMS

    ERIC Educational Resources Information Center

    Georgouli, Katerina; Guerreiro, Pedro

    2011-01-01

    This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…

  19. 76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... be held in the General Services Administration (GSA), Central Office Auditorium, 1800 F Street NW...

  20. Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.

    ERIC Educational Resources Information Center

    Newby, Gregory B.; Greenberg, Jane; Jones, Paul

    2003-01-01

    Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)

  1. Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat.

    PubMed

    Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart

    2015-04-21

    Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation.

  2. Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat

    PubMed Central

    Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart

    2015-01-01

    Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation. PMID:25897892

  3. Report on annual utility oil buyers conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stambler, K.

    1994-08-01

    What was discussed at this year`s Utility Oil Buyers` Conference is summarized. This year`s conference was held in Boston and there were over 200 attendees representing over 130 different companies from the utility, oil trading, consulting and inspection industries. Attendees for this year`s came from as far away as Italy and Argentina. The mood at this year`s conference was somber, as each sector is feeling the effects of the decline in residual fuel oil demand--due to natural gas displacement, non-utility generation and an economy that is still lethargic. The topics that were covered ranged from {open_quotes}Life after Order 636{close_quotes} tomore » {open_quotes}Does Residual Fuel Oil Have a Future in a Utility Steam Boiler?{close_quotes} to {open_quotes}The Future of Shipping after OPA of 1990.{close_quotes} In addition, there were several topics that have been or will be covered at this conference.« less

  4. Scalable Algorithms for Clustering Large Geospatiotemporal Data Sets on Manycore Architectures

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hoffman, F. M.; Kumar, J.; Sreepathi, S.; Sripathi, V.

    2016-12-01

    The increasing availability of high-resolution geospatiotemporal data sets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery using data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe a massively parallel implementation of accelerated k-means clustering and some optimizations to boost computational intensity and utilization of wide SIMD lanes on state-of-the art multi- and manycore processors, including the second-generation Intel Xeon Phi ("Knights Landing") processor based on the Intel Many Integrated Core (MIC) architecture, which includes several new features, including an on-package high-bandwidth memory. We also analyze the code in the context of a few practical applications to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.

  5. Chemical Discrimination in Turbulent Gas Mixtures with MOX Sensors Validated by Gas Chromatography-Mass Spectrometry

    PubMed Central

    Fonollosa, Jordi; Rodríguez-Luján, Irene; Trincavelli, Marco; Vergara, Alexander; Huerta, Ramón

    2014-01-01

    Chemical detection systems based on chemo-resistive sensors usually include a gas chamber to control the sample air flow and to minimize turbulence. However, such a kind of experimental setup does not reproduce the gas concentration fluctuations observed in natural environments and destroys the spatio-temporal information contained in gas plumes. Aiming at reproducing more realistic environments, we utilize a wind tunnel with two independent gas sources that get naturally mixed along a turbulent flow. For the first time, chemo-resistive gas sensors are exposed to dynamic gas mixtures generated with several concentration levels at the sources. Moreover, the ground truth of gas concentrations at the sensor location was estimated by means of gas chromatography-mass spectrometry. We used a support vector machine as a tool to show that chemo-resistive transduction can be utilized to reliably identify chemical components in dynamic turbulent mixtures, as long as sufficient gas concentration coverage is used. We show that in open sampling systems, training the classifiers only on high concentrations of gases produces less effective classification and that it is important to calibrate the classification method with data at low gas concentrations to achieve optimal performance. PMID:25325339

  6. Impact of the social networking applications for health information management for patients and physicians.

    PubMed

    Sahama, Tony; Liang, Jian; Iannella, Renato

    2012-01-01

    Most social network users hold more than one social network account and utilize them in different ways depending on the digital context. For example, friendly chat on Facebook, professional discussion on LinkedIn, and health information exchange on PatientsLikeMe. Thus many web users need to manage many disparate profiles across many distributed online sources. Maintaining these profiles is cumbersome, time consuming, inefficient, and leads to lost opportunity. In this paper we propose a framework for multiple profile management of online social networks and showcase a demonstrator utilising an open source platform. The result of the research enables a user to create and manage an integrated profile and share/synchronise their profiles with their social networks. A number of use cases were created to capture the functional requirements and describe the interactions between users and the online services. An innovative application of this project is in public health informatics. We utilize the prototype to examine how the framework can benefit patients and physicians. The framework can greatly enhance health information management for patients and more importantly offer a more comprehensive personal health overview of patients to physicians.

  7. Chemical discrimination in turbulent gas mixtures with MOX sensors validated by gas chromatography-mass spectrometry.

    PubMed

    Fonollosa, Jordi; Rodríguez-Luján, Irene; Trincavelli, Marco; Vergara, Alexander; Huerta, Ramón

    2014-10-16

    Chemical detection systems based on chemo-resistive sensors usually include a gas chamber to control the sample air flow and to minimize turbulence. However, such a kind of experimental setup does not reproduce the gas concentration fluctuations observed in natural environments and destroys the spatio-temporal information contained in gas plumes. Aiming at reproducing more realistic environments, we utilize a wind tunnel with two independent gas sources that get naturally mixed along a turbulent flow. For the first time, chemo-resistive gas sensors are exposed to dynamic gas mixtures generated with several concentration levels at the sources. Moreover, the ground truth of gas concentrations at the sensor location was estimated by means of gas chromatography-mass spectrometry. We used a support vector machine as a tool to show that chemo-resistive transduction can be utilized to reliably identify chemical components in dynamic turbulent mixtures, as long as sufficient gas concentration coverage is used. We show that in open sampling systems, training the classifiers only on high concentrations of gases produces less effective classification and that it is important to calibrate the classification method with data at low gas concentrations to achieve optimal performance.

  8. Open Source and ROI: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    A switch to free open source software can minimize cost and allow funding to be diverted to equipment and other programs. For instance, the OpenOffice suite is an alternative to expensive basic application programs offered by major vendors. Many such programs on the market offer features seldom used in education but for which educators must pay.…

  9. Optical and spectroscopic studies on tannery wastes as a possible source of organic semiconductors.

    PubMed

    Nashy, El-Shahat H A; Al-Ashkar, Emad; Moez, A Abdel

    2012-02-01

    Tanning industry produces a large quantity of solid wastes which contain hide proteins in the form of protein shavings containing chromium salts. The chromium wastes are the main concern from an environmental stand point of view, because chrome wastes posses a significant disposal problem. The present work is devoted to investigate the possibility of utilizing these wastes as a source of organic semi-conductors as an alternative method instead of the conventional ones. The chemical characterization of these wastes was determined. In addition, the Horizontal Attenuated Total Reflection (HATR) FT-IR spectroscopic analysis and optical parameters were also carried out for chromated samples. The study showed that the chromated samples had suitable absorbance and transmittance in the wavelength range (500-850 nm). Presence of chromium salt in the collagen samples increases the absorbance which improves the optical properties of the studied samples and leads to decrease the optical energy gap. The obtained optical energy gap gives an impression that the environmentally hazardous chrome shavings wastes can be utilized as a possible source of natural organic semiconductors with direct and indirect energy gap. This work opens the door to use some hazardous wastes in the manufacture of electronic devices such as IR-detectors, solar cells and also as solar cell windows. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Describing environmental public health data: implementing a descriptive metadata standard on the environmental public health tracking network.

    PubMed

    Patridge, Jeff; Namulanda, Gonza

    2008-01-01

    The Environmental Public Health Tracking (EPHT) Network provides an opportunity to bring together diverse environmental and health effects data by integrating}?> local, state, and national databases of environmental hazards, environmental exposures, and health effects. To help users locate data on the EPHT Network, the network will utilize descriptive metadata that provide critical information as to the purpose, location, content, and source of these data. Since 2003, the Centers for Disease Control and Prevention's EPHT Metadata Subgroup has been working to initiate the creation and use of descriptive metadata. Efforts undertaken by the group include the adoption of a metadata standard, creation of an EPHT-specific metadata profile, development of an open-source metadata creation tool, and promotion of the creation of descriptive metadata by changing the perception of metadata in the public health culture.

  11. Lexington Children`s Museum final report on EnergyQuest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    EnergyQuest is a museum-wide exhibit that familiarizes children and their families with energy sources, uses, and issues and with the impact of those issues on their lives. It was developed and built by Lexington Children`s Museum with support from the US Department of Energy, Kentucky Utilities, and the Kentucky Coal Marketing and Export Council. EnergyQuest featured six hands-on exhibit stations in each of six museum galleries. Collectively, the exhibits examine the sources, uses and conservation of energy. Each EnergyQuest exhibit reflects the content of its gallery setting. During the first year after opening EnergyQuest, a series of 48 public educationalmore » programs on energy were conducted at the Museum as part of the Museum`s ongoing schedule of demonstrations, performances, workshops and classes. In addition, teacher training was conducted.« less

  12. Audio CAPTCHA for SIP-Based VoIP

    NASA Astrophysics Data System (ADS)

    Soupionis, Yannis; Tountas, George; Gritzalis, Dimitris

    Voice over IP (VoIP) introduces new ways of communication, while utilizing existing data networks to provide inexpensive voice communications worldwide as a promising alternative to the traditional PSTN telephony. SPam over Internet Telephony (SPIT) is one potential source of future annoyance in VoIP. A common way to launch a SPIT attack is the use of an automated procedure (bot), which generates calls and produces audio advertisements. In this paper, our goal is to design appropriate CAPTCHA to fight such bots. We focus on and develop audio CAPTCHA, as the audio format is more suitable for VoIP environments and we implement it in a SIP-based VoIP environment. Furthermore, we suggest and evaluate the specific attributes that audio CAPTCHA should incorporate in order to be effective, and test it against an open source bot implementation.

  13. Open source drug discovery--a new paradigm of collaborative research in tuberculosis drug development.

    PubMed

    Bhardwaj, Anshu; Scaria, Vinod; Raghava, Gajendra Pal Singh; Lynn, Andrew Michael; Chandra, Nagasuma; Banerjee, Sulagna; Raghunandanan, Muthukurussi V; Pandey, Vikas; Taneja, Bhupesh; Yadav, Jyoti; Dash, Debasis; Bhattacharya, Jaijit; Misra, Amit; Kumar, Anil; Ramachandran, Srinivasan; Thomas, Zakir; Brahmachari, Samir K

    2011-09-01

    It is being realized that the traditional closed-door and market driven approaches for drug discovery may not be the best suited model for the diseases of the developing world such as tuberculosis and malaria, because most patients suffering from these diseases have poor paying capacity. To ensure that new drugs are created for patients suffering from these diseases, it is necessary to formulate an alternate paradigm of drug discovery process. The current model constrained by limitations for collaboration and for sharing of resources with confidentiality hampers the opportunities for bringing expertise from diverse fields. These limitations hinder the possibilities of lowering the cost of drug discovery. The Open Source Drug Discovery project initiated by Council of Scientific and Industrial Research, India has adopted an open source model to power wide participation across geographical borders. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multi-faceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Since the inventions are community generated, the new chemical entities developed by Open Source Drug Discovery will be taken up for clinical trial in a non-exclusive manner by participation of multiple companies with majority funding from Open Source Drug Discovery. This will ensure availability of drugs through a lower cost community driven drug discovery process for diseases afflicting people with poor paying capacity. Hopefully what LINUX the World Wide Web have done for the information technology, Open Source Drug Discovery will do for drug discovery. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Carbon spectrum utilization by an indigenous strain of Pseudomonas aeruginosa NCIM 5514: Production, characterization and surface active properties of biosurfactant.

    PubMed

    Varjani, Sunita J; Upasani, Vivek N

    2016-12-01

    The present research work was undertaken with a mandate to study carbon spectrum utilization and structural characterization of biosurfactant produced by indigenous Pseudomonas aeruginosa NCIM 5514, which showed unique properties to utilize a large number of carbon sources effectively for production of biosurfactant, although glucose was the best carbon substrate. In Bushnell-Hass medium supplemented with glucose (1%, w/v), 3.178±0.071g/l biosurfactant was produced by this isolate in 96h. The biosurfactant produced showed surface tension and emulsification activity values from 29.14±0.05 to 62.29±0.13mN/m and 88.50±1.96 to 15.40±0.91%, respectively. Toluene showed highest emulsification activity followed by kerosene. However, kerosene exhibited emulsion stability for 30days. Biosurfactant was characterized as a mixture of di-rhamnolipid (Rha-Rha-C 10 -C 14:1 ) and mono-rhamnolipid (Rha-C 8 -C 10 ) by FTIR, ESI-MS and LC-MS techniques. High biosurfactant yield opens up doors for the isolate to find utility in various industries. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Enhancing data utilization through adoption of cloud-based data architectures (Invited Paper 211869)

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2017-12-01

    A traditional approach to data distribution and utilization of open government data involves continuously moving those data from a central government location to each potential user, who would then utilize them on their local computer systems. An alternate approach would be to bring those users to the open government data, where users would also have access to computing and analytics capabilities that would support data utilization. NOAA's Big Data Project is exploring such an alternate approach through an experimental collaboration with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium. As part of this ongoing experiment, NOAA is providing open data of interest which are freely hosted by the Big Data Project Collaborators, who provide a variety of cloud-based services and capabilities to enable utilization by data users. By the terms of the agreement, the Collaborators may charge for those value-added services and processing capacities to recover their costs to freely host the data and to generate profits if so desired. Initial results have shown sustained increases in data utilization from 2 to over 100 times previously-observed access patterns from traditional approaches. Significantly increased utilization speed as compared to the traditional approach has also been observed by NOAA data users who have volunteered their experiences on these cloud-based systems. The potential for implementing and sustaining the alternate cloud-based approach as part of a change in operational data utilization strategies will be discussed.

  16. NPTFit: A Code Package for Non-Poissonian Template Fitting

    NASA Astrophysics Data System (ADS)

    Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R.

    2017-06-01

    We present NPTFit, an open-source code package, written in Python and Cython, for performing non-Poissonian template fits (NPTFs). The NPTF is a recently developed statistical procedure for characterizing the contribution of unresolved point sources (PSs) to astrophysical data sets. The NPTF was first applied to Fermi gamma-ray data to provide evidence that the excess of ˜GeV gamma-rays observed in the inner regions of the Milky Way likely arises from a population of sub-threshold point sources, and the NPTF has since found additional applications studying sub-threshold extragalactic sources at high Galactic latitudes. The NPTF generalizes traditional astrophysical template fits to allow for the ability to search for populations of unresolved PSs that may follow a given spatial distribution. NPTFit builds upon the framework of the fluctuation analyses developed in X-ray astronomy, thus it likely has applications beyond those demonstrated with gamma-ray data. The NPTFit package utilizes novel computational methods to perform the NPTF efficiently. The code is available at http://github.com/bsafdi/NPTFit and up-to-date and extensive documentation may be found at http://nptfit.readthedocs.io.

  17. Sifting Through It All: Characterizing Melanoma Patients' Utilization of the Internet as an Information Source.

    PubMed

    Hamilton, Sarah Nicole; Scali, Elena P; Yu, Irene; Gusnowski, Eva; Ingledew, Paris-Ann

    2015-09-01

    This study describes how melanoma patients used the Internet as a melanoma information source and how it impacted their clinical encounter and treatment decision. From 2010 to 2013, melanoma patients were invited to complete a 23-question paper survey with open- and close-ended questions. Thirty-one of the 62 patients approached completed the survey. The majority (90 %) of respondents used the Internet as a melanoma information source. Most (90 %) had used the search engine Google. The most commonly searched topics were melanoma treatment (96 %), screening (64 %), and prevention (64 %). While most respondents (85 %) found the Internet was a useful melanoma information source, over half (54 %) found melanoma websites at least somewhat difficult to understand. Many (78 %) believed it increased their understanding of their diagnosis, 71 % thought it influenced their treatment decision, and 59 % felt it impacted their specialist consultation. This study informs health care professionals that many melanoma patients search the Internet for information regarding their diagnosis and that it may impact their disease understanding and treatment decisions.

  18. State-of-the-practice and lessons learned on implementing open data and open source policies.

    DOT National Transportation Integrated Search

    2012-05-01

    This report describes the current government, academic, and private sector practices associated with open data and open source application development. These practices are identified; and the potential uses with the ITS Programs Data Capture and M...

  19. Your Personal Analysis Toolkit - An Open Source Solution

    NASA Astrophysics Data System (ADS)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  20. All-source Information Management and Integration for Improved Collective Intelligence Production

    DTIC Science & Technology

    2011-06-01

    Intelligence (ELINT) • Open Source Intelligence ( OSINT ) • Technical Intelligence (TECHINT) These intelligence disciplines produce... intelligence , measurement and signature intelligence , signals intelligence , and open - source data, in the production of intelligence . All- source intelligence ...All- Source Information Integration and Management) R&D Project 3 All- Source Intelligence

  1. Isotopic constraints on global atmospheric methane sources and sinks: a critical assessment of recent findings and new data

    NASA Astrophysics Data System (ADS)

    Schwietzke, S.; Sherwood, O.; Michel, S. E.; Bruhwiler, L.; Dlugokencky, E. J.; Tans, P. P.

    2017-12-01

    Methane isotopic data have increasingly been used in recent studies to help constrain global atmospheric methane sources and sinks. The added scientific contributions to this field include (i) careful comparisons and merging of atmospheric isotope measurement datasets to increase spatial coverage, (ii) in-depth analyses of observed isotopic spatial gradients and seasonal patterns, and (iii) improved datasets of isotopic source signatures. Different interpretations have been made regarding the utility of the isotopic data on the diagnosis of methane sources and sinks. Some studies have found isotopic evidence of a largely microbial source causing the renewed growth in global atmospheric methane since 2007, and underestimated global fossil fuel methane emissions compared to most previous studies. However, other studies have challenged these conclusions by pointing out substantial spatial variability in isotopic source signatures as well as open questions in atmospheric sinks and biomass burning trends. This presentation will review and contrast the main arguments and evidence for the different conclusions. The analysis will distinguish among the different research objectives including (i) global methane budget source attribution in steady-state, (ii) source attribution of recent global methane trends, and (iii) identifying specific methane sources in individual plumes during field campaigns. Additional comparisons of model experiments with atmospheric measurements and updates on isotopic source signature data will complement the analysis.

  2. 20. INTERIOR OF UTILITY ROOM SHOWING OPEN DOORWAY TO KITCHEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. INTERIOR OF UTILITY ROOM SHOWING OPEN DOORWAY TO KITCHEN AT PHOTO LEFT, JUNCTION BOXES AT UPPER PHOTO CENTER, AND PLUMBING FOR WASHER AT PHOTO RIGHT. VIEW TO EAST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA

  3. A videoconferencing tool acting as a home-based healthcare monitoring robot for elderly patients.

    PubMed

    Mapundu, Zamikhaya; Simonnet, Thierry; van der Walt, J S

    2012-01-01

    Currently, healthcare costs associated with aging at home can be prohibitive if individuals require continual/periodical supervision and assistance because of Alzheimer's disease. Open-source tools and videoconferencing tools are attracting more significant organizations; it has been observed that another way to reduce medical care costs is to reduce the length of the patient's hospitalization and reinforce home sanitary support by medical professionals with family care givers. Videoconferencing has been around for a while and presently this technology is the leading way in reducing healthcare costs, thus making medical care more available and convenient for both doctors and patients. This article portrays how the videoconferencing tool can be utilized to improve communication practices for patient monitoring using a Robot Companion. SWOT analysis method is also presented in a form of a summary and was utilized to evaluate the user's point of view.

  4. TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.

    PubMed

    Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D

    2018-05-08

    Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.

  5. The interactive electrode localization utility: software for automatic sorting and labeling of intracranial subdural electrodes

    PubMed Central

    Tang, Wei; Peled, Noam; Vallejo, Deborah I.; Borzello, Mia; Dougherty, Darin D.; Eskandar, Emad N.; Widge, Alik S.; Cash, Sydney S.; Stufflebeam, Steven M.

    2018-01-01

    Purpose Existing methods for sorting, labeling, registering, and across-subject localization of electrodes in intracranial encephalography (iEEG) may involve laborious work requiring manual inspection of radiological images. Methods We describe a new open-source software package, the interactive electrode localization utility which presents a full pipeline for the registration, localization, and labeling of iEEG electrodes from CT and MR images. In addition, we describe a method to automatically sort and label electrodes from subdural grids of known geometry. Results We validated our software against manual inspection methods in twelve subjects undergoing iEEG for medically intractable epilepsy. Our algorithm for sorting and labeling performed correct identification on 96% of the electrodes. Conclusions The sorting and labeling methods we describe offer nearly perfect performance and the software package we have distributed may simplify the process of registering, sorting, labeling, and localizing subdural iEEG grid electrodes by manual inspection. PMID:27915398

  6. PyFolding: Open-Source Graphing, Simulation, and Analysis of the Biophysical Properties of Proteins.

    PubMed

    Lowe, Alan R; Perez-Riba, Albert; Itzhaki, Laura S; Main, Ewan R G

    2018-02-06

    For many years, curve-fitting software has been heavily utilized to fit simple models to various types of biophysical data. Although such software packages are easy to use for simple functions, they are often expensive and present substantial impediments to applying more complex models or for the analysis of large data sets. One field that is reliant on such data analysis is the thermodynamics and kinetics of protein folding. Over the past decade, increasingly sophisticated analytical models have been generated, but without simple tools to enable routine analysis. Consequently, users have needed to generate their own tools or otherwise find willing collaborators. Here we present PyFolding, a free, open-source, and extensible Python framework for graphing, analysis, and simulation of the biophysical properties of proteins. To demonstrate the utility of PyFolding, we have used it to analyze and model experimental protein folding and thermodynamic data. Examples include: 1) multiphase kinetic folding fitted to linked equations, 2) global fitting of multiple data sets, and 3) analysis of repeat protein thermodynamics with Ising model variants. Moreover, we demonstrate how PyFolding is easily extensible to novel functionality beyond applications in protein folding via the addition of new models. Example scripts to perform these and other operations are supplied with the software, and we encourage users to contribute notebooks and models to create a community resource. Finally, we show that PyFolding can be used in conjunction with Jupyter notebooks as an easy way to share methods and analysis for publication and among research teams. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  7. MolProbity: More and better reference data for improved all-atom structure validation.

    PubMed

    Williams, Christopher J; Headd, Jeffrey J; Moriarty, Nigel W; Prisant, Michael G; Videau, Lizbeth L; Deis, Lindsay N; Verma, Vishal; Keedy, Daniel A; Hintze, Bradley J; Chen, Vincent B; Jain, Swati; Lewis, Steven M; Arendall, W Bryan; Snoeyink, Jack; Adams, Paul D; Lovell, Simon C; Richardson, Jane S; Richardson, David C

    2018-01-01

    This paper describes the current update on macromolecular model validation services that are provided at the MolProbity website, emphasizing changes and additions since the previous review in 2010. There have been many infrastructure improvements, including rewrite of previous Java utilities to now use existing or newly written Python utilities in the open-source CCTBX portion of the Phenix software system. This improves long-term maintainability and enhances the thorough integration of MolProbity-style validation within Phenix. There is now a complete MolProbity mirror site at http://molprobity.manchester.ac.uk. GitHub serves our open-source code, reference datasets, and the resulting multi-dimensional distributions that define most validation criteria. Coordinate output after Asn/Gln/His "flip" correction is now more idealized, since the post-refinement step has apparently often been skipped in the past. Two distinct sets of heavy-atom-to-hydrogen distances and accompanying van der Waals radii have been researched and improved in accuracy, one for the electron-cloud-center positions suitable for X-ray crystallography and one for nuclear positions. New validations include messages at input about problem-causing format irregularities, updates of Ramachandran and rotamer criteria from the million quality-filtered residues in a new reference dataset, the CaBLAM Cα-CO virtual-angle analysis of backbone and secondary structure for cryoEM or low-resolution X-ray, and flagging of the very rare cis-nonProline and twisted peptides which have recently been greatly overused. Due to wide application of MolProbity validation and corrections by the research community, in Phenix, and at the worldwide Protein Data Bank, newly deposited structures have continued to improve greatly as measured by MolProbity's unique all-atom clashscore. © 2017 The Protein Society.

  8. Utilizing Public Access Data and Open Source Statistical Programs to Teach Climate Science to Interdisciplinary Undergraduate Students

    NASA Astrophysics Data System (ADS)

    Collins, L.

    2014-12-01

    Students in the Environmental Studies major at the University of Southern California fulfill their curriculum requirements by taking a broad range of courses in the social and natural sciences. Climate change is often taught in 1-2 lectures in these courses with limited examination of this complex topic. Several upper division elective courses focus on the science, policy, and social impacts of climate change. In an upper division course focused on the scientific tools used to determine paleoclimate and predict future climate, I have developed a project where students download, manipulate, and analyze data from the National Climatic Data Center. Students are required to download 100 or more years of daily temperature records and use the statistical program R to analyze that data, calculating daily, monthly, and yearly temperature averages along with changes in the number of extreme hot or cold days (≥90˚F and ≤30˚F, respectively). In parallel, they examine population growth, city expansion, and changes in transportation looking for correlations between the social data and trends observed in the temperature data. Students examine trends over time to determine correlations to urban heat island effect. This project exposes students to "real" data, giving them the tools necessary to critically analyze scientific studies without being experts in the field. Utilizing the existing, public, online databases provides almost unlimited, free data. Open source statistical programs provide a cost-free platform for examining the data although some in-class time is required to help students navigate initial data importation and analysis. Results presented will highlight data compiled over three years of course projects.

  9. An i2b2-based, generalizable, open source, self-scaling chronic disease registry

    PubMed Central

    Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D

    2013-01-01

    Objective Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Materials and methods Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. Results The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. Discussion We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. Conclusions The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases. PMID:22733975

  10. Open Collaboration: A Problem Solving Strategy That Is Redefining NASA's Innovative Spirit

    NASA Technical Reports Server (NTRS)

    Rando, Cynthia M.; Fogarty, Jennifer A.; Richard, Elizabeth E.; Davis, Jeffrey R.

    2011-01-01

    In 2010, NASA?s Space Life Sciences Directorate announced the successful results from pilot experiments with open innovation methodologies. Specifically, utilization of internet based external crowd sourcing platforms to solve challenging problems in human health and performance related to the future of spaceflight. The follow-up to this success was an internal crowd sourcing pilot program entitled NASA@work, which was supported by the InnoCentive@work software platform. The objective of the NASA@work pilot was to connect the collective knowledge of individuals from all areas within the NASA organization via a private web based environment. The platform provided a venue for NASA Challenge Owners, those looking for solutions or new ideas, to pose challenges to internal solvers, those within NASA with the skill and desire to create solutions. The pilot was launched in 57 days, a record for InnoCentive and NASA, and ran for three months with a total of 20 challenges posted Agency wide. The NASA@work pilot attracted over 6000 participants throughout NASA with a total of 183 contributing solvers for the 20 challenges posted. At the time of the pilot?s closure, solvers provided viable solutions and ideas for 17 of the 20 posted challenges. The solver community provided feedback on the pilot describing it as a barrier breaking activity, conveying that there was a satisfaction associated with helping co-workers, that it was "fun" to think about problems outside normal work boundaries, and it was nice to learn what challenges others were facing across the agency. The results and the feedback from the solver community have demonstrated the power and utility of an internal collaboration tool, such as NASA@work.

  11. An i2b2-based, generalizable, open source, self-scaling chronic disease registry.

    PubMed

    Natter, Marc D; Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D

    2013-01-01

    Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases.

  12. Cost-utility analysis of minimally invasive versus open multilevel hemilaminectomy for lumbar stenosis.

    PubMed

    Parker, Scott L; Adogwa, Owoicho; Davis, Brandon J; Fulchiero, Erin; Aaronson, Oran; Cheng, Joseph; Devin, Clinton J; McGirt, Matthew J

    2013-02-01

    Two-year cost-utility study comparing minimally invasive (MIS) versus open multilevel hemilaminectomy in patients with degenerative lumbar spinal stenosis. The objective of the study was to determine whether MIS versus open multilevel hemilaminectomy for degenerative lumbar spinal stenosis is a cost-effective advancement in lumbar decompression surgery. MIS-multilevel hemilaminectomy for degenerative lumbar spinal stenosis allows for effective treatment of back and leg pain while theoretically minimizing blood loss, tissue injury, and postoperative recovery. No studies have evaluated comprehensive healthcare costs associated with multilevel hemilaminectomy procedures, nor assessed cost-effectiveness of MIS versus open multilevel hemilaminectomy. Fifty-four consecutive patients with lumbar stenosis undergoing multilevel hemilaminectomy through an MIS paramedian tubular approach (n=27) versus midline open approach (n=27) were included. Total back-related medical resource utilization, missed work, and health state values [quality adjusted life years (QALYs), calculated from EuroQuol-5D with US valuation] were assessed after 2-year follow-up. Two-year resource use was multiplied by unit costs based on Medicare national allowable payment amounts (direct cost) and work-day losses were multiplied by the self-reported gross-of-tax wage rate (indirect cost). Difference in mean total cost per QALY gained for MIS versus open hemilaminectomy was assessed as incremental cost-effectiveness ratio (ICER: COST(MIS)-COST(OPEN)/QALY(MIS)-QALY(OPEN)). MIS versus open cohorts were similar at baseline. MIS and open hemilaminectomy were associated with an equivalent cumulative gain of 0.72 QALYs 2 years after surgery. Mean direct medical costs, indirect societal costs, and total 2-year cost ($23,109 vs. $25,420; P=0.21) were similar between MIS and open hemilaminectomy. MIS versus open approach was associated with similar total costs and utility, making it a cost equivalent technology compared with the traditional open approach. MIS versus open multilevel hemilaminectomy was associated with similar cost over 2 years while providing equivalent improvement in QALYs. In our experience, MIS versus open multilevel hemilaminectomy is a cost equivalent technology for patients with lumbar stenosis-associated radicular pain.

  13. Open Resonator for Summation of Powers in Sub-Terahertz and Terahertz Frequencies

    NASA Astrophysics Data System (ADS)

    Kuz'michev, I. K.; Yeryomka, V. D.; May, A. V.; Troshchilo, A. S.

    2017-03-01

    Purpose: Study of excitation features for the first higher axialasymmetric type oscillations in an open resonator connected into the waveguide transmission line. Design/methodology/approach: To determine the efficiency of higher oscillation excitation in the resonator by using the highest wave of a rectangular waveguide, the coefficient of the antenna surface utilization is used. The coefficient of reflection from the open resonator is determined by the known method of summation of the partial coefficients of reflection from the resonant system. Findings: The excitation efficiency of the first higher axial asymmetric type TEM10q oscillations in an open resonator connected into the waveguide transmission line, using the TE20 type wave, is considered. The research efforts were made with accounting for the electromagnetic field vector nature. It is shown that for certain sizes of exciting coupler the excitation efficiency of the working excitation is equal to 0.867. Besides, this resonant system has a single frequency response within a wide band of frequencies. Due to this, it can be applied for summation of powers for individual sources of oscillations. Since this resonant system allows separating the matching functions as to the field and coupling, it is possible to provide any prescribed coupling of sources with a resonant volume. For this purpose, one- dimensional diffraction gratings (E-polarization) are used. Conclusions: With the matched excitation of axially asymmetric modes of oscillations the resonant system has an angular and frequency spectrum selection that is of great practical importance for powers summation. By application of one- dimensional diffraction gratings (E-polarization), located in apertures of coupling elements, the active elements can be matched with the resonant volume.

  14. The electric industry's gyrations are giving some telecommunications experts that old familiar feeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graniere, R.J.

    1993-06-15

    A timeline of the past 20 years would characterize an American telecommunications policy revolution dominated by alternating periods of market structure and access. It also would reveal that this cycle is not a casual phenomenon but the result of procompetitive regulatory and judicial decisions that spawn equal and open access issues whose resolution is, in turn, a source of additional market structure issues. Passage of the Energy Policy Act of 1992 has started a similar cogenerative process in the electricity industry. How can electric utility executives and regulators use the lessons of the telecommunications industry to deal with emerging transmissionmore » issues in the electricity industry They can begin by realizing that multiple forms of mandatory transmission access may be new to electric utilities, but they are second nature to telephone local exchange companies (LECs). For example, LECs have been providing local access services to equipment manufacturers and long-distance companies for over a decade. These firms also are deploying local access services for the enhanced and information-services providers under the rubric of open network architecture (ONA). This full range of access services might soon be commonplace in the electricity industry, too, as exempt wholesale generators (EWGs) enter the wholesale power markets.« less

  15. iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations

    NASA Astrophysics Data System (ADS)

    Vanfretti, L.; Rabuzin, T.; Baudette, M.; Murad, M.

    The iTesla Power Systems Library (iPSL) is a Modelica package providing a set of power system components for phasor time-domain modeling and simulation. The Modelica language provides a systematic approach to develop models using a formal mathematical description, that uniquely specifies the physical behavior of a component or the entire system. Furthermore, the standardized specification of the Modelica language (Modelica Association [1]) enables unambiguous model exchange by allowing any Modelica-compliant tool to utilize the models for simulation and their analyses without the need of a specific model transformation tool. As the Modelica language is being developed with open specifications, any tool that implements these requirements can be utilized. This gives users the freedom of choosing an Integrated Development Environment (IDE) of their choice. Furthermore, any integration solver can be implemented within a Modelica tool to simulate Modelica models. Additionally, Modelica is an object-oriented language, enabling code factorization and model re-use to improve the readability of a library by structuring it with object-oriented hierarchy. The developed library is released under an open source license to enable a wider distribution and let the user customize it to their specific needs. This paper describes the iPSL and provides illustrative application examples.

  16. The Utility of the Extended Images in Ambient Seismic Wavefield Migration

    NASA Astrophysics Data System (ADS)

    Girard, A. J.; Shragge, J. C.

    2015-12-01

    Active-source 3D seismic migration and migration velocity analysis (MVA) are robust and highly used methods for imaging Earth structure. One class of migration methods uses extended images constructed by incorporating spatial and/or temporal wavefield correlation lags to the imaging conditions. These extended images allow users to directly assess whether images focus better with different parameters, which leads to MVA techniques that are based on the tenets of adjoint-state theory. Under certain conditions (e.g., geographical, cultural or financial), however, active-source methods can prove impractical. Utilizing ambient seismic energy that naturally propagates through the Earth is an alternate method currently used in the scientific community. Thus, an open question is whether extended images are similarly useful for ambient seismic migration processing and verifying subsurface velocity models, and whether one can similarly apply adjoint-state methods to perform ambient migration velocity analysis (AMVA). Herein, we conduct a number of numerical experiments that construct extended images from ambient seismic recordings. We demonstrate that, similar to active-source methods, there is a sensitivity to velocity in ambient seismic recordings in the migrated extended image domain. In synthetic ambient imaging tests with varying degrees of error introduced to the velocity model, the extended images are sensitive to velocity model errors. To determine the extent of this sensitivity, we utilize acoustic wave-equation propagation and cross-correlation-based migration methods to image weak body-wave signals present in the recordings. Importantly, we have also observed scenarios where non-zero correlation lags show signal while zero-lags show none. This may be a valuable missing piece for ambient migration techniques that have yielded largely inconclusive results, and might be an important piece of information for performing AMVA from ambient seismic recordings.

  17. Unified EDGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2007-06-18

    UEDGE is an interactive suite of physics packages using the Python or BASIS scripting systems. The plasma is described by time-dependent 2D plasma fluid equations that include equations for density, velocity, ion temperature, electron temperature, electrostatic potential, and gas density in the edge region of a magnetic fusion energy confinement device. Slab, cylindrical, and toroidal geometries are allowed, and closed and open magnetic field-line regions are included. Classical transport is assumed along magnetic field lines, and anomalous transport is assumed across field lines. Multi-charge state impurities can be included with the corresponding line-radiation energy loss. Although UEDGE is written inmore » Fortran, for efficient execution and analysis of results, it utilizes either Python or BASIS scripting shells. Python is easily available for many platforms (http://www.Python.org/). The features and availability of BASIS are described in "Basis Manual Set" by P.F. Dubois, Z.C. Motteler, et al., Lawrence Livermore National Laboratory report UCRL-MA-1 18541, June, 2002 and http://basis.llnl.gov. BASIS has been reviewed and released by LLNL for unlimited distribution. The Python version utilizes PYBASIS scripts developed by D.P. Grote, LLNL. The Python version also uses MPPL code and MAC Perl script, available from the public-domain BASIS source above. The Forthon version of UEDGE uses the same source files, but utilizes Forthon to produce a Python-compatible source. Forthon has been developed by D.P. Grote at LBL (see http://hifweb.lbl.gov/Forthon/ and Grote et al. in the references below), and it is freely available. The graphics can be performed by any package importable to Python, such as PYGIST.« less

  18. Getting Open Source Software into Schools: Strategies and Challenges

    ERIC Educational Resources Information Center

    Hepburn, Gary; Buley, Jan

    2006-01-01

    In this article Gary Hepburn and Jan Buley outline different approaches to implementing open source software (OSS) in schools; they also address the challenges that open source advocates should anticipate as they try to convince educational leaders to adopt OSS. With regard to OSS implementation, they note that schools have a flexible range of…

  19. Open Source Library Management Systems: A Multidimensional Evaluation

    ERIC Educational Resources Information Center

    Balnaves, Edmund

    2008-01-01

    Open source library management systems have improved steadily in the last five years. They now present a credible option for small to medium libraries and library networks. An approach to their evaluation is proposed that takes account of three additional dimensions that only open source can offer: the developer and support community, the source…

  20. Open Source as Appropriate Technology for Global Education

    ERIC Educational Resources Information Center

    Carmichael, Patrick; Honour, Leslie

    2002-01-01

    Economic arguments for the adoption of "open source" software in business have been widely discussed. In this paper we draw on personal experience in the UK, South Africa and Southeast Asia to forward compelling reasons why open source software should be considered as an appropriate and affordable alternative to the currently prevailing…

  1. Government Technology Acquisition Policy: The Case of Proprietary versus Open Source Software

    ERIC Educational Resources Information Center

    Hemphill, Thomas A.

    2005-01-01

    This article begins by explaining the concepts of proprietary and open source software technology, which are now competing in the marketplace. A review of recent individual and cooperative technology development and public policy advocacy efforts, by both proponents of open source software and advocates of proprietary software, subsequently…

  2. Open Source Communities in Technical Writing: Local Exigence, Global Extensibility

    ERIC Educational Resources Information Center

    Conner, Trey; Gresham, Morgan; McCracken, Jill

    2011-01-01

    By offering open-source software (OSS)-based networks as an affordable technology alternative, we partnered with a nonprofit community organization. In this article, we narrate the client-based experiences of this partnership, highlighting the ways in which OSS and open-source culture (OSC) transformed our students' and our own expectations of…

  3. Personal Electronic Devices and the ISR Data Explosion: The Impact of Cyber Cameras on the Intelligence Community

    DTIC Science & Technology

    2015-06-01

    ground.aspx?p=1 Texas Tech Security Group, “Automated Open Source Intelligence ( OSINT ) Using APIs.” RaiderSec, Sunday 30 December 2012, http...Open Source Intelligence ( OSINT ) Using APIs,” RaiderSec, Sunday 30 December 2012, http://raidersec.blogspot.com/2012/12/automated-open- source

  4. Open-Source Unionism: New Workers, New Strategies

    ERIC Educational Resources Information Center

    Schmid, Julie M.

    2004-01-01

    In "Open-Source Unionism: Beyond Exclusive Collective Bargaining," published in fall 2002 in the journal Working USA, labor scholars Richard B. Freeman and Joel Rogers use the term "open-source unionism" to describe a form of unionization that uses Web technology to organize in hard-to-unionize workplaces. Rather than depend on the traditional…

  5. Perceptions of Open Source versus Commercial Software: Is Higher Education Still on the Fence?

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2007-01-01

    This exploratory study investigated the perceptions of technology and academic decision-makers about open source benefits and risks versus commercial software applications. The study also explored reactions to a concept for outsourcing campus-wide deployment and maintenance of open source. Data collected from telephone interviews were analyzed,…

  6. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    ERIC Educational Resources Information Center

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  7. Open-Source Learning Management Systems: A Predictive Model for Higher Education

    ERIC Educational Resources Information Center

    van Rooij, S. Williams

    2012-01-01

    The present study investigated the role of pedagogical, technical, and institutional profile factors in an institution of higher education's decision to select an open-source learning management system (LMS). Drawing on the results of previous research that measured patterns of deployment of open-source software (OSS) in US higher education and…

  8. An Embedded Systems Course for Engineering Students Using Open-Source Platforms in Wireless Scenarios

    ERIC Educational Resources Information Center

    Rodriguez-Sanchez, M. C.; Torrado-Carvajal, Angel; Vaquero, Joaquin; Borromeo, Susana; Hernandez-Tamames, Juan A.

    2016-01-01

    This paper presents a case study analyzing the advantages and disadvantages of using project-based learning (PBL) combined with collaborative learning (CL) and industry best practices, integrated with information communication technologies, open-source software, and open-source hardware tools, in a specialized microcontroller and embedded systems…

  9. Technology collaboration by means of an open source government

    NASA Astrophysics Data System (ADS)

    Berardi, Steven M.

    2009-05-01

    The idea of open source software originally began in the early 1980s, but it never gained widespread support until recently, largely due to the explosive growth of the Internet. Only the Internet has made this kind of concept possible, bringing together millions of software developers from around the world to pool their knowledge. The tremendous success of open source software has prompted many corporations to adopt the culture of open source and thus share information they previously held secret. The government, and specifically the Department of Defense (DoD), could also benefit from adopting an open source culture. In acquiring satellite systems, the DoD often builds walls between program offices, but installing doors between programs can promote collaboration and information sharing. This paper addresses the challenges and consequences of adopting an open source culture to facilitate technology collaboration for DoD space acquisitions. DISCLAIMER: The views presented here are the views of the author, and do not represent the views of the United States Government, United States Air Force, or the Missile Defense Agency.

  10. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  11. Embracing Open Source for NASA's Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin

    2017-01-01

    The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.

  12. Open source Modeling and optimization tools for Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peles, S.

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less

  13. Limitations of Phased Array Beamforming in Open Rotor Noise Source Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Csaba; Envia, Edmane; Podboy, Gary G.

    2013-01-01

    Phased array beamforming results of the F31/A31 historical baseline counter-rotating open rotor blade set were investigated for measurement data taken on the NASA Counter-Rotating Open Rotor Propulsion Rig in the 9- by 15-Foot Low-Speed Wind Tunnel of NASA Glenn Research Center as well as data produced using the LINPROP open rotor tone noise code. The planar microphone array was positioned broadside and parallel to the axis of the open rotor, roughly 2.3 rotor diameters away. The results provide insight as to why the apparent noise sources of the blade passing frequency tones and interaction tones appear at their nominal Mach radii instead of at the actual noise sources, even if those locations are not on the blades. Contour maps corresponding to the sound fields produced by the radiating sound waves, taken from the simulations, are used to illustrate how the interaction patterns of circumferential spinning modes of rotating coherent noise sources interact with the phased array, often giving misleading results, as the apparent sources do not always show where the actual noise sources are located. This suggests that a more sophisticated source model would be required to accurately locate the sources of each tone. The results of this study also have implications with regard to the shielding of open rotor sources by airframe empennages.

  14. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    NASA Astrophysics Data System (ADS)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  15. Prediction of aerodynamic tonal noise from open rotors

    NASA Astrophysics Data System (ADS)

    Sharma, Anupam; Chen, Hsuan-nien

    2013-08-01

    A numerical approach for predicting tonal aerodynamic noise from "open rotors" is presented. "Open rotor" refers to an engine architecture with a pair of counter-rotating propellers. Typical noise spectra from an open rotor consist of dominant tones, which arise due to both the steady loading/thickness and the aerodynamic interaction between the two bladerows. The proposed prediction approach utilizes Reynolds Averaged Navier-Stokes (RANS) Computational Fluid Dynamics (CFD) simulations to obtain near-field description of the noise sources. The near-to-far-field propagation is then carried out by solving the Ffowcs Williams-Hawkings equation. Since the interest of this paper is limited to tone noise, a linearized, frequency domain approach is adopted to solve the wake/vortex-blade interaction problem.This paper focuses primarily on the speed scaling of the aerodynamic tonal noise from open rotors. Even though there is no theoretical mode cut-off due to the absence of nacelle in open rotors, the far-field noise is a strong function of the azimuthal mode order. While the steady loading/thickness noise has circumferential modes of high order, due to the relatively large number of blades (≈10-12), the interaction noise typically has modes of small orders. The high mode orders have very low radiation efficiency and exhibit very strong scaling with Mach number, while the low mode orders show a relatively weaker scaling. The prediction approach is able to capture the speed scaling (observed in experiment) of the overall aerodynamic noise very well.

  16. Development of an Open Source, Air-Deployable Weather Station

    NASA Astrophysics Data System (ADS)

    Krejci, A.; Lopez Alcala, J. M.; Nelke, M.; Wagner, J.; Udell, C.; Higgins, C. W.; Selker, J. S.

    2017-12-01

    We created a packaged weather station intended to be deployed in the air on tethered systems. The device incorporates lightweight sensors and parts and runs for up to 24 hours off of lithium polymer batteries, allowing the entire package to be supported by a thin fiber. As the fiber does not provide a stable platform, additional data (pitch and roll) from typical weather parameters (e.g. temperature, pressure, humidity, wind speed, and wind direction) are determined using an embedded inertial motion unit. All designs are open sourced including electronics, CAD drawings, and descriptions of assembly and can be found on the OPEnS lab website at http://www.open-sensing.org/lowcost-weather-station/. The Openly Published Environmental Sensing Lab (OPEnS: Open-Sensing.org) expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting-edge technology. New OPEnS labs are now being established in India, France, Switzerland, the Netherlands, and Ghana.

  17. Software for Real-Time Analysis of Subsonic Test Shot Accuracy

    DTIC Science & Technology

    2014-03-01

    used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains

  18. What an open source clinical trial community can learn from hackers

    PubMed Central

    Dunn, Adam G.; Day, Richard O.; Mandl, Kenneth D.; Coiera, Enrico

    2014-01-01

    Summary Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. Since a similar gap has already been addressed in the software industry by the open source software movement, we examine how the social and technical principles of the movement can be used to guide the growth of an open source clinical trial community. PMID:22553248

  19. A Transparent Translation from Legacy System Model into Common Information Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Simpson, Jeffrey; Zhang, Yingchen

    Advance in smart grid is forcing utilities towards better monitoring, control and analysis of distribution systems, and requires extensive cyber-based intelligent systems and applications to realize various functionalities. The ability of systems, or components within systems, to interact and exchange services or information with each other is the key to the success of smart grid technologies, and it requires efficient information exchanging and data sharing infrastructure. The Common Information Model (CIM) is a standard that allows different applications to exchange information about an electrical system, and it has become a widely accepted solution for information exchange among different platforms andmore » applications. However, most existing legacy systems are not developed using CIM, but using their own languages. Integrating such legacy systems is a challenge for utilities, and the appropriate utilization of the integrated legacy systems is even more intricate. Thus, this paper has developed an approach and open-source tool in order to translate legacy system models into CIM format. The developed tool is tested for a commercial distribution management system and simulation results have proved its effectiveness.« less

  20. US Geoscience Information Network, Web Services for Geoscience Information Discovery and Access

    NASA Astrophysics Data System (ADS)

    Richard, S.; Allison, L.; Clark, R.; Coleman, C.; Chen, G.

    2012-04-01

    The US Geoscience information network has developed metadata profiles for interoperable catalog services based on ISO19139 and the OGC CSW 2.0.2. Currently data services are being deployed for the US Dept. of Energy-funded National Geothermal Data System. These services utilize OGC Web Map Services, Web Feature Services, and THREDDS-served NetCDF for gridded datasets. Services and underlying datasets (along with a wide variety of other information and non information resources are registered in the catalog system. Metadata for registration is produced by various workflows, including harvest from OGC capabilities documents, Drupal-based web applications, transformation from tabular compilations. Catalog search is implemented using the ESRI Geoportal open-source server. We are pursuing various client applications to demonstrated discovery and utilization of the data services. Currently operational applications allow catalog search and data acquisition from map services in an ESRI ArcMap extension, a catalog browse and search application built on openlayers and Django. We are developing use cases and requirements for other applications to utilize geothermal data services for resource exploration and evaluation.

  1. 20. INTERIOR OF SIDEENTRY UTILITY ROOM SHOWING OPEN 1 LIGHT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. INTERIOR OF SIDE-ENTRY UTILITY ROOM SHOWING OPEN 1 LIGHT SIDE-EXIT DOOR AT PHOTO LEFT AND 1-LIGHT OVER 1 LIGHT SASH WINDOW INTO PANTRY AT PHOTO RIGHT. VIEW TO SOUTHWEST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA

  2. 11. INTERIOR OF KITCHEN/UTILITY AREA SHOWING OPEN DOORWAY TO LIVING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. INTERIOR OF KITCHEN/UTILITY AREA SHOWING OPEN DOORWAY TO LIVING ROOM, AND BUILT-IN CABINETS AROUND SINK AND 3-LIGHT OVER 3-LIGHT, DOUBLE-HUNG, WOOD-FRAME WINDOW. VIEW TO NORTHWEST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA

  3. An Evaluation of Open Source Learning Management Systems According to Administration Tools and Curriculum Design

    ERIC Educational Resources Information Center

    Ozdamli, Fezile

    2007-01-01

    Distance education is becoming more important in the universities and schools. The aim of this research is to evaluate the current existing Open Source Learning Management Systems according to Administration tool and Curriculum Design. For this, seventy two Open Source Learning Management Systems have been subjected to a general evaluation. After…

  4. Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic Publishing

    ERIC Educational Resources Information Center

    Samuels, Ruth Gallegos; Griffy, Henry

    2012-01-01

    This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…

  5. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    ERIC Educational Resources Information Center

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  6. Open Source Meets Virtual Reality--An Instructor's Journey Unearths New Opportunities for Learning, Community, and Academia

    ERIC Educational Resources Information Center

    O'Connor, Eileen A.

    2015-01-01

    Opening with the history, recent advances, and emerging ways to use avatar-based virtual reality, an instructor who has used virtual environments since 2007 shares how these environments bring more options to community building, teaching, and education. With the open-source movement, where the source code for virtual environments was made…

  7. The Implications of Incumbent Intellectual Property Strategies for Open Source Software Success and Commercialization

    ERIC Educational Resources Information Center

    Wen, Wen

    2012-01-01

    While open source software (OSS) emphasizes open access to the source code and avoids the use of formal appropriability mechanisms, there has been little understanding of how the existence and exercise of formal intellectual property rights (IPR) such as patents influence the direction of OSS innovation. This dissertation seeks to bridge this gap…

  8. Migrations of the Mind: The Emergence of Open Source Education

    ERIC Educational Resources Information Center

    Glassman, Michael; Bartholomew, Mitchell; Jones, Travis

    2011-01-01

    The authors describe an Open Source approach to education. They define Open Source Education (OSE) as a teaching and learning framework where the use and presentation of information is non-hierarchical, malleable, and subject to the needs and contributions of students as they become "co-owners" of the course. The course transforms itself into an…

  9. Prepare for Impact

    ERIC Educational Resources Information Center

    Waters, John K.

    2010-01-01

    Open source software is poised to make a profound impact on K-12 education. For years industry experts have been predicting the widespread adoption of open source tools by K-12 school districts. They're about to be proved right. The impact may not yet have been profound, but it's fair to say that some open source systems and non-proprietary…

  10. 7 Questions to Ask Open Source Vendors

    ERIC Educational Resources Information Center

    Raths, David

    2012-01-01

    With their budgets under increasing pressure, many campus IT directors are considering open source projects for the first time. On the face of it, the savings can be significant. Commercial emergency-planning software can cost upward of six figures, for example, whereas the open source Kuali Ready might run as little as $15,000 per year when…

  11. Cognitive Readiness Assessment and Reporting: An Open Source Mobile Framework for Operational Decision Support and Performance Improvement

    ERIC Educational Resources Information Center

    Heric, Matthew; Carter, Jenn

    2011-01-01

    Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…

  12. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  13. Upon the Shoulders of Giants: Open-Source Hardware and Software in Analytical Chemistry.

    PubMed

    Dryden, Michael D M; Fobel, Ryan; Fobel, Christian; Wheeler, Aaron R

    2017-04-18

    Isaac Newton famously observed that "if I have seen further it is by standing on the shoulders of giants." We propose that this sentiment is a powerful motivation for the "open-source" movement in scientific research, in which creators provide everything needed to replicate a given project online, as well as providing explicit permission for users to use, improve, and share it with others. Here, we write to introduce analytical chemists who are new to the open-source movement to best practices and concepts in this area and to survey the state of open-source research in analytical chemistry. We conclude by considering two examples of open-source projects from our own research group, with the hope that a description of the process, motivations, and results will provide a convincing argument about the benefits that this movement brings to both creators and users.

  14. Open-Source 3-D Platform for Low-Cost Scientific Instrument Ecosystem.

    PubMed

    Zhang, C; Wijnen, B; Pearce, J M

    2016-08-01

    The combination of open-source software and hardware provides technically feasible methods to create low-cost, highly customized scientific research equipment. Open-source 3-D printers have proven useful for fabricating scientific tools. Here the capabilities of an open-source 3-D printer are expanded to become a highly flexible scientific platform. An automated low-cost 3-D motion control platform is presented that has the capacity to perform scientific applications, including (1) 3-D printing of scientific hardware; (2) laboratory auto-stirring, measuring, and probing; (3) automated fluid handling; and (4) shaking and mixing. The open-source 3-D platform not only facilities routine research while radically reducing the cost, but also inspires the creation of a diverse array of custom instruments that can be shared and replicated digitally throughout the world to drive down the cost of research and education further. © 2016 Society for Laboratory Automation and Screening.

  15. 40 CFR 74.44 - Reduced utilization for combustion sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Reduced utilization for combustion....44 Reduced utilization for combustion sources. (a) Calculation of utilization—(1) Annual utilization... reported in accordance with subpart F of this part for combustion sources. “Allowances transferred to all...

  16. 40 CFR 74.44 - Reduced utilization for combustion sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Reduced utilization for combustion....44 Reduced utilization for combustion sources. (a) Calculation of utilization—(1) Annual utilization... reported in accordance with subpart F of this part for combustion sources. “Allowances transferred to all...

  17. 40 CFR 74.44 - Reduced utilization for combustion sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Reduced utilization for combustion....44 Reduced utilization for combustion sources. (a) Calculation of utilization—(1) Annual utilization... reported in accordance with subpart F of this part for combustion sources. “Allowances transferred to all...

  18. 40 CFR 74.44 - Reduced utilization for combustion sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false Reduced utilization for combustion....44 Reduced utilization for combustion sources. (a) Calculation of utilization—(1) Annual utilization... reported in accordance with subpart F of this part for combustion sources. “Allowances transferred to all...

  19. OpenSesame: an open-source, graphical experiment builder for the social sciences.

    PubMed

    Mathôt, Sebastiaan; Schreij, Daniel; Theeuwes, Jan

    2012-06-01

    In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality, such as support for eyetrackers, input devices, and video playback, is available through plug-ins. OpenSesame can be used in combination with existing software for creating experiments.

  20. The Privacy and Security Implications of Open Data in Healthcare.

    PubMed

    Kobayashi, Shinji; Kane, Thomas B; Paton, Chris

    2018-04-22

     The International Medical Informatics Association (IMIA) Open Source Working Group (OSWG) initiated a group discussion to discuss current privacy and security issues in the open data movement in the healthcare domain from the perspective of the OSWG membership.  Working group members independently reviewed the recent academic and grey literature and sampled a number of current large-scale open data projects to inform the working group discussion.  This paper presents an overview of open data repositories and a series of short case reports to highlight relevant issues present in the recent literature concerning the adoption of open approaches to sharing healthcare datasets. Important themes that emerged included data standardisation, the inter-connected nature of the open source and open data movements, and how publishing open data can impact on the ethics, security, and privacy of informatics projects.  The open data and open source movements in healthcare share many common philosophies and approaches including developing international collaborations across multiple organisations and domains of expertise. Both movements aim to reduce the costs of advancing scientific research and improving healthcare provision for people around the world by adopting open intellectual property licence agreements and codes of practice. Implications of the increased adoption of open data in healthcare include the need to balance the security and privacy challenges of opening data sources with the potential benefits of open data for improving research and healthcare delivery. Georg Thieme Verlag KG Stuttgart.

  1. Healthcare Supported by Data Mule Networks in Remote Communities of the Amazon Region

    PubMed Central

    Coutinho, Mauro Margalho; Efrat, Alon; Richa, Andrea

    2014-01-01

    This paper investigates the feasibility of using boats as data mule nodes, carrying medical ultrasound videos from remote and isolated communities in the Amazon region in Brazil, to the main city of that area. The videos will be used by physicians to perform remote analysis and follow-up routine of prenatal examinations of pregnant women. Two open source simulators (the ONE and NS-2) were used to evaluate the results obtained utilizing a CoDPON (continuous displacement plan oriented network). The simulations took into account the connection times between the network nodes (boats) and the number of nodes on each boat route. PMID:27433519

  2. Validation of thermal effects of LED package by using Elmer finite element simulation method

    NASA Astrophysics Data System (ADS)

    Leng, Lai Siang; Retnasamy, Vithyacharan; Mohamad Shahimin, Mukhzeer; Sauli, Zaliman; Taniselass, Steven; Bin Ab Aziz, Muhamad Hafiz; Vairavan, Rajendaran; Kirtsaeng, Supap

    2017-02-01

    The overall performance of the Light-emitting diode, LED package is critically affected by the heat attribution. In this study, open source software - Elmer FEM has been utilized to study the thermal analysis of the LED package. In order to perform a complete simulation study, both Salome software and ParaView software were introduced as Pre and Postprocessor. The thermal effect of the LED package was evaluated by this software. The result has been validated with commercially licensed software based on previous work. The percentage difference from both simulation results is less than 5% which is tolerable and comparable.

  3. An Introduction to MAMA (Meta-Analysis of MicroArray data) System.

    PubMed

    Zhang, Zhe; Fenstermacher, David

    2005-01-01

    Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.

  4. Simulation of partially coherent light propagation using parallel computing devices

    NASA Astrophysics Data System (ADS)

    Magalhães, Tiago C.; Rebordão, José M.

    2017-08-01

    Light acquires or loses coherence and coherence is one of the few optical observables. Spectra can be derived from coherence functions and understanding any interferometric experiment is also relying upon coherence functions. Beyond the two limiting cases (full coherence or incoherence) the coherence of light is always partial and it changes with propagation. We have implemented a code to compute the propagation of partially coherent light from the source plane to the observation plane using parallel computing devices (PCDs). In this paper, we restrict the propagation in free space only. To this end, we used the Open Computing Language (OpenCL) and the open-source toolkit PyOpenCL, which gives access to OpenCL parallel computation through Python. To test our code, we chose two coherence source models: an incoherent source and a Gaussian Schell-model source. In the former case, we divided into two different source shapes: circular and rectangular. The results were compared to the theoretical values. Our implemented code allows one to choose between the PyOpenCL implementation and a standard one, i.e using the CPU only. To test the computation time for each implementation (PyOpenCL and standard), we used several computer systems with different CPUs and GPUs. We used powers of two for the dimensions of the cross-spectral density matrix (e.g. 324, 644) and a significant speed increase is observed in the PyOpenCL implementation when compared to the standard one. This can be an important tool for studying new source models.

  5. An Open Source Simulation System

    NASA Technical Reports Server (NTRS)

    Slack, Thomas

    2005-01-01

    An investigation into the current state of the art of open source real time programming practices. This document includes what technologies are available, how easy is it to obtain, configure, and use them, and some performance measures done on the different systems. A matrix of vendors and their products is included as part of this investigation, but this is not an exhaustive list, and represents only a snapshot of time in a field that is changing rapidly. Specifically, there are three approaches investigated: 1. Completely open source on generic hardware, downloaded from the net. 2. Open source packaged by a vender and provided as free evaluation copy. 3. Proprietary hardware with pre-loaded proprietary source available software provided by the vender as for our evaluation.

  6. A clinic compatible, open source electrophysiology system.

    PubMed

    Hermiz, John; Rogers, Nick; Kaestner, Erik; Ganji, Mehran; Cleary, Dan; Snider, Joseph; Barba, David; Dayeh, Shadi; Halgren, Eric; Gilja, Vikash

    2016-08-01

    Open source electrophysiology (ephys) recording systems have several advantages over commercial systems such as customization and affordability enabling more researchers to conduct ephys experiments. Notable open source ephys systems include Open-Ephys, NeuroRighter and more recently Willow, all of which have high channel count (64+), scalability, and advanced software to develop on top of. However, little work has been done to build an open source ephys system that is clinic compatible, particularly in the operating room where acute human electrocorticography (ECoG) research is performed. We developed an affordable (<; $10,000) and open system for research purposes that features power isolation for patient safety, compact and water resistant enclosures and 256 recording channels sampled up to 20ksam/sec, 16-bit. The system was validated by recording ECoG with a high density, thin film device for an acute, awake craniotomy study at UC San Diego, Thornton Hospital Operating Room.

  7. Freeing Worldview's development process: Open source everything!

    NASA Astrophysics Data System (ADS)

    Gunnoe, T.

    2016-12-01

    Freeing your code and your project are important steps for creating an inviting environment for collaboration, with the added side effect of keeping a good relationship with your users. NASA Worldview's codebase was released with the open source NOSA (NASA Open Source Agreement) license in 2014, but this is only the first step. We also have to free our ideas, empower our users by involving them in the development process, and open channels that lead to the creation of a community project. There are many highly successful examples of Free and Open Source Software (FOSS) projects of which we can take note: the Linux kernel, Debian, GNOME, etc. These projects owe much of their success to having a passionate mix of developers/users with a great community and a common goal in mind. This presentation will describe the scope of this openness and how Worldview plans to move forward with a more community-inclusive approach.

  8. Matching Livestock Production Systems and Environment

    NASA Astrophysics Data System (ADS)

    Becchetti, T.; Stackhouse, J.; Snell, L.; Lile, D.; George, H.; Harper, J. M.; Larson, S.; Mashiri, F.; Doran, M.; Barry, S.

    2015-12-01

    Livestock production systems vary greatly over the world. Producers try to match the resources they have with the demands of production, this can vary by species, class of animal, number of animals, and production goals, etc. Using California's diversity in production systems as an example, we explored how livestock producers best utilize the forage and feed found in different ecosystems and available in different parts of the state. Livestock grazing, the predominant land use in California and in much of the world, makes efficient use of the natural vegetation produced without additional water (irrigation), minimal inputs such as fertilizer while often supporting a variety of conservation objectives including vegetation management, fire fuels management, and habitat and open space conservation. The numerous by-products produced by other sectors of California's agriculture as well as food industries, such as brewer's grain, cottonseeds, and almond hulls are utilized as a feed source for livestock. These by-products are not only an important feed source especially in drought years but are diverted from our waste stream when utilized by livestock. The concept of matching available resources to livestock needs throughout the world is often overlooked and production systems are often over simplified in projects conducting a life cycle analysis or developing carbon foot prints for livestock production systems. This paper provides details on the various production systems found in California, the ecosystem they have adapted to, and how the producers use science and ecological knowledge to match the biological requirements of the livestock and conservation objectives to feed and forage resources.

  9. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  10. Interim Open Source Software (OSS) Policy

    EPA Pesticide Factsheets

    This interim Policy establishes a framework to implement the requirements of the Office of Management and Budget's (OMB) Federal Source Code Policy to achieve efficiency, transparency and innovation through reusable and open source software.

  11. Web-based decision support and visualization tools for water quality management in the Chesapeake Bay watershed

    USGS Publications Warehouse

    Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.

    2009-01-01

    Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.

  12. Growth and Photovoltaic Properties of High-Quality GaAs Nanowires Prepared by the Two-Source CVD Method.

    PubMed

    Wang, Ying; Yang, Zaixing; Wu, Xiaofeng; Han, Ning; Liu, Hanyu; Wang, Shuobo; Li, Jun; Tse, WaiMan; Yip, SenPo; Chen, Yunfa; Ho, Johnny C

    2016-12-01

    Growing high-quality and low-cost GaAs nanowires (NWs) as well as fabricating high-performance NW solar cells by facile means is an important development towards the cost-effective next-generation photovoltaics. In this work, highly crystalline, dense, and long GaAs NWs are successfully synthesized using a two-source method on non-crystalline SiO2 substrates by a simple solid-source chemical vapor deposition method. The high V/III ratio and precursor concentration enabled by this two-source configuration can significantly benefit the NW growth and suppress the crystal defect formation as compared with the conventional one-source system. Since less NW crystal defects would contribute fewer electrons being trapped by the surface oxides, the p-type conductivity is then greatly enhanced as revealed by the electrical characterization of fabricated NW devices. Furthermore, the individual single NW and high-density NW parallel arrays achieved by contact printing can be effectively fabricated into Schottky barrier solar cells simply by employing asymmetric Ni-Al contacts, along with an open circuit voltage of ~0.3 V. All these results indicate the technological promise of these high-quality two-source grown GaAs NWs, especially for the realization of facile Schottky solar cells utilizing the asymmetric Ni-Al contact.

  13. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  14. Open Source Software Development Experiences on the Students' Resumes: Do They Count?--Insights from the Employers' Perspectives

    ERIC Educational Resources Information Center

    Long, Ju

    2009-01-01

    Open Source Software (OSS) is a major force in today's Information Technology (IT) landscape. Companies are increasingly using OSS in mission-critical applications. The transparency of the OSS technology itself with openly available source codes makes it ideal for students to participate in the OSS project development. OSS can provide unique…

  15. Open Source Initiative Powers Real-Time Data Streams

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Under an SBIR contract with Dryden Flight Research Center, Creare Inc. developed a data collection tool called the Ring Buffered Network Bus. The technology has now been released under an open source license and is hosted by the Open Source DataTurbine Initiative. DataTurbine allows anyone to stream live data from sensors, labs, cameras, ocean buoys, cell phones, and more.

  16. Xtreme Learning Control: Examples of the Open Source Movement's Impact on Our Educational Practice in a University Setting.

    ERIC Educational Resources Information Center

    Dunlap, Joanna C.; Wilson, Brent G.; Young, David L.

    This paper describes how Open Source philosophy, a movement that has developed in opposition to the proprietary software industry, has influenced educational practice in the pursuit of scholarly freedom and authentic learning activities for students and educators. This paper provides a brief overview of the Open Source movement, and describes…

  17. Adopting Open-Source Software Applications in U. S. Higher Education: A Cross-Disciplinary Review of the Literature

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2009-01-01

    Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…

  18. Assessing the Impact of Security Behavior on the Awareness of Open-Source Intelligence: A Quantitative Study of IT Knowledge Workers

    ERIC Educational Resources Information Center

    Daniels, Daniel B., III

    2014-01-01

    There is a lack of literature linking end-user behavior to the availability of open-source intelligence (OSINT). Most OSINT literature has been focused on the use and assessment of open-source intelligence, not the proliferation of personally or organizationally identifiable information (PII/OII). Additionally, information security studies have…

  19. Looking toward the Future: A Case Study of Open Source Software in the Humanities

    ERIC Educational Resources Information Center

    Quamen, Harvey

    2006-01-01

    In this article Harvey Quamen examines how the philosophy of open source software might be of particular benefit to humanities scholars in the near future--particularly for academic journals with limited financial resources. To this end he provides a case study in which he describes his use of open source technology (MySQL database software and…

  20. Preparing a scientific manuscript in Linux: Today's possibilities and limitations.

    PubMed

    Tchantchaleishvili, Vakhtang; Schmitto, Jan D

    2011-10-22

    Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux.

  1. Open Source Service Agent (OSSA) in the intelligence community's Open Source Architecture

    NASA Technical Reports Server (NTRS)

    Fiene, Bruce F.

    1994-01-01

    The Community Open Source Program Office (COSPO) has developed an architecture for the intelligence community's new Open Source Information System (OSIS). The architecture is a multi-phased program featuring connectivity, interoperability, and functionality. OSIS is based on a distributed architecture concept. The system is designed to function as a virtual entity. OSIS will be a restricted (non-public), user configured network employing Internet communications. Privacy and authentication will be provided through firewall protection. Connection to OSIS can be made through any server on the Internet or through dial-up modems provided the appropriate firewall authentication system is installed on the client.

  2. Design and Deployment of a General Purpose, Open Source LoRa to Wi-Fi Hub and Data Logger

    NASA Astrophysics Data System (ADS)

    DeBell, T. C.; Udell, C.; Kwon, M.; Selker, J. S.; Lopez Alcala, J. M.

    2017-12-01

    Methods and technologies facilitating internet connectivity and near-real-time status updates for in site environmental sensor data are of increasing interest in Earth Science. However, Open Source, Do-It-Yourself technologies that enable plug and play functionality for web-connected sensors and devices remain largely inaccessible for typical researchers in our community. The Openly Published Environmental Sensing Lab at Oregon State University (OPEnS Lab) constructed an Open Source 900 MHz Long Range Radio (LoRa) receiver hub with SD card data logger, Ethernet and Wi-Fi shield, and 3D printed enclosure that dynamically uploads transmissions from multiple wirelessly-connected environmental sensing devices. Data transmissions may be received from devices up to 20km away. The hub time-stamps, saves to SD card, and uploads all transmissions to a Google Drive spreadsheet to be accessed in near-real-time by researchers and GeoVisualization applications (such as Arc GIS) for access, visualization, and analysis. This research expands the possibilities of scientific observation of our Earth, transforming the technology, methods, and culture by combining open-source development and cutting edge technology. This poster details our methods and evaluates the application of using 3D printing, Arduino Integrated Development Environment (IDE), Adafruit's Open-Hardware Feather development boards, and the WIZNET5500 Ethernet shield for designing this open-source, general purpose LoRa to Wi-Fi data logger.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olama, Mohammed M; Allgood, Glenn O; Kuruganti, Phani Teja

    Electric utilities have a main responsibility to protect the lives and safety of their workers when they are working on low-, medium-, and high-voltage power lines and distribution circuits. With the anticipated widespread deployment of smart grids, a secure and highly reliable means of maintaining isolation of customer-owned distributed generation (DG) from the affected distribution circuits during maintenance is necessary to provide a fully de-energized work area, ensure utility personnel safety, and prevent hazards that can lead to accidents such as accidental electrocution from unanticipated power sources. Some circuits are serviced while energized (live line work) while others are de-energizedmore » for maintenance. For servicing de-energized circuits and equipment, lock-out tag-out (LOTO) programs provide a verifiable procedure for ensuring that circuit breakers are locked in the off state and tagged to indicate that status to operational personnel so that the lines will be checked for voltage to verify they are de-energized. The de-energized area is isolated from any energized sources, which traditionally are the substations. This procedure works well when all power sources and their interconnections are known armed with this knowledge, utility personnel can determine the appropriate circuits to de-energize for isolating the target line or equipment. However, with customer-owned DG tied into the grid, the risk of inadvertently reenergizing a circuit increases because circuit connections may not be adequately documented and are not under the direct control of the local utility. Thus, the active device may not be properly de-energized or isolated from the work area. Further, a remote means of de-energizing and locking out energized devices provides an opportunity for greatly reduced safety risk to utility personnel compared to manual operations. In this paper, we present a remotely controllable LOTO system that allows individual workers to determine the configuration and status of electrical system circuits and permit them to lock out customer-owned DG devices for safety purposes using a highly secure and ultra-reliable radio signal. The system consists of: (1) individual personal lockout devices, (2) lockout communications and logic module at circuit breakers, which are located at all DG devices, and (3) a database and configuration control process located at the utility operations center. The lockout system is a close permissive, i.e., loss of control power or communications will cause the circuit breaker to open. Once the DG device is tripped open, a visual means will provide confirmation of a loss of voltage and current that verifies the disconnected status of the DG. Further the utility personnel will be able to place their own lock electronically on the system to ensure a lockout functionally. The proposed LOTO system provides enhanced worker safety and protection against unintended energized lines when DG is present. The main approaches and challenges encountered through designing the proposed region-wide LOTO system are discussed in this paper. These approaches include: (1) evaluating the reliability of the proposed approach under N-modular redundancy with voter/spares configurations and (2) conducting a system level risk assessment study using the failure modes and effects analysis (FMEA) technique to identify and rank failure modes by probability of occurrence, probability of detection, and severity of consequences. This ranking allows a cost benefits analysis to be conducted such that dollars and efforts will be applied to the failures that provide greatest incremental gains in system capability (resilience, survivability, security, reliability, availability, etc.) per dollar spent whether capital, operations, or investment. Several simulation scenarios and their results are presented to demonstrate the viability of these approaches.« less

  4. Improving Data Catalogs with Free and Open Source Software

    NASA Astrophysics Data System (ADS)

    Schweitzer, R.; Hankin, S.; O'Brien, K.

    2013-12-01

    The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are using free services such as Google Charts to create an easily identifiable visual metaphor which describes the quality of data catalogs. Using this rubric, in conjunction with the ncISO metadata quality rubric, will allow data providers to identify non-compliance issues in their data catalogs, thereby improving data availability to their users and to data discovery systems

  5. 18. INTERIOR OF KITCHEN SHOWING 1950s VINTAGE CABINETRY, SINK, AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. INTERIOR OF KITCHEN SHOWING 1950s VINTAGE CABINETRY, SINK, AND COUNTER-TOP. OPEN DOOR AT PHOTO LEFT LEADS TO UTILITY ROOM. OPEN DOOR VISIBLE IN UTILITY ROOM LEADS TO THE BATHROOM. VIEW TO WEST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA

  6. 18. INTERIOR OF KITCHEN SHOWING OPEN DOOR TO UTILITY ROOM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. INTERIOR OF KITCHEN SHOWING OPEN DOOR TO UTILITY ROOM AND BUILT-IN CABINETRY AROUND SINK, ON EITHER SIDE OF 1-LIGHT OVER 1-LIGHT, DOUBLE-HUNG WINDOW, AND ABOVE MAJOR APPLIANCE AREA. VIEW TO WEST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA

  7. Single-Step Laser-Assisted Graphene Oxide Reduction and Nonlinear Optical Properties Exploration via CW Laser Excitation

    NASA Astrophysics Data System (ADS)

    Ghasemi, Fatemeh; Razi, Sepehr; Madanipour, Khosro

    2018-02-01

    The synthesis of reduced graphene oxide using pulsed laser irradiation is experimentally investigated. For this purpose, various irradiation conditions were selected and the chemical features of the different products were explored using ultraviolet-visible, Fourier transform infrared and Raman spectroscopy techniques. Moreover, the nonlinear optical properties of the synthesized products were assessed by using open and closed aperture Z-scan techniques, in which continuous wave laser irradiating at 532-nm wavelength was utilized as the exciting source. The results clearly revealed that the degree of graphene oxide reduction not only depends on the amount of the irradiation dose (energy of the laser beam × exposure time) but also on the light source wavelength. Furthermore, strong dependency between the nonlinear optical properties of the products and the amount of the de-oxygenation was observed. The experimental results are discussed in detail.

  8. Helioviewer.org: Enhanced Solar & Heliospheric Data Visualization

    NASA Astrophysics Data System (ADS)

    Stys, J. E.; Ireland, J.; Hughitt, V. K.; Mueller, D.

    2013-12-01

    Helioviewer.org enables the simultaneous exploration of multiple heterogeneous solar data sets. In the latest iteration of this open-source web application, Hinode XRT and Yohkoh SXT join SDO, SOHO, STEREO, and PROBA2 as supported data sources. A newly enhanced user-interface expands the utility of Helioviewer.org by adding annotations backed by data from the Heliospheric Events Knowledgebase (HEK). Helioviewer.org can now overlay solar feature and event data via interactive marker pins, extended regions, data labels, and information panels. An interactive time-line provides enhanced browsing and visualization to image data set coverage and solar events. The addition of a size-of-the-Earth indicator provides a sense of the scale to solar and heliospheric features for education and public outreach purposes. Tight integration with the Virtual Solar Observatory and SDO AIA cutout service enable solar physicists to seamlessly import science data into their SSW/IDL or SunPy/Python data analysis environments.

  9. A universal Model-R Coupler to facilitate the use of R functions for model calibration and analysis

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Yan, Wende

    2014-01-01

    Mathematical models are useful in various fields of science and engineering. However, it is a challenge to make a model utilize the open and growing functions (e.g., model inversion) on the R platform due to the requirement of accessing and revising the model's source code. To overcome this barrier, we developed a universal tool that aims to convert a model developed in any computer language to an R function using the template and instruction concept of the Parameter ESTimation program (PEST) and the operational structure of the R-Soil and Water Assessment Tool (R-SWAT). The developed tool (Model-R Coupler) is promising because users of any model can connect an external algorithm (written in R) with their model to implement various model behavior analyses (e.g., parameter optimization, sensitivity and uncertainty analysis, performance evaluation, and visualization) without accessing or modifying the model's source code.

  10. A cost/utility analysis of open reduction and internal fixation versus cast immobilization for acute nondisplaced mid-waist scaphoid fractures.

    PubMed

    Davis, Erika N; Chung, Kevin C; Kotsis, Sandra V; Lau, Frank H; Vijan, Sandeep

    2006-04-01

    Open reduction and internal fixation and cast immobilization are both acceptable treatment options for nondisplaced waist fractures of the scaphoid. The authors conducted a cost/utility analysis to weigh open reduction and internal fixation against cast immobilization in the treatment of acute nondisplaced mid-waist scaphoid fractures. The authors used a decision-analytic model to calculate the outcomes and costs of open reduction and internal fixation and cast immobilization, assuming the societal perspective. Utilities were assessed from 50 randomly selected medical students using the time trade-off method. Outcome probabilities taken from the literature were factored into the calculation of quality-adjusted life-years associated with each treatment. The authors estimated medical costs using Medicare reimbursement rates, and costs of lost productivity were estimated by average wages obtained from the U.S. Bureau of Labor Statistics. Open reduction and internal fixation offers greater quality-adjusted life-years compared with casting, with an increase ranging from 0.21 quality-adjusted life-years for the 25- to 34-year age group to 0.04 quality-adjusted life-years for the > or =65-year age group. Open reduction and internal fixation is less costly than casting ($7940 versus $13,851 per patient) because of a longer period of lost productivity with casting. Open reduction and internal fixation is therefore the dominant strategy. When considering only direct costs, the incremental cost/utility ratio for open reduction and internal fixation ranges from $5438 per quality-adjusted life-year for the 25- to 34-year age group to $11,420 for the 55- to 64-year age group, and $29,850 for the > or =65-year age group. Compared with casting, open reduction and internal fixation is cost saving from the societal perspective ($5911 less per patient). When considering only direct costs, open reduction and internal fixation is cost-effective relative to other widely accepted interventions.

  11. 77 FR 26476 - Standards of Performance for Greenhouse Gas Emissions for New Stationary Sources: Electric...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-04

    ... Performance for Greenhouse Gas Emissions for New Stationary Sources: Electric Utility Generating Units AGENCY... Greenhouse Gas Emissions for New Stationary Sources: Electric Utility Generating Units.'' The EPA is making... for Greenhouse Gas Emissions for New Stationary Sources: Electric Utility Generating Units, and...

  12. Turbulent Bubbly Flow in a Vertical Pipe Computed By an Eddy-Resolving Reynolds Stress Model

    DTIC Science & Technology

    2014-09-19

    the numerical code OpenFOAM R©. 1 Introduction Turbulent bubbly flows are encountered in many industrially relevant applications, such as chemical in...performed using the OpenFOAM -2.2.2 computational code utilizing a cell- center-based finite volume method on an unstructured numerical grid. The...the mean Courant number is always below 0.4. The utilized turbulence models were implemented into the so-called twoPhaseEulerFoam solver in OpenFOAM , to

  13. Open Source Tools for Assessment of Global Water Availability, Demands, and Scarcity

    NASA Astrophysics Data System (ADS)

    Li, X.; Vernon, C. R.; Hejazi, M. I.; Link, R. P.; Liu, Y.; Feng, L.; Huang, Z.; Liu, L.

    2017-12-01

    Water availability and water demands are essential factors for estimating water scarcity conditions. To reproduce historical observations and to quantify future changes in water availability and water demand, two open source tools have been developed by the JGCRI (Joint Global Change Research Institute): Xanthos and GCAM-STWD. Xanthos is a gridded global hydrologic model, designed to quantify and analyze water availability in 235 river basins. Xanthos uses a runoff generation and a river routing modules to simulate both historical and future estimates of total runoff and streamflows on a monthly time step at a spatial resolution of 0.5 degrees. GCAM-STWD is a spatiotemporal water disaggregation model used with the Global Change Assessment Model (GCAM) to spatially downscale global water demands for six major enduse sectors (irrigation, domestic, electricity generation, mining, and manufacturing) from the region scale to the scale of 0.5 degrees. GCAM-STWD then temporally downscales the gridded annual global water demands to monthly results. These two tools, written in Python, can be integrated to assess global, regional or basin-scale water scarcity or water stress. Both of the tools are extensible to ensure flexibility and promote contribution from researchers that utilize GCAM and study global water use and supply.

  14. Advancing the Power and Utility of Server-Side Aggregation

    NASA Technical Reports Server (NTRS)

    Fulker, Dave; Gallagher, James

    2016-01-01

    During the upcoming Summer 2016 meeting of the ESIP Federation (July 19-22), OpenDAP will hold a Developers and Users Workshop. While a broad set of topics will be covered, a key focus is capitalizing on recent EOSDIS-sponsored advances in Hyrax, OPeNDAPs own software for server-side realization of the DAP2 and DAP4 protocols. These Hyrax advances are as important to data users as to data providers, and the workshop will include hands-on experiences of value to both. Specifically, a balanced set of presentations and hands-on tutorials will address advances in1.server installation,2.server configuration,3.Hyrax aggregation capabilities,4.support for data-access from clients that are HTTP-based, JSON-based or OGC-compliant (especially WCS and WMS),5.support for DAP4,6.use and extension of server-side computational capabilities, and7.several performance-affecting matters.Topics 2 through 7 will be relevant to data consumers, data providers andnotably, due to the open-source nature of all OPeNDAP softwareto developers wishing to extend Hyrax, to build compatible clients and servers, andor to employ Hyrax as middleware that enables interoperability across a variety of end-user and source-data contexts. A session for contributed talks will elaborate the topics listed above and embrace additional ones.

  15. Pulse Sequence Programming in a Dynamic Visual Environment: SequenceTree

    PubMed Central

    Magland, Jeremy F.; Li, Cheng; Langham, Michael C.; Wehrli, Felix W.

    2015-01-01

    Purpose To describe SequenceTree (ST), an open source. integrated software environment for implementing MRI pulse sequences, and ideally exported them to actual MRI scanners. The software is a user-friendly alternative to vendor-supplied pulse sequence design and editing tools and is suited for non-programmers and programmers alike. Methods The integrated user interface was programmed using the Qt4/C++ toolkit. As parameters and code are modified, the pulse sequence diagram is automatically updated within the user interface. Several aspects of pulse programming are handled automatically allowing users to focus on higher-level aspects of sequence design. Sequences can be simulated using a built-in Bloch equation solver and then exported for use on a Siemens MRI scanner. Ideally other types of scanners will be supported in the future. Results The software has been used for eight years in the authors’ laboratory and elsewhere and has been utilized in more than fifty peer-reviewed publications in areas such as cardiovascular imaging, solid state and non-proton NMR, MR elastography, and high resolution structural imaging. Conclusion ST is an innovative, open source, visual pulse sequence environment for MRI combining simplicity with flexibility and is ideal for both advanced users and those with limited programming experience. PMID:25754837

  16. Thermal radiation heat transfer in participating media by finite volume discretization using collimated beam incidence

    NASA Astrophysics Data System (ADS)

    Harijishnu, R.; Jayakumar, J. S.

    2017-09-01

    The main objective of this paper is to study the heat transfer rate of thermal radiation in participating media. For that, a generated collimated beam has been passed through a two dimensional slab model of flint glass with a refractive index 2. Both Polar and azimuthal angle have been varied to generate such a beam. The Temperature of the slab and Snells law has been validated by Radiation Transfer Equation (RTE) in OpenFOAM (Open Field Operation and Manipulation), a CFD software which is the major computational tool used in Industry and research applications where the source code is modified in which radiation heat transfer equation is added to the case and different radiation heat transfer models are utilized. This work concentrates on the numerical strategies involving both transparent and participating media. Since Radiation Transfer Equation (RTE) is difficult to solve, the purpose of this paper is to use existing solver buoyantSimlpeFoam to solve radiation model in the participating media by compiling the source code to obtain the heat transfer rate inside the slab by varying the Intensity of radiation. The Finite Volume Method (FVM) is applied to solve the Radiation Transfer Equation (RTE) governing the above said physical phenomena.

  17. Herschel Observations of Protostellar and Young Stellar Objects in Nearby Molecular Clouds: The DIGIT Open Time Key Project

    NASA Astrophysics Data System (ADS)

    Green, Joel D.; DIGIT OTKP Team

    2010-01-01

    The DIGIT (Dust, Ice, and Gas In Time) Open Time Key Project utilizes the PACS spectrometer (57-210 um) onboard the Herschel Space Observatory to study the colder regions of young stellar objects and protostellar cores, complementary to recent observations from Spitzer and ground-based observatories. DIGIT focuses on 30 embedded sources and 64 disk sources, and includes supporting photometry from PACS and SPIRE, as well as spectroscopy from HIFI, selected from nearby molecular clouds. For the embedded sources, PACS spectroscopy will allow us to address the origin of [CI] and high-J CO lines observed with ISO-LWS. Our observations are sensitive to the presence of cold crystalline water ice, diopside, and carbonates. Additionally, PACS scans are 5x5 maps of the embedded sources and their outflows. Observations of more evolved disk sources will sample low and intermediate mass objects as well as a variety of spectral types from A to M. Many of these sources are extremely rich in mid-IR crystalline dust features, enabling us to test whether similar features can be detected at larger radii, via colder dust emission at longer wavelengths. If processed grains are present only in the inner disk (in the case of full disks) or from the emitting wall surface which marks the outer edge of the gap (in the case of transitional disks), there must be short timescales for dust processing; if processed grains are detected in the outer disk, radial transport must be rapid and efficient. Weak bands of forsterite and clino- and ortho-enstatite in the 60-75 um range provide information about the conditions under which these materials were formed. For the Science Demonstration Phase we are observing an embedded protostar (DK Cha) and a Herbig Ae/Be star (HD 100546), exemplars of the kind of science that DIGIT will achieve over the full program.

  18. Simulation of Mechanical Processes in Gas Storage Caverns for Short-Term Energy Storage

    NASA Astrophysics Data System (ADS)

    Böttcher, Norbert; Nagel, Thomas; Kolditz, Olaf

    2015-04-01

    In recent years, Germany's energy management has started to be transferred from fossil fuels to renewable and sustainable energy carriers. Renewable energy sources such as solar and wind power are subjected by fluctuations, thus the development and extension of energy storage capacities is a priority in German R&D programs. This work is a part of the ANGUS+ Project, funded by the federal ministry of education and research, which investigates the influence of subsurface energy storage on the underground. The utilization of subsurface salt caverns as a long-term storage reservoir for fossil fuels is a common method, since the construction of caverns in salt rock is inexpensive in comparison to solid rock formations due to solution mining. Another advantage of evaporate as host material is the self-healing behaviour of salt rock, thus the cavity can be assumed to be impermeable. In the framework of short-term energy storage (hours to days), caverns can be used as gas storage reservoirs for natural or artificial fuel gases, such as hydrogen, methane, or compressed air, where the operation pressures inside the caverns will fluctuate more frequently. This work investigates the influence of changing operation pressures at high frequencies on the stability of the host rock of gas storage caverns utilizing numerical models. Therefore, we developed a coupled Thermo-Hydro-Mechanical (THM) model based on the finite element method utilizing the open-source software platform OpenGeoSys. The salt behaviour is described by well-known constitutive material models which are capable of predicting creep, self-healing, and dilatancy processes. Our simulations include the thermodynamic behaviour of gas storage process, temperature development and distribution on the cavern boundary, the deformation of the cavern geometry, and the prediction of the dilatancy zone. Based on the numerical results, optimal operation modes can be found for individual caverns, so the risk of host rock damage can be minimized. Furthermore, the model can be used to design efficient monitoring programs to detect possible variations of the host rock due construction and operation of the storage facility. The developed model will be used by public authorities for land use planning issues.

  19. OpenCFU, a new free and open-source software to count cell colonies and other circular objects.

    PubMed

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net.

  20. Using Open Source Software in Visual Simulation Development

    DTIC Science & Technology

    2005-09-01

    increased the use of the technology in training activities. Using open source/free software tools in the process can expand these possibilities...resulting in even greater cost reduction and allowing the flexibility needed in a training environment. This thesis presents a configuration and architecture...to be used when developing training visual simulations using both personal computers and open source tools. Aspects of the requirements needed in a

  1. Open-Source Intelligence in the Czech Military: Knowledge System and Process Design

    DTIC Science & Technology

    2002-06-01

    in Open-Source Intelligence OSINT, as one of the intelligence disciplines, bears some of the general problems of intelligence " business " OSINT...ADAPTING KNOWLEDGE MANAGEMENT THEORY TO THE CZECH MILITARY INTELLIGENCE Knowledge work is the core business of the military intelligence . As...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS Approved for public release; distribution is unlimited OPEN-SOURCE INTELLIGENCE IN THE

  2. Writing in the Disciplines versus Corporate Workplaces: On the Importance of Conflicting Disciplinary Discourses in the Open Source Movement and the Value of Intellectual Property

    ERIC Educational Resources Information Center

    Ballentine, Brian D.

    2009-01-01

    Writing programs and more specifically, Writing in the Disciplines (WID) initiatives have begun to embrace the use of and the ideology inherent to, open source software. The Conference on College Composition and Communication has passed a resolution stating that whenever feasible educators and their institutions consider open source applications.…

  3. Anatomy of BioJS, an open source community for the life sciences.

    PubMed

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-07-08

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects.

  4. Build, Buy, Open Source, or Web 2.0?: Making an Informed Decision for Your Library

    ERIC Educational Resources Information Center

    Fagan, Jody Condit; Keach, Jennifer A.

    2010-01-01

    When improving a web presence, today's libraries have a choice: using a free Web 2.0 application, opting for open source, buying a product, or building a web application. This article discusses how to make an informed decision for one's library. The authors stress that deciding whether to use a free Web 2.0 application, to choose open source, to…

  5. A Framework for the Systematic Collection of Open Source Intelligence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pouchard, Line Catherine; Trien, Joseph P; Dobson, Jonathan D

    2009-01-01

    Following legislative directions, the Intelligence Community has been mandated to make greater use of Open Source Intelligence (OSINT). Efforts are underway to increase the use of OSINT but there are many obstacles. One of these obstacles is the lack of tools helping to manage the volume of available data and ascertain its credibility. We propose a unique system for selecting, collecting and storing Open Source data from the Web and the Open Source Center. Some data management tasks are automated, document source is retained, and metadata containing geographical coordinates are added to the documents. Analysts are thus empowered to search,more » view, store, and analyze Web data within a single tool. We present ORCAT I and ORCAT II, two implementations of the system.« less

  6. Laparoscopic colon resection trends in utilization and rate of conversion to open procedure: a national database review of academic medical centers.

    PubMed

    Simorov, Anton; Shaligram, Abhijit; Shostrom, Valerie; Boilesen, Eugene; Thompson, Jon; Oleynikov, Dmitry

    2012-09-01

    This study aims to examine trends of utilization and rates of conversion to open procedure for patients undergoing laparoscopic colon resections (LCR). This study is a national database review of academic medical centers and a retrospective analysis utilizing the University HealthSystem Consortium administrative database-an alliance of more than 300 academic and affiliate hospitals. A total of 85,712 patients underwent colon resections between October 2008 and December 2011. LCR was attempted in 36,228 patients (42.2%), with 5751 patients (15.8%) requiring conversion to an open procedure. There was a trend toward increasing utilization of LCR from 37.5% in 2008 to 44.1% in 2011. Attempted laparoscopic transverse colectomy had the highest rate of conversion (20.8%), followed by left (20.7%), right (15.6%), and sigmoid (14.3%) colon resections. The rate of utilization was highest in the Mid-Atlantic region (50.5%) and in medium- to large-sized hospitals (47.0%-49.0%).Multivariate logistic regression has shown that increasing age [odds ratio (OR) = 4.8, 95% confidence interval (CI) = 3.6-6.4], male sex (OR = 1.2, 95% CI = 1.1-1.3), open as compared with laparoscopic approach (OR = 2.6, 95%, CI = 2.3-3.1), and greater severity of illness category (OR = 27.1, 95% CI = 23.0-31.9) were all associated with increased mortality and morbidity and prolonged length of hospital stay. There is a trend of increasing utilization of LCR, with acceptable conversion rates, across hospitals in the United States over the recent years. When feasible, attempted LCR had better outcomes than open colectomy in the immediate perioperative period.

  7. The open-source neutral-mass spectrometer on Atmosphere Explorer-C, -D, and -E.

    NASA Technical Reports Server (NTRS)

    Nier, A. O.; Potter, W. E.; Hickman, D. R.; Mauersberger, K.

    1973-01-01

    The open-source mass spectrometer will be used to obtain the number densities of the neutral atmospheric gases in the mass range 1 to 48 amu at the satellite location. The ion source has been designed to allow gas particles to enter the ionizing region with the minimum practicable number of prior collisions with surfaces. This design minimizes the loss of atomic oxygen and other reactive species due to reactions with the walls of the ion source. The principal features of the open-source spectrometer and the laboratory calibration system are discussed.

  8. RADIOLOGICAL SEALED SOURCE LIBRARY: A NUCLEAR FORENSICS TOOL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canaday, Jodi; Chamberlain, David; Finck, Martha

    If a terrorist were to obtain and possibly detonate a device that contained radiological material, radiological forensic analysis of the material and source capsule could provide law enforcement with valuable clues about the origin of the radiological material; this information could then provide further leads on where the material and sealed source was obtained, and the loss of control point. This information could potentially be utilized for attribution and prosecution. Analyses of nuclear forensic signatures for radiological materials are generally understood to include isotopic ratios, trace element concentrations, the time since irradiation or purification, and morphology. Radiological forensic signatures formore » sealed sources provide additional information that leverages information on the physical design and chemical composition of the source capsule and containers, physical markings indicative of an owner or manufacturer. Argonne National Laboratory (Argonne), in collaboration with Idaho National Laboratory (INL), has been working since 2003 to understand signatures that could be used to identify specific source manufacturers. These signatures include the materials from which the capsule is constructed, dimensions, weld details, elemental composition, and isotopic abundances of the radioactive material. These signatures have been compiled in a library known as the Argonne/INL Radiological Sealed Source Library. Data collected for the library has included open-source information from vendor catalogs and web pages; discussions with source manufacturers and touring of production facilities (both protected through non-disclosure agreements); technical publications; and government registries such as the U.S. Nuclear Regulatory Commission’s Sealed Source and Device Registry.« less

  9. A Clinician-Centered Evaluation of the Usability of AHLTA and Automated Clinical Practice Guidelines at TAMC

    DTIC Science & Technology

    2011-03-31

    evidence based medicine into clinical practice. It will decrease costs and enable multiple stakeholders to work in an open content/source environment to exchange clinical content, develop and test technology and explore processes in applied CDS. Design: Comparative study between the KMR infrastructure and capabilities developed as an open source, vendor agnostic solution for aCPG execution within AHLTA and the current DoD/MHS standard evaluating: H1: An open source, open standard KMR and Clinical Decision Support Engine can enable organizations to share domain

  10. Preparing a scientific manuscript in Linux: Today's possibilities and limitations

    PubMed Central

    2011-01-01

    Background Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Findings Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux. PMID:22018246

  11. Open source bioimage informatics for cell biology.

    PubMed

    Swedlow, Jason R; Eliceiri, Kevin W

    2009-11-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.

  12. Implementation, reliability, and feasibility test of an Open-Source PACS.

    PubMed

    Valeri, Gianluca; Zuccaccia, Matteo; Badaloni, Andrea; Ciriaci, Damiano; La Riccia, Luigi; Mazzoni, Giovanni; Maggi, Stefania; Giovagnoni, Andrea

    2015-12-01

    To implement a hardware and software system able to perform the major functions of an Open-Source PACS, and to analyze it in a simulated real-world environment. A small home network was implemented, and the Open-Source operating system Ubuntu 11.10 was installed in a laptop containing the Dcm4chee suite with the software devices needed. The Open-Source PACS implemented is compatible with Linux OS, Microsoft OS, and Mac OS X; furthermore, it was used with operating systems that guarantee the operation in portable devices (smartphone, tablet) Android and iOS. An OSS PACS is useful for making tutorials and workshops on post-processing techniques for educational and training purposes.

  13. Optimization in the utility maximization framework for conservation planning: a comparison of solution procedures in a study of multifunctional agriculture

    PubMed Central

    Stoms, David M.; Davis, Frank W.

    2014-01-01

    Quantitative methods of spatial conservation prioritization have traditionally been applied to issues in conservation biology and reserve design, though their use in other types of natural resource management is growing. The utility maximization problem is one form of a covering problem where multiple criteria can represent the expected social benefits of conservation action. This approach allows flexibility with a problem formulation that is more general than typical reserve design problems, though the solution methods are very similar. However, few studies have addressed optimization in utility maximization problems for conservation planning, and the effect of solution procedure is largely unquantified. Therefore, this study mapped five criteria describing elements of multifunctional agriculture to determine a hypothetical conservation resource allocation plan for agricultural land conservation in the Central Valley of CA, USA. We compared solution procedures within the utility maximization framework to determine the difference between an open source integer programming approach and a greedy heuristic, and find gains from optimization of up to 12%. We also model land availability for conservation action as a stochastic process and determine the decline in total utility compared to the globally optimal set using both solution algorithms. Our results are comparable to other studies illustrating the benefits of optimization for different conservation planning problems, and highlight the importance of maximizing the effectiveness of limited funding for conservation and natural resource management. PMID:25538868

  14. Optimization in the utility maximization framework for conservation planning: a comparison of solution procedures in a study of multifunctional agriculture

    USGS Publications Warehouse

    Kreitler, Jason R.; Stoms, David M.; Davis, Frank W.

    2014-01-01

    Quantitative methods of spatial conservation prioritization have traditionally been applied to issues in conservation biology and reserve design, though their use in other types of natural resource management is growing. The utility maximization problem is one form of a covering problem where multiple criteria can represent the expected social benefits of conservation action. This approach allows flexibility with a problem formulation that is more general than typical reserve design problems, though the solution methods are very similar. However, few studies have addressed optimization in utility maximization problems for conservation planning, and the effect of solution procedure is largely unquantified. Therefore, this study mapped five criteria describing elements of multifunctional agriculture to determine a hypothetical conservation resource allocation plan for agricultural land conservation in the Central Valley of CA, USA. We compared solution procedures within the utility maximization framework to determine the difference between an open source integer programming approach and a greedy heuristic, and find gains from optimization of up to 12%. We also model land availability for conservation action as a stochastic process and determine the decline in total utility compared to the globally optimal set using both solution algorithms. Our results are comparable to other studies illustrating the benefits of optimization for different conservation planning problems, and highlight the importance of maximizing the effectiveness of limited funding for conservation and natural resource management.

  15. Reuse of coal mining wastes in civil engineering. Part 2: Utilization of minestone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skarzynska, K.M.

    1995-11-01

    The oldest method of minestone utilization is reclamation of spoil heaps by adapting them to the landscape by afforestation or agricultural management. The best method is, however, complete removal of the wastes. Hence, for many years research has been carried out to find new ways of minestone utilization to minimize disposal cost and harmful environmental effects. Earth structures offer the best possibilities of minestone utilization. Investigations conducted in recent years in Germany, the United Kingdom, France, Belgium, the Netherlands and also in Poland have led to the use of many tones of wastes in the construction of road and railroadmore » banks, river embankments, dykes and dams, filling of land depressions and open pits, as well as for sea wharfs and land reclamation. This paper presents descriptions of minestone applications to hydraulic, harbor and road engineering as well as to mine backfilling and restoration of derelict land. Effective management of minestone is still the principal problem with respect to safety, economics and environmental protection. Hence, the propagation of minestone utilization of known sources and the search for new methods of its management are essential. Two sections in this review have been devoted to the prevention of spontaneous heating and combustion of minestone and to the impact of minestone structures on the environment and its protection.« less

  16. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  17. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent processes from a different domain or have different spatial and temporal resolutions. An open source framework that bridges OpenMI and OpenDA is presented. The framework provides a generic and easy means for any OpenMI compliant model to assimilate observation measurements. An example test case will be presented using MikeSHE, and OpenMI compliant fully coupled integrated hydrological model that can accurately simulate the feedback dynamics of overland flow, unsaturated zone and saturated zone.

  18. Health Information-Seeking Patterns of the General Public and Indications for Disease Surveillance: Register-Based Study Using Lyme Disease.

    PubMed

    Pesälä, Samuli; Virtanen, Mikko J; Sane, Jussi; Mustonen, Pekka; Kaila, Minna; Helve, Otto

    2017-11-06

    People using the Internet to find information on health issues, such as specific diseases, usually start their search from a general search engine, for example, Google. Internet searches such as these may yield results and data of questionable quality and reliability. Health Library is a free-of-charge medical portal on the Internet providing medical information for the general public. Physician's Databases, an Internet evidence-based medicine source, provides medical information for health care professionals (HCPs) to support their clinical practice. Both databases are available throughout Finland, but the latter is used only by health professionals and pharmacies. Little is known about how the general public seeks medical information from medical sources on the Internet, how this behavior differs from HCPs' queries, and what causes possible differences in behavior. The aim of our study was to evaluate how the general public's and HCPs' information-seeking trends from Internet medical databases differ seasonally and temporally. In addition, we aimed to evaluate whether the general public's information-seeking trends could be utilized for disease surveillance and whether media coverage could affect these seeking trends. Lyme disease, serving as a well-defined disease model with distinct seasonal variation, was chosen as a case study. Two Internet medical databases, Health Library and Physician's Databases, were used. We compared the general public's article openings on Lyme disease from Health Library to HCPs' article openings on Lyme disease from Physician's Databases seasonally across Finland from 2011 to 2015. Additionally, media publications related to Lyme disease were searched from the largest and most popular media websites in Finland. Both databases, Health Library and Physician's Databases, show visually similar patterns in temporal variations of article openings on Lyme disease in Finland from 2011 to 2015. However, Health Library openings show not only an increasing trend over time but also greater fluctuations, especially during peak opening seasons. Outside these seasons, publications in the media coincide with Health Library article openings only occasionally. Lyme disease-related information-seeking behaviors between the general public and HCPs from Internet medical portals share similar temporal variations, which is consistent with the trend seen in epidemiological data. Therefore, the general public's article openings could be used as a supplementary source of information for disease surveillance. The fluctuations in article openings appeared stronger among the general public, thus, suggesting that different factors such as media coverage, affect the information-seeking behaviors of the public versus professionals. However, media coverage may also have an influence on HCPs. Not every publication was associated with an increase in openings, but the higher the media coverage by some publications, the higher the general public's access to Health Library. ©Samuli Pesälä, Mikko J Virtanen, Jussi Sane, Pekka Mustonen, Minna Kaila, Otto Helve. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 06.11.2017.

  19. Health Information–Seeking Patterns of the General Public and Indications for Disease Surveillance: Register-Based Study Using Lyme Disease

    PubMed Central

    Virtanen, Mikko J; Sane, Jussi; Mustonen, Pekka; Kaila, Minna; Helve, Otto

    2017-01-01

    Background People using the Internet to find information on health issues, such as specific diseases, usually start their search from a general search engine, for example, Google. Internet searches such as these may yield results and data of questionable quality and reliability. Health Library is a free-of-charge medical portal on the Internet providing medical information for the general public. Physician’s Databases, an Internet evidence-based medicine source, provides medical information for health care professionals (HCPs) to support their clinical practice. Both databases are available throughout Finland, but the latter is used only by health professionals and pharmacies. Little is known about how the general public seeks medical information from medical sources on the Internet, how this behavior differs from HCPs’ queries, and what causes possible differences in behavior. Objective The aim of our study was to evaluate how the general public’s and HCPs’ information-seeking trends from Internet medical databases differ seasonally and temporally. In addition, we aimed to evaluate whether the general public’s information-seeking trends could be utilized for disease surveillance and whether media coverage could affect these seeking trends. Methods Lyme disease, serving as a well-defined disease model with distinct seasonal variation, was chosen as a case study. Two Internet medical databases, Health Library and Physician’s Databases, were used. We compared the general public’s article openings on Lyme disease from Health Library to HCPs’ article openings on Lyme disease from Physician’s Databases seasonally across Finland from 2011 to 2015. Additionally, media publications related to Lyme disease were searched from the largest and most popular media websites in Finland. Results Both databases, Health Library and Physician’s Databases, show visually similar patterns in temporal variations of article openings on Lyme disease in Finland from 2011 to 2015. However, Health Library openings show not only an increasing trend over time but also greater fluctuations, especially during peak opening seasons. Outside these seasons, publications in the media coincide with Health Library article openings only occasionally. Conclusions Lyme disease–related information-seeking behaviors between the general public and HCPs from Internet medical portals share similar temporal variations, which is consistent with the trend seen in epidemiological data. Therefore, the general public’s article openings could be used as a supplementary source of information for disease surveillance. The fluctuations in article openings appeared stronger among the general public, thus, suggesting that different factors such as media coverage, affect the information-seeking behaviors of the public versus professionals. However, media coverage may also have an influence on HCPs. Not every publication was associated with an increase in openings, but the higher the media coverage by some publications, the higher the general public’s access to Health Library. PMID:29109071

  20. Basic principles to consider when opening a nurse practitioner-owned practice in Texas.

    PubMed

    Watson, Michael

    2015-12-01

    Advanced Practice Registered Nurse (APRN)-owned clinics in Texas are becoming more common and because of the success of these early clinics, more APRNs are considering opening their own practice; but Texas remains one of the most restrictive states for APRN practice and many questions remain. What are the regulations about physician delegation? Will you get reimbursed from insurance companies and at what rates? Can you be a primary care provider (PCP)? Changes enacted after the adoption of Senate Bill 406 improved the opportunities for APRNs in Texas yet several requirements must be met and early consultation with a lawyer and accountant can facilitate the initial business setup. The Prescriptive Authority Agreement simplified the delegation requirements and allows the APRN increased flexibility in obtaining and consulting with a delegating physician. Becoming credentialed as a PCP with private insurance companies is often complicated; however, utilizing the Council for Affordable Quality Healthcare's Universal Provider Data source for initial credentialing can facilitate this. Although this article does not discuss the financial implications of opening a practice, it does cover many aspects including legislative and regulatory requirements for practice, credentialing process and challenges, business structure, and tax implications. ©2015 American Association of Nurse Practitioners.

Top