Sample records for source portal software

  1. Evaluating Open Source Portals

    ERIC Educational Resources Information Center

    Goh, Dion; Luyt, Brendan; Chua, Alton; Yee, See-Yong; Poh, Kia-Ngoh; Ng, How-Yeu

    2008-01-01

    Portals have become indispensable for organizations of all types trying to establish themselves on the Web. Unfortunately, there have only been a few evaluative studies of portal software and even fewer of open source portal software. This study aims to add to the available literature in this important area by proposing and testing a checklist for…

  2. Towards easing the configuration and new team member accommodation for open source software based portals

    NASA Astrophysics Data System (ADS)

    Fu, L.; West, P.; Zednik, S.; Fox, P. A.

    2013-12-01

    For simple portals such as vocabulary based services, which contain small amounts of data and require only hyper-textual representation, it is often an overkill to adopt the whole software stack of database, middleware and front end, or to use a general Web development framework as the starting point of development. Directly combining open source software is a much more favorable approach. However, our experience with the Coastal and Marine Spatial Planning Vocabulary (CMSPV) service portal shows that there are still issues such as system configuration and accommodating a new team member that need to be handled carefully. In this contribution, we share our experience in the context of the CMSPV portal, and focus on the tools and mechanisms we've developed to ease the configuration job and the incorporation process of new project members. We discuss the configuration issues that arise when we don't have complete control over how the software in use is configured and need to follow existing configuration styles that may not be well documented, especially when multiple pieces of such software need to work together as a combined system. As for the CMSPV portal, it is built on two pieces of open source software that are still under rapid development: a Fuseki data server and Epimorphics Linked Data API (ELDA) front end. Both lack mature documentation and tutorials. We developed comparison and labeling tools to ease the problem of system configuration. Another problem that slowed down the project is that project members came and went during the development process, so new members needed to start with a partially configured system and incomplete documentation left by old members. We developed documentation/tutorial maintenance mechanisms based on our comparison and labeling tools to make it easier for the new members to be incorporated into the project. These tools and mechanisms also provided benefit to other projects that reused the software components from the CMSPV system.

  3. An Invitation to Collaborate: The SPIRIT Open Source Health Care Portal

    PubMed Central

    Bray, Brian; Molin, Joseph Dal

    2001-01-01

    The SPIRIT portal is a web site resulting from a joint project of the European Commission 5th Framework Research Programme for Information Society Technologies, Minoru Development (France), Conecta srl (Italy), and Sistema Information Systems (Italy). The portal indexes and disseminates free software, serves as a meeting point for health care informatics researchers, and provides collaboration services to health care innovators. This poster session describes the services of the portal and invites researchers to join a worldwide collaborative community developing evidence based health care solutions.

  4. Collaboration using open standards and open source software (examples of DIAS/CEOS Water Portal)

    NASA Astrophysics Data System (ADS)

    Miura, S.; Sekioka, S.; Kuroiwa, K.; Kudo, Y.

    2015-12-01

    The DIAS/CEOS Water Portal is a part of the DIAS (Data Integration and Analysis System, http://www.editoria.u-tokyo.ac.jp/projects/dias/?locale=en_US) systems for data distribution for users including, but not limited to, scientists, decision makers and officers like river administrators. One of the functions of this portal is to enable one-stop search and access variable water related data archived multiple data centers located all over the world. This portal itself does not store data. Instead, according to requests made by users on the web page, it retrieves data from distributed data centers on-the-fly and lets them download and see rendered images/plots. Our system mainly relies on the open source software GI-cat (http://essi-lab.eu/do/view/GIcat) and open standards such as OGC-CSW, Opensearch and OPeNDAP protocol to enable the above functions. Details on how it works will be introduced during the presentation. Although some data centers have unique meta data format and/or data search protocols, our portal's brokering function enables users to search across various data centers at one time. And this portal is also connected to other data brokering systems, including GEOSS DAB (Discovery and Access Broker). As a result, users can search over thousands of datasets, millions of files at one time. Users can access the DIAS/CEOS Water Portal system at http://waterportal.ceos.org/.

  5. BioPortal: An Open-Source Community-Based Ontology Repository

    NASA Astrophysics Data System (ADS)

    Noy, N.; NCBO Team

    2011-12-01

    Advances in computing power and new computational techniques have changed the way researchers approach science. In many fields, one of the most fruitful approaches has been to use semantically aware software to break down the barriers among disparate domains, systems, data sources, and technologies. Such software facilitates data aggregation, improves search, and ultimately allows the detection of new associations that were previously not detectable. Achieving these analyses requires software systems that take advantage of the semantics and that can intelligently negotiate domains and knowledge sources, identifying commonality across systems that use different and conflicting vocabularies, while understanding apparent differences that may be concealed by the use of superficially similar terms. An ontology, a semantically rich vocabulary for a domain of interest, is the cornerstone of software for bridging systems, domains, and resources. However, as ontologies become the foundation of all semantic technologies in e-science, we must develop an infrastructure for sharing ontologies, finding and evaluating them, integrating and mapping among them, and using ontologies in applications that help scientists process their data. BioPortal [1] is an open-source on-line community-based ontology repository that has been used as a critical component of semantic infrastructure in several domains, including biomedicine and bio-geochemical data. BioPortal, uses the social approaches in the Web 2.0 style to bring structure and order to the collection of biomedical ontologies. It enables users to provide and discuss a wide array of knowledge components, from submitting the ontologies themselves, to commenting on and discussing classes in the ontologies, to reviewing ontologies in the context of their own ontology-based projects, to creating mappings between overlapping ontologies and discussing and critiquing the mappings. Critically, it provides web-service access to all its content, enabling its integration in semantically enriched applications. [1] Noy, N.F., Shah, N.H., et al., BioPortal: ontologies and integrated data resources at the click of a mouse. Nucleic Acids Res, 2009. 37(Web Server issue): p. W170-3.

  6. Software Products - Naval Oceanography Portal

    Science.gov Websites

    section Advanced Search... Sections Home Time Earth Orientation Astronomy Meteorology Oceanography Ice You astronomy. Available as Fortran, C, or Python source code. Current version: 3.1 Software Products by Our computer or programmable calculator. Standards Of Fundamental Astronomy (SOFA) Libraries The International

  7. SPOCS: software for predicting and visualizing orthology/paralogy relationships among genomes.

    PubMed

    Curtis, Darren S; Phillips, Aaron R; Callister, Stephen J; Conlan, Sean; McCue, Lee Ann

    2013-10-15

    At the rate that prokaryotic genomes can now be generated, comparative genomics studies require a flexible method for quickly and accurately predicting orthologs among the rapidly changing set of genomes available. SPOCS implements a graph-based ortholog prediction method to generate a simple tab-delimited table of orthologs and in addition, html files that provide a visualization of the predicted ortholog/paralog relationships to which gene/protein expression metadata may be overlaid. A SPOCS web application is freely available at http://cbb.pnnl.gov/portal/tools/spocs.html. Source code for Linux systems is also freely available under an open source license at http://cbb.pnnl.gov/portal/software/spocs.html; the Boost C++ libraries and BLAST are required.

  8. A Business Case Study of Open Source Software

    DTIC Science & Technology

    2001-07-01

    LinuxPPC LinuxPPC www.linuxppc.com MandrakeSoft Linux -Mandrake www.linux-mandrake.com/ en / CLE Project CLE cle.linux.org.tw/CLE/e_index.shtml Red Hat... en Coyote Linux www2.vortech.net/coyte/coyte.htm MNIS www.mnis.fr Data-Portal www.data-portal.com Mr O’s Linux Emporium www.ouin.com DLX Linux www.wu...1998 1999 Year S h ip m en ts ( in m ill io n s) Source: IDC, 2000. Figure 11. Worldwide New Linux Shipments (Client and Server) 3.2.2 Market

  9. Web portal for dynamic creation and publication of teaching materials in multiple formats from a single source representation

    NASA Astrophysics Data System (ADS)

    Roganov, E. A.; Roganova, N. A.; Aleksandrov, A. I.; Ukolova, A. V.

    2017-01-01

    We implement a web portal which dynamically creates documents in more than 30 different formats including html, pdf and docx from a single original material source. It is obtained by using a number of free software such as Markdown (markup language), Pandoc (document converter), MathJax (library to display mathematical notation in web browsers), framework Ruby on Rails. The portal enables the creation of documents with a high quality visualization of mathematical formulas, is compatible with a mobile device and allows one to search documents by text or formula fragments. Moreover, it gives professors the ability to develop the latest technology educational materials, without qualified technicians' assistance, thus improving the quality of the whole educational process.

  10. Continued Funding for Prime Development

    DTIC Science & Technology

    2012-04-18

    Portal The PrIMe Portal is based on the Drupal open-source software. During the past year we upgraded it to version 6. There are currently over 350...Primekinetics.org ( Drupal Data warehouse \\, WebDAV Access Layer - L qeirch Re~~ est Role validation/Authorization Authentication Module I ~ Module...PHP language with the help of CMF Drupal -6. The standard modules of the Drupal core set are developed by third parties and obtained from the

  11. Software licensing policy for the Open Source Application Development Portal (OSADP).

    DOT National Transportation Integrated Search

    1998-07-01

    The purpose of the Commercial Vehicle Information Systems and Networks Model Deployment Initiative (CVISN MDI) is to demonstrate the technical and institutional feasibility, costs, and benefits of the primary Intelligent Transportation Systems (ITS) ...

  12. Increase in Efficiency of Use of Pedestrian Radiation Portal Monitors

    NASA Astrophysics Data System (ADS)

    Solovev, D. B.; Merkusheva, A. E.

    2017-11-01

    Most international airports in the world use radiation portal monitors (RPM) for primary radiation control organization. During the exploitation pedestrian radiation portal monitors operators (in the Russian Federation it is a special subdivision of customs officials) have certain problems related to the search of an ionizing radiation source causing the alarm signal of a radiation monitor. Radiation portal monitors at standard (factory) settings have to find out the illegal moving of the radioisotopes moved by physical persons passing through a controlled zone and having a steady radiation by the gamma or neutron channel. The problem is that recently the number of the ownerships who underwent treatment or medical diagnostics with the use of radio pharmaceuticals considerably increased, i.e,. ownerships represent such an ionizing radiation source. The operator of the radiation portal monitor has to define very quickly whether the ownership is a violator (takes unsolved radioisotopes illegally) or is just a patient of the clinic who underwent treatment/diagnostics with the use of radio pharmaceuticals. The research showing the radioisotopes which are most often used in the medical purposes are given in article, it is offered to use the new software developed by the authors allowing the operator of the radiation portal monitor to define the location of the ownership which has such ionizing radiation source by the activity of radiation similar to the radiation from radio pharmaceuticals.

  13. Gateway Portal

    DTIC Science & Technology

    2004-03-01

    using standard Internet technologies with no additional client software required. Furthermore, using a portable...Wilkerson Computational and Information Sciences Directorate, ARL Approved for public release... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grant, Ryan E.; Barrett, Brian W.; Pedretti, Kevin

    The Portals reference implementation is based on the Portals 4.X API, published by Sandia National Laboratories as a freely available public document. It is designed to be an implementation of the Portals Networking Application Programming Interface and is used by several other upper layer protocols like SHMEM, GASNet and MPI. It is implemented over existing networks, specifically Ethernet and InfiniBand networks. This implementation provides Portals networks functionality and serves as a software emulation of Portals compliant networking hardware. It can be used to develop software using the Portals API prior to the debut of Portals networking hardware, such as Bull’smore » BXI interconnect, as well as a substitute for portals hardware on development platforms that do not have Portals compliant hardware. The reference implementation provides new capabilities beyond that of a typical network, namely the ability to have messages matched in hardware in a way compatible with upper layer software such as MPI or SHMEM. It also offers methods of offloading network operations via triggered operations, which can be used to create offloaded collective operations. Specific details on the Portals API can be found at http://portals4.org.« less

  15. Reverse Engineering and Software Products Reuse to Teach Collaborative Web Portals: A Case Study with Final-Year Computer Science Students

    ERIC Educational Resources Information Center

    Medina-Dominguez, Fuensanta; Sanchez-Segura, Maria-Isabel; Mora-Soto, Arturo; Amescua, Antonio

    2010-01-01

    The development of collaborative Web applications does not follow a software engineering methodology. This is because when university students study Web applications in general, and collaborative Web portals in particular, they are not being trained in the use of software engineering techniques to develop collaborative Web portals. This paper…

  16. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    NASA Technical Reports Server (NTRS)

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; hide

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  17. A Checkup with Open Source Software Revitalizes an Early Electronic Resource Portal

    ERIC Educational Resources Information Center

    Spitzer, Stephan; Brown, Stephen

    2007-01-01

    The Uniformed Services University of the Health Sciences, located on the National Naval Medical Center's campus in Bethesda, Maryland, is a medical education and research facility for the nation's military and public health community. In order to support its approximately 7,500 globally distributed users, the university's James A. Zimble Learning…

  18. Self-service for software development projects and HPC activities

    NASA Astrophysics Data System (ADS)

    Husejko, M.; Høimyr, N.; Gonzalez, A.; Koloventzos, G.; Asbury, D.; Trzcinska, A.; Agtzidis, I.; Botrel, G.; Otto, J.

    2014-05-01

    This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.

  19. The Advent of Portals.

    ERIC Educational Resources Information Center

    Jackson, Mary E.

    2002-01-01

    Explains portals as tools that gather a variety of electronic information resources, including local library resources, into a single Web page. Highlights include cross-database searching; integration with university portals and course management software; the ARL (Association of Research Libraries) Scholars Portal Initiative; and selected vendors…

  20. Requirements model for an e-Health awareness portal

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Nawi, Mohd Nasrun M.

    2016-08-01

    Requirements engineering is at the heart and foundation of software engineering process. Poor quality requirements inevitably lead to poor quality software solutions. Also, poor requirement modeling is tantamount to designing a poor quality product. So, quality assured requirements development collaborates fine with usable products in giving the software product the needed quality it demands. In the light of the foregoing, the requirements for an e-Ebola Awareness Portal were modeled with a good attention given to these software engineering concerns. The requirements for the e-Health Awareness Portal are modeled as a contribution to the fight against Ebola and helps in the fulfillment of the United Nation's Millennium Development Goal No. 6. In this study requirements were modeled using UML 2.0 modeling technique.

  1. Description and testing of the Geo Data Portal: Data integration framework and Web processing services for environmental science collaboration

    USGS Publications Warehouse

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.

    2011-01-01

    Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.

  2. Information Technology. DOD Needs to Strengthen Management of Its Statutorily Mandated Software and System Process Improvement Efforts

    DTIC Science & Technology

    2009-09-01

    NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI

  3. Best Practices for Building Web Data Portals

    NASA Astrophysics Data System (ADS)

    Anderson, R. A.; Drew, L.

    2013-12-01

    With a data archive of more than 1.5 petabytes and a key role as the NASA Distributed Active Archive Center (DAAC) for synthetic aperture radar (SAR) data, the Alaska Satellite Facility (ASF) has an imperative to develop effective Web data portals. As part of continuous enhancement and expansion of its website, ASF recently created two data portals for distribution of SAR data: one for the archiving and distribution of NASA's MEaSUREs Wetlands project and one for newly digitally processed data from NASA's 1978 Seasat satellite. These case studies informed ASF's development of the following set of best practices for developing Web data portals. 1) Maintain well-organized, quality data. This is fundamental. If data are poorly organized or contain errors, credibility is lost and the data will not be used. 2) Match data to likely data uses. 3) Identify audiences in as much detail as possible. ASF DAAC's Seasat and Wetlands portals target three groups of users: a) scientists already familiar with ASF DAAC's SAR archive and our data download tool, Vertex; b) scientists not familiar with SAR or ASF, but who can use the data for their research of oceans, sea ice, volcanoes, land deformation and other Earth sciences; c) audiences wishing to learn more about SAR and its use in Earth sciences. 4) Identify the heaviest data uses and the terms scientists search for online when trying to find data for those uses. 5) Create search engine optimized (SEO) Web content that corresponds to those searches. Because search engines do not yet search raw data, so Web data portals must include content that ties the data to its likely uses. 6) Create Web designs that best serves data users (user centered design), not for how the organization views itself or its data. Usability testing was conducted for the ASF DAAC Wetlands portal to improve the user experience. 7) Use SEO tips and techniques. The ASF DAAC Seasat portal used numerous SEO techniques, including social media, blogging technology, SEO rich content and more. As a result, it was on the first page of numerous related Google search results within 24 hours of the portal launch. 8) Build in-browser data analysis tools showing scientists how the data can be used in their research. The ASF DAAC Wetlands portal demonstrates that allowing the user to examine the data quickly and graphically online readily enables users to perceive the value of the data and how to use it. 9) Use responsive Web design (RWD) so content and tools can be accessed from a wide range of devices. Wetlands and Seasat can be accessed from smartphones, tablets and desktops. 10) Use Web frameworks to enable rapid building of new portals using consistent design patterns. Seasat and Wetlands both use Django and Twitter Bootstrap. 11) Use load-balanced servers if high demand for the data is anticipated. Using load-balanced servers for the Seasat and Wetlands portals allows ASF to simply add hardware as needed to support increased capacity. 12) Use open-source software when possible. Seasat and Wetlands portal development costs were reduced, and functionality was increased, with the use of open-source software. 13) Use third-party virtual servers (e.g. Amazon EC2 and S3 Services) where applicable. 14) Track visitors using analytic tools. 15) Continually improve design.

  4. Mapping and Modeling Web Portal to Advance Global Monitoring and Climate Research

    NASA Astrophysics Data System (ADS)

    Chang, G.; Malhotra, S.; Bui, B.; Sadaqathulla, S.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Rodriguez, L.; Law, E.

    2011-12-01

    Today, the principal investigators of NASA Earth Science missions develop their own software to manipulate, visualize, and analyze the data collected from Earth, space, and airborne observation instruments. There is very little, if any, collaboration among these principal investigators due to the lack of collaborative tools, which would allow these scientists to share data and results. At NASA's Jet Propulsion Laboratory (JPL), under the Lunar Mapping and Modeling Project (LMMP), we have built a web portal that exposes a set of common services to users to allow search, visualization, subset, and download lunar science data. Users also have access to a set of tools that visualize, analyze and annotate the data. These services are developed according to industry standards for data access and manipulation, such REST and Open Geospatial Consortium (OGC) web services. As a result, users can access the datasets through custom written applications or off-the-shelf applications such as Google Earth. Even though it's currently used to store and process lunar data, this web portal infrastructure has been designed to support other solar system bodies such as asteroids and planets, including Earth. The infrastructure uses a combination of custom, commercial, and open-source software as well as off-the-shelf hardware and pay-by-use cloud computing services. The use of standardized web service interfaces facilitates platform and application-independent access to the services and data. For instance, we have software clients for the LMMP portal that provide a rich browsing and analysis experience from a variety of platforms including iOS and Android mobile platforms and large screen multi-touch displays with 3-D terrain viewing functions. The service-oriented architecture and design principles utilized in the implementation of the portal lends itself to be reusable and scalable and could naturally be extended to include a collaborative environment that enables scientists and principal investigators to share their research and analysis seamlessly. In addition, this extension will allow users to easily share their tools and data, and to enrich their mapping and analysis experiences. In this talk, we will describe the advanced data management and portal technologies used to power this collaborative environment. We will further illustrate how this environment can enable, enhance and advance global monitoring and climate research.

  5. HELI-DEM portal for geo-processing services

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Antonovic, Milan; Molinari, Monia

    2014-05-01

    HELI-DEM (Helvetia-Italy Digital Elevation Model) is a project developed in the framework of Italy/Switzerland Operational Programme for Trans-frontier Cooperation 2007-2013 whose major aim is to create a unified digital terrain model that includes the alpine and sub-alpine areas between Italy and Switzerland. The partners of the project are: Lombardy Region, Piedmont Region, Polytechnic of Milan, Polytechnic of Turin and Fondazione Politecnico from Italy; Institute of Earth Sciences (SUPSI) from Switzerland. The digital terrain model has been produced by integrating and validating the different elevation data available for the areas of interest, characterized by different reference frame, resolutions and accuracies: DHM at 25 m resolution from Swisstopo, DTM at 20 m resolution from Lombardy Region, DTM at 5 m resolution from Piedmont Region and DTM LiDAR PST-A at about 1 m resolution, that covers the main river bed areas and is produced by the Italian Ministry of the Environment. Further results of the project are: the generation of a unique Italian Swiss geoid with an accuracy of few centimeters (Gilardoni et al. 2012); the establishment of a GNSS permanent network, prototype of a transnational positioning service; the development of a geo-portal, entirely based on open source technologies and open standards, which provides the cross-border DTM and offers some capabilities of analysis and processing through the Internet. With this talk, the authors want to present the main steps of the project with a focus on the HELI-DEM geo-portal development carried out by the Institute of Earth Sciences, which is the access point to the DTM outputted from the project. The portal, accessible at http://geoservice.ist.supsi.ch/helidem, is a demonstration of open source technologies combined for providing access to geospatial functionalities to wide non GIS expert public. In fact, the system is entirely developed using only Open Standards and Free and Open Source Software (FOSS) both on the server side (services) and on the client side (interface). In addition to self developed code the system relies mainly on teh software GRASS 7 [1], ZOO-project [2], Geoserver [3] and OpenLayers [4] and the standards WMS [5], WCS [6] and WPS [7]. At the time of writing, the portal offers features like profiling, contour extraction, watershed delineation and analysis, derivatives calculation, data extraction, coordinate conversion but it is evolving and it is planned to extend to a series of environmental modeling that the IST developed in the past like dam break simulation, landslide run-out estimation and floods due to landslide impact in artificial basins. [1] Neteler M., Mitasova H., Open Source GIS: A GRASS GIS Approach. 3rd Ed. 406 pp, Springer, New York, 2008. [2] Fenoy G., Bozon N., Raghavan V., ZOO Project: The Open Wps Platform. Proceeding of 1st International Workshop on Pervasive Web Mapping, Geoprocessing and Services (WebMGS). Como, http://www.isprs.org/proceedings/XXXVIII/4-W13/ID_32.pdf, 26-27 agosto 2010. [3] Giannecchini S., Aime A., GeoServer, il server open source per la gestione interoperabile dei dati geospaziali. Atti 15a Conferenza Nazionale ASITA. Reggia di Colorno, 15-18 novembre 2011. [4] Perez A.S., OpenLayers Cookbook. Packt Publishing, 2012. ISBN 1849517843. [5] OGC, OpenGIS Web Map Server Implementation Specification, http://www.opengeospatial.org/standards/wms, 2006. [6] OGC, OGC WCS 2.0 Interface Standard - Core, http://portal.opengeospatial.org/files/?artifact_id=41437, 2010b. [7] OGC, OpenGIS Web Processing Service, http://portal.opengeospatial.org/files/?artifact_id=24151, 2007.

  6. Next Gen One Portal Usability Evaluation

    NASA Technical Reports Server (NTRS)

    Cross, E. V., III; Perera, J. S.; Hanson, A. M.; English, K.; Vu, L.; Amonette, W.

    2018-01-01

    Each exercise device on the International Space Station (ISS) has a unique, customized software system interface with unique layouts / hierarchy, and operational principles that require significant crew training. Furthermore, the software programs are not adaptable and provide no real-time feedback or motivation to enhance the exercise experience and/or prevent injuries. Additionally, the graphical user interfaces (GUI) of these systems present information through multiple layers resulting in difficulty navigating to the desired screens and functions. These limitations of current exercise device GUI's lead to increased crew time spent on initiating, loading, performing exercises, logging data and exiting the system. To address these limitations a Next Generation One Portal (NextGen One Portal) Crew Countermeasure System (CMS) was developed, which utilizes the latest industry guidelines in GUI designs to provide an intuitive ease of use approach (i.e., 80% of the functionality gained within 5-10 minutes of initial use without/limited formal training required). This is accomplished by providing a consistent interface using common software to reduce crew training, increase efficiency & user satisfaction while also reducing development & maintenance costs. Results from the usability evaluations showed the NextGen One Portal UI having greater efficiency, learnability, memorability, usability and overall user experience than the current Advanced Resistive Exercise Device (ARED) UI used by astronauts on ISS. Specifically, the design of the One-Portal UI as an app interface similar to those found on the Apple and Google's App Store, assisted many of the participants in grasping the concepts of the interface with minimum training. Although the NextGen One-Portal UI was shown to be an overall better interface, observations by the test facilitators noted specific exercise tasks appeared to have a significant impact on the NextGen One-Portal UI efficiency. Future updates to the NextGen One Portal UI will address these inefficiencies.

  7. A workflow learning model to improve geovisual analytics utility

    PubMed Central

    Roth, Robert E; MacEachren, Alan M; McCabe, Craig A

    2011-01-01

    Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545

  8. A workflow learning model to improve geovisual analytics utility.

    PubMed

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.

  9. Future-saving audiovisual content for Data Science: Preservation of geoinformatics video heritage with the TIB|AV-Portal

    NASA Astrophysics Data System (ADS)

    Löwe, Peter; Plank, Margret; Ziedorn, Frauke

    2015-04-01

    In data driven research, the access to citation and preservation of the full triad consisting of journal article, research data and -software has started to become good scientific practice. To foster the adoption of this practice the significance of software tools has to be acknowledged, which enable scientists to harness auxiliary audiovisual content in their research work. The advent of ubiquitous computer-based audiovisual recording and corresponding Web 2.0 hosting platforms like Youtube, Slideshare and GitHub has created new ecosystems for contextual information related to scientific software and data, which continues to grow both in size and variety of content. The current Web 2.0 platforms lack capabilities for long term archiving and scientific citation, such as persistent identifiers allowing to reference specific intervals of the overall content. The audiovisual content currently shared by scientists ranges from commented howto-demonstrations on software handling, installation and data-processing, to aggregated visual analytics of the evolution of software projects over time. Such content are crucial additions to the scientific message, as they ensure that software-based data-processing workflows can be assessed, understood and reused in the future. In the context of data driven research, such content needs to be accessible by effective search capabilities, enabling the content to be retrieved and ensuring that the content producers receive credit for their efforts within the scientific community. Improved multimedia archiving and retrieval services for scientific audiovisual content which meet these requirements are currently implemented by the scientific library community. This paper exemplifies the existing challenges, requirements, benefits and the potential of the preservation, accessibility and citability of such audiovisual content for the Open Source communities based on the new audiovisual web service TIB|AV Portal of the German National Library of Science and Technology. The web-based portal allows for extended search capabilities based on enhanced metadata derived by automated video analysis. By combining state-of-the-art multimedia retrieval techniques such as speech-, text-, and image recognition with semantic analysis, content-based access to videos at the segment level is provided. Further, by using the open standard Media Fragment Identifier (MFID), a citable Digital Object Identifier is displayed for each video segment. In addition to the continuously growing footprint of contemporary content, the importance of vintage audiovisual information needs to be considered: This paper showcases the successful application of the TIB|AV-Portal in the preservation and provision of a newly discovered version of a GRASS GIS promotional video produced by US Army -Corps of Enginers Laboratory (US-CERL) in 1987. The video is provides insight into the constraints of the very early days of the GRASS GIS project, which is the oldest active Free and Open Source Software (FOSS) GIS project which has been active for over thirty years. GRASS itself has turned into a collaborative scientific platform and a repository of scientific peer-reviewed code and algorithm/knowledge hub for future generation of scientists [1]. This is a reference case for future preservation activities regarding semantic-enhanced Web 2.0 content from geospatial software projects within Academia and beyond. References: [1] Chemin, Y., Petras V., Petrasova, A., Landa, M., Gebbert, S., Zambelli, P., Neteler, M., Löwe, P.: GRASS GIS: a peer-reviewed scientific platform and future research Repository, Geophysical Research Abstracts, Vol. 17, EGU2015-8314-1, 2015 (submitted)

  10. Software tool for portal dosimetry research.

    PubMed

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  11. ExPASy: SIB bioinformatics resource portal.

    PubMed

    Artimo, Panu; Jonnalagedda, Manohar; Arnold, Konstantin; Baratin, Delphine; Csardi, Gabor; de Castro, Edouard; Duvaud, Séverine; Flegel, Volker; Fortier, Arnaud; Gasteiger, Elisabeth; Grosdidier, Aurélien; Hernandez, Céline; Ioannidis, Vassilios; Kuznetsov, Dmitry; Liechti, Robin; Moretti, Sébastien; Mostaguir, Khaled; Redaschi, Nicole; Rossier, Grégoire; Xenarios, Ioannis; Stockinger, Heinz

    2012-07-01

    ExPASy (http://www.expasy.org) has worldwide reputation as one of the main bioinformatics resources for proteomics. It has now evolved, becoming an extensible and integrative portal accessing many scientific resources, databases and software tools in different areas of life sciences. Scientists can henceforth access seamlessly a wide range of resources in many different domains, such as proteomics, genomics, phylogeny/evolution, systems biology, population genetics, transcriptomics, etc. The individual resources (databases, web-based and downloadable software tools) are hosted in a 'decentralized' way by different groups of the SIB Swiss Institute of Bioinformatics and partner institutions. Specifically, a single web portal provides a common entry point to a wide range of resources developed and operated by different SIB groups and external institutions. The portal features a search function across 'selected' resources. Additionally, the availability and usage of resources are monitored. The portal is aimed for both expert users and people who are not familiar with a specific domain in life sciences. The new web interface provides, in particular, visual guidance for newcomers to ExPASy.

  12. European Space Software Repository ESSR

    NASA Astrophysics Data System (ADS)

    Livschitz, Jakob; Blommestijn, Robert

    2016-08-01

    The paper and presentation will present the status of the ESSR (European Space Software Repository), see [1]. It will describe the development phases, outline the web portal functionality and explain the process steps behind. Not only the front-end but also the back-end will be discussed.The ESSR web portal went live ESA internal on May 15th, 2015 and live world-wide September 19th, 2015. Currently the ESSR is in operations.

  13. HTSstation: a web application and open-access libraries for high-throughput sequencing data analysis.

    PubMed

    David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.

  14. HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis

    PubMed Central

    David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057

  15. The Gulf of Mexico Coastal Ocean Observing System: A Gulf Science Portal

    NASA Astrophysics Data System (ADS)

    Howard, M.; Gayanilo, F.; Kobara, S.; Jochens, A. E.

    2013-12-01

    The Gulf of Mexico Coastal Ocean Observing System's (GCOOS) regional science portal (gcoos.org) was designed to aggregate data and model output from distributed providers and to offer these, and derived products, through a single access point in standardized ways to a diverse set of users. The portal evolved under the NOAA-led U.S. Integrated Ocean Observing System (IOOS) program where automated largely-unattended machine-to-machine interoperability has always been a guiding tenet for system design. The web portal has a business unit where membership lists, new items, and reference materials are kept, a data portal where near real-time and historical data are held and served, and a products portal where data are fused into products tailored for specific or general stakeholder groups. The staff includes a system architect who built and maintains the data portal, a GIS expert who built and maintains the current product portal, the executive director who marshals resources to keep news items fresh and data manger who manages most of this. The business portal is built using WordPress which was selected because it appeared to be the easiest content management system for non-web programmers to add content to, maintain and enhance. The data portal is custom built and uses database, PHP, and web services based on Open Geospatial Consortium standards-based Sensor Observation Service (SOS) with Observations and Measurements (O&M) encodings. We employ a standards-based vocabulary, which we helped develop, which is registered at the Marine Metadata Interoperability Ontology Registry and Repository (http://mmisw.org). The registry is currently maintained by one of the authors. Products appearing in the products portal are primarily constructed using ESRI software by a Ph.D. level Geographer. Some products were built with other software, generally by graduate students over the years. We have been sensitive to the private sector when deciding which products to produce. While science users want numbers, users of all types mainly want maps. We have tried to develop flexible capabilities to present products for a variety of output devices, from desktop screens to the smart phones. Software maintenance is a continuing issue and new initiatives from NOAA add to the work load but improve the system. We will discuss how our data management system has evolved within the backdrop of rapidly changing technologies and diverse community requirements.

  16. ATLAS Live: Collaborative Information Streams

    NASA Astrophysics Data System (ADS)

    Goldfarb, Steven; ATLAS Collaboration

    2011-12-01

    I report on a pilot project launched in 2010 focusing on facilitating communication and information exchange within the ATLAS Collaboration, through the combination of digital signage software and webcasting. The project, called ATLAS Live, implements video streams of information, ranging from detailed detector and data status to educational and outreach material. The content, including text, images, video and audio, is collected, visualised and scheduled using digital signage software. The system is robust and flexible, utilizing scripts to input data from remote sources, such as the CERN Document Server, Indico, or any available URL, and to integrate these sources into professional-quality streams, including text scrolling, transition effects, inter and intra-screen divisibility. Information is published via the encoding and webcasting of standard video streams, viewable on all common platforms, using a web browser or other common video tool. Authorisation is enforced at the level of the streaming and at the web portals, using the CERN SSO system.

  17. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination

    PubMed Central

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-01-01

    Abstract Motivation Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Implementation Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. General features Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Availability Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. PMID:29025122

  18. NewProt - a protein engineering portal.

    PubMed

    Schwarte, Andreas; Genz, Maika; Skalden, Lilly; Nobili, Alberto; Vickers, Clare; Melse, Okke; Kuipers, Remko; Joosten, Henk-Jan; Stourac, Jan; Bendl, Jaroslav; Black, Jon; Haase, Peter; Baakman, Coos; Damborsky, Jiri; Bornscheuer, Uwe; Vriend, Gert; Venselaar, Hanka

    2017-06-01

    The NewProt protein engineering portal is a one-stop-shop for in silico protein engineering. It gives access to a large number of servers that compute a wide variety of protein structure characteristics supporting work on the modification of proteins through the introduction of (multiple) point mutations. The results can be inspected through multiple visualizers. The HOPE software is included to indicate mutations with possible undesired side effects. The Hotspot Wizard software is embedded for the design of mutations that modify a proteins' activity, specificity, or stability. The NewProt portal is freely accessible at http://newprot.cmbi.umcn.nl/ and http://newprot.fluidops.net/. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. The DIAS/CEOS Water Portal, distributed system using brokering architecture

    NASA Astrophysics Data System (ADS)

    Miura, Satoko; Sekioka, Shinichi; Kuroiwa, Kaori; Kudo, Yoshiyuki

    2015-04-01

    The DIAS/CEOS Water Portal is a one of the DIAS (Data Integration and Analysis System, http://www.editoria.u-tokyo.ac.jp/projects/dias/?locale=en_US) systems for data distribution for users including, but not limited to, scientists, decision makers and officers like river administrators. This portal has two main functions; one is to search and access data and the other is to register and share use cases which use datasets provided via this portal. This presentation focuses on the first function, to search and access data. The Portal system is distributed in the sense that, while the portal system is located in Tokyo, the data is located in archive centers which are globally distributed. For example, some in-situ data is archived at the National Center for Atmospheric Research (NCAR) Earth Observing Laboratory in Boulder, Colorado, USA. The NWP station time series and global gridded model output data is archived at the Max Planck Institute for Meteorology (MPIM) in cooperation with the World Data Center for Climate in Hamburg, Germany. Part of satellite data is archived at DIAS storage at the University of Tokyo, Japan. This portal itself does not store data. Instead, according to requests made by users on the web page, it retrieves data from distributed data centers on-the-fly and lets them download and see rendered images/plots. Although some data centers have unique meta data format and/or data search protocols, our portal's brokering function enables users to search across various data centers at one time, like one-stop shopping. And this portal is also connected to other data brokering systems, including GEOSS DAB (Discovery and Access Broker). As a result, users can search over thousands of datasets, millions of files at one time. Our system mainly relies on the open source software GI-cat (http://essi-lab.eu/do/view/GIcat), Opensearch protocol and OPeNDAP protocol to enable the above functions. Details on how it works will be introduced during the presentation. Users can access the DIAS/CEOS Water Portal system at http://waterportal.ceos.org/.

  20. Software - Naval Oceanography Portal

    Science.gov Websites

    section Advanced Search... Sections Home Time Earth Orientation Astronomy Meteorology Oceanography Ice You are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  1. The GPlates Portal: Cloud-Based Interactive 3D Visualization of Global Geophysical and Geological Data in a Web Browser.

    PubMed

    Müller, R Dietmar; Qin, Xiaodong; Sandwell, David T; Dutkiewicz, Adriana; Williams, Simon E; Flament, Nicolas; Maus, Stefan; Seton, Maria

    2016-01-01

    The pace of scientific discovery is being transformed by the availability of 'big data' and open access, open source software tools. These innovations open up new avenues for how scientists communicate and share data and ideas with each other and with the general public. Here, we describe our efforts to bring to life our studies of the Earth system, both at present day and through deep geological time. The GPlates Portal (portal.gplates.org) is a gateway to a series of virtual globes based on the Cesium Javascript library. The portal allows fast interactive visualization of global geophysical and geological data sets, draped over digital terrain models. The globes use WebGL for hardware-accelerated graphics and are cross-platform and cross-browser compatible with complete camera control. The globes include a visualization of a high-resolution global digital elevation model and the vertical gradient of the global gravity field, highlighting small-scale seafloor fabric such as abyssal hills, fracture zones and seamounts in unprecedented detail. The portal also features globes portraying seafloor geology and a global data set of marine magnetic anomaly identifications. The portal is specifically designed to visualize models of the Earth through geological time. These space-time globes include tectonic reconstructions of the Earth's gravity and magnetic fields, and several models of long-wavelength surface dynamic topography through time, including the interactive plotting of vertical motion histories at selected locations. The globes put the on-the-fly visualization of massive data sets at the fingertips of end-users to stimulate teaching and learning and novel avenues of inquiry.

  2. The GPlates Portal: Cloud-Based Interactive 3D Visualization of Global Geophysical and Geological Data in a Web Browser

    PubMed Central

    Müller, R. Dietmar; Qin, Xiaodong; Sandwell, David T.; Dutkiewicz, Adriana; Williams, Simon E.; Flament, Nicolas; Maus, Stefan; Seton, Maria

    2016-01-01

    The pace of scientific discovery is being transformed by the availability of ‘big data’ and open access, open source software tools. These innovations open up new avenues for how scientists communicate and share data and ideas with each other and with the general public. Here, we describe our efforts to bring to life our studies of the Earth system, both at present day and through deep geological time. The GPlates Portal (portal.gplates.org) is a gateway to a series of virtual globes based on the Cesium Javascript library. The portal allows fast interactive visualization of global geophysical and geological data sets, draped over digital terrain models. The globes use WebGL for hardware-accelerated graphics and are cross-platform and cross-browser compatible with complete camera control. The globes include a visualization of a high-resolution global digital elevation model and the vertical gradient of the global gravity field, highlighting small-scale seafloor fabric such as abyssal hills, fracture zones and seamounts in unprecedented detail. The portal also features globes portraying seafloor geology and a global data set of marine magnetic anomaly identifications. The portal is specifically designed to visualize models of the Earth through geological time. These space-time globes include tectonic reconstructions of the Earth’s gravity and magnetic fields, and several models of long-wavelength surface dynamic topography through time, including the interactive plotting of vertical motion histories at selected locations. The globes put the on-the-fly visualization of massive data sets at the fingertips of end-users to stimulate teaching and learning and novel avenues of inquiry. PMID:26960151

  3. Searching and exploitation of distributed geospatial data sources via the Naval Research Lab's Geospatial Information Database (GIDB) Portal System

    NASA Astrophysics Data System (ADS)

    McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.

    2005-05-01

    The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.

  4. The Collaborative Information Portal and NASA's Mars Exploration Rover Mission

    NASA Technical Reports Server (NTRS)

    Mak, Ronald; Walton, Joan

    2005-01-01

    The Collaborative Information Portal was enterprise software developed jointly by the NASA Ames Research Center and the Jet Propulsion Laboratory for NASA's Mars Exploration Rover mission. Mission managers, engineers, scientists, and researchers used this Internet application to view current staffing and event schedules, download data and image files generated by the rovers, receive broadcast messages, and get accurate times in various Mars and Earth time zones. This article describes the features, architecture, and implementation of this software, and concludes with lessons we learned from its deployment and a look towards future missions.

  5. JPL Genesis and Rapid Intensification Processes (GRIP) Portal

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, P. Peggy; Vu, Quoc A.; Turk, Francis J.; Shen, Tsae-Pyng J.; Hristova-Veleva, Svetla M.; Licata, Stephen J.; Poulsen, William L.

    2012-01-01

    Satellite observations can play a very important role in airborne field campaigns, since they provide a comprehensive description of the environment that is essential for the experiment design, flight planning, and post-experiment scientific data analysis. In the past, it has been difficult to fully utilize data from multiple NASA satellites due to the large data volume, the complexity of accessing NASA s data in near-real-time (NRT), as well as the lack of software tools to interact with multi-sensor information. The JPL GRIP Portal is a Web portal that serves a comprehensive set of NRT observation data sets from NASA and NOAA satellites describing the atmospheric and oceanic environments related to the genesis and intensification of the tropical storms in the North Atlantic Ocean. Together with the model forecast data from four major global atmospheric models, this portal provides a useful tool for the scientists and forecasters in planning and monitoring the NASA GRIP field campaign during the 2010 Atlantic Ocean hurricane season. This portal uses the Google Earth plug-in to visualize various types of data sets, such as 2D maps, wind vectors, streamlines, 3D data sets presented at series of vertical cross-sections or pointwise vertical profiles, and hurricane best tracks and forecast tracks. Additionally, it allows users to overlap multiple data sets, change the opacity of each image layer, generate animations on the fly with selected data sets, and compare the observation data with the model forecast using two independent calendars. The portal also provides the capability to identify the geographic location of any point of interest. In addition to supporting the airborne mission planning, the NRT data and portal will serve as a very rich source of information during the post-field campaign analysis stage of the airborne experiment. By including a diverse set of satellite observations and model forecasts, it provides a good spatial and temporal context for the high-resolution, but limited in space and time, airborne observations.

  6. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination.

    PubMed

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-10-01

    Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. © The Author 2017; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association

  7. Third party EPID with IGRT capability retrofitted onto an existing medical linear accelerator

    PubMed Central

    Odero, DO; Shimm, DS

    2009-01-01

    Radiation therapy requires precision to avoid unintended irradiation of normal organs. Electronic Portal Imaging Devices (EPIDs), can help with precise patient positioning for accurate treatment. EPIDs are now bundled with new linear accelerators, or they can be purchased from the Linac manufacturer for retrofit. Retrofitting a third party EPID to a linear accelerator can pose challenges. The authors describe a relatively inexpensive third party CCD camera-based EPID manufactured by TheraView (Cablon Medical B.V.), installed onto a Siemens Primus linear accelerator, and integrated with a Lantis record and verify system, an Oldelft simulator with Digital Therapy Imaging (DTI) unit, and a Philips ADAC Pinnacle treatment planning system (TPS). This system integrates well with existing equipment and its software can process DICOM images from other sources. The system provides a complete imaging system that eliminates the need for separate software for portal image viewing, interpretation, analysis, archiving, image guided radiation therapy and other image management applications. It can also be accessed remotely via safe VPN tunnels. TheraView EPID retrofit therefore presents an example of a less expensive alternative to linear accelerator manufacturers’ proprietary EPIDs suitable for implementation in third world countries radiation therapy departments which are often faced with limited financial resources. PMID:21611056

  8. Third party EPID with IGRT capability retrofitted onto an existing medical linear accelerator.

    PubMed

    Odero, D O; Shimm, D S

    2009-07-01

    Radiation therapy requires precision to avoid unintended irradiation of normal organs. Electronic Portal Imaging Devices (EPIDs), can help with precise patient positioning for accurate treatment. EPIDs are now bundled with new linear accelerators, or they can be purchased from the Linac manufacturer for retrofit. Retrofitting a third party EPID to a linear accelerator can pose challenges. The authors describe a relatively inexpensive third party CCD camera-based EPID manufactured by TheraView (Cablon Medical B.V.), installed onto a Siemens Primus linear accelerator, and integrated with a Lantis record and verify system, an Oldelft simulator with Digital Therapy Imaging (DTI) unit, and a Philips ADAC Pinnacle treatment planning system (TPS). This system integrates well with existing equipment and its software can process DICOM images from other sources. The system provides a complete imaging system that eliminates the need for separate software for portal image viewing, interpretation, analysis, archiving, image guided radiation therapy and other image management applications. It can also be accessed remotely via safe VPN tunnels. TheraView EPID retrofit therefore presents an example of a less expensive alternative to linear accelerator manufacturers' proprietary EPIDs suitable for implementation in third world countries radiation therapy departments which are often faced with limited financial resources.

  9. The Orthanc Ecosystem for Medical Imaging.

    PubMed

    Jodogne, Sébastien

    2018-05-03

    This paper reviews the components of Orthanc, a free and open-source, highly versatile ecosystem for medical imaging. At the core of the Orthanc ecosystem, the Orthanc server is a lightweight vendor neutral archive that provides PACS managers with a powerful environment to automate and optimize the imaging flows that are very specific to each hospital. The Orthanc server can be extended with plugins that provide solutions for teleradiology, digital pathology, or enterprise-ready databases. It is shown how software developers and research engineers can easily develop external software or Web portals dealing with medical images, with minimal knowledge of the DICOM standard, thanks to the advanced programming interface of the Orthanc server. The paper concludes by introducing the Stone of Orthanc, an innovative toolkit for the cross-platform rendering of medical images.

  10. Description of the U.S. Geological Survey Geo Data Portal data integration framework

    USGS Publications Warehouse

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Lucido, Jessica M.

    2012-01-01

    The U.S. Geological Survey has developed an open-standard data integration framework for working efficiently and effectively with large collections of climate and other geoscience data. A web interface accesses catalog datasets to find data services. Data resources can then be rendered for mapping and dataset metadata are derived directly from these web services. Algorithm configuration and information needed to retrieve data for processing are passed to a server where all large-volume data access and manipulation takes place. The data integration strategy described here was implemented by leveraging existing free and open source software. Details of the software used are omitted; rather, emphasis is placed on how open-standard web services and data encodings can be used in an architecture that integrates common geographic and atmospheric data.

  11. LifeWatchGreece Portal development: architecture, implementation and challenges for a biodiversity research e-infrastructure.

    PubMed

    Gougousis, Alexandros; Bailly, Nicolas

    2016-01-01

    Biodiversity data is characterized by its cross-disciplinary character, the extremely broad range of data types and structures, and the plethora of different data sources providing resources for the same piece of information in a heterogeneous way. Since the web inception two decades ago, there are multiple initiatives to connect, aggregate, share, and publish biodiversity data, and to establish data and work flows in order to analyze them. The European program LifeWatch aims at establishing a distributed network of nodes implementing virtual research environment in Europe to facilitate the work of biodiversity researchers and managers. LifeWatchGreece is one of these nodes where a portal was developed offering access to a suite of virtual laboratories and e-services. Despite its strict definition in information technology, in practice "portal" is a fairly broad term that embraces many web architectures. In the biodiversity domain, the term "portal" is usually used to indicate either a web site that provides access to a single or an aggregation of data repositories (like: http://indiabiodiversity.org/, http://www.mountainbiodiversity.org/, http://data.freshwaterbiodiversity.eu), a web site that gathers information about various online biodiversity tools (like http://test-eubon.ebd.csic.es/, http://marine.lifewatch.eu/) or a web site that just gathers information and news about the biodiversity domain (like http://chm.moew.government.bg). LifeWatchGreece's portal takes the concept of a portal a step further. In strict IT terms, LifeWatchGreece's portal is partly a portal, partly a platform and partly an aggregator. It includes a number of biodiversity-related web tools integrated into a centrally-controlled software ecosystem. This ecosystem includes subsystems for access control, traffic monitoring, user notifications and web tool management. These subsystems are shared to all the web tools that have been integrated to the portal and thereby are part of this ecosystem. These web tools do not consist in external and completely independent web applications as it happens in most other portals. A quite obvious (to the user) indication of this is the Single-Sign-On (SSO) functionality for all tools and the common user interface wrapper that most of these tools use. Another example of a less obvious functionality is the common user profile that is shared and can be utilized by all tools (e.g user's timezone).

  12. VisEL: Visualisation of Expertise Level in a Special Interest Group Knowledge Portal

    NASA Astrophysics Data System (ADS)

    Zulhafizsyam Wan Ahmad, Wan Muhammad; Sulaiman, Shahida; Yusof, Umi Kalsom

    A variety of portals are available nowadays to support diverse purposes such as commercial, publishing, personal, affinity and corporate portals. Affinity portals promote electronic communities who share common interest such as a special interest group (SIG). Knowledge portal is an emerging trend that benefits the existing portal technology by designing such portals with proper representation of the members' shared knowledge. Besides textual representation for diverse expertise levels, graphical visualisation will be able to support the requirements in searching and representing expertise level among e-community. There is a number of existing SIG portals available. However, they do not visualise effectively and accurately the expertise level of members and make it difficult for users to search their targeted experts for instance searching the highest expertise level to have a discussion and to solve their problems related to a project. The goal of this paper is to propose a graphical visualisation of expertise level method (VisEL) using an interactive tag cloud technique that represents expertise level of each member based on their knowledge in a software engineering SIG portal.

  13. The Power of Portals: Personalizing the Web To Build Community.

    ERIC Educational Resources Information Center

    Page, Dan

    2001-01-01

    Describes how the director of information systems for the computing and communications department and a team of software developers embarked on the task of creating and refining portal technology for a broad community of users with various relationships to the University of Washington. Discusses focus on individual needs; authentication, the…

  14. Developing Interoperable Air Quality Community Portals

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Husar, R. B.; Yang, C. P.; Robinson, E. M.; Fialkowski, W. E.

    2009-04-01

    Web portals are intended to provide consolidated discovery, filtering and aggregation of content from multiple, distributed web sources targeted at particular user communities. This paper presents a standards-based information architectural approach to developing portals aimed at air quality community collaboration in data access and analysis. An important characteristic of the approach is to advance beyond the present stand-alone design of most portals to achieve interoperability with other portals and information sources. We show how using metadata standards, web services, RSS feeds and other Web 2.0 technologies, such as Yahoo! Pipes and del.icio.us, helps increase interoperability among portals. The approach is illustrated within the context of the GEOSS Architecture Implementation Pilot where an air quality community portal is being developed to provide a user interface between the portals and clearinghouse of the GEOSS Common Infrastructure and the air quality community catalog of metadata and data services.

  15. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I

    2015-06-15

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm{sup 3} cube where the average cumulative reconstructed dose exceedsmore » the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments.« less

  16. NASA's Lunar and Planetary Mapping and Modeling Program

    NASA Astrophysics Data System (ADS)

    Law, E.; Day, B. H.; Kim, R. M.; Bui, B.; Malhotra, S.; Chang, G.; Sadaqathullah, S.; Arevalo, E.; Vu, Q. A.

    2016-12-01

    NASA's Lunar and Planetary Mapping and Modeling Program produces a suite of online visualization and analysis tools. Originally designed for mission planning and science, these portals offer great benefits for education and public outreach (EPO), providing access to data from a wide range of instruments aboard a variety of past and current missions. As a component of NASA's Science EPO Infrastructure, they are available as resources for NASA STEM EPO programs, and to the greater EPO community. As new missions are planned to a variety of planetary bodies, these tools are facilitating the public's understanding of the missions and engaging the public in the process of identifying and selecting where these missions will land. There are currently three web portals in the program: the Lunar Mapping and Modeling Portal or LMMP (http://lmmp.nasa.gov), Vesta Trek (http://vestatrek.jpl.nasa.gov), and Mars Trek (http://marstrek.jpl.nasa.gov). Portals for additional planetary bodies are planned. As web-based toolsets, the portals do not require users to purchase or install any software beyond current web browsers. The portals provide analysis tools for measurement and study of planetary terrain. They allow data to be layered and adjusted to optimize visualization. Visualizations are easily stored and shared. The portals provide 3D visualization and give users the ability to mark terrain for generation of STL files that can be directed to 3D printers. Such 3D prints are valuable tools in museums, public exhibits, and classrooms - especially for the visually impaired. Along with the web portals, the program supports additional clients, web services, and APIs that facilitate dissemination of planetary data to a range of external applications and venues. NASA challenges and hackathons are also providing members of the software development community opportunities to participate in tool development and leverage data from the portals.

  17. Digital Partnerships for Health: Steps to develop a community-specific health portal aimed at promoting health and well-being

    PubMed Central

    Kukafka, Rita; Khan, Sharib A.; Hutchinson, Carly; McFarlane, Delano J.; Li, Jianhua; Ancker, Jessica S.; Cohall, Alwyn

    2007-01-01

    We describe the steps taken by the Harlem Health Promotion Center to develop a community-specific health web portal aimed at promoting health and well-being in Harlem. Methods and results that begin with data collection and move onto elucidating requirements for the web portal are discussed. Sentiments of distrust in medical institutions, and the desire for community specific content and resources were among the needs emanating from our data analysis. These findings guided our decision to customize social software designed to foster connections, collaborations, flexibility, and interactivity; an “architecture of participation”. While we maintain that the leveraging of social software may indeed be the way to build healthy communities and support learning and engagement in underserved communities, our conclusion calls for careful thinking, testing and evaluation research to establish best practice models for leveraging these emerging technologies to support health improvements in the community. PMID:18693872

  18. Integrative Analysis of Complex Cancer Genomics and Clinical Profiles Using the cBioPortal

    PubMed Central

    Gao, Jianjiong; Aksoy, Bülent Arman; Dogrusoz, Ugur; Dresdner, Gideon; Gross, Benjamin; Sumer, S. Onur; Sun, Yichao; Jacobsen, Anders; Sinha, Rileen; Larsson, Erik; Cerami, Ethan; Sander, Chris; Schultz, Nikolaus

    2014-01-01

    The cBioPortal for Cancer Genomics (http://cbioportal.org) provides a Web resource for exploring, visualizing, and analyzing multidimensional cancer genomics data. The portal reduces molecular profiling data from cancer tissues and cell lines into readily understandable genetic, epigenetic, gene expression, and proteomic events. The query interface combined with customized data storage enables researchers to interactively explore genetic alterations across samples, genes, and pathways and, when available in the underlying data, to link these to clinical outcomes. The portal provides graphical summaries of gene-level data from multiple platforms, network visualization and analysis, survival analysis, patient-centric queries, and software programmatic access. The intuitive Web interface of the portal makes complex cancer genomics profiles accessible to researchers and clinicians without requiring bioinformatics expertise, thus facilitating biological discoveries. Here, we provide a practical guide to the analysis and visualization features of the cBioPortal for Cancer Genomics. PMID:23550210

  19. Tectonic Storytelling with Open Source and Digital Object Identifiers - a case study about Plate Tectonics and the Geopark Bergstraße-Odenwald

    NASA Astrophysics Data System (ADS)

    Löwe, Peter; Barmuta, Jan; Klump, Jens; Neumann, Janna; Plank, Margret

    2014-05-01

    The communication of advances in research to the common public for both education and decision making is an important aspect of scientific work. An even more crucial task is to gain recognition within the scientific community, which is judged by impact factor and citation counts. Recently, the latter concepts have been extended from textual publications to include data and software publications. This paper presents a case study for science communication and data citation. For this, tectonic models, Free and Open Source Software (FOSS), best practices for data citation and a multimedia online-portal for scientific content are combined. This approach creates mutual benefits for the stakeholders: Target audiences receive information on the latest research results, while the use of Digital Object Identifiers (DOI) increases the recognition and citation of underlying scientific data. This creates favourable conditions for every researcher as DOI names ensure citeability and long term availability of scientific research. In the developed application, the FOSS tool for tectonic modelling GPlates is used to visualise and manipulate plate-tectonic reconstructions and associated data through geological time. These capabilities are augmented by the Science on a Halfsphere project (SoaH) with a robust and intuitive visualisation hardware environment. The tectonic models used for science communication are provided by the AGH University of Science and Technology. They focus on the Silurian to Early Carboniferous evolution of Central Europe (Bohemian Massif) and were interpreted for the area of the Geopark Bergstraße Odenwald based on the GPlates/SoaH hardware- and software stack. As scientific story-telling is volatile by nature, recordings are a natural means of preservation for further use, reference and analysis. For this, the upcoming portal for audiovisual media of the German National Library of Science and Technology TIB is expected to become a critical service infrastructure. It allows complex search queries, including metadata such as DOI and media fragment identifiers (MFI), thereby linking data citation and science communication.

  20. Use of Portal Monitors for Detection of Technogenic Radioactive Sources in Scrap Metal

    NASA Astrophysics Data System (ADS)

    Solovev, D. B.; Merkusheva, A. E.

    2017-11-01

    The article considers the features of organization of scrap-metal primary radiation control on the specialized enterprises engaging in its deep processing and storage at using by primary technical equipment - radiation portal monitors. The issue of this direction relevance, validity of radiation control implementation with the use of radiation portal monitors, physical and organizational bases of radiation control are considered in detail. The emphasis is put on the considerable increase in the number of technogenic radioactive sources detected in scrap-metal that results in the entering into exploitation of radioactive metallic structures as different building wares. One of reasons of such increase of the number of technogenic radioactive sources getting for processing with scrap-metal is the absence of any recommendations on the radiation portal monitors exploitation. The practical division of the article offers to recommendation on tuning of the modes of work of radiation portal monitors depending on influence the weather factor thus allowing to considerably increase the percent of technogenic radioactive sources detection.

  1. The Use of Gamma-Ray Imaging to Improve Portal Monitor Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziock, Klaus-Peter; Collins, Jeff; Fabris, Lorenzo

    2008-01-01

    We have constructed a prototype, rapid-deployment portal monitor that uses visible-light and gamma-ray imaging to allow simultaneous monitoring of multiple lanes of traffic from the side of a roadway. Our Roadside Tracker uses automated target acquisition and tracking (TAT) software to identify and track vehicles in visible light images. The field of view of the visible camera overlaps with and is calibrated to that of a one-dimensional gamma-ray imager. The TAT code passes information on when vehicles enter and exit the system field of view and when they cross gamma-ray pixel boundaries. Based on this in-formation, the gamma-ray imager "harvests"more » the gamma-ray data specific to each vehicle, integrating its radiation signature for the entire time that it is in the field of view. In this fashion we are able to generate vehicle-specific radiation signatures and avoid source confusion problems that plague nonimaging approaches to the same problem.« less

  2. Framework Development Supporting the Safety Portal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prescott, Steven Ralph; Kvarfordt, Kellie Jean; Vang, Leng

    2015-07-01

    In a collaborating scientific research arena it is important to have an environment where analysts have access to a shared repository of information, documents, and software tools, and be able to accurately maintain and track historical changes in models. The new Safety Portal cloud-based environment will be accessible remotely from anywhere regardless of computing platforms given that the platform has available Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report discusses current development of a cloud-based web portal for PRA tools.

  3. Inside a VAMDC data node—putting standards into practical software

    NASA Astrophysics Data System (ADS)

    Regandell, Samuel; Marquart, Thomas; Piskunov, Nikolai

    2018-03-01

    Access to molecular and atomic data is critical for many forms of remote sensing analysis across different fields. Many atomic and molecular databases are however highly specialised for their intended application, complicating querying and combination data between sources. The Virtual Atomic and Molecular Data Centre, VAMDC, is an electronic infrastructure that allows each database to register as a ‘node’. Through services such as VAMDC’s portal website, users can then access and query all nodes in a homogenised way. Today all major Atomic and Molecular databases are attached to VAMDC This article describes the software tools we developed to help data providers create and manage a VAMDC node. It gives an overview of the VAMDC infrastructure and of the various standards it uses. The article then discusses the development choices made and how the standards are implemented in practice. It concludes with a full example of implementing a VAMDC node using a real-life case as well as future plans for the node software.

  4. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    PubMed

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  5. Septic thrombophlebitis of the portal venous system: clinical and imaging findings in thirty-three patients.

    PubMed

    Ames, Jennifer T; Federle, Michael P

    2011-07-01

    Our purpose was to review the clinical and imaging findings in a series of patients with septic thrombophlebitis of the portal venous system in order to define criteria that might allow more confident and timely diagnosis. This is a retrospective case series. The clinical and imaging features were analyzed in 33 subjects with septic thrombophlebitis of the portal venous system. All 33 patients with septic thrombophlebitis of the portal venous system had pre-disposing infectious or inflammatory processes. Contrast-enhanced CT studies of patients with septic thrombophlebitis typically demonstrate an infectious gastrointestinal source (82%), thrombosis (70%), and/or gas (21%) of the portal system or its branches, and intrahepatic abnormalities such as a transient hepatic attenuation difference (THAD) (42%) or abscess (61%). Septic thrombophlebitis of the portal system is often associated with an infectious source in the gastrointestinal tract and sepsis. Contrast-enhanced CT demonstrates an infectious gastrointestinal source, thrombosis or gas within the portal system or its branches, and intrahepatic abnormalities such as abscess in most cases. We report a THAD in several of our patients, an observation that was not made in prior reports of septic thrombophlebitis.

  6. BioPortal: enhanced functionality via new Web services from the National Center for Biomedical Ontology to access and use ontologies in software applications.

    PubMed

    Whetzel, Patricia L; Noy, Natalya F; Shah, Nigam H; Alexander, Paul R; Nyulas, Csongor; Tudorache, Tania; Musen, Mark A

    2011-07-01

    The National Center for Biomedical Ontology (NCBO) is one of the National Centers for Biomedical Computing funded under the NIH Roadmap Initiative. Contributing to the national computing infrastructure, NCBO has developed BioPortal, a web portal that provides access to a library of biomedical ontologies and terminologies (http://bioportal.bioontology.org) via the NCBO Web services. BioPortal enables community participation in the evaluation and evolution of ontology content by providing features to add mappings between terms, to add comments linked to specific ontology terms and to provide ontology reviews. The NCBO Web services (http://www.bioontology.org/wiki/index.php/NCBO_REST_services) enable this functionality and provide a uniform mechanism to access ontologies from a variety of knowledge representation formats, such as Web Ontology Language (OWL) and Open Biological and Biomedical Ontologies (OBO) format. The Web services provide multi-layered access to the ontology content, from getting all terms in an ontology to retrieving metadata about a term. Users can easily incorporate the NCBO Web services into software applications to generate semantically aware applications and to facilitate structured data collection.

  7. Environmental Information Management For Data Discovery and Access System

    NASA Astrophysics Data System (ADS)

    Giriprakash, P.

    2011-01-01

    Mercury is a federated metadata harvesting, search and retrieval tool based on both open source software and software developed at Oak Ridge National Laboratory. It was originally developed for NASA, and the Mercury development consortium now includes funding from NASA, USGS, and DOE. A major new version of Mercury was developed during 2007 and released in early 2008. This new version provides orders of magnitude improvements in search speed, support for additional metadata formats, integration with Google Maps for spatial queries, support for RSS delivery of search results, and ready customization to meet the needs of the multiple projects which use Mercury. For the end users, Mercury provides a single portal to very quickly search for data and information contained in disparate data management systems. It collects metadata and key data from contributing project servers distributed around the world and builds a centralized index. The Mercury search interfaces then allow ! the users to perform simple, fielded, spatial and temporal searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results to the user, while allowing data providers to advertise the availability of their data and maintain complete control and ownership of that data.

  8. U.S. Geological Survey spatial data access

    USGS Publications Warehouse

    Faundeen, John L.; Kanengieter, Ronald L.; Buswell, Michael D.

    2002-01-01

    The U.S. Geological Survey (USGS) has done a progress review on improving access to its spatial data holdings over the Web. The USGS EROS Data Center has created three major Web-based interfaces to deliver spatial data to the general public; they are Earth Explorer, the Seamless Data Distribution System (SDDS), and the USGS Web Mapping Portal. Lessons were learned in developing these systems, and various resources were needed for their implementation. The USGS serves as a fact-finding agency in the U.S. Government that collects, monitors, analyzes, and provides scientific information about natural resource conditions and issues. To carry out its mission, the USGS has created and managed spatial data since its inception. Originally relying on paper maps, the USGS now uses advanced technology to produce digital representations of the Earth’s features. The spatial products of the USGS include both source and derivative data. Derivative datasets include Digital Orthophoto Quadrangles (DOQ), Digital Elevation Models, Digital Line Graphs, land-cover Digital Raster Graphics, and the seamless National Elevation Dataset. These products, created with automated processes, use aerial photographs, satellite images, or other cartographic information such as scanned paper maps as source data. With Earth Explorer, users can search multiple inventories through metadata queries and can browse satellite and DOQ imagery. They can place orders and make payment through secure credit card transactions. Some USGS spatial data can be accessed with SDDS. The SDDS uses an ArcIMS map service interface to identify the user’s areas of interest and determine the output format; it allows the user to either download the actual spatial data directly for small areas or place orders for larger areas to be delivered on media. The USGS Web Mapping Portal provides views of national and international datasets through an ArcIMS map service interface. In addition, the map portal posts news about new map services available from the USGS, many simultaneously published on the Environmental Systems Research Institute Geography Network. These three information systems use new software tools and expanded hardware to meet the requirements of the users. The systems are designed to handle the required workload and are relatively easy to enhance and maintain. The software tools give users a high level of functionality and help the system conform to industry standards. The hardware and software architecture is designed to handle the large amounts of spatial data and Internet traffic required by the information systems. Last, customer support was needed to answer questions, monitor e-mail, and report customer problems.

  9. A Cloud-Computing Service for Environmental Geophysics and Seismic Data Processing

    NASA Astrophysics Data System (ADS)

    Heilmann, B. Z.; Maggi, P.; Piras, A.; Satta, G.; Deidda, G. P.; Bonomi, E.

    2012-04-01

    Cloud computing is establishing worldwide as a new high performance computing paradigm that offers formidable possibilities to industry and science. The presented cloud-computing portal, part of the Grida3 project, provides an innovative approach to seismic data processing by combining open-source state-of-the-art processing software and cloud-computing technology, making possible the effective use of distributed computation and data management with administratively distant resources. We substituted the user-side demanding hardware and software requirements by remote access to high-performance grid-computing facilities. As a result, data processing can be done quasi in real-time being ubiquitously controlled via Internet by a user-friendly web-browser interface. Besides the obvious advantages over locally installed seismic-processing packages, the presented cloud-computing solution creates completely new possibilities for scientific education, collaboration, and presentation of reproducible results. The web-browser interface of our portal is based on the commercially supported grid portal EnginFrame, an open framework based on Java, XML, and Web Services. We selected the hosted applications with the objective to allow the construction of typical 2D time-domain seismic-imaging workflows as used for environmental studies and, originally, for hydrocarbon exploration. For data visualization and pre-processing, we chose the free software package Seismic Un*x. We ported tools for trace balancing, amplitude gaining, muting, frequency filtering, dip filtering, deconvolution and rendering, with a customized choice of options as services onto the cloud-computing portal. For structural imaging and velocity-model building, we developed a grid version of the Common-Reflection-Surface stack, a data-driven imaging method that requires no user interaction at run time such as manual picking in prestack volumes or velocity spectra. Due to its high level of automation, CRS stacking can benefit largely from the hardware parallelism provided by the cloud deployment. The resulting output, post-stack section, coherence, and NMO-velocity panels are used to generate a smooth migration-velocity model. Residual static corrections are calculated as a by-product of the stack and can be applied iteratively. As a final step, a time migrated subsurface image is obtained by a parallelized Kirchhoff time migration scheme. Processing can be done step-by-step or using a graphical workflow editor that can launch a series of pipelined tasks. The status of the submitted jobs is monitored by a dedicated service. All results are stored in project directories, where they can be downloaded of viewed directly in the browser. Currently, the portal has access to three research clusters having a total number of 70 nodes with 4 cores each. They are shared with four other cloud-computing applications bundled within the GRIDA3 project. To demonstrate the functionality of our "seismic cloud lab", we will present results obtained for three different types of data, all taken from hydrogeophysical studies: (1) a seismic reflection data set, made of compressional waves from explosive sources, recorded in Muravera, Sardinia; (2) a shear-wave data set from, Sardinia; (3) a multi-offset Ground-Penetrating-Radar data set from Larreule, France. The presented work was funded by the government of the Autonomous Region of Sardinia and by the Italian Ministry of Research and Education.

  10. Neutron detection with a NaI spectrometer using high-energy photons

    NASA Astrophysics Data System (ADS)

    Holm, Philip; Peräjärvi, Kari; Sihvonen, Ari-Pekka; Siiskonen, Teemu; Toivonen, Harri

    2013-01-01

    Neutrons can be indirectly detected by high-energy photons. The performance of a 4″×4″×16″ NaI portal monitor was compared to a 3He-based portal monitor with a comparable cross-section of the active volume. Measurements were performed with bare and shielded 252Cf and AmBe sources. With an optimum converter and moderator structure for the NaI detector, the detection efficiencies and minimum detectable activities of the portal monitors were similar. The NaI portal monitor preserved its detection efficiency much better with shielded sources, making the method very interesting for security applications. For heavily shielded sources, the NaI detector was 2-3 times more sensitive than the 3He-based detector.

  11. Wired Widgets: Agile Visualization for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Gerschefske, K.; Witmer, J.

    2012-09-01

    Continued advancement in sensors and analysis techniques have resulted in a wealth of Space Situational Awareness (SSA) data, made available via tools and Service Oriented Architectures (SOA) such as those in the Joint Space Operations Center Mission Systems (JMS) environment. Current visualization software cannot quickly adapt to rapidly changing missions and data, preventing operators and analysts from performing their jobs effectively. The value of this wealth of SSA data is not fully realized, as the operators' existing software is not built with the flexibility to consume new or changing sources of data or to rapidly customize their visualization as the mission evolves. While tools like the JMS user-defined operational picture (UDOP) have begun to fill this gap, this paper presents a further evolution, leveraging Web 2.0 technologies for maximum agility. We demonstrate a flexible Web widget framework with inter-widget data sharing, publish-subscribe eventing, and an API providing the basis for consumption of new data sources and adaptable visualization. Wired Widgets offers cross-portal widgets along with a widget communication framework and development toolkit for rapid new widget development, giving operators the ability to answer relevant questions as the mission evolves. Wired Widgets has been applied in a number of dynamic mission domains including disaster response, combat operations, and noncombatant evacuation scenarios. The variety of applications demonstrate that Wired Widgets provides a flexible, data driven solution for visualization in changing environments. In this paper, we show how, deployed in the Ozone Widget Framework portal environment, Wired Widgets can provide an agile, web-based visualization to support the SSA mission. Furthermore, we discuss how the tenets of agile visualization can generally be applied to the SSA problem space to provide operators flexibility, potentially informing future acquisition and system development.

  12. The design and fabrication of two portal vein flow phantoms by different methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yunker, Bryan E., E-mail: bryan.yunker@ucdenver.edu; Lanning, Craig J.; Shandas, Robin

    2014-02-15

    Purpose: This study outlines the design and fabrication techniques for two portal vein flow phantoms. Methods: A materials study was performed as a precursor to this phantom fabrication effort and the desired material properties are restated for continuity. A three-dimensional portal vein pattern was created from the Visual Human database. The portal vein pattern was used to fabricate two flow phantoms by different methods with identical interior surface geometry using computer aided design software tools and rapid prototyping techniques. One portal flow phantom was fabricated within a solid block of clear silicone for use on a table with Ultrasound ormore » within medical imaging systems such as MRI, CT, PET, or SPECT. The other portal flow phantom was fabricated as a thin walled tubular latex structure for use in water tanks with Ultrasound imaging. Both phantoms were evaluated for usability and durability. Results: Both phantoms were fabricated successfully and passed durability criteria for flow testing in the next project phase. Conclusions: The fabrication methods and materials employed for the study yielded durable portal vein phantoms.« less

  13. Software and hardware infrastructure for research in electrophysiology

    PubMed Central

    Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Řondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Štěbeták, Jan

    2014-01-01

    As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software. PMID:24639646

  14. Software and hardware infrastructure for research in electrophysiology.

    PubMed

    Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Rondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Stěbeták, Jan

    2014-01-01

    As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software.

  15. Tethys: A Platform for Water Resources Modeling and Decision Support Apps

    NASA Astrophysics Data System (ADS)

    Nelson, J.; Swain, N. R.

    2015-12-01

    The interactive nature of web applications or "web apps" makes it an excellent medium for conveying complex scientific concepts to lay audiences and creating decision support tools that harness cutting edge modeling techniques. However, the technical expertise required to develop web apps represents a barrier for would-be developers. This barrier can be characterized by the following hurdles that developers must overcome: (1) identify, select, and install software that meet the spatial and computational capabilities commonly required for water resources modeling; (2) orchestrate the use of multiple free and open source (FOSS) projects and navigate their differing application programming interfaces; (3) learn the multi-language programming skills required for modern web development; and (4) develop a web-secure and fully featured web portal to host the app. Tethys Platform has been developed to lower the technical barrier and minimize the initial development investment that prohibits many scientists and engineers from making use of the web app medium. It includes (1) a suite of FOSS that address the unique data and computational needs common to water resources web app development, (2) a Python software development kit that streamlines development, and (3) a customizable web portal that is used to deploy the completed web apps. Tethys synthesizes several software projects including PostGIS, 52°North WPS, GeoServer, Google Maps™, OpenLayers, and Highcharts. It has been used to develop a broad array of web apps for water resources modeling and decision support for several projects including CI-WATER, HydroShare, and the National Flood Interoperability Experiment. The presentation will include live demos of some of the apps that have been developed using Tethys to demonstrate its capabilities.

  16. Dynamic mobility applications open source application development portal : Task 4 : system requirements specifications : final report.

    DOT National Transportation Integrated Search

    2016-10-12

    This document describes the System Requirements Specifications (SyRS) of the Dynamic Mobility Applications (DMA) Open Source Application Development Portal (OSADP) system in details according to IEEE-Std. 1233-1998. The requirement statements discuss...

  17. Policy analysis and recommendations for the open source application development portal (OSADP).

    DOT National Transportation Integrated Search

    2012-06-01

    This white paper addresses the policy and institutional issues that are associated with the development of an open source applications development portal (OSADP), part of a larger research effort being conducted under the ITS Programs Dynamic Mobi...

  18. NASA Software Lets You Explore Mars, the Asteroid Vesta and the Moon

    NASA Image and Video Library

    2016-10-06

    NASA wants you to use your web browser to explore Mars, the Moon and the asteroid Vesta! The three portals are some of NASA's planetary mapping and modeling web portals. It makes it easy for mission planners, scientists, students and the public to visualize details on the surface of Mars, the Moon and Vesta, as seen with a variety of instruments aboard a number of spacecraft.

  19. The Electronic View Box: a software tool for radiation therapy treatment verification.

    PubMed

    Bosch, W R; Low, D A; Gerber, R L; Michalski, J M; Graham, M V; Perez, C A; Harms, W B; Purdy, J A

    1995-01-01

    We have developed a software tool for interactively verifying treatment plan implementation. The Electronic View Box (EVB) tool copies the paradigm of current practice but does so electronically. A portal image (online portal image or digitized port film) is displayed side by side with a prescription image (digitized simulator film or digitally reconstructed radiograph). The user can measure distances between features in prescription and portal images and "write" on the display, either to approve the image or to indicate required corrective actions. The EVB tool also provides several features not available in conventional verification practice using a light box. The EVB tool has been written in ANSI C using the X window system. The tool makes use of the Virtual Machine Platform and Foundation Library specifications of the NCI-sponsored Radiation Therapy Planning Tools Collaborative Working Group for portability into an arbitrary treatment planning system that conforms to these specifications. The present EVB tool is based on an earlier Verification Image Review tool, but with a substantial redesign of the user interface. A graphical user interface prototyping system was used in iteratively refining the tool layout to allow rapid modifications of the interface in response to user comments. Features of the EVB tool include 1) hierarchical selection of digital portal images based on physician name, patient name, and field identifier; 2) side-by-side presentation of prescription and portal images at equal magnification and orientation, and with independent grayscale controls; 3) "trace" facility for outlining anatomical structures; 4) "ruler" facility for measuring distances; 5) zoomed display of corresponding regions in both images; 6) image contrast enhancement; and 7) communication of portal image evaluation results (approval, block modification, repeat image acquisition, etc.). The EVB tool facilitates the rapid comparison of prescription and portal images and permits electronic communication of corrections in port shape and positioning.

  20. Organic Scintillation Detectors for Spectroscopic Radiation Portal Monitors

    NASA Astrophysics Data System (ADS)

    Paff, Marc Gerrit

    Thousands of radiation portal monitors have been deployed worldwide to detect and deter the smuggling of nuclear and radiological materials that could be used in nefarious acts. Radiation portal monitors are often installed at bottlenecks where large amounts of people or goods must traverse. Examples of use include scanning cargo containers at shipping ports, vehicles at border crossings, and people at high profile functions and events. Traditional radiation portal monitors contain separate detectors for passively measuring neutron and gamma ray count rates. 3He tubes embedded in polyethylene and slabs of plastic scintillators are the most common detector materials used in radiation portal monitors. The radiation portal monitor alarm mechanism relies on measuring radiation count rates above user defined alarm thresholds. These alarm thresholds are set above natural background count rates. Minimizing false alarms caused by natural background and maximizing sensitivity to weakly emitting threat sources must be balanced when setting these alarm thresholds. Current radiation portal monitor designs suffer from frequent nuisance radiation alarms. These radiation nuisance alarms are most frequently caused by shipments of large quantities of naturally occurring radioactive material containing cargo, like kitty litter, as well as by humans who have recently undergone a nuclear medicine procedure, particularly 99mTc treatments. Current radiation portal monitors typically lack spectroscopic capabilities, so nuisance alarms must be screened out in time-intensive secondary inspections with handheld radiation detectors. Radiation portal monitors using organic liquid scintillation detectors were designed, built, and tested. A number of algorithms were developed to perform on-the-fly radionuclide identification of single and combination radiation sources moving past the portal monitor at speeds up to 2.2 m/s. The portal monitor designs were tested extensively with a variety of shielded and unshielded radiation sources, including special nuclear material, at the European Commission Joint Research Centre in Ispra, Italy. Common medical isotopes were measured at the C.S. Mott Children's Hospital and added to the radionuclide identification algorithms.

  1. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  2. Organic liquid scintillation detectors for on-the-fly neutron/gamma alarming and radionuclide identification in a pedestrian radiation portal monitor

    NASA Astrophysics Data System (ADS)

    Paff, Marc Gerrit; Ruch, Marc L.; Poitrasson-Riviere, Alexis; Sagadevan, Athena; Clarke, Shaun D.; Pozzi, Sara

    2015-07-01

    We present new experimental results from a radiation portal monitor based on the use of organic liquid scintillators. The system was tested as part of a 3He-free radiation portal monitor testing campaign at the European Commission's Joint Research Centre in Ispra, Italy, in February 2014. The radiation portal monitor was subjected to a wide range of test conditions described in ANSI N42.35, including a variety of gamma-ray sources and a 20,000 n/s 252Cf source. A false alarm test tested whether radiation portal monitors ever alarmed in the presence of only natural background. The University of Michigan Detection for Nuclear Nonproliferation Group's system triggered zero false alarms in 2739 trials. It consistently alarmed on a variety of gamma-ray sources travelling at 1.2 m/s at a 70 cm source to detector distance. The neutron source was detected at speeds up to 3 m/s and in configurations with up to 8 cm of high density polyethylene shielding. The success of on-the-fly radionuclide identification varied with the gamma-ray source measured as well as with which of two radionuclide identification methods was used. Both methods used a least squares comparison between the measured pulse height distributions to library spectra to pick the best match. The methods varied in how the pulse height distributions were modified prior to the least squares comparison. Correct identification rates were as high as 100% for highly enriched uranium, but as low as 50% for 241Am. Both radionuclide identification algorithms produced mixed results, but the concept of using liquid scintillation detectors for gamma-ray and neutron alarming in radiation portal monitor was validated.

  3. EUDAT B2FIND : A Cross-Discipline Metadata Service and Discovery Portal

    NASA Astrophysics Data System (ADS)

    Widmann, Heinrich; Thiemann, Hannes

    2016-04-01

    The European Data Infrastructure (EUDAT) project aims at a pan-European environment that supports a variety of multiple research communities and individuals to manage the rising tide of scientific data by advanced data management technologies. This led to the establishment of the community-driven Collaborative Data Infrastructure that implements common data services and storage resources to tackle the basic requirements and the specific challenges of international and interdisciplinary research data management. The metadata service B2FIND plays a central role in this context by providing a simple and user-friendly discovery portal to find research data collections stored in EUDAT data centers or in other repositories. For this we store the diverse metadata collected from heterogeneous sources in a comprehensive joint metadata catalogue and make them searchable in an open data portal. The implemented metadata ingestion workflow consists of three steps. First the metadata records - provided either by various research communities or via other EUDAT services - are harvested. Afterwards the raw metadata records are converted and mapped to unified key-value dictionaries as specified by the B2FIND schema. The semantic mapping of the non-uniform, community specific metadata to homogenous structured datasets is hereby the most subtle and challenging task. To assure and improve the quality of the metadata this mapping process is accompanied by • iterative and intense exchange with the community representatives, • usage of controlled vocabularies and community specific ontologies and • formal and semantic validation. Finally the mapped and checked records are uploaded as datasets to the catalogue, which is based on the open source data portal software CKAN. CKAN provides a rich RESTful JSON API and uses SOLR for dataset indexing that enables users to query and search in the catalogue. The homogenization of the community specific data models and vocabularies enables not only the unique presentation of these datasets as tables of field-value pairs but also the faceted, spatial and temporal search in the B2FIND metadata portal. Furthermore the service provides transparent access to the scientific data objects through the given references and identifiers in the metadata. B2FIND offers support for new communities interested in publishing their data within EUDAT. We present here the functionality and the features of the B2FIND service and give an outlook of further developments as interfaces to external libraries and use of Linked Data.

  4. Solar System Treks: Interactive Web Portals or STEM, Exploration and Beyond

    NASA Astrophysics Data System (ADS)

    Law, E.; Day, B. H.; Viotti, M.

    2017-12-01

    NASA's Solar System Treks project produces a suite of online visualization and analysis tools for lunar and planetary mapping and modeling. These portals offer great benefits for education and public outreach, providing access to data from a wide range of instruments aboard a variety of past and current missions. As a component of NASA's STEM Activation Infrastructure, they are available as resources for NASA STEM programs, and to the greater STEM community. As new missions are planned to a variety of planetary bodies, these tools facilitate public understanding of the missions and engage the public in the process of identifying and selecting where these missions will land. There are currently three web portals in the program: Moon Trek (https://moontrek.jpl.nasa.gov), Mars Trek (https://marstrek.jpl.nasa.gov), and Vesta Trek (https://vestatrek.jpl.nasa.gov). A new release of Mars Trek includes new tools and data products focusing on human landing site selection. Backed by evidence-based cognitive and computer science findings, an additional version is available for educational and public audiences in support of earning along novice-to-expert pathways, enabling authentic, real-world interaction with planetary data. Portals for additional planetary bodies are planned. As web-based toolsets, the portals do not require users to purchase or install any software beyond current web browsers. The portals provide analysis tools for measurement and study of planetary terrain. They allow data to be layered and adjusted to optimize visualization. Visualizations are easily stored and shared. The portals provide 3D visualization and give users the ability to mark terrain for generation of STL/OBJ files that can be directed to 3D printers. Such 3D prints are valuable tools in museums, public exhibits, and classrooms - especially for the visually impaired. The program supports additional clients, web services, and APIs facilitating dissemination of planetary data to external applications and venues. NASA challenges and hackathons also provide members of the software development community opportunities to participate in tool development and leverage data from the portals.

  5. Integrating Thematic Web Portal Capabilities into the NASA Earthdata Web Infrastructure

    NASA Technical Reports Server (NTRS)

    Wong, Minnie; Baynes, Kathleen E.; Huang, Thomas; McLaughlin, Brett

    2015-01-01

    This poster will present the process of integrating thematic web portal capabilities into the NASA Earth data web infrastructure, with examples from the Sea Level Change Portal. The Sea Level Change Portal will be a source of current NASA research, data and information regarding sea level change. The portal will provide sea level change information through articles, graphics, videos and animations, an interactive tool to view and access sea level change data and a dashboard showing sea level change indicators.

  6. Audiovisual heritage preservation in Earth and Space Science Informatics: Videos from Free and Open Source Software for Geospatial (FOSS4G) conferences in the TIB|AV-Portal.

    NASA Astrophysics Data System (ADS)

    Löwe, Peter; Marín Arraiza, Paloma; Plank, Margret

    2016-04-01

    The influence of Free and Open Source Software (FOSS) projects on Earth and Space Science Informatics (ESSI) continues to grow, particularly in the emerging context of Data Science or Open Science. The scientific significance and heritage of FOSS projects is only to a limited amount covered by traditional scientific journal articles: Audiovisual conference recordings contain significant information for analysis, reference and citation. In the context of data driven research, this audiovisual content needs to be accessible by effective search capabilities, enabling the content to be searched in depth and retrieved. Thereby, it is ensured that the content producers receive credit for their efforts within the respective communities. For Geoinformatics and ESSI, one distinguished driver is the OSGeo Foundation (OSGeo), founded in 2006 to support and promote the interdisciplinary collaborative development of open geospatial technologies and data. The organisational structure is based on software projects that have successfully passed the OSGeo incubation process, proving their compliance with FOSS licence models. This quality assurance is crucial for the transparent and unhindered application in (Open) Science. The main communication channels within and between the OSGeo-hosted community projects for face to face meetings are conferences on national, regional and global scale. Video recordings have been complementing the scientific proceedings since 2006. During the last decade, the growing body of OSGeo videos has been negatively affected by content loss, obsolescence of video technology and dependence on commercial video portals. Even worse, the distributed storage and lack of metadata do not guarantee concise and efficient access of the content. This limits the retrospective analysis of video content from past conferences. But, it also indicates a need for reliable, standardized, comparable audiovisual repositories for the future, as the number of OSGeo projects continues to grow - and so does the number of topics to be addressed at conferences. Up to now, commercial Web 2.0 platforms like Youtube and Vimeo were used. However, these platforms lack capabilities for long-term archiving and scientific citation, such as persistent identifiers that permit the citation of specific intervals of the overall content. To address these issues, the scientific library community has started to implement improved multimedia archiving and retrieval services for scientific audiovisual content which fulfil these requirements. Using the reference case of the OSGeo conference video recordings, this paper gives an overview over the new and growing collection activities by the German National Library of Science and Technology for audiovisual content in Geoinformatics/ESSI in the TIB|AV Portal for audiovisual content. Following a successful start in 2014 and positive response from the OSGeo Community, the TIB acquisition strategy for OSGeo video material was extended to include German, European, North-American and global conference content. The collection grows steadily by new conference content and also by harvesting of past conference videos from commercial Web 2.0 platforms like Youtube and Vimeo. This positions the TIB|AV-Portal as a reliable and concise long-term resource for innovation mining, education and scholarly research within the ESSI context both within Academia and Industry.

  7. Eurogrid: a new glideinWMS based portal for CDF data analysis

    NASA Astrophysics Data System (ADS)

    Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.

    2012-12-01

    The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.

  8. HydroDesktop: An Open Source GIS-Based Platform for Hydrologic Data Discovery, Visualization, and Analysis

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.

    2010-12-01

    A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.

  9. An overview of the challenges in designing, integrating, and delivering BARD: a public chemical biology resource and query portal across multiple organizations, locations, and disciplines

    PubMed Central

    de Souza, Andrea; Bittker, Joshua; Lahr, David; Brudz, Steve; Chatwin, Simon; Oprea, Tudor I.; Waller, Anna; Yang, Jeremy; Southall, Noel; Guha, Rajarshi; Schurer, Stephan; Vempati, Uma; Southern, Mark R.; Dawson, Eric S.; Clemons, Paul A.; Chung, Thomas D.Y.

    2015-01-01

    Recent industry-academic partnerships involve collaboration across disciplines, locations, and organizations using publicly funded “open-access” and proprietary commercial data sources. These require effective integration of chemical and biological information from diverse data sources, presenting key informatics, personnel, and organizational challenges. BARD (BioAssay Research Database) was conceived to address these challenges and to serve as a community-wide resource and intuitive web portal for public-sector chemical biology data. Its initial focus is to enable scientists to more effectively use the NIH Roadmap Molecular Libraries Program (MLP) data generated from 3-year pilot and 6-year production phases of the Molecular Libraries Probe Production Centers Network (MLPCN), currently in its final year. BARD evolves the current data standards through structured assay and result annotations that leverage the BioAssay Ontology (BAO) and other industry-standard ontologies, and a core hierarchy of assay definition terms and data standards defined specifically for small-molecule assay data. We have initially focused on migrating the highest-value MLP data into BARD and bringing it up to this new standard. We review the technical and organizational challenges overcome by the inter-disciplinary BARD team, veterans of public and private sector data-integration projects, collaborating to describe (functional specifications), design (technical specifications), and implement this next-generation software solution. PMID:24441647

  10. Developing a Taxonomy of Characteristics and Features of Collaboration Tools for Teams in Distributed Environments

    DTIC Science & Technology

    2007-09-01

    Motion URL: http://www.blackberry.com/products/blackberry/index.shtml Software Name: Bricolage Company: Bricolage URL: http://www.bricolage.cc...Workflow Customizable control over editorial content. Bricolage Bricolage Feature Description Software Company Workflow Allows development...content for Nuxeo Collaborative Portal projects. Nuxeo Workspace Add, edit, delete, content through web interface. Bricolage Bricolage

  11. The GPlates Portal: Cloud-based interactive 3D and 4D visualization of global geological and geophysical data and models in a browser

    NASA Astrophysics Data System (ADS)

    Müller, Dietmar; Qin, Xiaodong; Sandwell, David; Dutkiewicz, Adriana; Williams, Simon; Flament, Nicolas; Maus, Stefan; Seton, Maria

    2017-04-01

    The pace of scientific discovery is being transformed by the availability of 'big data' and open access, open source software tools. These innovations open up new avenues for how scientists communicate and share data and ideas with each other, and with the general public. Here, we describe our efforts to bring to life our studies of the Earth system, both at present day and through deep geological time. The GPlates Portal (portal.gplates.org) is a gateway to a series of virtual globes based on the Cesium Javascript library. The portal allows fast interactive visualization of global geophysical and geological data sets, draped over digital terrain models. The globes use WebGL for hardware-accelerated graphics and are cross-platform and cross-browser compatible with complete camera control. The globes include a visualization of a high-resolution global digital elevation model and the vertical gradient of the global gravity field, highlighting small-scale seafloor fabric such as abyssal hills, fracture zones and seamounts in unprecedented detail. The portal also features globes portraying seafloor geology and a global data set of marine magnetic anomaly identifications. The portal is specifically designed to visualize models of the Earth through geological time. These space-time globes include tectonic reconstructions of the Earth's gravity and magnetic fields, and several models of long-wavelength surface dynamic topography through time, including the interactive plotting of vertical motion histories at selected locations. The portal has been visited over half a million times since its inception in October 2015, as tracked by google analytics, and the globes have been featured in numerous media articles around the world. This demonstrates the high demand for fast visualization of global spatial big data, both for the present-day as well as through geological time. The globes put the on-the-fly visualization of massive data sets at the fingertips of end-users to stimulate teaching and learning and novel avenues of inquiry. This technology offers many future opportunities for providing additional functionality, especially on-the-fly big data analytics. Müller, R.D., Qin, X., Sandwell, D.T., Dutkiewicz, A., Williams, S.E., Flament, N., Maus, S. and Seton, M, 2016, The GPlates Portal: Cloud-based interactive 3D visualization of global geophysical and geological data in a web browser, PLoS ONE 11(3): e0150883. doi:10.1371/ journal.pone.0150883

  12. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the variations in algorithm parameters since all of the processing is done remotely and numerous jobs can be submitted in sequence. The web-based software also eliminates the need for users to deal with the hassles and costs associated with software installation and licensing while providing adequate disk space for storage and personal job archival capability. Although currently limited to data access and DEM generation tasks, the OpenTopography system is modular in design and can be modified to accommodate new processing tools as they become available. We are currently exploring implementation of higher-level DEM analysis tasks in OpenTopography, since such processing is often computationally intensive and thus lends itself to utilization of cyberinfrastructure. Products derived from OpenTopography processing are available in a variety of formats ranging from simple Google Earth visualizations of LIDAR-derived hillshades to various GIS-compatible grid formats. To serve community users less interested in data processing, OpenTopography also hosts 1 km^2 digital elevation model tiles as well as Google Earth image overlays for a synoptic view of the data.

  13. Customizable scientific web-portal for DIII-D nuclear fusion experiment

    NASA Astrophysics Data System (ADS)

    Abla, G.; Kim, E. N.; Schissel, D. P.

    2010-04-01

    Increasing utilization of the Internet and convenient web technologies has made the web-portal a major application interface for remote participation and control of scientific instruments. While web-portals have provided a centralized gateway for multiple computational services, the amount of visual output often is overwhelming due to the high volume of data generated by complex scientific instruments and experiments. Since each scientist may have different priorities and areas of interest in the experiment, filtering and organizing information based on the individual user's need can increase the usability and efficiency of a web-portal. DIII-D is the largest magnetic nuclear fusion device in the US. A web-portal has been designed to support the experimental activities of DIII-D researchers worldwide. It offers a customizable interface with personalized page layouts and list of services for users to select. Each individual user can create a unique working environment to fit his own needs and interests. Customizable services are: real-time experiment status monitoring, diagnostic data access, interactive data analysis and visualization. The web-portal also supports interactive collaborations by providing collaborative logbook, and online instant announcement services. The DIII-D web-portal development utilizes multi-tier software architecture, and Web 2.0 technologies and tools, such as AJAX and Django, to develop a highly-interactive and customizable user interface.

  14. Elastography methods for the non-invasive assessment of portal hypertension.

    PubMed

    Roccarina, Davide; Rosselli, Matteo; Genesca, Joan; Tsochatzis, Emmanuel A

    2018-02-01

    The gold standard to assess the presence and severity of portal hypertension remains the hepatic vein pressure gradient, however the recent development of non-invasive assessment using elastography techniques offers valuable alternatives. In this review, we discuss the diagnostic accuracy and utility of such techniques in patients with portal hypertension due to cirrhosis. Areas covered: A literature search focused on liver and spleen stiffness measurement with different elastographic techniques for the assessment of the presence and severity of portal hypertension and oesophageal varices in people with chronic liver disease. The combination of elastography with parameters such as platelet count and spleen size is also discussed. Expert commentary: Non-invasive assessment of liver fibrosis and portal hypertension is a validated tool for the diagnosis and follow-up of patients. Baveno VI recommended the combination of transient elastography and platelet count for ruling out varices needing treatment in patients with compensated advanced chronic liver disease. Assessment of aetiology specific cut-offs for ruling in and ruling out clinically significant portal hypertension is an unmet clinical need. The incorporation of spleen stiffness measurements in non-invasive algorithms using validated software and improved measuring scales might enhance the non-invasive diagnosis of portal hypertension in the next 5 years.

  15. TCGA Expedition: A Data Acquisition and Management System for TCGA Data

    PubMed Central

    Chandran, Uma R.; Medvedeva, Olga P.; Barmada, M. Michael; Blood, Philip D.; Chakka, Anish; Luthra, Soumya; Ferreira, Antonio; Wong, Kim F.; Lee, Adrian V.; Zhang, Zhihui; Budden, Robert; Scott, J. Ray; Berndt, Annerose; Berg, Jeremy M.; Jacobson, Rebecca S.

    2016-01-01

    Background The Cancer Genome Atlas Project (TCGA) is a National Cancer Institute effort to profile at least 500 cases of 20 different tumor types using genomic platforms and to make these data, both raw and processed, available to all researchers. TCGA data are currently over 1.2 Petabyte in size and include whole genome sequence (WGS), whole exome sequence, methylation, RNA expression, proteomic, and clinical datasets. Publicly accessible TCGA data are released through public portals, but many challenges exist in navigating and using data obtained from these sites. We developed TCGA Expedition to support the research community focused on computational methods for cancer research. Data obtained, versioned, and archived using TCGA Expedition supports command line access at high-performance computing facilities as well as some functionality with third party tools. For a subset of TCGA data collected at University of Pittsburgh, we also re-associate TCGA data with de-identified data from the electronic health records. Here we describe the software as well as the architecture of our repository, methods for loading of TCGA data to multiple platforms, and security and regulatory controls that conform to federal best practices. Results TCGA Expedition software consists of a set of scripts written in Bash, Python and Java that download, extract, harmonize, version and store all TCGA data and metadata. The software generates a versioned, participant- and sample-centered, local TCGA data directory with metadata structures that directly reference the local data files as well as the original data files. The software supports flexible searches of the data via a web portal, user-centric data tracking tools, and data provenance tools. Using this software, we created a collaborative repository, the Pittsburgh Genome Resource Repository (PGRR) that enabled investigators at our institution to work with all TCGA data formats, and to interrogate these data with analysis pipelines, and associated tools. WGS data are especially challenging for individual investigators to use, due to issues with downloading, storage, and processing; having locally accessible WGS BAM files has proven invaluable. Conclusion Our open-source, freely available TCGA Expedition software can be used to create a local collaborative infrastructure for acquiring, managing, and analyzing TCGA data and other large public datasets. PMID:27788220

  16. Delivering Unidata Technology via the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, Ward; Oxelson Ganter, Jennifer

    2016-04-01

    Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.

  17. Teachers' Acceptance and Use of an Educational Portal

    ERIC Educational Resources Information Center

    Pynoo, Bram; Tondeur, Jo; van Braak, Johan; Duyck, Wouter; Sijnave, Bart; Duyck, Philippe

    2012-01-01

    In this study, teachers' acceptance and use of an educational portal is assessed based on data from two sources: usage data (number of logins, downloads, uploads, reactions and pages viewed) and an online acceptance questionnaire. The usage data is extracted on two occasions from the portal's database: at survey completion (T1) and twenty-two…

  18. International Cancer Genome Consortium Data Portal--a one-stop shop for cancer genomics data.

    PubMed

    Zhang, Junjun; Baran, Joachim; Cros, A; Guberman, Jonathan M; Haider, Syed; Hsu, Jack; Liang, Yong; Rivkin, Elena; Wang, Jianxin; Whitty, Brett; Wong-Erasmus, Marie; Yao, Long; Kasprzyk, Arek

    2011-01-01

    The International Cancer Genome Consortium (ICGC) is a collaborative effort to characterize genomic abnormalities in 50 different cancer types. To make this data available, the ICGC has created the ICGC Data Portal. Powered by the BioMart software, the Data Portal allows each ICGC member institution to manage and maintain its own databases locally, while seamlessly presenting all the data in a single access point for users. The Data Portal currently contains data from 24 cancer projects, including ICGC, The Cancer Genome Atlas (TCGA), Johns Hopkins University, and the Tumor Sequencing Project. It consists of 3478 genomes and 13 cancer types and subtypes. Available open access data types include simple somatic mutations, copy number alterations, structural rearrangements, gene expression, microRNAs, DNA methylation and exon junctions. Additionally, simple germline variations are available as controlled access data. The Data Portal uses a web-based graphical user interface (GUI) to offer researchers multiple ways to quickly and easily search and analyze the available data. The web interface can assist in constructing complicated queries across multiple data sets. Several application programming interfaces are also available for programmatic access. Here we describe the organization, functionality, and capabilities of the ICGC Data Portal.

  19. NASA Sea Level Change Portal - It not just another portal site

    NASA Astrophysics Data System (ADS)

    Huang, T.; Quach, N.; Abercrombie, S. P.; Boening, C.; Brennan, H. P.; Gill, K. M.; Greguska, F. R., III; Jackson, R.; Larour, E. Y.; Shaftel, H.; Tenenbaum, L. F.; Zlotnicki, V.; Moore, B.; Moore, J.; Boeck, A.

    2017-12-01

    The NASA Sea Level Change Portal (https://sealevel.nasa.gov) is designed as a "one-stop" source for current sea level change information, including interactive tools for accessing and viewing regional data, a virtual dashboard of sea level indicators, and ongoing updates through a suite of editorial products that include content articles, graphics, videos, and animations. With increasing global temperatures warming the ocean and melting ice sheets and glaciers, there is an immediate need both for accelerating sea level change research and for making this research accessible to scientists in disparate discipline, to the general public, to policy makers and business. The immersive and innovative NASA portal debuted at the 2015 AGU attracts thousands of daily visitors and over 30K followers on Facebook®. Behind its intuitive interface is an extensible architecture that integrates site contents, data for various sources, visualization, horizontal-scale geospatial data analytic technology (called NEXUS), and an interactive 3D simulation platform (called the Virtual Earth System Laboratory). We will present an overview of our NASA portal and some of our architectural decisions along with discussion on our open-source, cloud-based data analytic technology that enables on-the-fly analysis of heterogeneous data.

  20. NDFOM Description for DNDO Summer Internship Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Budden, Brent Scott

    2017-12-01

    Nuclear Detection Figure of Merit (NDFOM) is a DNDO-funded project at LANL to develop a software framework that allows a user to evaluate a radiation detection scenario of interest, quickly obtaining results on detector performance. It is intended as a “first step” in detector performance assessment, and meant to be easily employed by subject matter experts (SMEs) and non-SMEs alike. The generic scenario consists of a potential source moving past a detector at a relative velocity and with a distance of closest approach. Such a scenario is capable of describing, e.g., vehicles driving through portal monitors, border patrol scanning suspectedmore » illicit materials with a handheld instrument, and first responders with backpackor pager-based detectors (see Fig. 1). The backend library is prepopulated by the NDFOM developers to include sources and detectors of interest to DNDO and its community.« less

  1. [Using modern information technology in the practice of the sanitary-epidemiological surveiliance during the XXII Olympic Winter Games and XI Paralympic Winter Games in Sochi].

    PubMed

    Popova, A Yu; Kuzkin, B P; Demina, Yu V; Dubyansky, V M; Kulichenko, A N; Maletskaya, O V; Shayakhmetov, O Kh; Semenko, O V; Nazarenko, Yu V; Agapitov, D S; Mezentsev, V M; Kharchenko, T V; Efremenko, D V; Oroby, V G; Klindukhov, V P; Grechanaya, T V; Nikolaevich, P N; Tesheva, S Ch; Rafeenko, G K

    2015-01-01

    To improve the sanitary and epidemiological surveillance at the Olympic Games has developed a system of GIS for monitoring objects and situations in the region of Sochi. The system is based on software package ArcGIS, version 10.2 server, with Web-java.lang. Object, Web-server Apach, and software developed in language java. During th execution of the tasks are solved: the stratification of the region of the Olympic Games for the private and aggregate epidemiological risk OCI various eti- ologies, ranking epidemiologically important facilities for the sanitary and hygienic conditions, monitoring of infectious diseases (in real time according to the preliminary diagnosis). GIS monitoring has shown its effectiveness: Information received from various sources, but focused on one portal. Information was available in real time all the specialists involved in ensuring epidemiological well-being and use at work during the Olympic Games in Sochi.

  2. Non-variceal upper gastrointestinal bleeding in cirrhotic patients in Nile Delta.

    PubMed

    Gabr, Mamdouh Ahmed; Tawfik, Mohamed Abd El-Raouf; El-Sawy, Abd Allah Ahmed

    2016-01-01

    Acute upper gastrointestinal bleeding (AUGIB) in cirrhotic patients occurs mainly from esophageal and gastric varices; however, quite a large number of cirrhotic patients bleed from other sources as well. The aim of the present work is to determine the prevalence of non-variceal UGIB as well as its different causes among the cirrhotic portal hypertensive patients in Nile Delta. Emergency upper gastrointestinal (UGI) endoscopy for AUGIB was done in 650 patients. Out of these patients, 550 (84.6%) patients who were proved to have cirrhosis were the subject of the present study. From all cirrhotic portal hypertensive patients, 415 (75.5%) bled from variceal sources (esophageal and gastric) while 135 (24.5%) of them bled from non-variceal sources. Among variceal sources of bleeding, esophageal varices were much more common than gastric varices. Peptic ulcer was the most common non-variceal source of bleeding. Non-variceal bleeding in cirrhosis was not frequent, and sources included peptic ulcer, portal hypertensive gastropathy, and erosive disease of the stomach and duodenum.

  3. The VTIE telescope resource management system

    NASA Astrophysics Data System (ADS)

    Busschots, B.; Keating, J. G.

    2005-06-01

    The VTIE Telescope Resource Management System (TRMS) provides a frame work for managing a distributed group of internet telescopes as a single "Virtual Observatory". The TRMS provides hooks which allow for it to be connected to any Java Based web portal and for a Java based scheduler to be added to it. The TRMS represents each telescope and observatory in the system with a software agent and then allows the scheduler and web portal to communicate with these distributed resources in a simple transparent way, hence allowing the scheduler and portal designers to concentrate only on what they wish to do with these resources rather than how to communicate with them. This paper outlines the structure and implementation of this frame work.

  4. Monte Carlo simulation of portal dosimetry on a rectilinear voxel geometry: a variable gantry angle solution.

    PubMed

    Chin, P W; Spezi, E; Lewis, D G

    2003-08-21

    A software solution has been developed to carry out Monte Carlo simulations of portal dosimetry using the BEAMnrc/DOSXYZnrc code at oblique gantry angles. The solution is based on an integrated phantom, whereby the effect of incident beam obliquity was included using geometric transformations. Geometric transformations are accurate within +/- 1 mm and +/- 1 degrees with respect to exact values calculated using trigonometry. An application in portal image prediction of an inhomogeneous phantom demonstrated good agreement with measured data, where the root-mean-square of the difference was under 2% within the field. Thus, we achieved a dose model framework capable of handling arbitrary gantry angles, voxel-by-voxel phantom description and realistic particle transport throughout the geometry.

  5. Intervention Therapy for Portal Vein Stenosis/Occlusion After Pediatric Liver Transplantation.

    PubMed

    Gao, Haijun; Wang, Hao; Chen, Guang; Yi, Zhengjia

    2017-04-18

    BACKGROUND The aim of this study was to investigate the outcomes and stent implantation timing of portal vein stenosis intervention after pediatric liver transplantation (pLT). MATERIAL AND METHODS The clinical data of 30 children with post-liver transplantation portal vein stenosis/occlusion (PVS/O) between Jan 2008 and Jun 2015 were retrospectively analyzed. The successfully re-opened cases used balloon angioplasty or stent implantation. SPSS13.0 software was used for statistical analysis and paired t test of the pressure gradient at both ends of the stenosis, diameter and flow rate within the stenosis, platelet count, and albumin in the PVS children before and after balloon angioplasty, with p<0.05 considered as statistically significant. Among the 30 patients, 6 received a stent implant in their first treatment, 22 received balloon angioplasty in their first treatment, and in 2 the re-opening could not be achieved. RESULTS The diameter of the stenotic segment, portal vein velocity, pressure gradient at both ends of the stenosis, and platelet count in these children with portal vein stenosis/occlusion (PVS/O) showed statistically significant differences when comparing values before and after intervention (p<0.05), but albumin showed no statistically significant difference (p>0.05). CONCLUSIONS Intervention therapy for portal vein stenosis after pediatric liver transplantation (pLT-PVS) is a safe and effective treatment, and patients with portal vein torsion, intimal tearing, or long portal vein segment occlusion should undergo stent implantation.

  6. A Web Portal-Based Time-Aware KML Animation Tool for Exploring Spatiotemporal Dynamics of Hydrological Events

    NASA Astrophysics Data System (ADS)

    Bao, X.; Cai, X.; Liu, Y.

    2009-12-01

    Understanding spatiotemporal dynamics of hydrological events such as storms and droughts is highly valuable for decision making on disaster mitigation and recovery. Virtual Globe-based technologies such as Google Earth and Open Geospatial Consortium KML standards show great promises for collaborative exploration of such events using visual analytical approaches. However, currently there are two barriers for wider usage of such approaches. First, there lacks an easy way to use open source tools to convert legacy or existing data formats such as shapefiles, geotiff, or web services-based data sources to KML and to produce time-aware KML files. Second, an integrated web portal-based time-aware animation tool is currently not available. Thus users usually share their files in the portal but have no means to visually explore them without leaving the portal environment which the users are familiar with. We develop a web portal-based time-aware KML animation tool for viewing extreme hydrologic events. The tool is based on Google Earth JavaScript API and Java Portlet standard 2.0 JSR-286, and it is currently deployable in one of the most popular open source portal frameworks, namely Liferay. We have also developed an open source toolkit kml-soc-ncsa (http://code.google.com/p/kml-soc-ncsa/) to facilitate the conversion of multiple formats into KML and the creation of time-aware KML files. We illustrate our tool using some example cases, in which drought and storm events with both time and space dimension can be explored in this web-based KML animation portlet. The tool provides an easy-to-use web browser-based portal environment for multiple users to collaboratively share and explore their time-aware KML files as well as improving the understanding of the spatiotemporal dynamics of the hydrological events.

  7. Geometrical verification system using Adobe Photoshop in radiotherapy.

    PubMed

    Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige

    2005-02-01

    Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p < 0.007). The geometrical verification system using Adobe Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.

  8. An assessment of a film enhancement system for use in a radiation therapy department.

    PubMed

    Solowsky, E L; Reinstein, L E; Meek, A G

    1990-01-01

    The clinical uses of a radiotherapy film enhancement system are explored. The primary functions of the system are to improve the quality of poorly exposed simulator and portal films, and to perform comparisons between the two films to determine whether patient or block positioning errors are present. Other features include: the production of inexpensive, high quality hardcopy images of simulation films and initial portal films for chart documentation, the capacity to overlay lateral simulation films with sagittal MRI films to aid in field design, and a mode to zoom in on individual CT or MRI images and enlarge them for video display during chart rounds or instructional sessions. This commercially available system is comprised of a microcomputer, frame grabber, CCD camera with zoom lens, and a high-resolution thermal printer. The user-friendly software is menu driven and utilizes both keyboard and track ball to perform its functions. At the heart of the software is a very fast Adaptive Histogram Equalization (AHE) routine, which enhances and improves the readability of most portal films. The system has been evaluated for several disease sites, and its advantages and limitations will be presented.

  9. My World Is Your World: Web Portal Design For Environmental Data

    NASA Astrophysics Data System (ADS)

    Laney, C.; Cody, R. P.; Gaylord, A. G.; Kassin, A.; Manley, W. F.; Score, R.; Tweedie, C. E.

    2013-12-01

    In the environmental sciences, researchers are increasingly relying on automated sensors as necessary components of their work. There are many software packages available that will help users download data from internet-connected data loggers; process, store, document, and analyze the data; or provide web-based geoportals for visualization and sharing of both spatial and time-series data. However, few (if any) software packages provide a complete, end-to-end system that will meet all of the needs of any given research group. Such systems often need to be designed and built as needed. Our group specializes in creating such systems. Our portals provide rapid data discovery and contextualization, and promote collaboration. We work at multiple scales, from a small lab working at a single site in the Chihuahuan desert (SEL-Jornada), to a community portal for environmental data from Barrow, Alaska (Barrow Area Information Database Information Management System [BAID-IMS]), to a project-tracking system for US Arctic research efforts (Arctic Research Mapping Application/Arctic Observing Viewer [ARMAP/AON]). Here, we share our experiences of creating scalable systems and improving practices that address both user community and research needs.

  10. Software

    Science.gov Websites

    United States Census Bureau Topics Population Latest Information Age and Sex Ancestry Children Mobility Population Estimates Population Projections Race Veterans Economy Latest Information Portal Other Economic Programs Business Latest Information Business Characteristics Classification Codes

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less

  12. OntoCAT -- simple ontology search and integration in Java, R and REST/JavaScript

    PubMed Central

    2011-01-01

    Background Ontologies have become an essential asset in the bioinformatics toolbox and a number of ontology access resources are now available, for example, the EBI Ontology Lookup Service (OLS) and the NCBO BioPortal. However, these resources differ substantially in mode, ease of access, and ontology content. This makes it relatively difficult to access each ontology source separately, map their contents to research data, and much of this effort is being replicated across different research groups. Results OntoCAT provides a seamless programming interface to query heterogeneous ontology resources including OLS and BioPortal, as well as user-specified local OWL and OBO files. Each resource is wrapped behind easy to learn Java, Bioconductor/R and REST web service commands enabling reuse and integration of ontology software efforts despite variation in technologies. It is also available as a stand-alone MOLGENIS database and a Google App Engine application. Conclusions OntoCAT provides a robust, configurable solution for accessing ontology terms specified locally and from remote services, is available as a stand-alone tool and has been tested thoroughly in the ArrayExpress, MOLGENIS, EFO and Gen2Phen phenotype use cases. Availability http://www.ontocat.org PMID:21619703

  13. OntoCAT--simple ontology search and integration in Java, R and REST/JavaScript.

    PubMed

    Adamusiak, Tomasz; Burdett, Tony; Kurbatova, Natalja; Joeri van der Velde, K; Abeygunawardena, Niran; Antonakaki, Despoina; Kapushesky, Misha; Parkinson, Helen; Swertz, Morris A

    2011-05-29

    Ontologies have become an essential asset in the bioinformatics toolbox and a number of ontology access resources are now available, for example, the EBI Ontology Lookup Service (OLS) and the NCBO BioPortal. However, these resources differ substantially in mode, ease of access, and ontology content. This makes it relatively difficult to access each ontology source separately, map their contents to research data, and much of this effort is being replicated across different research groups. OntoCAT provides a seamless programming interface to query heterogeneous ontology resources including OLS and BioPortal, as well as user-specified local OWL and OBO files. Each resource is wrapped behind easy to learn Java, Bioconductor/R and REST web service commands enabling reuse and integration of ontology software efforts despite variation in technologies. It is also available as a stand-alone MOLGENIS database and a Google App Engine application. OntoCAT provides a robust, configurable solution for accessing ontology terms specified locally and from remote services, is available as a stand-alone tool and has been tested thoroughly in the ArrayExpress, MOLGENIS, EFO and Gen2Phen phenotype use cases. http://www.ontocat.org.

  14. eXtended CASA Line Analysis Software Suite (XCLASS)

    NASA Astrophysics Data System (ADS)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  15. The NIH Library of Integrated Network-Based Cellular Signatures (LINCS) Program | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    By generating and making public data that indicates how cells respond to various genetic and environmental stressors, the LINCS project will help us gain a more detailed understanding of cell pathways and aid efforts to develop therapies that might restore perturbed pathways and networks to their normal states. The LINCS website is a source of information for the research community and general public about the LINCS project. This website along with the LINCS Data Portal contains details about the assays, cell types, and perturbagens that are currently part of the library, as well as links to participating sites, data releases from the sites, and software that can be used for analyzing the data.

  16. The climate4impact platform: Providing, tailoring and facilitating climate model data access

    NASA Astrophysics Data System (ADS)

    Pagé, Christian; Pagani, Andrea; Plieger, Maarten; Som de Cerff, Wim; Mihajlovski, Andrej; de Vreede, Ernst; Spinuso, Alessandro; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Vega, Manuel; Cofiño, Antonio; d'Anca, Alessandro; Fiore, Sandro; Kolax, Michael

    2017-04-01

    One of the main objectives of climate4impact is to provide standardized web services and tools that are reusable in other portals. These services include web processing services, web coverage services and web mapping services (WPS, WCS and WMS). Tailored portals can be targeted to specific communities and/or countries/regions while making use of those services. Easier access to climate data is very important for the climate change impact communities. To fulfill this objective, the climate4impact (http://climate4impact.eu/) web portal and services has been developed, targeting climate change impact modellers, impact and adaptation consultants, as well as other experts using climate change data. It provides to users harmonized access to climate model data through tailored services. It features static and dynamic documentation, Use Cases and best practice examples, an advanced search interface, an integrated authentication and authorization system with the Earth System Grid Federation (ESGF), a visualization interface with ADAGUC web mapping tools. In the latest version, statistical downscaling services, provided by the Santander Meteorology Group Downscaling Portal, were integrated. An innovative interface to integrate statistical downscaling services will be released in the upcoming version. The latter will be a big step in bridging the gap between climate scientists and the climate change impact communities. The climate4impact portal builds on the infrastructure of an international distributed database that has been set to disseminate the results from the global climate model results of the Coupled Model Intercomparison project Phase 5 (CMIP5). This database, the ESGF, is an international collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of climate model data. The European FP7 project IS-ENES, Infrastructure for the European Network for Earth System modelling, supports the European contribution to ESGF and contributes to the ESGF open source effort, notably through the development of search, monitoring, quality control, and metadata services. In its second phase, IS-ENES2 supports the implementation of regional climate model results from the international Coordinated Regional Downscaling Experiments (CORDEX). These services were extended within the European FP7 Climate Information Portal for Copernicus (CLIPC) project, and some could be later integrated into the European Copernicus platform.

  17. Science Gateways, Scientific Workflows and Open Community Software

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing Apache Airavata as a hosted service to provide these features.

  18. Interactive video audio system: communication server for INDECT portal

    NASA Astrophysics Data System (ADS)

    Mikulec, Martin; Voznak, Miroslav; Safarik, Jakub; Partila, Pavol; Rozhon, Jan; Mehic, Miralem

    2014-05-01

    The paper deals with presentation of the IVAS system within the 7FP EU INDECT project. The INDECT project aims at developing the tools for enhancing the security of citizens and protecting the confidentiality of recorded and stored information. It is a part of the Seventh Framework Programme of European Union. We participate in INDECT portal and the Interactive Video Audio System (IVAS). This IVAS system provides a communication gateway between police officers working in dispatching centre and police officers in terrain. The officers in dispatching centre have capabilities to obtain information about all online police officers in terrain, they can command officers in terrain via text messages, voice or video calls and they are able to manage multimedia files from CCTV cameras or other sources, which can be interesting for officers in terrain. The police officers in terrain are equipped by smartphones or tablets. Besides common communication, they can reach pictures or videos sent by commander in office and they can respond to the command via text or multimedia messages taken by their devices. Our IVAS system is unique because we are developing it according to the special requirements from the Police of the Czech Republic. The IVAS communication system is designed to use modern Voice over Internet Protocol (VoIP) services. The whole solution is based on open source software including linux and android operating systems. The technical details of our solution are presented in the paper.

  19. An Overview of the Challenges in Designing, Integrating, and Delivering BARD: A Public Chemical-Biology Resource and Query Portal for Multiple Organizations, Locations, and Disciplines.

    PubMed

    de Souza, Andrea; Bittker, Joshua A; Lahr, David L; Brudz, Steve; Chatwin, Simon; Oprea, Tudor I; Waller, Anna; Yang, Jeremy J; Southall, Noel; Guha, Rajarshi; Schürer, Stephan C; Vempati, Uma D; Southern, Mark R; Dawson, Eric S; Clemons, Paul A; Chung, Thomas D Y

    2014-06-01

    Recent industry-academic partnerships involve collaboration among disciplines, locations, and organizations using publicly funded "open-access" and proprietary commercial data sources. These require the effective integration of chemical and biological information from diverse data sources, which presents key informatics, personnel, and organizational challenges. The BioAssay Research Database (BARD) was conceived to address these challenges and serve as a community-wide resource and intuitive web portal for public-sector chemical-biology data. Its initial focus is to enable scientists to more effectively use the National Institutes of Health Roadmap Molecular Libraries Program (MLP) data generated from the 3-year pilot and 6-year production phases of the Molecular Libraries Probe Production Centers Network (MLPCN), which is currently in its final year. BARD evolves the current data standards through structured assay and result annotations that leverage BioAssay Ontology and other industry-standard ontologies, and a core hierarchy of assay definition terms and data standards defined specifically for small-molecule assay data. We initially focused on migrating the highest-value MLP data into BARD and bringing it up to this new standard. We review the technical and organizational challenges overcome by the interdisciplinary BARD team, veterans of public- and private-sector data-integration projects, who are collaborating to describe (functional specifications), design (technical specifications), and implement this next-generation software solution. © 2014 Society for Laboratory Automation and Screening.

  20. Access High Quality Imagery from the NOAA View Portal

    NASA Astrophysics Data System (ADS)

    Pisut, D.; Powell, A. M.; Loomis, T.; Goel, V.; Mills, B.; Cowan, D.

    2013-12-01

    NOAA curates a vast treasure trove of environmental data, but one that is sometimes not easily accessed, especially for education, outreach, and media purposes. Traditional data portals in NOAA require extensive knowledge of the specific names of observation platforms, models, and analyses, along with nomenclature for variable outputs. A new website and web mapping service (WMS) from NOAA attempts to remedy such issues. The NOAA View data imagery portal provides a seamless entry point into data from across the agency: satellite, models, in-situ analysis, etc. The system provides the user with ability to browse, animate, and download high resolution (e.g., 4,000 x 2,000 pixel) imagery, Google Earth, and even proxy data files. The WMS architecture also allows the resources to be ingested into other software systems or applications.

  1. Open source clinical portals: a model for healthcare information systems to support care processes and feed clinical research. An Italian case of design, development, reuse, and exploitation.

    PubMed

    Locatelli, Paolo; Baj, Emanuele; Restifo, Nicola; Origgi, Gianni; Bragagia, Silvia

    2011-01-01

    Open source is a still unexploited chance for healthcare organizations and technology providers to answer to a growing demand for innovation and to join economical benefits with a new way of managing hospital information systems. This chapter will present the case of the web enterprise clinical portal developed in Italy by Niguarda Hospital in Milan with the support of Fondazione Politecnico di Milano, to enable a paperless environment for clinical and administrative activities in the ward. This represents also one rare case of open source technology and reuse in the healthcare sector, as the system's porting is now taking place at Besta Neurological Institute in Milan. This institute is customizing the portal to feed researchers with structured clinical data collected in its portal's patient records, so that they can be analyzed, e.g., through business intelligence tools. Both organizational and clinical advantages are investigated, from process monitoring, to semantic data structuring, to recognition of common patterns in care processes.

  2. Galaxy Portal: interacting with the galaxy platform through mobile devices.

    PubMed

    Børnich, Claus; Grytten, Ivar; Hovig, Eivind; Paulsen, Jonas; Čech, Martin; Sandve, Geir Kjetil

    2016-06-01

    : We present Galaxy Portal app, an open source interface to the Galaxy system through smart phones and tablets. The Galaxy Portal provides convenient and efficient monitoring of job completion, as well as opportunities for inspection of results and execution history. In addition to being useful to the Galaxy community, we believe that the app also exemplifies a useful way of exploiting mobile interfaces for research/high-performance computing resources in general. The source is freely available under a GPL license on GitHub, along with user documentation and pre-compiled binaries and instructions for several platforms: https://github.com/Tarostar/QMLGalaxyPortal It is available for iOS version 7 (and newer) through the Apple App Store, and for Android through Google Play for version 4.1 (API 16) or newer. geirksa@ifi.uio.no. © The Author 2016. Published by Oxford University Press.

  3. Influence of Extraterrestrial Radiation on Radiation Portal Monitors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, Paul E.; Kouzes, Richard T.

    2009-06-01

    Cosmic radiation and solar flares can be a major source of background radiation at the Earth’s surface. This paper examines the relationship between extraterrestrial radiation and the detectable background in radiation portal monitors used for homeland security applications. Background radiation data from 13 radiation portal monitor facilities are examined and compared against external sources of data related to extraterrestrial radiation, including measurements at neutron monitors located at 53 cosmic-ray observatories around the Earth, four polar orbiting satellites, three geostationary satellites, ground-based geomagnetic field data from observatories around the Earth, a solar magnetic index, solar radio flux data, and sunspot activitymore » data. Four-years (January 2003 through December 2006) of data are used in this study, which include the latter part of Solar Cycle 23 as solar activity was on the decline. The analysis shows a significant relationship between some extraterrestrial radiation and the background detected in the radiation portal monitors. A demonstrable decline is shown in the average gamma ray and neutron background at the radiation portal monitors as solar activity declined over the period of the study.« less

  4. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  5. Integrating Space Communication Network Capabilities via Web Portal Technologies

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.; Lee, Carlyn-Ann; Lau, Chi-Wung; Cheung, Kar-Ming; Levesque, Michael; Carruth, Butch; Coffman, Adam; Wallace, Mike

    2014-01-01

    We have developed a service portal prototype as part of an investigation into the feasibility of using Java portlet technology as a means of providing integrated access to NASA communications network services. Portal servers provide an attractive platform for this role due to the various built-in collaboration applications they can provide, combined with the possibility to develop custom inter-operating portlets to extent their functionality while preserving common presentation and behavior. This paper describes various options for integration of network services related to planning and scheduling, and results based on use of a popular open-source portal framework. Plans are underway to develop an operational SCaN Service Portal, building on the experiences reported here.

  6. Pattern-based information portal for business plan co-creation

    NASA Astrophysics Data System (ADS)

    Bontchev, Boyan; Ruskov, Petko; Tanev, Stoyan

    2011-03-01

    Creation of business plans helps entrepreneurs in managing identification of business opportunities and committing necessary resources for process evolution. Applying patterns in business plan creation facilitates the identification of effective solutions that were adopted in the past and may provide a basis for adopting similar solutions in the future within given business context. The article presents the system design of an information portal for business plan co-creation based on patterns. The portal is going to provide start-up and entrepreneurs with ready-to-modify business plan patterns in order to help them in development of effective and efficient business plans. It will facilitate entrepreneurs in co-experimenting and co-learning more frequently and faster. Moreover, the paper focuses on the software architecture of the pattern based portal and explains the functionality of its modules, namely the pattern designer, pattern repository services and agent-based pattern implementers. It explains their role for business process co-creation, storing and managing patterns described formally, and selecting patterns best suited for specific business case. Thus, innovative entrepreneurs will be guided by the portal in co-writing winning business plans and staying competitive in the present day dynamic globalized environment.

  7. Pattern-based information portal for business plan co-creation

    NASA Astrophysics Data System (ADS)

    Bontchev, Boyan; Ruskov, Petko; Tanev, Stoyan

    2010-10-01

    Creation of business plans helps entrepreneurs in managing identification of business opportunities and committing necessary resources for process evolution. Applying patterns in business plan creation facilitates the identification of effective solutions that were adopted in the past and may provide a basis for adopting similar solutions in the future within given business context. The article presents the system design of an information portal for business plan co-creation based on patterns. The portal is going to provide start-up and entrepreneurs with ready-to-modify business plan patterns in order to help them in development of effective and efficient business plans. It will facilitate entrepreneurs in co-experimenting and co-learning more frequently and faster. Moreover, the paper focuses on the software architecture of the pattern based portal and explains the functionality of its modules, namely the pattern designer, pattern repository services and agent-based pattern implementers. It explains their role for business process co-creation, storing and managing patterns described formally, and selecting patterns best suited for specific business case. Thus, innovative entrepreneurs will be guided by the portal in co-writing winning business plans and staying competitive in the present day dynamic globalized environment.

  8. [The EU Portal: Implementation, importance, and features].

    PubMed

    von Aschen, Harald; Krafft, Hartmut

    2017-08-01

    The European Medicines Agency (EMA) is developing a web-based EU portal with a database "at Union level as a single entry point for the submission of data and information relating to clinical trials in accordance with" the new EU regulation No. 536/2014. The specifications are mostly published, but some documents are still missing. Because the project is integrated and has dependencies on other projects, this could result in other specification upgrades. The IT solution is under ongoing development until project completion in quarter III of 2019. The EU Portal and the database will be audited. If the audit is successful, the new regulation will come into force in October 2018. The use of the EU Portal will then be mandatory with some transition rules. The software development of the portal is restricted to the regulation and the derived requirements. It is not possible to implement any national requirements. We describe in this paper the current key functionalities of the portal and try to derive requirements for a national IT system.On 16.06.2017 the EMA Management Board announced that the development of the new portal has been delayed and it is foreseen that the new regulation can come into effect in 2019 at the earliest. The press release can be found here: http://www.ema.europa.eu/ema/index.jsp?curl=pages/news_and_events/news/2017/06/news_detail_002764.jsp%26mid=WC0b01ac058004d5c1 (accessed: 12.07.2017).

  9. The NIDDK Information Network: A Community Portal for Finding Data, Materials, and Tools for Researchers Studying Diabetes, Digestive, and Kidney Diseases

    PubMed Central

    Whetzel, Patricia L.; Grethe, Jeffrey S.; Banks, Davis E.; Martone, Maryann E.

    2015-01-01

    The NIDDK Information Network (dkNET; http://dknet.org) was launched to serve the needs of basic and clinical investigators in metabolic, digestive and kidney disease by facilitating access to research resources that advance the mission of the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK). By research resources, we mean the multitude of data, software tools, materials, services, projects and organizations available to researchers in the public domain. Most of these are accessed via web-accessible databases or web portals, each developed, designed and maintained by numerous different projects, organizations and individuals. While many of the large government funded databases, maintained by agencies such as European Bioinformatics Institute and the National Center for Biotechnology Information, are well known to researchers, many more that have been developed by and for the biomedical research community are unknown or underutilized. At least part of the problem is the nature of dynamic databases, which are considered part of the “hidden” web, that is, content that is not easily accessed by search engines. dkNET was created specifically to address the challenge of connecting researchers to research resources via these types of community databases and web portals. dkNET functions as a “search engine for data”, searching across millions of database records contained in hundreds of biomedical databases developed and maintained by independent projects around the world. A primary focus of dkNET are centers and projects specifically created to provide high quality data and resources to NIDDK researchers. Through the novel data ingest process used in dkNET, additional data sources can easily be incorporated, allowing it to scale with the growth of digital data and the needs of the dkNET community. Here, we provide an overview of the dkNET portal and its functions. We show how dkNET can be used to address a variety of use cases that involve searching for research resources. PMID:26393351

  10. EMODnet Physics in the EMODnet program phase 3

    NASA Astrophysics Data System (ADS)

    Novellino, Antonio; Gorringe, Patrick; Schaap, Dick; Pouliquen, Sylvie; Rickards, Lesley; Thijsse, Peter; Manzella, Giuseppe

    2017-04-01

    Access to marine data is of vital importance for marine research and a key issue for various studies, from climate change prediction to off shore engineering. Giving access to and harmonising marine data from different sources will help industry, public authorities and researchers find the data and make more effective use of them to develop new products, services and improve our understanding of how the seas behave. The aim of EMODnet Physics is the provision of a combined array of services and functionalities (facility for viewing and downloading, dashboard reporting and machine-to-machine communication services) to obtain, free of charge data, meta-data and data products on the physical conditions of European sea basins and oceans from many different distributed data bases. Moreover, the system provides full interoperability with third-party software through WMS services, Web Services and Web catalogues in order to exchange data and products according to the most recent standards. This assures to the user, the access to data having same quality and formats. The portal is providing access to data and products of: wave height and period; temperature and salinity of the water column; wind speed and direction; horizontal velocity of the water column; light attenuation; sea ice coverage and sea level trends. EMODnet Physics is continuously enhancing the number and type of platforms in the system by unlocking and providing high quality data from a growing network. Nowadays the system does integrate information by more than 12.000 stations and is including two ready-to-use data products: Ice Map and Sea Level Trends. The final aim of EMODnet Physics is to confederate different portals and be a portal of portal to further extend the number and type of data (e.g. water noise, river data, etc.) and platforms (e.g. animal bourne instruments, etc) feeding the system; improve the capacity of the system producing data and products that could match the market needs of the current and potential new end and intermediate users.

  11. The Most Common Geometric and Semantic Errors in CityGML Datasets

    NASA Astrophysics Data System (ADS)

    Biljecki, F.; Ledoux, H.; Du, X.; Stoter, J.; Soon, K. H.; Khoo, V. H. S.

    2016-10-01

    To be used as input in most simulation and modelling software, 3D city models should be geometrically and topologically valid, and semantically rich. We investigate in this paper what is the quality of currently available CityGML datasets, i.e. we validate the geometry/topology of the 3D primitives (Solid and MultiSurface), and we validate whether the semantics of the boundary surfaces of buildings is correct or not. We have analysed all the CityGML datasets we could find, both from portals of cities and on different websites, plus a few that were made available to us. We have thus validated 40M surfaces in 16M 3D primitives and 3.6M buildings found in 37 CityGML datasets originating from 9 countries, and produced by several companies with diverse software and acquisition techniques. The results indicate that CityGML datasets without errors are rare, and those that are nearly valid are mostly simple LOD1 models. We report on the most common errors we have found, and analyse them. One main observation is that many of these errors could be automatically fixed or prevented with simple modifications to the modelling software. Our principal aim is to highlight the most common errors so that these are not repeated in the future. We hope that our paper and the open-source software we have developed will help raise awareness for data quality among data providers and 3D GIS software producers.

  12. Resurrecting Legacy Code Using Ontosoft Knowledge-Sharing and Digital Object Management to Revitalize and Reproduce Software for Groundwater Management Research

    NASA Astrophysics Data System (ADS)

    Kwon, N.; Gentle, J.; Pierce, S. A.

    2015-12-01

    Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the application is accessible with version control and potential for new branch developments. Finally, metadata about the software has been completed within the OntoSoft portal to provide descriptive curation, make GWDSS searchable, and complete documentation of the scientific software lifecycle.

  13. On flattening filter‐free portal dosimetry

    PubMed Central

    Novais, Juan Castro; Molina López, María Yolanda; Maqueda, Sheila Ruiz

    2016-01-01

    Varian introduced (in 2010) the option of removing the flattening filter (FF) in their C‐Arm linacs for intensity‐modulated treatments. This mode, called flattening filter‐free (FFF), offers the advantage of a greater dose rate. Varian's “Portal Dosimetry” is an electronic portal imager device (EPID)‐based tool for IMRT verification. This tool lacks the capability of verifying flattening filter‐free (FFF) modes due to saturation and lack of an image prediction algorithm. (Note: the latest versions of this software and EPID correct these issues.) The objective of the present study is to research the feasibility of said verifications (with the older versions of the software and EPID). By placing the EPID at a greater distance, the images can be acquired without saturation, yielding a linearity similar to the flattened mode. For the image prediction, a method was optimized based on the clinically used algorithm (analytical anisotropic algorithm (AAA)) over a homogeneous phantom. The depth inside the phantom and its electronic density were tailored. An application was developed to allow the conversion of a dose plane (in DICOM format) to Varian's custom format for Portal Dosimetry. The proposed method was used for the verification of test and clinical fields for the three qualities used in our institution for IMRT: 6X, 6FFF and 10FFF. The method developed yielded a positive verification (more than 95% of the points pass a 2%/2 mm gamma) for both the clinical and test fields. This method was also capable of “predicting” static and wedged fields. A workflow for the verification of FFF fields was developed. This method relies on the clinical algorithm used for dose calculation and is able to verify the FFF modes, as well as being useful for machine quality assurance. The procedure described does not require new hardware. This method could be used as a verification of Varian's Portal Dose Image Prediction. PACS number(s): 87.53.Kn, 87.55.T‐, 87.56.bd, 87.59.‐e PMID:27455487

  14. qPortal: A platform for data-driven biomedical research.

    PubMed

    Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven

    2018-01-01

    Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on high-performance computing systems via coupling of workflow management systems. Integration of project and data management as well as workflow resources in one place present clear advantages over existing solutions.

  15. Development of the UTAUT2 model to measure the acceptance of medical laboratory portals by patients in Shiraz.

    PubMed

    Ravangard, Ramin; Kazemi, Zhila; Abbasali, Somaye Zaker; Sharifian, Roxana; Monem, Hossein

    2017-02-01

    One of the main stages for achieving the success is acceptance of technology by its users. Hence, identifying the effective factors in successful acceptance of information technology is necessary and vital. One such factor is usability. This study aimed to investigate the software usability in the "Unified Theory of Acceptance and Use of Technology 2 (UTAUT2)" model in patients' use of medical diagnosis laboratories' electronic portals in 2015. This cross-sectional study was carried out on 170 patients in 2015. A 27-item questionnaire adopted from previous research and the Usability Evaluation questionnaire were used for data collection. Data were analyzed using Structural Equation Modeling (SEM), with Partial Least Squares approach by SPSS 20.0 and Smart-PLS V3.0. The results showed that the construct of intention to use had significant associations with price value (t-value=2.77), hedonic motivation (t-value=4.46), habit (t-value=1.99) and usability (t-value=5.2), as well as the construct of usage behavior with usability (t-value=3.45) and intention to use (t-value=2.03). Considering the results of this study, the following recommendations can be made in order for the higher use of portals by the patients: informing patients about the advantages of using these portals, designing portals in a simple and understandable form, increasing the portals' attractiveness, etc.

  16. VDJServer: A Cloud-Based Analysis Portal and Data Commons for Immune Repertoire Sequences and Rearrangements.

    PubMed

    Christley, Scott; Scarborough, Walter; Salinas, Eddie; Rounds, William H; Toby, Inimary T; Fonner, John M; Levin, Mikhail K; Kim, Min; Mock, Stephen A; Jordan, Christopher; Ostmeyer, Jared; Buntzman, Adam; Rubelt, Florian; Davila, Marco L; Monson, Nancy L; Scheuermann, Richard H; Cowell, Lindsay G

    2018-01-01

    Recent technological advances in immune repertoire sequencing have created tremendous potential for advancing our understanding of adaptive immune response dynamics in various states of health and disease. Immune repertoire sequencing produces large, highly complex data sets, however, which require specialized methods and software tools for their effective analysis and interpretation. VDJServer is a cloud-based analysis portal for immune repertoire sequence data that provide access to a suite of tools for a complete analysis workflow, including modules for preprocessing and quality control of sequence reads, V(D)J gene segment assignment, repertoire characterization, and repertoire comparison. VDJServer also provides sophisticated visualizations for exploratory analysis. It is accessible through a standard web browser via a graphical user interface designed for use by immunologists, clinicians, and bioinformatics researchers. VDJServer provides a data commons for public sharing of repertoire sequencing data, as well as private sharing of data between users. We describe the main functionality and architecture of VDJServer and demonstrate its capabilities with use cases from cancer immunology and autoimmunity. VDJServer provides a complete analysis suite for human and mouse T-cell and B-cell receptor repertoire sequencing data. The combination of its user-friendly interface and high-performance computing allows large immune repertoire sequencing projects to be analyzed with no programming or software installation required. VDJServer is a web-accessible cloud platform that provides access through a graphical user interface to a data management infrastructure, a collection of analysis tools covering all steps in an analysis, and an infrastructure for sharing data along with workflows, results, and computational provenance. VDJServer is a free, publicly available, and open-source licensed resource.

  17. The PO.DAAC Portal and its use of the Drupal Framework

    NASA Astrophysics Data System (ADS)

    Alarcon, C.; Huang, T.; Bingham, A.; Cosic, S.

    2011-12-01

    The Physical Oceanography Distributed Active Archive Center portal (http://podaac.jpl.nasa.gov) is the primary interface for discovering and accessing oceanographic datasets collected from the vantage point of space. In addition, it provides information about NASA's satellite missions and operational activities at the data center. Recently the portal underwent a major redesign and deployment utilizing the Drupal framework. The Drupal framework was chosen as the platform for the portal due to its flexibility, open source community, and modular infrastructure. The portal features efficient content addition and management, mailing lists, forums, role based access control, and a faceted dataset browse capability. The dataset browsing was built as a custom Drupal module and integrates with a SOLR search engine.

  18. Exploring the Cost and Functionality of MEDCOM Web Services

    DTIC Science & Technology

    2005-10-24

    Software Name 24. What backend database software supports your intranet/Internet content? (check all that apply)-. o Oracle o Microsoft SQL Server E0...Department of Defense (DoD) service branches, which funded and deployed an Internet portal, TRICARE Online, to serve as an information conduit between the...public website, the information contained on the intranet is traditionally limited to the members of the hosting command. The local information serves as

  19. For DoD Users - Naval Oceanography Portal

    Science.gov Websites

    are here: Home › USNO › Astronomical Applications › For DoD Users USNO Logo USNO Navigation Data Services Astronomical Information Center Almanacs and Other Publications Software Products For DoD Users

  20. Detection with Enhanced Energy Windowing Phase I Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bass, David A.; Enders, Alexander L.

    2016-12-01

    This document reviews the progress of Phase I of the Detection with Enhanced Energy Windowing (DEEW) project. The DEEW project is the implementation of software incorporating an algorithm which reviews data generated by radiation portal monitors and utilizes advanced and novel techniques for detecting radiological and fissile material while not alarming on Naturally Occurring Radioactive Material. Independent testing indicated that the Enhanced Energy Windowing algorithm showed promise at reducing the probability of alarm in the stream of commerce compared to existing algorithms and other developmental algorithms, while still maintaining adequate sensitivity to threats. This document contains a brief description ofmore » the project, instructions for setting up and running the applications, and guidance to help make reviewing the output files and source code easier.« less

  1. An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films

    NASA Astrophysics Data System (ADS)

    Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander

    Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.

  2. Actionable, long-term stable and semantic web compatible identifiers for access to biological collection objects

    PubMed Central

    Hyam, Roger; Hagedorn, Gregor; Chagnoux, Simon; Röpert, Dominik; Casino, Ana; Droege, Gabi; Glöckler, Falko; Gödderz, Karsten; Groom, Quentin; Hoffmann, Jana; Holleman, Ayco; Kempa, Matúš; Koivula, Hanna; Marhold, Karol; Nicolson, Nicky; Smith, Vincent S.; Triebel, Dagmar

    2017-01-01

    With biodiversity research activities being increasingly shifted to the web, the need for a system of persistent and stable identifiers for physical collection objects becomes increasingly pressing. The Consortium of European Taxonomic Facilities agreed on a common system of HTTP-URI-based stable identifiers which is now rolled out to its member organizations. The system follows Linked Open Data principles and implements redirection mechanisms to human-readable and machine-readable representations of specimens facilitating seamless integration into the growing semantic web. The implementation of stable identifiers across collection organizations is supported with open source provider software scripts, best practices documentations and recommendations for RDF metadata elements facilitating harmonized access to collection information in web portals. Database URL: http://cetaf.org/cetaf-stable-identifiers PMID:28365724

  3. Enabling a systems biology knowledgebase with gaggle and firegoose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baliga, Nitin S.

    The overall goal of this project was to extend the existing Gaggle and Firegoose systems to develop an open-source technology that runs over the web and links desktop applications with many databases and software applications. This technology would enable researchers to incorporate workflows for data analysis that can be executed from this interface to other online applications. The four specific aims were to (1) provide one-click mapping of genes, proteins, and complexes across databases and species; (2) enable multiple simultaneous workflows; (3) expand sophisticated data analysis for online resources; and enhance open-source development of the Gaggle-Firegoose infrastructure. Gaggle is anmore » open-source Java software system that integrates existing bioinformatics programs and data sources into a user-friendly, extensible environment to allow interactive exploration, visualization, and analysis of systems biology data. Firegoose is an extension to the Mozilla Firefox web browser that enables data transfer between websites and desktop tools including Gaggle. In the last phase of this funding period, we have made substantial progress on development and application of the Gaggle integration framework. We implemented the workspace to the Network Portal. Users can capture data from Firegoose and save them to the workspace. Users can create workflows to start multiple software components programmatically and pass data between them. Results of analysis can be saved to the cloud so that they can be easily restored on any machine. We also developed the Gaggle Chrome Goose, a plugin for the Google Chrome browser in tandem with an opencpu server in the Amazon EC2 cloud. This allows users to interactively perform data analysis on a single web page using the R packages deployed on the opencpu server. The cloud-based framework facilitates collaboration between researchers from multiple organizations. We have made a number of enhancements to the cmonkey2 application to enable and improve the integration within different environments, and we have created a new tools pipeline for generating EGRIN2 models in a largely automated way.« less

  4. Best Practices for Making Scientific Data Discoverable and Accessible through Integrated, Standards-Based Data Portals

    NASA Astrophysics Data System (ADS)

    Lucido, J. M.

    2013-12-01

    Scientists in the fields of hydrology, geophysics, and climatology are increasingly using the vast quantity of publicly-available data to address broadly-scoped scientific questions. For example, researchers studying contamination of nearshore waters could use a combination of radar indicated precipitation, modeled water currents, and various sources of in-situ monitoring data to predict water quality near a beach. In discovering, gathering, visualizing and analyzing potentially useful data sets, data portals have become invaluable tools. The most effective data portals often aggregate distributed data sets seamlessly and allow multiple avenues for accessing the underlying data, facilitated by the use of open standards. Additionally, adequate metadata are necessary for attribution, documentation of provenance and relating data sets to one another. Metadata also enable thematic, geospatial and temporal indexing of data sets and entities. Furthermore, effective portals make use of common vocabularies for scientific methods, units of measure, geologic features, chemical, and biological constituents as they allow investigators to correctly interpret and utilize data from external sources. One application that employs these principles is the National Ground Water Monitoring Network (NGWMN) Data Portal (http://cida.usgs.gov/ngwmn), which makes groundwater data from distributed data providers available through a single, publicly accessible web application by mediating and aggregating native data exposed via web services on-the-fly into Open Geospatial Consortium (OGC) compliant service output. That output may be accessed either through the map-based user interface or through the aforementioned OGC web services. Furthermore, the Geo Data Portal (http://cida.usgs.gov/climate/gdp/), which is a system that provides users with data access, subsetting and geospatial processing of large and complex climate and land use data, exemplifies the application of International Standards Organization (ISO) metadata records to enhance data discovery for both human and machine interpretation. Lastly, the Water Quality Portal (http://www.waterqualitydata.us/) achieves interoperable dissemination of water quality data by referencing a vocabulary service for mapping constituents and methods between the USGS and USEPA. The NGWMN Data Portal, Geo Data Portal and Water Quality Portal are three examples of best practices when implementing data portals that provide distributed scientific data in an integrated, standards-based approach.

  5. DOIDB: Reusing DataCite's search software as metadata portal for GFZ Data Services

    NASA Astrophysics Data System (ADS)

    Elger, K.; Ulbricht, D.; Bertelmann, R.

    2016-12-01

    GFZ Data Services is the central service point for the publication of research data at the Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences (GFZ). It provides data publishing services to scientists of GFZ, associated projects, and associated institutions. The publishing services aim to make research data and physical samples visible and citable, by assigning persistent identifiers (DOI, IGSN) and by complementing existing IT infrastructure. To integrate several research domains a modular software stack that is made of free software components has been created to manage data and metadata as well as register persistent identifiers [1]. Pivotal component for the registration of DOIs is the DOIDB. It has been derived from three software components provided by DataCite [2] that moderate the registration of DOIs and the deposition of metadata, allow the dissemination of metadata, and provide a user interface to navigate and discover datasets. The DOIDB acts as a proxy to the DataCite infrastructure and in addition to the DataCite metadata schema, it allows to deposit and disseminate metadata following the schemas ISO19139 and NASA GCMD DIF. The search component has been modified to meet the requirements of a geosciences metadata portal. In particular, the search component has been altered to make use of Apache SOLRs capability to index and query spatial coordinates. Furthermore, the user interface has been adjusted to provide a first impression of the data by showing a map, summary information and subjects. DOIDB and its components are available on GitHub [3].We present a software solution for registration of DOIs that allows to integrate existing data systems, keeps track of registered DOIs, and provides a metadata portal to discover datasets [4]. [1] Ulbricht, D.; Elger, K.; Bertelmann, R.; Klump, J. panMetaDocs, eSciDoc, and DOIDB—An Infrastructure for the Curation and Publication of File-Based Datasets for GFZ Data Services. ISPRS Int. J. Geo-Inf. 2016, 5, 25. http://doi.org/10.3390/ijgi5030025[2] https://github.com/datacite[3] https://github.com/ulbricht/search/tree/doidb , https://github.com/ulbricht/mds/tree/doidb , https://github.com/ulbricht/oaip/tree/doidb[4] http://doidb.wdc-terra.org

  6. Radionuclide identification algorithm for organic scintillator-based radiation portal monitor

    NASA Astrophysics Data System (ADS)

    Paff, Marc Gerrit; Di Fulvio, Angela; Clarke, Shaun D.; Pozzi, Sara A.

    2017-03-01

    We have developed an algorithm for on-the-fly radionuclide identification for radiation portal monitors using organic scintillation detectors. The algorithm was demonstrated on experimental data acquired with our pedestrian portal monitor on moving special nuclear material and industrial sources at a purpose-built radiation portal monitor testing facility. The experimental data also included common medical isotopes. The algorithm takes the power spectral density of the cumulative distribution function of the measured pulse height distributions and matches these to reference spectra using a spectral angle mapper. F-score analysis showed that the new algorithm exhibited significant performance improvements over previously implemented radionuclide identification algorithms for organic scintillators. Reliable on-the-fly radionuclide identification would help portal monitor operators more effectively screen out the hundreds of thousands of nuisance alarms they encounter annually due to recent nuclear-medicine patients and cargo containing naturally occurring radioactive material. Portal monitor operators could instead focus on the rare but potentially high impact incidents of nuclear and radiological material smuggling detection for which portal monitors are intended.

  7. Molecular property diagnostic suite (MPDS): Development of disease-specific open source web portals for drug discovery.

    PubMed

    Nagamani, S; Gaur, A S; Tanneeru, K; Muneeswaran, G; Madugula, S S; Consortium, Mpds; Druzhilovskiy, D; Poroikov, V V; Sastry, G N

    2017-11-01

    Molecular property diagnostic suite (MPDS) is a Galaxy-based open source drug discovery and development platform. MPDS web portals are designed for several diseases, such as tuberculosis, diabetes mellitus, and other metabolic disorders, specifically aimed to evaluate and estimate the drug-likeness of a given molecule. MPDS consists of three modules, namely data libraries, data processing, and data analysis tools which are configured and interconnected to assist drug discovery for specific diseases. The data library module encompasses vast information on chemical space, wherein the MPDS compound library comprises 110.31 million unique molecules generated from public domain databases. Every molecule is assigned with a unique ID and card, which provides complete information for the molecule. Some of the modules in the MPDS are specific to the diseases, while others are non-specific. Importantly, a suitably altered protocol can be effectively generated for another disease-specific MPDS web portal by modifying some of the modules. Thus, the MPDS suite of web portals shows great promise to emerge as disease-specific portals of great value, integrating chemoinformatics, bioinformatics, molecular modelling, and structure- and analogue-based drug discovery approaches.

  8. Portal for Families Overcoming Neurodevelopmental Disorders (PFOND): Implementation of a Software Framework for Facilitated Community Website Creation by Nontechnical Volunteers.

    PubMed

    Ye, Xin Cynthia; Ng, Isaiah; Seid-Karbasi, Puya; Imam, Tuhina; Lee, Cheryl E; Chen, Shirley Yu; Herman, Adam; Sharma, Balraj; Johal, Gurinder; Gu, Bobby; Wasserman, Wyeth W

    2013-08-06

    The Portal for Families Overcoming Neurodevelopmental Disorders (PFOND) provides a structured Internet interface for the sharing of information with individuals struggling with the consequences of rare developmental disorders. Large disease-impacted communities can support fundraising organizations that disseminate Web-based information through elegant websites run by professional staff. Such quality resources for families challenged by rare disorders are infrequently produced and, when available, are often dependent upon the continued efforts of a single individual. The project endeavors to create an intuitive Web-based software system that allows a volunteer with limited technical computer skills to produce a useful rare disease website in a short time period. Such a system should provide access to emerging news and research findings, facilitate community participation, present summary information about the disorder, and allow for transient management by volunteers who are likely to change periodically. The prototype portal was implemented using the WordPress software system with both existing and customized supplementary plug-in software modules. Gamification scoring features were implemented in a module, allowing editors to measure progress. The system was installed on a Linux-based computer server, accessible across the Internet through standard Web browsers. A prototype PFOND system was implemented and tested. The prototype system features a structured organization with distinct partitions for background information, recent publications, and community discussions. The software design allows volunteer editors to create a themed website, implement a limited set of topic pages, and connect the software to dynamic RSS feeds providing information about recent news or advances. The prototype was assessed by a fraction of the disease sites developed (8 out of 27), including Aarskog-Scott syndrome, Aniridia, Adams-Oliver syndrome, Cat Eye syndrome, Kabuki syndrome, Leigh syndrome, Peters anomaly, and Rothmund-Thomson syndrome. The editor progress score was used to measure performance for a portion of sites. The PFOND system provides a convenient and structured Internet resource for the facilitated creation of information resources for families confronted by rare disorders. The system empowers volunteers to participate in the creation of quality content, while allowing for the inevitable turnover of contributors over time. The next phase of PFOND development will focus on volunteer participation in system development and community engagement.

  9. Portal for Families Overcoming Neurodevelopmental Disorders (PFOND): Implementation of a Software Framework for Facilitated Community Website Creation by Nontechnical Volunteers

    PubMed Central

    Imam, Tuhina; Lee, Cheryl E; Chen, Shirley Yu; Herman, Adam; Sharma, Balraj; Johal, Gurinder; Gu, Bobby

    2013-01-01

    Background The Portal for Families Overcoming Neurodevelopmental Disorders (PFOND) provides a structured Internet interface for the sharing of information with individuals struggling with the consequences of rare developmental disorders. Large disease-impacted communities can support fundraising organizations that disseminate Web-based information through elegant websites run by professional staff. Such quality resources for families challenged by rare disorders are infrequently produced and, when available, are often dependent upon the continued efforts of a single individual. Objective The project endeavors to create an intuitive Web-based software system that allows a volunteer with limited technical computer skills to produce a useful rare disease website in a short time period. Such a system should provide access to emerging news and research findings, facilitate community participation, present summary information about the disorder, and allow for transient management by volunteers who are likely to change periodically. Methods The prototype portal was implemented using the WordPress software system with both existing and customized supplementary plug-in software modules. Gamification scoring features were implemented in a module, allowing editors to measure progress. The system was installed on a Linux-based computer server, accessible across the Internet through standard Web browsers. Results A prototype PFOND system was implemented and tested. The prototype system features a structured organization with distinct partitions for background information, recent publications, and community discussions. The software design allows volunteer editors to create a themed website, implement a limited set of topic pages, and connect the software to dynamic RSS feeds providing information about recent news or advances. The prototype was assessed by a fraction of the disease sites developed (8 out of 27), including Aarskog-Scott syndrome, Aniridia, Adams-Oliver syndrome, Cat Eye syndrome, Kabuki syndrome, Leigh syndrome, Peters anomaly, and Rothmund-Thomson syndrome. The editor progress score was used to measure performance for a portion of sites. Conclusions The PFOND system provides a convenient and structured Internet resource for the facilitated creation of information resources for families confronted by rare disorders. The system empowers volunteers to participate in the creation of quality content, while allowing for the inevitable turnover of contributors over time. The next phase of PFOND development will focus on volunteer participation in system development and community engagement. PMID:23920006

  10. PREP: Portal for Readiness Exercises & Planning v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noel, Todd; Le, Tam; McNeil, Carrie

    2016-10-28

    The software includes a web-based template for recording actions taken during emergency preparedness exercises and planning workshops. In addition, a virtual outbreak prevention simulation exercise is also included. Both tools interact with a server which records user decisions and communications.

  11. Clinical Trial Electronic Portals for Expedited Safety Reporting: Recommendations from the Clinical Trials Transformation Initiative Investigational New Drug Safety Advancement Project.

    PubMed

    Perez, Raymond P; Finnigan, Shanda; Patel, Krupa; Whitney, Shanell; Forrest, Annemarie

    2016-12-15

    Use of electronic clinical trial portals has increased in recent years to assist with sponsor-investigator communication, safety reporting, and clinical trial management. Electronic portals can help reduce time and costs associated with processing paperwork and add security measures; however, there is a lack of information on clinical trial investigative staff's perceived challenges and benefits of using portals. The Clinical Trials Transformation Initiative (CTTI) sought to (1) identify challenges to investigator receipt and management of investigational new drug (IND) safety reports at oncologic investigative sites and coordinating centers and (2) facilitate adoption of best practices for communicating and managing IND safety reports using electronic portals. CTTI, a public-private partnership to improve the conduct of clinical trials, distributed surveys and conducted interviews in an opinion-gathering effort to record investigator and research staff views on electronic portals in the context of the new safety reporting requirements described in the US Food and Drug Administration's final rule (Code of Federal Regulations Title 21 Section 312). The project focused on receipt, management, and review of safety reports as opposed to the reporting of adverse events. The top challenge investigators and staff identified in using individual sponsor portals was remembering several complex individual passwords to access each site. Also, certain tasks are time-consuming (eg, downloading reports) due to slow sites or difficulties associated with particular operating systems or software. To improve user experiences, respondents suggested that portals function independently of browsers and operating systems, have intuitive interfaces with easy navigation, and incorporate additional features that would allow users to filter, search, and batch safety reports. Results indicate that an ideal system for sharing expedited IND safety information is through a central portal used by all sponsors. Until this is feasible, electronic reporting portals should at least have consistent functionality. CTTI has issued recommendations to improve the quality and use of electronic portals. ©Raymond P Perez, Shanda Finnigan, Krupa Patel, Shanell Whitney, Annemarie Forrest. Originally published in JMIR Cancer (http://cancer.jmir.org), 15.12.2016.

  12. Global Ocean Currents Database

    NASA Astrophysics Data System (ADS)

    Boyer, T.; Sun, L.

    2016-02-01

    The NOAA's National Centers for Environmental Information has released an ocean currents database portal that aims 1) to integrate global ocean currents observations from a variety of instruments with different resolution, accuracy and response to spatial and temporal variability into a uniform network common data form (NetCDF) format and 2) to provide a dedicated online data discovery, access to NCEI-hosted and distributed data sources for ocean currents data. The portal provides a tailored web application that allows users to search for ocean currents data by platform types and spatial/temporal ranges of their interest. The dedicated web application is available at http://www.nodc.noaa.gov/gocd/index.html. The NetCDF format supports widely-used data access protocols and catalog services such as OPeNDAP (Open-source Project for a Network Data Access Protocol) and THREDDS (Thematic Real-time Environmental Distributed Data Services), which the GOCD users can use data files with their favorite analysis and visualization client software without downloading to their local machine. The potential users of the ocean currents database include, but are not limited to, 1) ocean modelers for their model skills assessments, 2) scientists and researchers for studying the impact of ocean circulations on the climate variability, 3) ocean shipping industry for safety navigation and finding optimal routes for ship fuel efficiency, 4) ocean resources managers while planning for the optimal sites for wastes and sewages dumping and for renewable hydro-kinematic energy, and 5) state and federal governments to provide historical (analyzed) ocean circulations as an aid for search and rescue

  13. International Portal

    EIA Publications

    The International Energy Portal includes a powerful data browser that provides country-level energy data; many countries have at least 30 years of historical data. The data browser provides users the ability to view and download complete datasets for consumption, production, trade, reserves, and carbon dioxide emissions for different fuels and energy sources.

  14. Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project

    NASA Astrophysics Data System (ADS)

    van Eck, T.; Giardini, D.

    2010-12-01

    The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.

  15. Personal Alarm System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-04-17

    Software that runs on smartphones and desktop web browsers and notifies border officials of radiation alarms. It displays images and data associated with an alarm and provides a variety of reports. DOE had a need for discrete notification. PAS replaces the lights and sounds of a Radiation Portal Monitor.

  16. The CPEX Data Portal: Bringing Together Different Types of Data for Different Types of Users

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Li, P.; Vu, Q. A.; Hristova-Veleva, S. M.; Turk, J.; Lambrigtsen, B.

    2017-12-01

    The NASA Convective Processes Experiment (CPEX) aircraft field campaign took place in the summer of 2017 in the North Atlantic / Caribbean Ocean region. During this campaign, the NASA DC-8 aircraft carried several instruments that took measurements with the goal of collecting data to help answer questions about convective storm initiation, organization, growth, and dissipation. To help researchers answer science questions about convective storms, the CPEX Data Portal (https://cpexportal.jpl.nasa.gov) was created to bring together relevant satellite and model data, along with aircraft data observed during the campaign. The CPEX Data Portal was designed for two major functions: 1) assist with mission planning by providing a near real-time snapshot of what was going on in the broader North Atlantic domain and 2) bring together different types of data after the aircraft flights had finished to allow researchers to dive deeper into the data. Both functions necessitated collecting a host of disparate data from different instrument types that inherently have differences in resolution, spatial and temporal domain, and quality. Additionally, users of this data portal had varying levels of experience with the different data types (e.g. some used aircraft data before, but not with satellite data). Users were also at different points in their careers - both students and seasoned researchers participated in the campaign and brought different understandings of the physical processes depicted in the portal's visualizations. The CPEX Data Portal team used the existing JPL Tropical Cyclone Information System's near real-time data portal software package to launch a campaign-specific portal to host data during and after the CPEX campaign. This web-based portal includes the ability to visualize pre-generated images of physical quantities from satellites, models, and aircraft instruments, and brings them together in a common virtual globe for given spatial and temporal criteria. Users can also utilize on-line analysis tools to further interrogate the data. In this talk, we will describe how the CPEX Data Portal was able to curate campaign-relevant data, display different types of data in different ways, and make the data digestible and informative for different types of Earth science data user communities.

  17. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using "service casts" and "interest casts" (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH's Mining Workflow Composer and the open-source Active BPEL engine, and JPL's SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the "sociological" problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  18. [A computerized system for the management of letters of authorization for access to sensitive data in a research and teaching hospital].

    PubMed

    Bodina, Annalisa; Brizzolara, Antonella; Vadruccio, Gianluca; Castaldi, Silvana

    2012-01-01

    This paper describes the experience of a hospital which has introduced a system of computerized management of letters of authorization for healthcare workers to access sensitive health data, through the use of open source software. A new corporate intranet portal was created with access given only to the privacy contacts of each operational unit of the hospital. Once the privacy contact has entered the relevant user authorization, these must be approved first by the Directors of the respective operational units and finally by the privacy officer. The introduction of this system has allowed a systematic approach to the management of authorization for access to health data by hospital staff, regular updating and monitoring of the authorization and the start of a process of digitalization of documents.

  19. Promoting scientific collaboration and research through integrated social networking capabilities within the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.

    2009-04-01

    LiDAR (Light Distance And Ranging) topography data offer earth scientists the opportunity to study the earth's surface at very high resolutions. As a result, the popularity of these data is growing dramatically. However, the management, distribution, and analysis of community LiDAR data sets is a challenge due to their massive size (multi-billion point, mutli-terabyte). We have also found that many earth science users of these data sets lack the computing resources and expertise required to process these data. We have developed the OpenTopography Portal to democratize access to these large and computationally challenging data sets. The OpenTopography Portal uses cyberinfrastructure technology developed by the GEON project to provide access to LiDAR data in a variety of formats. LiDAR data products available range from simple Google Earth visualizations of LiDAR-derived hillshades to 1 km2 tiles of standard digital elevation model (DEM) products as well as LiDAR point cloud data and user generated custom-DEMs. We have found that the wide spectrum of LiDAR users have variable scientific applications, computing resources and technical experience and thus require a data system with multiple distribution mechanisms and platforms to serve a broader range of user communities. Because the volume of LiDAR topography data available is rapidly expanding, and data analysis techniques are evolving, there is a need for the user community to be able to communicate and interact to share knowledge and experiences. To address this need, the OpenTopography Portal enables social networking capabilities through a variety of collaboration tools, web 2.0 technologies and customized usage pattern tracking. Fundamentally, these tools offer users the ability to communicate, to access and share documents, participate in discussions, and to keep up to date on upcoming events and emerging technologies. The OpenTopography portal achieves the social networking capabilities by integrating various software technologies and platforms. These include the Expression Engine Content Management System (CMS) that comes with pre-packaged collaboration tools like blogs and wikis, the Gridsphere portal framework that contains the primary GEON LiDAR System portlet with user job monitoring capabilities and a java web based discussion forum (Jforums) application all seamlessly integrated under one portal. The OpenTopography Portal also provides integrated authentication mechanism between the various CMS collaboration tools and the core gridsphere based portlets. The integration of these various technologies allows for enhanced user interaction capabilities within the portal. By integrating popular collaboration tools like discussion forums and blogs we can promote conversation and openness among users. The ability to ask question and share expertise in forum discussions allows users to easily find information and interact with users facing similar challenges. The OpenTopography Blog enables our domain experts to post ideas, news items, commentary, and other resources in order to foster discussion and information sharing. The content management capabilities of the portal allow for easy updates to information in the form of publications, documents, and news articles. Access to the most current information fosters better decision-making. As has become the standard for web 2.0 technologies, the OpenTopography Portal is fully RSS enabled to allow users of the portal to keep track of news items, forum discussions, blog updates, and system outages. We are currently exploring how the information captured by user and job monitoring components of the Gridsphere based GEON LiDAR System can be harnessed to provide a recommender system that will help users to identify appropriate processing parameters and to locate related documents and data. By seamlessly integrating the various platforms and technologies under one single portal, we can take advantage of popular online collaboration tools that are either stand alone or software platform restricted. The availability of these collaboration tools along with the data will foster more community interaction and increase the strength and vibrancy of the LiDAR topography user community.

  20. BioMart Central Portal: an open database network for the biological community

    PubMed Central

    Guberman, Jonathan M.; Ai, J.; Arnaiz, O.; Baran, Joachim; Blake, Andrew; Baldock, Richard; Chelala, Claude; Croft, David; Cros, Anthony; Cutts, Rosalind J.; Di Génova, A.; Forbes, Simon; Fujisawa, T.; Gadaleta, E.; Goodstein, D. M.; Gundem, Gunes; Haggarty, Bernard; Haider, Syed; Hall, Matthew; Harris, Todd; Haw, Robin; Hu, S.; Hubbard, Simon; Hsu, Jack; Iyer, Vivek; Jones, Philip; Katayama, Toshiaki; Kinsella, R.; Kong, Lei; Lawson, Daniel; Liang, Yong; Lopez-Bigas, Nuria; Luo, J.; Lush, Michael; Mason, Jeremy; Moreews, Francois; Ndegwa, Nelson; Oakley, Darren; Perez-Llamas, Christian; Primig, Michael; Rivkin, Elena; Rosanoff, S.; Shepherd, Rebecca; Simon, Reinhard; Skarnes, B.; Smedley, Damian; Sperling, Linda; Spooner, William; Stevenson, Peter; Stone, Kevin; Teague, J.; Wang, Jun; Wang, Jianxin; Whitty, Brett; Wong, D. T.; Wong-Erasmus, Marie; Yao, L.; Youens-Clark, Ken; Yung, Christina; Zhang, Junjun; Kasprzyk, Arek

    2011-01-01

    BioMart Central Portal is a first of its kind, community-driven effort to provide unified access to dozens of biological databases spanning genomics, proteomics, model organisms, cancer data, ontology information and more. Anybody can contribute an independently maintained resource to the Central Portal, allowing it to be exposed to and shared with the research community, and linking it with the other resources in the portal. Users can take advantage of the common interface to quickly utilize different sources without learning a new system for each. The system also simplifies cross-database searches that might otherwise require several complicated steps. Several integrated tools streamline common tasks, such as converting between ID formats and retrieving sequences. The combination of a wide variety of databases, an easy-to-use interface, robust programmatic access and the array of tools make Central Portal a one-stop shop for biological data querying. Here, we describe the structure of Central Portal and show example queries to demonstrate its capabilities. Database URL: http://central.biomart.org. PMID:21930507

  1. BioMart Central Portal: an open database network for the biological community.

    PubMed

    Guberman, Jonathan M; Ai, J; Arnaiz, O; Baran, Joachim; Blake, Andrew; Baldock, Richard; Chelala, Claude; Croft, David; Cros, Anthony; Cutts, Rosalind J; Di Génova, A; Forbes, Simon; Fujisawa, T; Gadaleta, E; Goodstein, D M; Gundem, Gunes; Haggarty, Bernard; Haider, Syed; Hall, Matthew; Harris, Todd; Haw, Robin; Hu, S; Hubbard, Simon; Hsu, Jack; Iyer, Vivek; Jones, Philip; Katayama, Toshiaki; Kinsella, R; Kong, Lei; Lawson, Daniel; Liang, Yong; Lopez-Bigas, Nuria; Luo, J; Lush, Michael; Mason, Jeremy; Moreews, Francois; Ndegwa, Nelson; Oakley, Darren; Perez-Llamas, Christian; Primig, Michael; Rivkin, Elena; Rosanoff, S; Shepherd, Rebecca; Simon, Reinhard; Skarnes, B; Smedley, Damian; Sperling, Linda; Spooner, William; Stevenson, Peter; Stone, Kevin; Teague, J; Wang, Jun; Wang, Jianxin; Whitty, Brett; Wong, D T; Wong-Erasmus, Marie; Yao, L; Youens-Clark, Ken; Yung, Christina; Zhang, Junjun; Kasprzyk, Arek

    2011-01-01

    BioMart Central Portal is a first of its kind, community-driven effort to provide unified access to dozens of biological databases spanning genomics, proteomics, model organisms, cancer data, ontology information and more. Anybody can contribute an independently maintained resource to the Central Portal, allowing it to be exposed to and shared with the research community, and linking it with the other resources in the portal. Users can take advantage of the common interface to quickly utilize different sources without learning a new system for each. The system also simplifies cross-database searches that might otherwise require several complicated steps. Several integrated tools streamline common tasks, such as converting between ID formats and retrieving sequences. The combination of a wide variety of databases, an easy-to-use interface, robust programmatic access and the array of tools make Central Portal a one-stop shop for biological data querying. Here, we describe the structure of Central Portal and show example queries to demonstrate its capabilities.

  2. GPS Software Packages Deliver Positioning Solutions

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."

  3. Negotiable Technology Licensing | NREL

    Science.gov Websites

    more than 800 patented or patent-pending technologies available for licensing. Software NREL currently available to both small and large businesses with the technical and financial resources necessary to turn Portal 250+ Licenses Since 2000, NREL has executed more than 250 licenses. Patents NREL currently has

  4. GrameneMart: the biomart data portal for the gramene project

    USDA-ARS?s Scientific Manuscript database

    The Gramene project was an early adopter of the BioMart software, which remains an integral and well-used component of the Gramene web site. BioMart accessible data sets include plant gene annotations, plant variation catalogues, genetic markers, physical mapping entities, public DNA/mRNA sequences ...

  5. Neutron Scattering Home Page (Low-Graphics)

    Science.gov Websites

    will be added. We encourage everyone interested in neutron scattering to take full advantage of this Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Home Page A new portal for neutron scattering has just been established

  6. Ontology-Driven Discovery of Scientific Computational Entities

    ERIC Educational Resources Information Center

    Brazier, Pearl W.

    2010-01-01

    Many geoscientists use modern computational resources, such as software applications, Web services, scientific workflows and datasets that are readily available on the Internet, to support their research and many common tasks. These resources are often shared via human contact and sometimes stored in data portals; however, they are not necessarily…

  7. Viewing Files — EDRN Public Portal

    Cancer.gov

    In addition to standard HTML Web pages, our web site contain other file formats. You may need additional software or browser plug-ins to view some of the information available on our site. This document lists show each format, along with links to the corresponding freely available plug-ins or viewers.

  8. 77 FR 42419 - Airworthiness Directives; Honeywell International, Inc. Global Navigation Satellite Sensor Units

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-19

    ... following methods: Federal eRulemaking Portal: Go to http://www.regulations.gov . Follow the instructions... software problem is due to a mathematical rounding error, which results in misleading information. At this... air commerce by prescribing regulations for practices, methods, and procedures the Administrator finds...

  9. Consortial IT Services: Collaborating To Reduce the Pain.

    ERIC Educational Resources Information Center

    Klonoski, Ed

    The Connecticut Distance Learning Consortium (CTDLC) provides its 32 members with Information Technologies (IT) services including a portal Web site, course management software, course hosting and development, faculty training, a help desk, online assessment, and a student financial aid database. These services are supplied to two- and four-year…

  10. First field trial of Virtual Network Operator oriented network on demand (NoD) service provisioning over software defined multi-vendor OTN networks

    NASA Astrophysics Data System (ADS)

    Li, Yajie; Zhao, Yongli; Zhang, Jie; Yu, Xiaosong; Chen, Haoran; Zhu, Ruijie; Zhou, Quanwei; Yu, Chenbei; Cui, Rui

    2017-01-01

    A Virtual Network Operator (VNO) is a provider and reseller of network services from other telecommunications suppliers. These network providers are categorized as virtual because they do not own the underlying telecommunication infrastructure. In terms of business operation, VNO can provide customers with personalized services by leasing network infrastructure from traditional network providers. The unique business modes of VNO lead to the emergence of network on demand (NoD) services. The conventional network provisioning involves a series of manual operation and configuration, which leads to high cost in time. Considering the advantages of Software Defined Networking (SDN), this paper proposes a novel NoD service provisioning solution to satisfy the private network need of VNOs. The solution is first verified in the real software defined multi-domain optical networks with multi-vendor OTN equipment. With the proposed solution, NoD service can be deployed via online web portals in near-real time. It reinvents the customer experience and redefines how network services are delivered to customers via an online self-service portal. Ultimately, this means a customer will be able to simply go online, click a few buttons and have new services almost instantaneously.

  11. Clinical results of computerized tomography-based simulation with laser patient marking.

    PubMed

    Ragan, D P; Forman, J D; He, T; Mesina, C F

    1996-02-01

    Accuracy of a patient treatment portal marking device and computerized tomography (CT) simulation have been clinically tested. A CT-based simulator has been assembled based on a commercial CT scanner. This includes visualization software and a computer-controlled laser drawing device. This laser drawing device is used to transfer the setup, central axis, and/or radiation portals from the CT simulator to the patient for appropriate patient skin marking. A protocol for clinical testing is reported. Twenty-five prospectively, sequentially accessioned patients have been analyzed. The simulation process can be completed in an average time of 62 min. Under many cases, the treatment portals can be designed and the patient marked in one session. Mechanical accuracy of the system was found to be within +/- 1mm. The portal projection accuracy in clinical cases is observed to be better than +/- 1.2 mm. Operating costs are equivalent to the conventional simulation process it replaces. Computed tomography simulation is a clinical accurate substitute for conventional simulation when used with an appropriate patient marking system and digitally reconstructed radiographs. Personnel time spent in CT simulation is equivalent to time in conventional simulation.

  12. Motion estimation accuracy for visible-light/gamma-ray imaging fusion for portable portal monitoring

    NASA Astrophysics Data System (ADS)

    Karnowski, Thomas P.; Cunningham, Mark F.; Goddard, James S.; Cheriyadat, Anil M.; Hornback, Donald E.; Fabris, Lorenzo; Kerekes, Ryan A.; Ziock, Klaus-Peter; Gee, Timothy F.

    2010-01-01

    The use of radiation sensors as portal monitors is increasing due to heightened concerns over the smuggling of fissile material. Portable systems that can detect significant quantities of fissile material that might be present in vehicular traffic are of particular interest. We have constructed a prototype, rapid-deployment portal gamma-ray imaging portal monitor that uses machine vision and gamma-ray imaging to monitor multiple lanes of traffic. Vehicles are detected and tracked by using point detection and optical flow methods as implemented in the OpenCV software library. Points are clustered together but imperfections in the detected points and tracks cause errors in the accuracy of the vehicle position estimates. The resulting errors cause a "blurring" effect in the gamma image of the vehicle. To minimize these errors, we have compared a variety of motion estimation techniques including an estimate using the median of the clustered points, a "best-track" filtering algorithm, and a constant velocity motion estimation model. The accuracy of these methods are contrasted and compared to a manually verified ground-truth measurement by quantifying the rootmean- square differences in the times the vehicles cross the gamma-ray image pixel boundaries compared with a groundtruth manual measurement.

  13. The International Human Epigenome Consortium Data Portal.

    PubMed

    Bujold, David; Morais, David Anderson de Lima; Gauthier, Carol; Côté, Catherine; Caron, Maxime; Kwan, Tony; Chen, Kuang Chung; Laperle, Jonathan; Markovits, Alexei Nordell; Pastinen, Tomi; Caron, Bryan; Veilleux, Alain; Jacques, Pierre-Étienne; Bourque, Guillaume

    2016-11-23

    The International Human Epigenome Consortium (IHEC) coordinates the production of reference epigenome maps through the characterization of the regulome, methylome, and transcriptome from a wide range of tissues and cell types. To define conventions ensuring the compatibility of datasets and establish an infrastructure enabling data integration, analysis, and sharing, we developed the IHEC Data Portal (http://epigenomesportal.ca/ihec). The portal provides access to >7,000 reference epigenomic datasets, generated from >600 tissues, which have been contributed by seven international consortia: ENCODE, NIH Roadmap, CEEHRC, Blueprint, DEEP, AMED-CREST, and KNIH. The portal enhances the utility of these reference maps by facilitating the discovery, visualization, analysis, download, and sharing of epigenomics data. The IHEC Data Portal is the official source to navigate through IHEC datasets and represents a strategy for unifying the distributed data produced by international research consortia. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  14. Visualization of the Capitellum During Elbow Arthroscopy: A Comparison of 3 Portal Techniques.

    PubMed

    Trofa, David P; Gancarczyk, Stephanie M; Lombardi, Joseph M; Makhni, Eric C; Popkin, Charles A; Ahmad, Christopher S

    2017-06-01

    Capitellar osteochondritis dissecans (OCD) is a debilitating condition of unknown etiology for which various arthroscopic treatments are available. Prior data suggest that greater than 75% of the capitellum can be visualized arthroscopically through a dual lateral portal approach. However, there is no literature assessing arthroscopic visualization of the capitellum via alternative portals. To determine the percentage of capitellum visualized using the dual lateral, distal ulnar and soft spot, and posterolateral and soft spot portal configurations in a cadaver model. Descriptive laboratory study. Arthroscopy was performed on 12 fresh-frozen cadaver elbows, 4 for each of the following approaches: dual lateral, distal ulna, and posterolateral. Electrocautery was used to mark the most anterior, posterior, medial, and lateral points seen on the capitellum. The radiocapitellar joint was subsequently exposed through an extensile posterior dissection, and the surface anatomy was reconstructed using the Microscribe 3D digitizing system. Using Rhinoceros software, the percentage of capitellum surface area visualized by each approach was determined. The mean percentage of capitellum visualized for the dual lateral, distal ulna, and posterolateral approaches was approximately 68.8%, 66.3%, and 63.5%, respectively. There was no significant difference between the percentage of capitellum seen among approaches ( P = .68). On average, 66.5% of the capitellum was visible through these 3 arthroscopic approaches to the elbow. Approximately 66.5% of the capitellum is visualized through the popularized posterior arthroscopic portals, with no significant differences found between the 3 investigated approaches. As determined in this cadaveric model investigation, each portal technique provides equivalent visualization for capitellar OCD pathology.

  15. Final Report: Non-Visible, Automated Target Acquisition and Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziock, Klaus-Peter; Fabris, Lorenzo; Goddard, James K.

    The Roadside Tracker (RST) represents a new approach to radiation portal monitors. It uses a combination of gamma-ray and visible-light imaging to localize gamma-ray radiation sources to individual vehicles in free-flowing, multi-lane traffic. Deployed as two trailers that are parked on either side of the roadway (Fig. 1); the RST scans passing traffic with two large gamma-ray imagers, one mounted in each trailer. The system compensates for vehicle motion through the imager’s fields of view by using automated target acquisition and tracking (TAT) software applied to a stream of video images. Once a vehicle has left the field of view,more » the radiation image of that vehicle is analyzed for the presence of a source, and if one is found, an alarm is sounded. The gamma-ray image is presented to the operator together with the video image of the traffic stream when the vehicle was approximately closest to the system (Fig. 2). The offending vehicle is identified with a bounding box to distinguish it from other vehicles that might be present at the same time. The system was developed under a previous grant from the Department of Homeland Security’s (DHS’s) Domestic Nuclear Detection Office (DNDO). This report documents work performed with follow-on funding from DNDO to further advance the development of the RST. Specifically, the primary thrust was to extend the performance envelope of the system by replacing the visible-light video cameras used by the TAT software with sensors that would allow operation at night and during inclement weather. In particular, it was desired to allow operation after dark without requiring external lighting. As part of this work, the system software was also upgraded to allow the use of 64-bit computers, the current generation operating system (OS), software development environment (Windows 7 vs. Windows XP, and current Visual Studio.Net), and improved software version controls (GIT vs. Source Safe.) With the upgraded performance allowed by new computers, and the additional memory available in a 64-bit OS, the system was able to handle greater traffic densities, and this also allowed addition of the ability to handle stop-and-go traffic.« less

  16. Uses of megavoltage digital tomosynthesis in radiotherapy

    NASA Astrophysics Data System (ADS)

    Sarkar, Vikren

    With the advent of intensity modulated radiotherapy, radiation treatment plans are becoming more conformal to the tumor with the decreasing margins. It is therefore of prime importance that the patient be positioned correctly prior to treatment. Therefore, image guided treatment is necessary for intensity modulated radiotherapy plans to be implemented successfully. Current advanced imaging devices require costly hardware and software upgrade, and radiation imaging solutions, such as cone beam computed tomography, may introduce extra radiation dose to the patient in order to acquire better quality images. Thus, there is a need to extend current existing imaging device ability and functions while reducing cost and radiation dose. Existing electronic portal imaging devices can be used to generate computed tomography-like tomograms through projection images acquired over a small angle using the technique of cone-beam digital tomosynthesis. Since it uses a fraction of the images required for computed tomography reconstruction, use of this technique correspondingly delivers only a fraction of the imaging dose to the patient. Furthermore, cone-beam digital tomosynthesis can be offered as a software-only solution as long as a portal imaging device is available. In this study, the feasibility of performing digital tomosynthesis using individually-acquired megavoltage images from a charge coupled device-based electronic portal imaging device was investigated. Three digital tomosynthesis reconstruction algorithms, the shift-and-add, filtered back-projection, and simultaneous algebraic reconstruction technique, were compared considering the final image quality and radiation dose during imaging. A software platform, DART, was created using a combination of the Matlab and C++ languages. The platform allows for the registration of a reference Cone Beam Digital Tomosynthesis (CBDT) image against a daily acquired set to determine how to shift the patient prior to treatment. Finally, the software was extended to investigate if the digital tomosynthesis dataset could be used in an adaptive radiotherapy regimen through the use of the Pinnacle treatment planning software to recalculate dose delivered. The feasibility study showed that the megavoltage CBDT visually agreed with corresponding megavoltage computed tomography images. The comparative study showed that the best compromise between imaging quality and imaging dose is obtained when 11 projection images, acquired over an imaging angle of 40°, are used with the filtered back-projection algorithm. DART was successfully used to register reference and daily image sets to within 1 mm in-plane and 2.5 mm out of plane. The DART platform was also effectively used to generate updated files that the Pinnacle treatment planning system used to calculate updated dose in a rigidly shifted patient. These doses were then used to calculate a cumulative dose distribution that could be used by a physician as reference to decide when the treatment plan should be updated. In conclusion, this study showed that a software solution is possible to extend existing electronic portal imaging devices to function as cone-beam digital tomosynthesis devices and achieve daily requirement for image guided intensity modulated radiotherapy treatments. The DART platform also has the potential to be used as a part of adaptive radiotherapy solution.

  17. ViPAR: a software platform for the Virtual Pooling and Analysis of Research Data.

    PubMed

    Carter, Kim W; Francis, Richard W; Carter, K W; Francis, R W; Bresnahan, M; Gissler, M; Grønborg, T K; Gross, R; Gunnes, N; Hammond, G; Hornig, M; Hultman, C M; Huttunen, J; Langridge, A; Leonard, H; Newman, S; Parner, E T; Petersson, G; Reichenberg, A; Sandin, S; Schendel, D E; Schalkwyk, L; Sourander, A; Steadman, C; Stoltenberg, C; Suominen, A; Surén, P; Susser, E; Sylvester Vethanayagam, A; Yusof, Z

    2016-04-01

    Research studies exploring the determinants of disease require sufficient statistical power to detect meaningful effects. Sample size is often increased through centralized pooling of disparately located datasets, though ethical, privacy and data ownership issues can often hamper this process. Methods that facilitate the sharing of research data that are sympathetic with these issues and which allow flexible and detailed statistical analyses are therefore in critical need. We have created a software platform for the Virtual Pooling and Analysis of Research data (ViPAR), which employs free and open source methods to provide researchers with a web-based platform to analyse datasets housed in disparate locations. Database federation permits controlled access to remotely located datasets from a central location. The Secure Shell protocol allows data to be securely exchanged between devices over an insecure network. ViPAR combines these free technologies into a solution that facilitates 'virtual pooling' where data can be temporarily pooled into computer memory and made available for analysis without the need for permanent central storage. Within the ViPAR infrastructure, remote sites manage their own harmonized research dataset in a database hosted at their site, while a central server hosts the data federation component and a secure analysis portal. When an analysis is initiated, requested data are retrieved from each remote site and virtually pooled at the central site. The data are then analysed by statistical software and, on completion, results of the analysis are returned to the user and the virtually pooled data are removed from memory. ViPAR is a secure, flexible and powerful analysis platform built on open source technology that is currently in use by large international consortia, and is made publicly available at [http://bioinformatics.childhealthresearch.org.au/software/vipar/]. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.

  18. Preliminary Studies for a CBCT Imaging Protocol for Offline Organ Motion Analysis: Registration Software Validation and CTDI Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falco, Maria Daniela, E-mail: mdanielafalco@hotmail.co; Fontanarosa, Davide; Miceli, Roberto

    2011-04-01

    Cone-beam X-ray volumetric imaging in the treatment room, allows online correction of set-up errors and offline assessment of residual set-up errors and organ motion. In this study the registration algorithm of the X-ray volume imaging software (XVI, Elekta, Crawley, United Kingdom), which manages a commercial cone-beam computed tomography (CBCT)-based positioning system, has been tested using a homemade and an anthropomorphic phantom to: (1) assess its performance in detecting known translational and rotational set-up errors and (2) transfer the transformation matrix of its registrations into a commercial treatment planning system (TPS) for offline organ motion analysis. Furthermore, CBCT dose index hasmore » been measured for a particular site (prostate: 120 kV, 1028.8 mAs, approximately 640 frames) using a standard Perspex cylindrical body phantom (diameter 32 cm, length 15 cm) and a 10-cm-long pencil ionization chamber. We have found that known displacements were correctly calculated by the registration software to within 1.3 mm and 0.4{sup o}. For the anthropomorphic phantom, only translational displacements have been considered. Both studies have shown errors within the intrinsic uncertainty of our system for translational displacements (estimated as 0.87 mm) and rotational displacements (estimated as 0.22{sup o}). The resulting table translations proposed by the system to correct the displacements were also checked with portal images and found to place the isocenter of the plan on the linac isocenter within an error of 1 mm, which is the dimension of the spherical lead marker inserted at the center of the homemade phantom. The registration matrix translated into the TPS image fusion module correctly reproduced the alignment between planning CT scans and CBCT scans. Finally, measurements on the CBCT dose index indicate that CBCT acquisition delivers less dose than conventional CT scans and electronic portal imaging device portals. The registration software was found to be accurate, and its registration matrix can be easily translated into the TPS and a low dose is delivered to the patient during image acquisition. These results can help in designing imaging protocols for offline evaluations.« less

  19. The Finnish disease heritage database (FinDis) update-a database for the genes mutated in the Finnish disease heritage brought to the next-generation sequencing era.

    PubMed

    Polvi, Anne; Linturi, Henna; Varilo, Teppo; Anttonen, Anna-Kaisa; Byrne, Myles; Fokkema, Ivo F A C; Almusa, Henrikki; Metzidis, Anthony; Avela, Kristiina; Aula, Pertti; Kestilä, Marjo; Muilu, Juha

    2013-11-01

    The Finnish Disease Heritage Database (FinDis) (http://findis.org) was originally published in 2004 as a centralized information resource for rare monogenic diseases enriched in the Finnish population. The FinDis database originally contained 405 causative variants for 30 diseases. At the time, the FinDis database was a comprehensive collection of data, but since 1994, a large amount of new information has emerged, making the necessity to update the database evident. We collected information and updated the database to contain genes and causative variants for 35 diseases, including six more genes and more than 1,400 additional disease-causing variants. Information for causative variants for each gene is collected under the LOVD 3.0 platform, enabling easy updating. The FinDis portal provides a centralized resource and user interface to link information on each disease and gene with variant data in the LOVD 3.0 platform. The software written to achieve this has been open-sourced and made available on GitHub (http://github.com/findis-db), allowing biomedical institutions in other countries to present their national data in a similar way, and to both contribute to, and benefit from, standardized variation data. The updated FinDis portal provides a unique resource to assist patient diagnosis, research, and the development of new cures. © 2013 WILEY PERIODICALS, INC.

  20. Information, Courses, Community: Fostering Student Engagement with MyArcadia

    ERIC Educational Resources Information Center

    Bedi, Param

    2005-01-01

    In the spring of 2004, the Arcadia University launched MyArcadia, the campus web portal. MyArcadia gives students, faculty, and staff access to online courses and departmental web sites. The portal is also the main source for campus announcements and event listings, and provides a single sign-on link to campus email. This report gives a detailed…

  1. Data-based Considerations in Portal Radiation Monitoring of Cargo Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weier, Dennis R.; O'Brien, Robert F.; Ely, James H.

    2004-07-01

    Radiation portal monitoring of cargo vehicles often includes a configuration of four-panel monitors that record gamma and neutron counts from vehicles transporting cargo. As vehicles pass the portal monitors, they generate a count profile over time that can be compared to the average panel background counts obtained just prior to the time the vehicle entered the area of the monitors. Pacific Northwest National Laboratory has accumulated considerable data regarding such background radiation and vehicle profiles from portal installations, as well as in experimental settings using known sources and cargos. Several considerations have a bearing on how alarm thresholds are setmore » in order to maintain sensitivity to radioactive sources while also controlling to a manageable level the rate of false or nuisance alarms. False alarms are statistical anomalies while nuisance alarms occur due to the presence of naturally occurring radioactive material (NORM) in cargo, for example, kitty litter. Considerations to be discussed include: • Background radiation suppression due to the shadow shielding from the vehicle. • The impact of the relative placement of the four panels on alarm decision criteria. • Use of plastic scintillators to separate gamma counts into energy windows. • The utility of using ratio criteria for the energy window counts rather than simply using total window counts. • Detection likelihood for these various decision criteria based on computer simulated injections of sources into vehicle profiles.« less

  2. ACPYPE - AnteChamber PYthon Parser interfacE.

    PubMed

    Sousa da Silva, Alan W; Vranken, Wim F

    2012-07-23

    ACPYPE (or AnteChamber PYthon Parser interfacE) is a wrapper script around the ANTECHAMBER software that simplifies the generation of small molecule topologies and parameters for a variety of molecular dynamics programmes like GROMACS, CHARMM and CNS. It is written in the Python programming language and was developed as a tool for interfacing with other Python based applications such as the CCPN software suite (for NMR data analysis) and ARIA (for structure calculations from NMR data). ACPYPE is open source code, under GNU GPL v3, and is available as a stand-alone application at http://www.ccpn.ac.uk/acpype and as a web portal application at http://webapps.ccpn.ac.uk/acpype. We verified the topologies generated by ACPYPE in three ways: by comparing with default AMBER topologies for standard amino acids; by generating and verifying topologies for a large set of ligands from the PDB; and by recalculating the structures for 5 protein-ligand complexes from the PDB. ACPYPE is a tool that simplifies the automatic generation of topology and parameters in different formats for different molecular mechanics programmes, including calculation of partial charges, while being object oriented for integration with other applications.

  3. DES Science Portal: II- Creating Science-Ready Catalogs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fausti Neto, Angelo; et al.

    We present a novel approach for creating science-ready catalogs through a software infrastructure developed for the Dark Energy Survey (DES). We integrate the data products released by the DES Data Management and additional products created by the DES collaboration in an environment known as DES Science Portal. Each step involved in the creation of a science-ready catalog is recorded in a relational database and can be recovered at any time. We describe how the DES Science Portal automates the creation and characterization of lightweight catalogs for DES Year 1 Annual Release, and show its flexibility in creating multiple catalogs withmore » different inputs and configurations. Finally, we discuss the advantages of this infrastructure for large surveys such as DES and the Large Synoptic Survey Telescope. The capability of creating science-ready catalogs efficiently and with full control of the inputs and configurations used is an important asset for supporting science analysis using data from large astronomical surveys.« less

  4. Keeping Research Data from the Continental Deep Drilling Programme (KTB) Accessible and Taking First Steps Towards Digital Preservation

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Ulbricht, D.; Conze, R.

    2014-12-01

    The Continental Deep Drilling Programme (KTB) was a scientific drilling project from 1987 to 1995 near Windischeschenbach, Bavaria. The main super-deep borehole reached a depth of 9,101 meters into the Earth's continental crust. The project used the most current equipment for data capture and processing. After the end of the project key data were disseminated through the web portal of the International Continental Scientific Drilling Program (ICDP). The scientific reports were published as printed volumes. As similar projects have also experienced, it becomes increasingly difficult to maintain a data portal over a long time. Changes in software and underlying hardware make a migration of the entire system inevitable. Around 2009 the data presented on the ICDP web portal were migrated to the Scientific Drilling Database (SDDB) and published through DataCite using Digital Object Identifiers (DOI) as persistent identifiers. The SDDB portal used a relational database with a complex data model to store data and metadata. A PHP-based Content Management System with custom modifications made it possible to navigate and browse datasets using the metadata and then download datasets. The data repository software eSciDoc allows storing self-contained packages consistent with the OAIS reference model. Each package consists of binary data files and XML-metadata. Using a REST-API the packages can be stored in the eSciDoc repository and can be searched using the XML-metadata. During the last maintenance cycle of the SDDB the data and metadata were migrated into the eSciDoc repository. Discovery metadata was generated following the GCMD-DIF, ISO19115 and DataCite schemas. The eSciDoc repository allows to store an arbitrary number of XML-metadata records with each data object. In addition to descriptive metadata each data object may contain pointers to related materials, such as IGSN-metadata to link datasets to physical specimens, or identifiers of literature interpreting the data. Datasets are presented by XSLT-stylesheet transformation using the stored metadata. The presentation shows several migration cycles of data and metadata, which were driven by aging software systems. Currently the datasets reside as self-contained entities in a repository system that is ready for digital preservation.

  5. Simultaneous detection of colonic epithelial cells in portal venous and peripheral blood during colorectal cancer surgery.

    PubMed

    Tien, Yu-Wen; Lee, Po-Huang; Wang, Shih-Ming; Hsu, Su-Ming; Chang, King-Jen

    2002-01-01

    This study was designed to show, in certain patients, that colonic epithelial cells can be present in peripheral blood while absent in portal venous blood. The circulating colorectal epithelial cells were detected by a reverse transcriptase-polymerase chain reaction assay, which involved amplifying guanylyl cyclase C transcripts. Portal venous and peripheral blood samples were obtained at intervals from 58 patients undergoing colorectal cancer surgery. Circulating colonic epithelial cells were more frequently detected in portal venous blood than in peripheral blood only before mobilization of the tumor-bearing colon segment in patients with tumors of Stage B. In five other patients, before mobilization of their tumor-bearing colon segments, and in another three patients, during the mobilization, colorectal epithelial cells were detected in peripheral blood but not in portal venous blood. These eight patients had Stage C or D tumors. In 8 of 58 patients, colorectal epithelial cells were detected in peripheral but not in portal venous blood. Metastatic deposits in lymphatic vessels or liver might be the source of these cells.

  6. A method to determine the planar dose distributions in patient undergone radiotherapy

    NASA Astrophysics Data System (ADS)

    Cilla, S.; Viola, P.; Augelli, B. G.; D'Onofrio, G.; Grimaldi, L.; Craus, M.; Digesù, C.; Deodato, F.; Macchia, G.; Morganti, A. G.; Fidanzio, A.; Azario, L.; Piermattei, A.

    2008-06-01

    A 2D-array equipped with 729 vented plane parallel ion-chambers has been calibrated as a portal dose detector for radiotherapy in vivo measurements. The array has been positioned by a radiographic film stand at 120 cm from the source orthogonal to the radiotherapy beam delivered with the gantry angle at 180°. The collision between the 2D-array and the patient's couch have been avoided. In this work, using the measurements of the portal detector, we present a method to reconstruct the dose variations in the patient treated with step and shoot intensity-modulated beams (IMRT) for head-neck tumours. For this treatment morphological changes often occur during the fractionated therapy. In a first step an in-house software supplied the comparison between the measured portal dose and the one computed by a commercial treatment planning system within the field of view of the computed tomography (CT) scanner. For each patient, the percentage Pγ of chambers, where the comparison is in agreement within a selected acceptance criteria, was determined 8 times. At the first radiotherapy fraction the γ-index analysis supplied Pγ values of about 95%, within acceptance criteria in terms of dose-difference, ΔD, and distance-agreement, Δd, that was equal to 5% and 4 mm, respectively. These acceptance criteria were taken into account for small errors in the patient's set-up reproducibility and for the accuracy of the portal dose calculated by the treatment planning system (TPS) in particular when the beam was attenuated by inhomogeneous tissues and the shape of the head-neck body contours were irregular. During the treatment, some patients showed a reduction of the Pγ below 90% because due to radiotherapy treatment there was a change of the patient's morphology. In a second step a method, based on dosimetric measurements that used standard phantoms, supplied the percentage dose variations in a coronal plane of the patient using the percentage dose variations measured by the 2D-array portal detector. The results showed that the dose variations due to the change of the patient's morphology reached 15% and such discrepancies were displayed on the digitally reconstructed radiography of the patient. The dose discrepancies were confirmed by the hybrid plan obtained by the treatment planning system. The good results here reported show that once it is possible to have the portal dose distributions even for other gantry angles, these tests could be introduced in the clinical protocol to have major support to decide when to repeat the patient's CT scan and to re-plan the new IMRT dose calculation.

  7. Earth Orientation - Naval Oceanography Portal

    Science.gov Websites

    section Advanced Search... Sections Home Time Earth Orientation Astronomy Meteorology Oceanography Ice You are here: Home › USNO › Earth Orientation USNO Logo USNO Navigation Earth Orientation Products GPS -based Products VLBI-based Products EO Information Center Publications about Products Software Info Earth

  8. EO Information Center - Naval Oceanography Portal

    Science.gov Websites

    are here: Home › USNO › Earth Orientation › EO Information Center USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center General Information GPS User Information Frequently Asked Questions Read Me Files Publications about Products Software

  9. Almanacs and Other Publications - Naval Oceanography Portal

    Science.gov Websites

    are here: Home › USNO › Astronomical Applications › Almanacs and Other Publications USNO Logo USNO Navigation Data Services Astronomical Information Center Almanacs and Other Publications Software Products For DoD Users Info Almanacs and Other Publications Annual Astronomical and Navigational Almanacs

  10. Recording the LHCb data and software dependencies

    NASA Astrophysics Data System (ADS)

    Trisovic, Ana; Couturier, Ben; Gibson, Val; Jones, Chris

    2017-10-01

    In recent years awareness of the importance of preserving the experimental data and scientific software at CERN has been rising. To support this effort, we are presenting a novel approach to structure dependencies of the LHCb data and software to make it more accessible in the long-term future. In this paper, we detail the implementation of a graph database of these dependencies. We list the implications that can be deduced from the graph mining (such as a search for the legacy software), with emphasis on data preservation. Furthermore, we introduce a methodology of recreating the LHCb data, thus supporting reproducible research and data stewardship. Finally, we describe how this information is made available to the users on a web portal that promotes data and analysis preservation and good practise with analysis documentation.

  11. Search without Boundaries Using Simple APIs

    USGS Publications Warehouse

    Tong, Qi

    2009-01-01

    The U.S. Geological Survey (USGS) Library, where the author serves as the digital services librarian, is increasingly challenged to make it easier for users to find information from many heterogeneous information sources. Information is scattered throughout different software applications (i.e., library catalog, federated search engine, link resolver, and vendor websites), and each specializes in one thing. How could the library integrate the functionalities of one application with another and provide a single point of entry for users to search across? To improve the user experience, the library launched an effort to integrate the federated search engine into the library's intranet website. The result is a simple search box that leverages the federated search engine's built-in application programming interfaces (APIs). In this article, the author describes how this project demonstrated the power of APIs and their potential to be used by other enterprise search portals inside or outside of the library.

  12. EXP-PAC: providing comparative analysis and storage of next generation gene expression data.

    PubMed

    Church, Philip C; Goscinski, Andrzej; Lefèvre, Christophe

    2012-07-01

    Microarrays and more recently RNA sequencing has led to an increase in available gene expression data. How to manage and store this data is becoming a key issue. In response we have developed EXP-PAC, a web based software package for storage, management and analysis of gene expression and sequence data. Unique to this package is SQL based querying of gene expression data sets, distributed normalization of raw gene expression data and analysis of gene expression data across experiments and species. This package has been populated with lactation data in the international milk genomic consortium web portal (http://milkgenomics.org/). Source code is also available which can be hosted on a Windows, Linux or Mac APACHE server connected to a private or public network (http://mamsap.it.deakin.edu.au/~pcc/Release/EXP_PAC.html). Copyright © 2012 Elsevier Inc. All rights reserved.

  13. PhyloBot: A Web Portal for Automated Phylogenetics, Ancestral Sequence Reconstruction, and Exploration of Mutational Trajectories.

    PubMed

    Hanson-Smith, Victor; Johnson, Alexander

    2016-07-01

    The method of phylogenetic ancestral sequence reconstruction is a powerful approach for studying evolutionary relationships among protein sequence, structure, and function. In particular, this approach allows investigators to (1) reconstruct and "resurrect" (that is, synthesize in vivo or in vitro) extinct proteins to study how they differ from modern proteins, (2) identify key amino acid changes that, over evolutionary timescales, have altered the function of the protein, and (3) order historical events in the evolution of protein function. Widespread use of this approach has been slow among molecular biologists, in part because the methods require significant computational expertise. Here we present PhyloBot, a web-based software tool that makes ancestral sequence reconstruction easy. Designed for non-experts, it integrates all the necessary software into a single user interface. Additionally, PhyloBot provides interactive tools to explore evolutionary trajectories between ancestors, enabling the rapid generation of hypotheses that can be tested using genetic or biochemical approaches. Early versions of this software were used in previous studies to discover genetic mechanisms underlying the functions of diverse protein families, including V-ATPase ion pumps, DNA-binding transcription regulators, and serine/threonine protein kinases. PhyloBot runs in a web browser, and is available at the following URL: http://www.phylobot.com. The software is implemented in Python using the Django web framework, and runs on elastic cloud computing resources from Amazon Web Services. Users can create and submit jobs on our free server (at the URL listed above), or use our open-source code to launch their own PhyloBot server.

  14. PhyloBot: A Web Portal for Automated Phylogenetics, Ancestral Sequence Reconstruction, and Exploration of Mutational Trajectories

    PubMed Central

    Hanson-Smith, Victor; Johnson, Alexander

    2016-01-01

    The method of phylogenetic ancestral sequence reconstruction is a powerful approach for studying evolutionary relationships among protein sequence, structure, and function. In particular, this approach allows investigators to (1) reconstruct and “resurrect” (that is, synthesize in vivo or in vitro) extinct proteins to study how they differ from modern proteins, (2) identify key amino acid changes that, over evolutionary timescales, have altered the function of the protein, and (3) order historical events in the evolution of protein function. Widespread use of this approach has been slow among molecular biologists, in part because the methods require significant computational expertise. Here we present PhyloBot, a web-based software tool that makes ancestral sequence reconstruction easy. Designed for non-experts, it integrates all the necessary software into a single user interface. Additionally, PhyloBot provides interactive tools to explore evolutionary trajectories between ancestors, enabling the rapid generation of hypotheses that can be tested using genetic or biochemical approaches. Early versions of this software were used in previous studies to discover genetic mechanisms underlying the functions of diverse protein families, including V-ATPase ion pumps, DNA-binding transcription regulators, and serine/threonine protein kinases. PhyloBot runs in a web browser, and is available at the following URL: http://www.phylobot.com. The software is implemented in Python using the Django web framework, and runs on elastic cloud computing resources from Amazon Web Services. Users can create and submit jobs on our free server (at the URL listed above), or use our open-source code to launch their own PhyloBot server. PMID:27472806

  15. Development of the UTAUT2 model to measure the acceptance of medical laboratory portals by patients in Shiraz

    PubMed Central

    Ravangard, Ramin; Kazemi, Zhila; Abbasali, Somaye Zaker; Sharifian, Roxana; Monem, Hossein

    2017-01-01

    Introduction One of the main stages for achieving the success is acceptance of technology by its users. Hence, identifying the effective factors in successful acceptance of information technology is necessary and vital. One such factor is usability. This study aimed to investigate the software usability in the “Unified Theory of Acceptance and Use of Technology 2 (UTAUT2)” model in patients’ use of medical diagnosis laboratories’ electronic portals in 2015. Methods This cross-sectional study was carried out on 170 patients in 2015. A 27-item questionnaire adopted from previous research and the Usability Evaluation questionnaire were used for data collection. Data were analyzed using Structural Equation Modeling (SEM), with Partial Least Squares approach by SPSS 20.0 and Smart-PLS V3.0. Results The results showed that the construct of intention to use had significant associations with price value (t-value=2.77), hedonic motivation (t-value=4.46), habit (t-value=1.99) and usability (t-value=5.2), as well as the construct of usage behavior with usability (t-value=3.45) and intention to use (t-value=2.03). Conclusion Considering the results of this study, the following recommendations can be made in order for the higher use of portals by the patients: informing patients about the advantages of using these portals, designing portals in a simple and understandable form, increasing the portals’ attractiveness, etc. PMID:28465819

  16. Web catalog of oceanographic data using GeoNetwork

    NASA Astrophysics Data System (ADS)

    Marinova, Veselka; Stefanov, Asen

    2017-04-01

    Most of the data collected, analyzed and used by Bulgarian oceanographic data center (BgODC) from scientific cruises, argo floats, ferry boxes and real time operating systems are spatially oriented and need to be displayed on the map. The challenge is to make spatial information more accessible to users, decision makers and scientists. In order to meet this challenge, BgODC concentrate its efforts on improving dynamic and standardized access to their geospatial data as well as those from various related organizations and institutions. BgODC currently is implementing a project to create a geospatial portal for distributing metadata and search, exchange and harvesting spatial data. There are many open source software solutions able to create such spatial data infrastructure (SDI). Finally, the GeoNetwork open source is chosen, as it is already widespread. This software is free, effective and "cheap" solution for implementing SDI at organization level. It is platform independent and runs under many operating systems. Filling of the catalog goes through these practical steps: • Managing and storing data reliably within MS SQL spatial data base; • Registration of maps and data of various formats and sources in GeoServer (most popular open source geospatial server embedded with GeoNetwork) ; • Filling added meta data and publishing geospatial data at the desktop of GeoNetwork. GeoServer and GeoNetwork are based on Java so they require installing of a servlet engine like Tomcat. The experience gained from the use of GeoNetwork Open Source confirms that the catalog meets the requirements for data management and is flexible enough to customize. Building the catalog facilitates sustainable data exchange between end users. The catalog is a big step towards implementation of the INSPIRE directive due to availability of many features necessary for producing "INSPIRE compliant" metadata records. The catalog now contains all available GIS data provided by BgODC for Internet access. Searching data within the catalog is based upon geographic extent, theme type and free text search.

  17. Evaluation of the Next-Gen Exercise Software Interface in the NEEMO Analog

    NASA Technical Reports Server (NTRS)

    Hanson, Andrea; Kalogera, Kent; Sandor, Aniko; Hardy, Marc; Frank, Andrew; English, Kirk; Williams, Thomas; Perera, Jeevan; Amonette, William

    2017-01-01

    NSBRI (National Space Biomedical Research Institute) funded research grant to develop the 'NextGen' exercise software for the NEEMO (NASA Extreme Environment Mission Operations) analog. Develop a software architecture to integrate instructional, motivational and socialization techniques into a common portal to enhance exercise countermeasures in remote environments. Increase user efficiency and satisfaction, and institute commonality across multiple exercise systems. Utilized GUI (Graphical User Interface) design principals focused on intuitive ease of use to minimize training time and realize early user efficiency. Project requirement to test the software in an analog environment. Top Level Project Aims: 1) Improve the usability of crew interface software to exercise CMS (Crew Management System) through common app-like interfaces. 2) Introduce virtual instructional motion training. 3) Use virtual environment to provide remote socialization with family and friends, improve exercise technique, adherence, motivation and ultimately performance outcomes.

  18. Visualization of the Capitellum During Elbow Arthroscopy: A Comparison of 3 Portal Techniques

    PubMed Central

    Trofa, David P.; Gancarczyk, Stephanie M.; Lombardi, Joseph M.; Makhni, Eric C.; Popkin, Charles A.; Ahmad, Christopher S.

    2017-01-01

    Background: Capitellar osteochondritis dissecans (OCD) is a debilitating condition of unknown etiology for which various arthroscopic treatments are available. Prior data suggest that greater than 75% of the capitellum can be visualized arthroscopically through a dual lateral portal approach. However, there is no literature assessing arthroscopic visualization of the capitellum via alternative portals. Purpose: To determine the percentage of capitellum visualized using the dual lateral, distal ulnar and soft spot, and posterolateral and soft spot portal configurations in a cadaver model. Study Design: Descriptive laboratory study. Methods: Arthroscopy was performed on 12 fresh-frozen cadaver elbows, 4 for each of the following approaches: dual lateral, distal ulna, and posterolateral. Electrocautery was used to mark the most anterior, posterior, medial, and lateral points seen on the capitellum. The radiocapitellar joint was subsequently exposed through an extensile posterior dissection, and the surface anatomy was reconstructed using the Microscribe 3D digitizing system. Using Rhinoceros software, the percentage of capitellum surface area visualized by each approach was determined. Results: The mean percentage of capitellum visualized for the dual lateral, distal ulna, and posterolateral approaches was approximately 68.8%, 66.3%, and 63.5%, respectively. There was no significant difference between the percentage of capitellum seen among approaches (P = .68). On average, 66.5% of the capitellum was visible through these 3 arthroscopic approaches to the elbow. Conclusion: Approximately 66.5% of the capitellum is visualized through the popularized posterior arthroscopic portals, with no significant differences found between the 3 investigated approaches. Clinical Relevance: As determined in this cadaveric model investigation, each portal technique provides equivalent visualization for capitellar OCD pathology. PMID:28680895

  19. Paleomagnetism.org: An online multi-platform open source environment for paleomagnetic data analysis

    NASA Astrophysics Data System (ADS)

    Koymans, Mathijs R.; Langereis, Cor G.; Pastor-Galán, Daniel; van Hinsbergen, Douwe J. J.

    2016-08-01

    This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The Paleomagnetism.org application is split in to an interpretation portal, a statistics portal, and a portal for miscellaneous paleomagnetic tools. In the interpretation portal, principle component analysis can be performed on visualized demagnetization diagrams. Interpreted directions and great circles can be combined to find great circle solutions. These directions can be used in the statistics portal, or exported as data and figures. The tools in the statistics portal cover standard Fisher statistics for directions and VGPs, including other statistical parameters used as reliability criteria. Other available tools include an eigenvector approach foldtest, two reversal test including a Monte Carlo simulation on mean directions, and a coordinate bootstrap on the original data. An implementation is included for the detection and correction of inclination shallowing in sediments following TK03.GAD. Finally we provide a module to visualize VGPs and expected paleolatitudes, declinations, and inclinations relative to widely used global apparent polar wander path models in coordinates of major continent-bearing plates. The tools in the miscellaneous portal include a net tectonic rotation (NTR) analysis to restore a body to its paleo-vertical and a bootstrapped oroclinal test using linear regressive techniques, including a modified foldtest around a vertical axis. Paleomagnetism.org provides an integrated approach for researchers to work with visualized (e.g. hemisphere projections, Zijderveld diagrams) paleomagnetic data. The application constructs a custom exportable file that can be shared freely and included in public databases. This exported file contains all data and can later be imported to the application by other researchers. The accessibility and simplicity through which paleomagnetic data can be interpreted, analyzed, visualized, and shared makes Paleomagnetism.org of interest to the community.

  20. Workflow-Based Software Development Environment

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  1. [Postoperative complications and survival analysis of 1 118 cases of open splenectomy and azygoportal disconnection in the treatment of portal hypertension].

    PubMed

    Qi, R Z; Zhao, X; Wang, S Z; Zhang, K; Chang, Z Y; Hu, X L; Wu, M L; Zhang, P R; Yu, L X; Xiao, C H; Shi, X J; Li, Z W

    2018-06-01

    Objective: To analyze the recent postoperative and long-term postoperative complications of open-splenectomy and disconnection in patients with portal hypertension. Methods: There were 1 118 cases with portal hypertension who underwent open splenectomy and azygoportal disconnection from April 2010 to September 2015 at Department of Surgery, People's Liberation Army 302 Hospital. Retrospective case investigation and telephone follow-up were conducted in October 2016. All patients had history of upper gastrointestinal bleeding before operation. Short-term complications after surgery were recorded including secondary laparotomy of postoperative abdominal hemostasis, severe infection, intake disorders, liver insufficiency, postoperative portal vein thrombosis and perioperative mortality. Long-term data including postoperative upper gastrointestinal rebleeding, postoperative survival rate and incidence of postoperative malignancy were recorded, too. GraphPad Prism 5 software for data survival analysis and charting. Results: Postoperative short-term complications in 1 118 patients included secondary laparotomy of postoperative abdominal hemostasis(1.8%, 21/1 118), severe infection(2.9%, 32/1 118), intake disorders(1.0%, 11/1 118), liver dysfunction (1.6%, 18/1 118), postoperative portal vein thrombosis(47.1%, 526/1 118)and perioperative mortality(0.5%, 5/1 118). After phone call following-up, 942 patients' long-term data were completed including 1, 3, 5 years postoperative upper gastrointestinal rebleeding rate(4.4%, 12.1%, 17.2%), 1, 3, 5-year postoperative survival rate(97.0%, 93.5%, 90.3%); the incidence of postoperative malignant tumors in 1, 3 and 5 years were 1.7%, 4.4% and 6.2%. Conclusions: Reasonable choosing of surgical indications and timing, proper performing the surgery process, effective conducting perioperative management of portal hypertension are directly related to the patient's short-term prognosis after portal hypertension. Surgical intervention can reduce the rates of patients with upper gastrointestinal rebleeding, improve survival, and do not increase the incidence of malignant tumors.

  2. The Disease Portals, disease-gene annotation and the RGD disease ontology at the Rat Genome Database.

    PubMed

    Hayman, G Thomas; Laulederkind, Stanley J F; Smith, Jennifer R; Wang, Shur-Jen; Petri, Victoria; Nigam, Rajni; Tutaj, Marek; De Pons, Jeff; Dwinell, Melinda R; Shimoyama, Mary

    2016-01-01

    The Rat Genome Database (RGD;http://rgd.mcw.edu/) provides critical datasets and software tools to a diverse community of rat and non-rat researchers worldwide. To meet the needs of the many users whose research is disease oriented, RGD has created a series of Disease Portals and has prioritized its curation efforts on the datasets important to understanding the mechanisms of various diseases. Gene-disease relationships for three species, rat, human and mouse, are annotated to capture biomarkers, genetic associations, molecular mechanisms and therapeutic targets. To generate gene-disease annotations more effectively and in greater detail, RGD initially adopted the MEDIC disease vocabulary from the Comparative Toxicogenomics Database and adapted it for use by expanding this framework with the addition of over 1000 terms to create the RGD Disease Ontology (RDO). The RDO provides the foundation for, at present, 10 comprehensive disease area-related dataset and analysis platforms at RGD, the Disease Portals. Two major disease areas are the focus of data acquisition and curation efforts each year, leading to the release of the related Disease Portals. Collaborative efforts to realize a more robust disease ontology are underway. Database URL:http://rgd.mcw.edu. © The Author(s) 2016. Published by Oxford University Press.

  3. A Knowledge Portal and Collaboration Environment for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.

    2008-12-01

    Earth Knowledge is developing a web-based 'Knowledge Portal and Collaboration Environment' that will serve as the information-technology-based foundation of a modular Internet-based Earth-Systems Monitoring, Analysis, and Management Tool. This 'Knowledge Portal' is essentially a 'mash- up' of web-based and client-based tools and services that support on-line collaboration, community discussion, and broad public dissemination of earth and environmental science information in a wide-area distributed network. In contrast to specialized knowledge-management or geographic-information systems developed for long- term and incremental scientific analysis, this system will exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize existing environmental datasets using Google Earth and Google Maps. An early form of these tools and services is being used by Earth Knowledge to facilitate the investigations and conversations of scientists, resource managers, and citizen-stakeholders addressing water resource sustainability issues in the Great Basin region of the desert southwestern United States. These ongoing projects will serve as use cases for the further development of this information-technology infrastructure. This 'Knowledge Portal' will accelerate the deployment of Earth- system data and information into an operational knowledge management system that may be used by decision-makers concerned with stewardship of water resources in the American Desert Southwest.

  4. PLOCAN glider portal: a gateway for useful data management and visualization system

    NASA Astrophysics Data System (ADS)

    Morales, Tania; Lorenzo, Alvaro; Viera, Josue; Barrera, Carlos; José Rueda, María

    2014-05-01

    Nowadays monitoring ocean behavior and its characteristics involves a wide range of sources able to gather and provide a vast amount of data in spatio-temporal scales. Multiplatform infrastructures, like PLOCAN, hold a variety of autonomous Lagrangian and Eulerian devices addressed to collect information then transferred to land in near-real time. Managing all this data collection in an efficient way is a major issue. Advances in ocean observation technologies, where underwater autonomous gliders play a key role, has brought as a consequence an improvement of spatio-temporal resolution which offers a deeper understanding of the ocean but requires a bigger effort in the data management process. There are general requirements in terms of data management in that kind of environments, such as processing raw data at different levels to obtain valuable information, storing data coherently and providing accurate products to final users according to their specific needs. Managing large amount of data can be certainly tedious and complex without having right tools and operational procedures; hence automating these tasks through software applications saves time and reduces errors. Moreover, data distribution is highly relevant since scientist tent to assimilate different sources for comparison and validation. The use of web applications has boosted the necessary scientific dissemination. Within this argument, PLOCAN has implemented a set of independent but compatible applications to process, store and disseminate information gathered through different oceanographic platforms. These applications have been implemented using open standards, such as HTML and CSS, and open source software, like python as programming language and Django as framework web. More specifically, a glider application has been developed within the framework of FP7-GROOM project. Regarding data management, this project focuses on collecting and making available consistent and quality controlled datasets as well as fostering open access to glider data.

  5. Quality evaluation of portal sites in health system, as a tool for education and learning.

    PubMed

    Hejazi, Sayed Mehdi; Sarmadi, Sima

    2013-01-01

    The main objective of creating a portal is to make information service available for users who need them for performance of duties and responsibilities regardless of the sources. This article is attempted to consider the parameters that can evaluate these sites since these criteria can be effective in designing and implementing such portals. On the other hand, portal sites in health systems of every country make it possible for leaders, policy makers, and directors to system education as a tool for new learning technologies. One of the main decisions each manager has to make is precise selection of appropriate portal sites. This is a descriptive and qualitative study. The research sample was 53 computer professional working in the area of computer programming and design. In the first part of the study a questionnaire was send to the participants and in the second part of the study based on their response to the questionnaire the participant was interviewed and the main themes of the studies were formulated. The validity and the reliability of the questionnaire were confirmed. The study results were summarized in 10 themes and 50 sub-categories. The main themes included were portal requirements, security, management, and efficiency, user friendliness, built-in applications, portal flexibility, interoperability, and support systems. Portal sites in any education systems make it possible for health system leaders and policy makers to manage their organization information system efficiently and effectively. One of the major decisions each manager has to make is precise selection of an appropriate portal sites design and development. The themes and sub-categories of this study could help health system managers and policy makers and information technology professionals to make appropriate decisions regarding portal design and development.

  6. Standardisation of radiation portal monitor controls and readouts.

    PubMed

    Tinker, M

    2010-10-01

    There is an urgent need to standardise the numbering configuration of radiation portal monitor sensing panels. Currently, manufacturers use conflicting numbering schemes that may confuse operators of these varied systems. There is a similar problem encountered with the varied choices of coloured indicator lights and coloured print lines designated for gamma and neutron alarms. In addition, second-party software that changes the alarm colour scheme may also have been installed. Furthermore, no provision exists for the colour blind or to provide work stations with only black ink on alarm printouts. These inconsistencies and confusing set-ups could inadvertently cause a misinterpretation of the alarm, resulting in the potential release of a radiological hazard into a sovereign country. These issues are discussed, and a proposed solution is offered.

  7. The MER/CIP Portal for Ground Operations

    NASA Technical Reports Server (NTRS)

    Chan, Louise; Desai, Sanjay; DOrtenzio, Matthew; Filman, Robtert E.; Heher, Dennis M.; Hubbard, Kim; Johan, Sandra; Keely, Leslie; Magapu, Vish; Mak, Ronald

    2003-01-01

    We developed the Mars Exploration Rover/Collaborative Information Portal (MER/CIP) to facilitate MER operations. MER/CIP provides a centralized, one-stop delivery platform integrating science and engineering data from several distributed heterogeneous data sources. Key issues for MER/CIP include: 1) Scheduling and schedule reminders; 2) Tracking the status of daily predicted outputs; 3) Finding and analyzing data products; 4) Collaboration; 5) Announcements; 6) Personalization.

  8. Organizational Analysis of the United States Army Evaluation Center

    DTIC Science & Technology

    2014-12-01

    analysis of qualitative or quantitative data obtained from design reviews, hardware inspections, M&S, hardware and software testing , metrics review... Research Development Test & Evaluation (RDT&E) appropriation account. The Defense Acquisition Portal ACQuipedia website describes RDT&E as “ one of the... research , design , development, test and evaluation, production, installation, operation, and maintenance; data collection; processing and analysis

  9. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-12-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be discoverable using tag search, and advertised using “service casts” and “interest casts” (Atom feeds). Multiple science workflow systems will be plugged into the system, with initial support for UAH’s Mining Workflow Composer and the open-source Active BPEL engine, and JPL’s SciFlo engine and the VizFlow visual programming interface. With the ability to share and execute analysis workflows, Talkoot portals can be used to do collaborative science in addition to communicate ideas and results. It will be useful for different science domains, mission teams, research projects and organizations. Thus, it will help to solve the “sociological” problem of bringing together disparate groups of researchers, and the technical problem of advertising, discovering, developing, documenting, and maintaining inter-agency science workflows. The presentation will discuss the goals of and barriers to Science 2.0, the social web technologies employed in the Talkoot software appliance (e.g. CMS, social tagging, personal presence, advertising by feeds, etc.), illustrate the resulting collaborative capabilities, and show early prototypes of the web interfaces (e.g. embedded workflows).

  10. A Web-Based Earth-Systems Knowledge Portal and Collaboration Platform

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.; Turner, A. K.

    2010-12-01

    In support of complex water-resource sustainability projects in the Great Basin region of the United States, Earth Knowledge, Inc. has developed several web-based data management and analysis platforms that have been used by its scientists, clients, and public to facilitate information exchanges, collaborations, and decision making. These platforms support accurate water-resource decision-making by combining second-generation internet (Web 2.0) technologies with traditional 2D GIS and web-based 2D and 3D mapping systems such as Google Maps, and Google Earth. Most data management and analysis systems use traditional software systems to address the data needs and usage behavior of the scientific community. In contrast, these platforms employ more accessible open-source and “off-the-shelf” consumer-oriented, hosted web-services. They exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize earth, engineering, and social science datasets. Thus, they respond to the information needs and web-interface expectations of both subject-matter experts and the public. Because the platforms continue to gather and store all the contributions of their broad-spectrum of users, each new assessment leverages the data, information, and expertise derived from previous investigations. In the last year, Earth Knowledge completed a conceptual system design and feasibility study for a platform, which has a Knowledge Portal providing access to users wishing to retrieve information or knowledge developed by the science enterprise and a Collaboration Environment Module, a framework that links the user-access functions to a Technical Core supporting technical and scientific analyses including Data Management, Analysis and Modeling, and Decision Management, and to essential system administrative functions within an Administrative Module. The over-riding technical challenge is the design and development of a single technical platform that is accessed through a flexible series of knowledge portal and collaboration environment styles reflecting the information needs and user expectations of a diverse community of users. Recent investigations have defined the information needs and expectations of the major end-users and also have reviewed and assessed a wide variety of modern web-based technologies. Combining these efforts produced design specifications and recommendations for the selection and integration of web- and client-based tools. When fully developed, the resulting platform will: -Support new, advanced information systems and decision environments that take full advantage of multiple data sources and platforms; -Provide a distribution network tailored to the timely delivery of products to a broad range of users that are needed to support applications in disaster management, resource management, energy, and urban sustainability; -Establish new integrated multiple-user requirements and knowledge databases that support researchers and promote infusion of successful technologies into existing processes; and -Develop new decision support strategies and presentation methodologies for applied earth science applications to reduce risk, cost, and time.

  11. Guiding Users to Sea Level Change Data Through Content

    NASA Astrophysics Data System (ADS)

    Quach, N.; Abercrombie, S. P.; Boening, C.; Brennan, H. P.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jackson, R.; Larour, E. Y.; Shaftel, H.; Tenenbaum, L. F.; Zlotnicki, V.; Boeck, A.; Moore, B.; Moore, J.

    2017-12-01

    The NASA Sea Level Change Portal (https://sealevel.nasa.gov) is an immersive and innovative web portal for sea level change research that addresses the needs of diverse audiences, from scientists across disparate disciplines to the general public to policy makers and businesses. Since sea level change research involves vast amounts of data from multiple fields, it becomes increasingly important to come up with novel and effective ways to guide users to the data they need. News articles published on the portal contains links to relevant data. The Missions section highlights missions and projects as well as provide a logical grouping of the data. Tools available on the portal, such as the Data Analysis Tool, a data visualization and high-performance environment for sea level analysis, and the Virtual Earth System Laboratory, a 3D simulation application, describes and links to the source data. With over 30K Facebook followers and over 23K Twitter follower, the portal outreach team also leverages social media to guide users to relevant data. This presentation focuses on how the portal uses news articles, mission and project pages, tools, and social media to connect users to the data.

  12. From EGEE Operations Portal towards EGI Operations Portal

    NASA Astrophysics Data System (ADS)

    Cordier, Hélène; L'Orphelin, Cyril; Reynaud, Sylvain; Lequeux, Olivier; Loikkanen, Sinikka; Veyre, Pierre

    Grid operators in EGEE have been using a dedicated dashboard as their central operational tool, stable and scalable for the last 5 years despite continuous upgrade from specifications by users, monitoring tools or data providers. In EGEE-III, recent regionalisation of operations led the Operations Portal developers to conceive a standalone instance of this tool. We will see how the dashboard reorganization paved the way for the re-engineering of the portal itself. The outcome is an easily deployable package customized with relevant information sources and specific decentralized operational requirements. This package is composed of a generic and scalable data access mechanism, Lavoisier; a renowned php framework for configuration flexibility, Symfony and a MySQL database. VO life cycle and operational information, EGEE broadcast and Downtime notifications are next for the major reorganization until all other key features of the Operations Portal are migrated to the framework. Features specifications will be sketched at the same time to adapt to EGI requirements and to upgrade. Future work on feature regionalisation, on new advanced features or strategy planning will be tracked in EGI- Inspire through the Operations Tools Advisory Group, OTAG, where all users, customers and third parties of the Operations Portal are represented from January 2010.

  13. Tsunami.gov: NOAA's Tsunami Information Portal

    NASA Astrophysics Data System (ADS)

    Shiro, B.; Carrick, J.; Hellman, S. B.; Bernard, M.; Dildine, W. P.

    2014-12-01

    We present the new Tsunami.gov website, which delivers a single authoritative source of tsunami information for the public and emergency management communities. The site efficiently merges information from NOAA's Tsunami Warning Centers (TWC's) by way of a comprehensive XML feed called Tsunami Event XML (TEX). The resulting unified view allows users to quickly see the latest tsunami alert status in geographic context without having to understand complex TWC areas of responsibility. The new site provides for the creation of a wide range of products beyond the traditional ASCII-based tsunami messages. The publication of modern formats such as Common Alerting Protocol (CAP) can drive geographically aware emergency alert systems like FEMA's Integrated Public Alert and Warning System (IPAWS). Supported are other popular information delivery systems, including email, text messaging, and social media updates. The Tsunami.gov portal allows NOAA staff to easily edit content and provides the facility for users to customize their viewing experience. In addition to access by the public, emergency managers and government officials may be offered the capability to log into the portal for special access rights to decision-making and administrative resources relevant to their respective tsunami warning systems. The site follows modern HTML5 responsive design practices for optimized use on mobile as well as non-mobile platforms. It meets all federal security and accessibility standards. Moving forward, we hope to expand Tsunami.gov to encompass tsunami-related content currently offered on separate websites, including the NOAA Tsunami Website, National Tsunami Hazard Mitigation Program, NOAA Center for Tsunami Research, National Geophysical Data Center's Tsunami Database, and National Data Buoy Center's DART Program. This project is part of the larger Tsunami Information Technology Modernization Project, which is consolidating the software architectures of NOAA's existing TWC's into a single system. We welcome your feedback to help Tsunami.gov become an effective public resource for tsunami information and a medium to enable better global tsunami warning coordination.

  14. Enabling Discoveries in Earth Sciences Through the Geosciences Network (GEON)

    NASA Astrophysics Data System (ADS)

    Seber, D.; Baru, C.; Memon, A.; Lin, K.; Youn, C.

    2005-12-01

    Taking advantage of the state-of-the-art information technology resources GEON researchers are building a cyberinfrastructure designed to enable data sharing, semantic data integration, high-end computations and 4D visualization in easy-to-use web-based environments. The GEON Network currently allows users to search and register Earth science resources such as data sets (GIS layers, GMT files, geoTIFF images, ASCII files, relational databases etc), software applications or ontologies. Portal based access mechanisms enable developers to built dynamic user interfaces to conduct advanced processing and modeling efforts across distributed computers and supercomputers. Researchers and educators can access the networked resources through the GEON portal and its portlets that were developed to conduct better and more comprehensive science and educational studies. For example, the SYNSEIS portlet in GEON enables users to access in near-real time seismic waveforms from the IRIS Data Management Center, easily build a 3D geologic model within the area of the seismic station(s) and the epicenter and perform a 3D synthetic seismogram analysis to understand the lithospheric structure and earthquake source parameters for any given earthquake in the US. Similarly, GEON's workbench area enables users to create their own work environment and copy, visualize and analyze any data sets within the network, and create subsets of the data sets for their own purposes. Since all these resources are built as part of a Service-oriented Architecture (SOA), they are also used in other development platforms. One such platform is Kepler Workflow system which can access web service based resources and provides users with graphical programming interfaces to build a model to conduct computations and/or visualization efforts using the networked resources. Developments in the area of semantic integration of the networked datasets continue to advance and prototype studies can be accessed via the GEON portal at www.geongrid.org

  15. Gadolinium-doped water cerenkov-based neutron and high energy gamma-ray detector and radiation portal monitoring system

    DOEpatents

    Dazeley, Steven A; Svoboda, Robert C; Bernstein, Adam; Bowden, Nathaniel

    2013-02-12

    A water Cerenkov-based neutron and high energy gamma ray detector and radiation portal monitoring system using water doped with a Gadolinium (Gd)-based compound as the Cerenkov radiator. An optically opaque enclosure is provided surrounding a detection chamber filled with the Cerenkov radiator, and photomultipliers are optically connected to the detect Cerenkov radiation generated by the Cerenkov radiator from incident high energy gamma rays or gamma rays induced by neutron capture on the Gd of incident neutrons from a fission source. The PMT signals are then used to determine time correlations indicative of neutron multiplicity events characteristic of a fission source.

  16. A Framework for Collaborative Review of Candidate Events in High Data Rate Streams: the V-Fastr Experiment as a Case Study

    NASA Astrophysics Data System (ADS)

    Hart, Andrew F.; Cinquini, Luca; Khudikyan, Shakeh E.; Thompson, David R.; Mattmann, Chris A.; Wagstaff, Kiri; Lazio, Joseph; Jones, Dayton

    2015-01-01

    “Fast radio transients” are defined here as bright millisecond pulses of radio-frequency energy. These short-duration pulses can be produced by known objects such as pulsars or potentially by more exotic objects such as evaporating black holes. The identification and verification of such an event would be of great scientific value. This is one major goal of the Very Long Baseline Array (VLBA) Fast Transient Experiment (V-FASTR), a software-based detection system installed at the VLBA. V-FASTR uses a “commensal” (piggy-back) approach, analyzing all array data continually during routine VLBA observations and identifying candidate fast transient events. Raw data can be stored from a buffer memory, which enables a comprehensive off-line analysis. This is invaluable for validating the astrophysical origin of any detection. Candidates discovered by the automatic system must be reviewed each day by analysts to identify any promising signals that warrant a more in-depth investigation. To support the timely analysis of fast transient detection candidates by V-FASTR scientists, we have developed a metadata-driven, collaborative candidate review framework. The framework consists of a software pipeline for metadata processing composed of both open source software components and project-specific code written expressly to extract and catalog metadata from the incoming V-FASTR data products, and a web-based data portal that facilitates browsing and inspection of the available metadata for candidate events extracted from the VLBA radio data.

  17. [Learning from regional differences: online platform: http://www.versorgungsatlas.de].

    PubMed

    Mangiapane, S

    2014-02-01

    In 2011, the Central Research Institute of Ambulatory Health Care in Germany (ZI) published the website http://www.versorgungsatlas.de, a portal that presents research results from regional health services in Germany. The Web portal provides a publicly accessible source of information and a growing number of selected analyses focusing on regional variation in health care. Each topic is presented in terms of interactive maps, tables, and diagrams and is supplemented by a paper that examines the results in detail and provides an explanation of the findings. The portal has been designed to provide a forum on which health service researchers can publish their results derived from various data sources of different institutions in Germany and can comment on results already available on http://www.versorgungsatlas.de. For health policy actors, the discussion of regional differences offers a new, previously unavailable basis for determining the region-specific treatment needs and for providing health-care management with the goal of high-quality care for each resident.

  18. Cyberspace: Devolution and Recovery

    DTIC Science & Technology

    2011-03-23

    time of the source of the burst and we do not know if it was accidental, an act of God , or a malicious attack. 28 The remainder of a speech like...Security 15 Mailing List, Federal Vulnerability Knowledgebase (VKB), US-CERT Portal, US-CERT Einstein Program, Internet Health and Status Service...The US-CERT portal is a website dedicated to sharing relevant information with participants. The Einstein Program is a program that allows for the

  19. Environmental Data Store: A Web-Based System Providing Management and Exploitation for Multi-Data-Type Environmental Data

    NASA Astrophysics Data System (ADS)

    Ji, P.; Piasecki, M.

    2012-12-01

    With the rapid growth in data volumes, data diversity and data demands from multi-disciplinary research effort, data management and exploitation are increasingly facing significant challenges for environmental scientific community. We describe Environmental data store (EDS), a system we are developing that is a web-based system following an open source implementation to manage and exploit multi-data-type environmental data. EDS provides repository services for the six fundamental data types, which meet the demands of multi-disciplinary environmental research. These data types are: a) Time Series Data, b) GeoSpatial data, c) Digital Data, d) Ex-Situ Sampling data, e) Modeling Data, f) Raster Data. Through data portal, EDS allows for efficient consuming these six types of data placed in data pool, which is made up of different data nodes corresponding to different data types, including iRODS, ODM, THREADS, ESSDB, GeoServer, etc.. EDS data portal offers unified submission interface for the above different data types; provides fully integrated, scalable search across content from the above different data systems; also features mapping, analysis, exporting and visualization, through integration with other software. EDS uses a number of developed systems, follows widely used data standards, and highlights the thematic, semantic, and syntactic support on the submission and search, in order to advance multi-disciplinary environmental research. This system will be installed and develop at the CrossRoads initiative at the City College of New York.

  20. Mercury: Reusable software application for Metadata Management, Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Devarakonda, Ranjeet; Palanisamy, Giri; Green, James; Wilson, Bruce E.

    2009-12-01

    Mercury is a federated metadata harvesting, data discovery and access tool based on both open source packages and custom developed software. It was originally developed for NASA, and the Mercury development consortium now includes funding from NASA, USGS, and DOE. Mercury is itself a reusable toolset for metadata, with current use in 12 different projects. Mercury also supports the reuse of metadata by enabling searching across a range of metadata specification and standards including XML, Z39.50, FGDC, Dublin-Core, Darwin-Core, EML, and ISO-19115. Mercury provides a single portal to information contained in distributed data management systems. It collects metadata and key data from contributing project servers distributed around the world and builds a centralized index. The Mercury search interfaces then allow the users to perform simple, fielded, spatial and temporal searches across these metadata sources. One of the major goals of the recent redesign of Mercury was to improve the software reusability across the projects which currently fund the continuing development of Mercury. These projects span a range of land, atmosphere, and ocean ecological communities and have a number of common needs for metadata searches, but they also have a number of needs specific to one or a few projects To balance these common and project-specific needs, Mercury’s architecture includes three major reusable components; a harvester engine, an indexing system and a user interface component. The harvester engine is responsible for harvesting metadata records from various distributed servers around the USA and around the world. The harvester software was packaged in such a way that all the Mercury projects will use the same harvester scripts but each project will be driven by a set of configuration files. The harvested files are then passed to the Indexing system, where each of the fields in these structured metadata records are indexed properly, so that the query engine can perform simple, keyword, spatial and temporal searches across these metadata sources. The search user interface software has two API categories; a common core API which is used by all the Mercury user interfaces for querying the index and a customized API for project specific user interfaces. For our work in producing a reusable, portable, robust, feature-rich application, Mercury received a 2008 NASA Earth Science Data Systems Software Reuse Working Group Peer-Recognition Software Reuse Award. The new Mercury system is based on a Service Oriented Architecture and effectively reuses components for various services such as Thesaurus Service, Gazetteer Web Service and UDDI Directory Services. The software also provides various search services including: RSS, Geo-RSS, OpenSearch, Web Services and Portlets, integrated shopping cart to order datasets from various data centers (ORNL DAAC, NSIDC) and integrated visualization tools. Other features include: Filtering and dynamic sorting of search results, book-markable search results, save, retrieve, and modify search criteria.

  1. SU-F-T-262: Commissioning Varian Portal Dosimetry for EPID-Based Patient Specific QA in a Non-Aria Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, M; Knutson, N; University of Rhode Island, Kingston, RI

    2016-06-15

    Purpose: Development of an in-house program facilitates a workflow that allows Electronic Portal Imaging Device (EPID) patient specific quality assurance (QA) measurements to be acquired and analyzed in the Portal Dosimetry Application (Varian Medical Systems, Palo Alto, CA) using a non-Aria Record and Verify (R&V) system (MOSAIQ, Elekta, Crawley, UK) to deliver beams in standard clinical treatment mode. Methods: Initial calibration of an in-house software tool includes characterization of EPID dosimetry parameters by importing DICOM images of varying delivered MUs to determine linear mapping factors in order to convert image pixel values to Varian-defined Calibrated Units (CU). Using this information,more » the Portal Dose Image Prediction (PDIP) algorithm was commissioned by converting images of various field sizes to output factors using the Eclipse Scripting Application Programming Interface (ESAPI) and converting a delivered configuration fluence to absolute dose units. To verify the algorithm configuration, an integrated image was acquired, exported directly from the R&V client, automatically converted to a compatible, calibrated dosimetric image, and compared to a PDIP calculated image using Varian’s Portal Dosimetry Application. Results: For two C-Series and one TrueBeam Varian linear accelerators, gamma comparisons (global 3% / 3mm) of PDIP algorithm predicted dosimetric images and images converted via the inhouse system demonstrated agreement for ≥99% of all pixels, exceeding vendor-recommended commissioning guidelines. Conclusion: Combinations of a programmatic image conversion tool and ESAPI allow for an efficient and accurate method of patient IMRT QA incorporating a 3rd party R&V system.« less

  2. Patterns in Patient Access and Utilization of Online Medical Records: Analysis of MyChart

    PubMed Central

    2018-01-01

    Background Electronic patient portals provide a new method for sharing personal medical information with individual patients. Objective Our aim was to review utilization patterns of the largest online patient portal in Canada's largest city. Methods We conducted a 4-year time-trend analysis of aggregated anonymous utilization data of the MyChart patient portal at Sunnybrook Health Sciences Centre in Ontario, Canada, from January 1, 2012, through December 31, 2015. Prespecified analyses examined trends related to day (weekend vs weekday), season (July vs January), year (2012 vs 2015), and an extreme adverse weather event (ice storm of December 20-26, 2013). Primary endpoints included three measures of patient portal activity: registrations, logins, and pageviews. Results We identified 32,325 patients who registered for a MyChart account during the study interval. Time-trend analysis showed no sign of attenuating registrations over time. Logins were frequent, averaged 734 total per day, and showed an increasing trend over time. Pageviews mirrored logins, averaged about 3029 total per day, and equated to about 5 pageviews during the average login. The most popular pageviews were clinical notes, followed by laboratory results and medical imaging reports. All measures of patient activity were lower on weekends compared to weekdays (P<.001) yet showed no significant changes related to seasons or extreme weather. No major security breach, malware attack, or software failure occurred during the study. Conclusions Online patient portals can provide a popular and reliable system for distributing personal medical information to active patients and may merit consideration for hospitals. PMID:29410386

  3. Model My Watershed: A high-performance cloud application for public engagement, watershed modeling and conservation decision support

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Tarboton, D. G.; Horsburgh, J. S.; Mayorga, E.; McFarland, M.; Robbins, A.; Haag, S.; Shokoufandeh, A.; Evans, B. M.; Arscott, D. B.

    2017-12-01

    The Model My Watershed Web app (https://app.wikiwatershed.org/) and the BiG-CZ Data Portal (http://portal.bigcz.org/) and are web applications that share a common codebase and a common goal to deliver high-performance discovery, visualization and analysis of geospatial data in an intuitive user interface in web browser. Model My Watershed (MMW) was designed as a decision support system for watershed conservation implementation. BiG CZ Data Portal was designed to provide context and background data for research sites. Users begin by creating an Area of Interest, via an automated watershed delineation tool, a free draw tool, selection of a predefined area such as a county or USGS Hydrological Unit (HUC), or uploading a custom polygon. Both Web apps visualize and provide summary statistics of land use, soil groups, streams, climate and other geospatial information. MMW then allows users to run a watershed model to simulate different scenarios of human impacts on stormwater runoff and water-quality. BiG CZ Data Portal allows users to search for scientific and monitoring data within the Area of Interest, which also serves as a prototype for the upcoming Monitor My Watershed web app. Both systems integrate with CUAHSI cyberinfrastructure, including visualizing observational data from CUAHSI Water Data Center and storing user data via CUAHSI HydroShare. Both systems also integrate with the new EnviroDIY Water Quality Data Portal (http://data.envirodiy.org/), a system for crowd-sourcing environmental monitoring data using open-source sensor stations (http://envirodiy.org/mayfly/) and based on the Observations Data Model v2.

  4. The ADRICOSM STAR GeoPortal

    NASA Astrophysics Data System (ADS)

    Giorgetti, Alessandra; Cesarini, Claudia; Gambetta, Marco; Reseghetti, Franco; Vinci, Matteo

    2010-05-01

    From Stockholm (1972) to Rio de Janeiro (1992) and to Johannesburg (2002), environmental protection objectives are related to the principles of sustainable development. This includes the following important components: participation, information, communication, training (capacity building) and education. Better information ensure more participation from individuals, and allows citizens to take part in many different actions that can influence the policy process. Participation to political decisions need access to reliable and quality controlled information. The ADRICOSM Portal was developed in order to manage data diversity, provide access to any kind of product, provide metadata completeness and accuracy. The product, as defined in ADRICOSM, is anything that can be offered to a client and that might satisfy a want or need. This implied the implementation of services that was taking into consideration the diversity of the objects to be provided to users: observations, model outputs, maps, etc. The implementation of the portal was based on two metadata levels: 1. Directory level - consisting of broad descriptions of the contents of data sets; used to locate data sets of potential interest and 2. Data level - consisting of the actual data objects. The portal was developed as a simplified front end for the partners data management systems, giving emphasis on federated access points focused on thematic aspects. This was based on the idea that specialized customer-related access points can be better carried out by delegated teams of experts who know the needs of different customers, define the user software which is most suited to them. The data management systems provide facilities for two data tracks, one in real-time (or near-realtime) and one in delayed mode. Both tracks are based on the same data sources and transmission systems, but the data follow different routes and are processed differently depending on user requirements. The real-time data and model products are checked after first use for up-grading and additional quality control to ensure that all relevant data are preserved for final archival. In summary the ADRICOSM Geoportal has: - the objective to support data exchange in Adricosm Star and the implementation of tools to facilitate the access to data and information through thematic portals, converting individual data provider systems into a federation. - provide the necessary information to users on content of data that can be accessed in the federated systems. The portal is organised in three modules and a catalog. The module 1 has been realized by giving information on the project and objectives, data policy, reasons to provide products and services, link to the themes. Skilled people can jump to the data/products/services provider by clicking on the geographical map. Module 2 is providing more specific information on centers providing data/products/services. Module 3 is providing a direct access to partners product. It contains a short description of the products and a link. Module 4 contain the catalog for each partner product and is based on What, Where, When and How.

  5. Design and implementation of a portal for the medical equipment market: MEDICOM.

    PubMed

    Palamas, S; Kalivas, D; Panou-Diamandi, O; Zeelenberg, C; van Nimwegen, C

    2001-01-01

    The MEDICOM (Medical Products Electronic Commerce) Portal provides the electronic means for medical-equipment manufacturers to communicate online with their customers while supporting the Purchasing Process and Post Market Surveillance. The Portal offers a powerful Internet-based search tool for finding medical products and manufacturers. Its main advantage is the fast, reliable and up-to-date retrieval of information while eliminating all unrelated content that a general-purpose search engine would retrieve. The Universal Medical Device Nomenclature System (UMDNS) registers all products. The Portal accepts end-user requests and generates a list of results containing text descriptions of devices, UMDNS attribute values, and links to manufacturer Web pages and online catalogues for access to more-detailed information. Device short descriptions are provided by the corresponding manufacturer. The Portal offers technical support for integration of the manufacturers Web sites with itself. The network of the Portal and the connected manufacturers sites is called the MEDICOM system. To establish an environment hosting all the interactions of consumers (health care organizations and professionals) and providers (manufacturers, distributors, and resellers of medical devices). The Portal provides the end-user interface, implements system management, and supports database compatibility. The Portal hosts information about the whole MEDICOM system (Common Database) and summarized descriptions of medical devices (Short Description Database); the manufacturers servers present extended descriptions. The Portal provides end-user profiling and registration, an efficient product-searching mechanism, bulletin boards, links to on-line libraries and standards, on-line information for the MEDICOM system, and special messages or advertisements from manufacturers. Platform independence and interoperability characterize the system design. Relational Database Management Systems are used for the system s databases. The end-user interface is implemented using HTML, Javascript, Java applets, and XML documents. Communication between the Portal and the manufacturers servers is implemented using a CORBA interface. Remote administration of the Portal is enabled by dynamically-generated HTML interfaces based on XML documents. A representative group of users evaluated the system. The aim of the evaluation was validation of the usability of all of MEDICOM s functionality. The evaluation procedure was based on ISO/IEC 9126 Information technology - Software product evaluation - Quality characteristics and guidelines for their use. The overall user evaluation of the MEDICOM system was very positive. The MEDICOM system was characterized as an innovative concept that brings significant added value to medical-equipment commerce. The eventual benefits of the MEDICOM system are (a) establishment of a worldwide-accessible marketplace between manufacturers and health care professionals that provides up-to-date and high-quality product information in an easy and friendly way and (b) enhancement of the efficiency of marketing procedures and after-sales support.

  6. Design and Implementation of a Portal for the Medical Equipment Market: MEDICOM

    PubMed Central

    Kalivas, Dimitris; Panou-Diamandi, Ourania; Zeelenberg, Cees; van Nimwegen, Chris

    2001-01-01

    Background The MEDICOM (Medical Products Electronic Commerce) Portal provides the electronic means for medical-equipment manufacturers to communicate online with their customers while supporting the Purchasing Process and Post Market Surveillance. The Portal offers a powerful Internet-based search tool for finding medical products and manufacturers. Its main advantage is the fast, reliable and up-to-date retrieval of information while eliminating all unrelated content that a general-purpose search engine would retrieve. The Universal Medical Device Nomenclature System (UMDNS) registers all products. The Portal accepts end-user requests and generates a list of results containing text descriptions of devices, UMDNS attribute values, and links to manufacturer Web pages and online catalogues for access to more-detailed information. Device short descriptions are provided by the corresponding manufacturer. The Portal offers technical support for integration of the manufacturers' Web sites with itself. The network of the Portal and the connected manufacturers' sites is called the MEDICOM system. Objective To establish an environment hosting all the interactions of consumers (health care organizations and professionals) and providers (manufacturers, distributors, and resellers of medical devices). Methods The Portal provides the end-user interface, implements system management, and supports database compatibility. The Portal hosts information about the whole MEDICOM system (Common Database) and summarized descriptions of medical devices (Short Description Database); the manufacturers' servers present extended descriptions. The Portal provides end-user profiling and registration, an efficient product-searching mechanism, bulletin boards, links to on-line libraries and standards, on-line information for the MEDICOM system, and special messages or advertisements from manufacturers. Platform independence and interoperability characterize the system design. Relational Database Management Systems are used for the system's databases. The end-user interface is implemented using HTML, Javascript, Java applets, and XML documents. Communication between the Portal and the manufacturers' servers is implemented using a CORBA interface. Remote administration of the Portal is enabled by dynamically-generated HTML interfaces based on XML documents. A representative group of users evaluated the system. The aim of the evaluation was validation of the usability of all of MEDICOM's functionality. The evaluation procedure was based on ISO/IEC 9126 Information technology - Software product evaluation - Quality characteristics and guidelines for their use. Results The overall user evaluation of the MEDICOM system was very positive. The MEDICOM system was characterized as an innovative concept that brings significant added value to medical-equipment commerce. Conclusions The eventual benefits of the MEDICOM system are (a) establishment of a worldwide-accessible marketplace between manufacturers and health care professionals that provides up-to-date and high-quality product information in an easy and friendly way and (b) enhancement of the efficiency of marketing procedures and after-sales support. PMID:11772547

  7. Building Portals for Evidence-Informed Education: Lessons from the Dead. A Case Study of the Development of a National Portal Intended to Enhance Evidence Informed Professionalism in Education

    ERIC Educational Resources Information Center

    Blamires, Mike

    2015-01-01

    Access to high-quality evidence has been cited as central to the enhancement of teacher professionalism. This is not a given and teacher access to high-quality evidence requires significant planning and effort. This paper considers the creation of quality-assured reviews to build sustainable quality-assured evidence sources that inform the…

  8. Human liver segments: role of cryptic liver lobes and vascular physiology in the development of liver veins and left-right asymmetry.

    PubMed

    Hikspoors, Jill P J M; Peeters, Mathijs M J P; Kruepunga, Nutmethee; Mekonen, Hayelom K; Mommen, Greet M C; Köhler, S Eleonore; Lamers, Wouter H

    2017-12-07

    Couinaud based his well-known subdivision of the liver into (surgical) segments on the branching order of portal veins and the location of hepatic veins. However, both segment boundaries and number remain controversial due to an incomplete understanding of the role of liver lobes and vascular physiology on hepatic venous development. Human embryonic livers (5-10 weeks of development) were visualized with Amira 3D-reconstruction and Cinema 4D-remodeling software. Starting at 5 weeks, the portal and umbilical veins sprouted portal-vein branches that, at 6.5 weeks, had been pruned to 3 main branches in the right hemi-liver, whereas all (>10) persisted in the left hemi-liver. The asymmetric branching pattern of the umbilical vein resembled that of a "distributing" vessel, whereas the more symmetric branching of the portal trunk resembled a "delivering" vessel. At 6 weeks, 3-4 main hepatic-vein outlets drained into the inferior caval vein, of which that draining the caudate lobe formed the intrahepatic portion of the caval vein. More peripherally, 5-6 major tributaries drained both dorsolateral regions and the left and right ventromedial regions, implying a "crypto-lobar" distribution. Lobar boundaries, even in non-lobated human livers, and functional vascular requirements account for the predictable topography and branching pattern of the liver veins, respectively.

  9. The Enzyme Portal: a case study in applying user-centred design methods in bioinformatics.

    PubMed

    de Matos, Paula; Cham, Jennifer A; Cao, Hong; Alcántara, Rafael; Rowland, Francis; Lopez, Rodrigo; Steinbeck, Christoph

    2013-03-20

    User-centred design (UCD) is a type of user interface design in which the needs and desires of users are taken into account at each stage of the design process for a service or product; often for software applications and websites. Its goal is to facilitate the design of software that is both useful and easy to use. To achieve this, you must characterise users' requirements, design suitable interactions to meet their needs, and test your designs using prototypes and real life scenarios.For bioinformatics, there is little practical information available regarding how to carry out UCD in practice. To address this we describe a complete, multi-stage UCD process used for creating a new bioinformatics resource for integrating enzyme information, called the Enzyme Portal (http://www.ebi.ac.uk/enzymeportal). This freely-available service mines and displays data about proteins with enzymatic activity from public repositories via a single search, and includes biochemical reactions, biological pathways, small molecule chemistry, disease information, 3D protein structures and relevant scientific literature.We employed several UCD techniques, including: persona development, interviews, 'canvas sort' card sorting, user workflows, usability testing and others. Our hope is that this case study will motivate the reader to apply similar UCD approaches to their own software design for bioinformatics. Indeed, we found the benefits included more effective decision-making for design ideas and technologies; enhanced team-working and communication; cost effectiveness; and ultimately a service that more closely meets the needs of our target audience.

  10. VRE4EIC: A Reference Architecture and Components for Research Access

    NASA Astrophysics Data System (ADS)

    Bailo, Daniele; Jeffery, Keith G.; Atakan, Kuvvet; Harrison, Matt

    2017-04-01

    VRE4EIC (www. Vre4eic.eu) is a EC H2020 project with the objective of providing a reference architecture and components for a VRE (Virtual Research Environment). SGs (Science gateways) in North America and VLs (Virtual Laboratories) in Australasia are similar - but significantly different - concepts. A VRE provides not only access to ICT services, data, software components and equipment but also provides a collaborative working environment for cooperation and supports the research lifecycle from idea to publication. Europe has a large number of RIs (Research infrastructures); the major ones are coordinated and planned through the ESFRI (European Strategy Forum on Research Infrastructures) roadmap. Most RIs - such as EPOS - provide a user interface portal function, ranging from (1) a simple list of assets (such as services, datasets, software components, workflows, equipment, experts.. although many provide only information about data) with URLs upon which the user can click to download; (2) to an end-user facility for constructing queries to find relevant assets and subsets of them more-or-less integrated as a downloaded combined dataset; (3) in a few cases - for constructing workflows to achieve the scientific objective. The portal has the scope of the individual RI. The aim of VRE4EIC is to provide a reference architecture, software components and a prototype implementation VRE which allows user access and all the portal functions (and more) not only to an individual RI - such as EPOS - but across RIs thus encouraging multidisciplinary research. Two RIs: EPOS and ENVRIplus (itself spanning 21 RIs) are represented within the project as requirements stakeholders , validators of the architecture and evaluators of the prototype system developed. The characterisation of many more RIs - and their requirements - has been done to ensure wide applicability. The virtualisation across RIs is achieved by using a rich metadata catalog based on CERIF (Common European Research Information Format: a EU Recommendation to Member States and supported, developed and promoted by euroCRIS www.eurocris.org ). The VRE4EIC catalog system harvests from individual RI catalogs (with conversion since they use many different metadata formats) to give the user of VRE4IC a 'canonical view' over the RIs and their assets. The VRE4IC user interface provides portal functions for each and all RIs but also a workflow construction facility. The project expects the RIs to use middleware developed in other projects to facilitate workflow deployment across the eIs (e-Infrastructures) such as GEANT, EUDAT, EGI, OpenAIRE and will itself use the same mechanisms. After 15 months of the project we have validated the requirements from the RIs, defined the architecture and started work on the metadata mapping and conversion. The intention is to have the prototype at M24 for evaluation by the RI partners (and some external Ris) leading to a refined architecture and software stack for production use after M36.

  11. An Open Data Platform in the framework of the EGI-LifeWatch Competence Center

    NASA Astrophysics Data System (ADS)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Yaiza Rodríguez Marrero, Ana

    2016-04-01

    The working pilot of an Open Data Platform supporting the full data cycle in research is presented. It aims to preserve knowledge explicitly, starting with the description of the Case Studies, and integrating data and software management and preservation on equal basis. The uninterrupted support in the chain starts at the data acquisition level and covers up to the support for reuse and publication in an open framework, providing integrity and provenance controls. The Lifewatch Open Science Framework is a pilot web portal developed in collaboration with different commercial companies that tries to enrich and integrate different data lifecycle-related tools in order to address the management of the different steps: data planning, gathering, storing, curation, preservation, sharing, discovering, etc. To achieve this goal, the platform includes the following features: -Data Management Planning. Tool to set up an structure of the data, including what data will be generated, how it will be exploited, re-used, curated, preserved, etc. It has a semantic approach: includes reference to ontologies in order to express what data will be gathered. -Close to instrumentation. The portal includes a distributed storage system that can be used both for storing data from instruments and output data from analysis. All that data can be shared -Analysis. Resources from EGI Federated Cloud are accessible within the portal, so that users can exploit computing resources to perform analysis and other processes, including workflows. -Preservation. Data can be preserved in different systems and DOIs can be minted not only for datasets but also for software, DMPs, etc. The presentation will show the different components of the framework as well as how it can be extrapolated to other communities.

  12. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License Notice: Published under licence in Journal of Physics: Conference Series by IOP Publishing Ltd.

  13. Small animal magnetic resonance imaging: an efficient tool to assess liver volume and intrahepatic vascular anatomy.

    PubMed

    Melloul, Emmanuel; Raptis, Dimitri A; Boss, Andreas; Pfammater, Thomas; Tschuor, Christoph; Tian, Yinghua; Graf, Rolf; Clavien, Pierre-Alain; Lesurtel, Mickael

    2014-04-01

    To develop a noninvasive technique to assess liver volumetry and intrahepatic portal vein anatomy in a mouse model of liver regeneration. Fifty-two C57BL/6 male mice underwent magnetic resonance imaging (MRI) of the liver using a 4.7 T small animal MRI system after no treatment, 70% partial hepatectomy (PH), or selective portal vein embolization. The protocol consisted of the following sequences: three-dimensional-encoded spoiled gradient-echo sequence (repetition time per echo time 15 per 2.7 ms, flip angle 20°) for volumetry, and two-dimensional-encoded time-of-flight angiography sequence (repetition time per echo time 18 per 6.4 ms, flip angle 80°) for vessel visualization. Liver volume and portal vein segmentation was performed using a dedicated postprocessing software. In animals with portal vein embolization, portography served as reference standard. True liver volume was measured after sacrificing the animals. Measurements were carried out by two independent observers with subsequent analysis by the Cohen κ-test for interobserver agreement. MRI liver volumetry highly correlated with the true liver volume measurement using a conventional method in both the untreated liver and the liver remnant after 70% PH with a high interobserver correlation coefficient of 0.94 (95% confidence interval, 0.80-0.98 for untreated liver [P < 0.001] and 0.90-0.97 after 70% PH [P < 0.001]). The diagnostic accuracy of magnetic resonance angiography for the occlusion of one branch of the portal vein was 0.95 (95% confidence interval, 0.84-1). The level of agreement between the two observers for the description of intrahepatic vascular anatomy was excellent (Cohen κ value = 0.925). This protocol may be used for noninvasive liver volumetry and visualization of portal vein anatomy in mice. It will serve the dynamic study of new strategies to enhance liver regeneration in vivo. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Implementation of an Enterprise Information Portal (EIP) in the Loyola University Health System

    PubMed Central

    Price, Ronald N.; Hernandez, Kim

    2001-01-01

    Loyola University Chicago Stritch School of Medicine and Loyola University Medical Center have long histories in the development of applications to support the institutions' missions of education, research and clinical care. In late 1998, the institutions' application development group undertook an ambitious program to re-architecture more than 10 years of legacy application development (30+ core applications) into a unified World Wide Web (WWW) environment. The primary project objectives were to construct an environment that would support the rapid development of n-tier, web-based applications while providing standard methods for user authentication/validation, security/access control and definition of a user's organizational context. The project's efforts resulted in Loyola's Enterprise Information Portal (EIP), which meets the aforementioned objectives. This environment: 1) allows access to other vertical Intranet portals (e.g., electronic medical record, patient satisfaction information and faculty effort); 2) supports end-user desktop customization; and 3) provides a means for standardized application “look and feel.” The portal was constructed utilizing readily available hardware and software. Server hardware consists of multiprocessor (Intel Pentium 500Mhz) Compaq 6500 servers with one gigabyte of random access memory and 75 gigabytes of hard disk storage. Microsoft SQL Server was selected to house the portal's internal or security data structures. Netscape Enterprise Server was selected for the web server component of the environment and Allaire's ColdFusion was chosen for access and application tiers. Total costs for the portal environment was less than $40,000. User data storage is accomplished through two Microsoft SQL Servers and an existing SUN Microsystems enterprise server with eight processors, 750 gigabytes of disk storage operating Sybase relational database manager. Total storage capacity for all system exceeds one terabyte. In the past 12 months, the EIP has supported development of more than 88 applications and is utilized by more than 2,200 users.

  15. SU-D-210-06: Feasibility for Monitoring the Head of the Pancreas Motion Through a Surrogate Using Ultrasound During Radiation Therapy Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Omari, E; Noid, G; Ehlers, C

    Purpose: Substantial target motion during the delivery of radiation therapy (RT) for pancreatic cancer is well recognized as a major limiting factor on RT effectiveness. The aim of this work is to monitor intra-fractional motion of the pancreas using ultrasound during RT delivery. Methods: Transabdominal Ultrasound B-mode images were collected from 5 volunteers using a research version of the Clarity Autoscan System (Elekta). The autoscan transducer with center frequency of 5 MHz was utilized for the scans. Imaging parameters were adjusted to acquire images at the desired depth with good contrast and a wide sweep angle. Since well-defined boundaries ofmore » the pancreas can be difficult to find on ultrasound B-mode images, the portal vein was selected as a surrogate for motion estimation of the head of the pancreas. The selection was due to its anatomical location posterior to the neck of the pancreas and close proximity to the pancreas head. The portal vein was contoured on the ultrasound images acquired during simulation using the Clarity Research AFC Workstation software. Volunteers were set up in a similar manner to the simulation for their monitoring session and the ultrasound transducer was mounted on an arm fixed to the couch. A video segment of the portal vein motion was captured. Results: The portal vein was visualized and segmented. Successful monitoring sessions of the portal vein were observed. In addition, our results showed that the ultrasound transducer itself reduces breathing related motion. This is analogous to the use of a compression plate to suppress respiration motion during thorax or abdominal irradiation. Conclusion: We demonstrate the feasibility of tracking the pancreas through the localization of the portal vein using abdominal ultrasound. This will allow for real-time tracking of the intra-fractional motion to justify PTV-margin and to account for unusual motions, thus, improving normal tissue sparing. This research was funding in part by Elekta Inc.« less

  16. Development of Watch Schedule Using Rules Approach

    NASA Astrophysics Data System (ADS)

    Jurkevicius, Darius; Vasilecas, Olegas

    The software for schedule creation and optimization solves a difficult, important and practical problem. The proposed solution is an online employee portal where administrator users can create and manage watch schedules and employee requests. Each employee can login with his/her own account and see his/her assignments, manage requests, etc. Employees set as administrators can perform the employee scheduling online, manage requests, etc. This scheduling software allows users not only to see the initial and optimized watch schedule in a simple and understandable form, but also to create special rules and criteria and input their business. The system using rules automatically will generate watch schedule.

  17. Biowep: a workflow enactment portal for bioinformatics applications.

    PubMed

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics - LITBIO.

  18. Biowep: a workflow enactment portal for bioinformatics applications

    PubMed Central

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-01-01

    Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics – LITBIO. PMID:17430563

  19. Patterns in Patient Access and Utilization of Online Medical Records: Analysis of MyChart.

    PubMed

    Redelmeier, Donald A; Kraus, Nicole C

    2018-02-06

    Electronic patient portals provide a new method for sharing personal medical information with individual patients. Our aim was to review utilization patterns of the largest online patient portal in Canada's largest city. We conducted a 4-year time-trend analysis of aggregated anonymous utilization data of the MyChart patient portal at Sunnybrook Health Sciences Centre in Ontario, Canada, from January 1, 2012, through December 31, 2015. Prespecified analyses examined trends related to day (weekend vs weekday), season (July vs January), year (2012 vs 2015), and an extreme adverse weather event (ice storm of December 20-26, 2013). Primary endpoints included three measures of patient portal activity: registrations, logins, and pageviews. We identified 32,325 patients who registered for a MyChart account during the study interval. Time-trend analysis showed no sign of attenuating registrations over time. Logins were frequent, averaged 734 total per day, and showed an increasing trend over time. Pageviews mirrored logins, averaged about 3029 total per day, and equated to about 5 pageviews during the average login. The most popular pageviews were clinical notes, followed by laboratory results and medical imaging reports. All measures of patient activity were lower on weekends compared to weekdays (P<.001) yet showed no significant changes related to seasons or extreme weather. No major security breach, malware attack, or software failure occurred during the study. Online patient portals can provide a popular and reliable system for distributing personal medical information to active patients and may merit consideration for hospitals. ©Donald A Redelmeier, Nicole C Kraus. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 06.02.2018.

  20. Screening portal, system and method of using same

    DOEpatents

    Linker, Kevin L.; Hunter, John A.; Brusseau, Charles A.

    2013-04-30

    A portal, system and method for screening an object for a target substance is provided. The portal includes an inflatable bladder expandable to form a test space for receiving the object and a plurality of nozzles positioned about the inflatable bladder. The nozzles are in fluid communication with a fluid source for directing air over the object whereby samples are removed from the object for examination. A collector is operatively connected to the inflatable bladder for collecting the samples removed from the object. A detector is operatively connected to the collector for examining the removed samples for the presence of the target substance. At least one preconcentrator may be operatively connected to the collector for concentrating the samples collected thereby.

  1. Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Lynnes, Christopher; Vollmer, Bruce; Savtchenko, Andrey; Theobald, Michael; Yang, Wenli

    2011-01-01

    The Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real-time Capability for EOS (LANCE) element at the Goddard Earth Sciences Data and Information Services Center (GES DISC) provides information on the global and regional atmospheric state, with very low temporal latency, to support climate research and improve weather forecasting. An open and interoperable platform is useful to facilitate access to, and integration of, LANCE AIRS NRT data. As Web services technology has matured in recent years, a new scalable Service-Oriented Architecture (SOA) is emerging as the basic platform for distributed computing and large networks of interoperable applications. Following the provide-register-discover-consume SOA paradigm, this presentation discusses how to use open-source geospatial software components to build Web services for publishing and accessing AIRS NRT data, explore the metadata relevant to registering and discovering data and services in the catalogue systems, and implement a Web portal to facilitate users' consumption of the data and services.

  2. WormQTL—public archive and analysis web portal for natural variation data in Caenorhabditis spp

    PubMed Central

    Snoek, L. Basten; Van der Velde, K. Joeri; Arends, Danny; Li, Yang; Beyer, Antje; Elvin, Mark; Fisher, Jasmin; Hajnal, Alex; Hengartner, Michael O.; Poulin, Gino B.; Rodriguez, Miriam; Schmid, Tobias; Schrimpf, Sabine; Xue, Feng; Jansen, Ritsert C.; Kammenga, Jan E.; Swertz, Morris A.

    2013-01-01

    Here, we present WormQTL (http://www.wormqtl.org), an easily accessible database enabling search, comparative analysis and meta-analysis of all data on variation in Caenorhabditis spp. Over the past decade, Caenorhabditis elegans has become instrumental for molecular quantitative genetics and the systems biology of natural variation. These efforts have resulted in a valuable amount of phenotypic, high-throughput molecular and genotypic data across different developmental worm stages and environments in hundreds of C. elegans strains. WormQTL provides a workbench of analysis tools for genotype–phenotype linkage and association mapping based on but not limited to R/qtl (http://www.rqtl.org). All data can be uploaded and downloaded using simple delimited text or Excel formats and are accessible via a public web user interface for biologists and R statistic and web service interfaces for bioinformaticians, based on open source MOLGENIS and xQTL workbench software. WormQTL welcomes data submissions from other worm researchers. PMID:23180786

  3. WormQTL--public archive and analysis web portal for natural variation data in Caenorhabditis spp.

    PubMed

    Snoek, L Basten; Van der Velde, K Joeri; Arends, Danny; Li, Yang; Beyer, Antje; Elvin, Mark; Fisher, Jasmin; Hajnal, Alex; Hengartner, Michael O; Poulin, Gino B; Rodriguez, Miriam; Schmid, Tobias; Schrimpf, Sabine; Xue, Feng; Jansen, Ritsert C; Kammenga, Jan E; Swertz, Morris A

    2013-01-01

    Here, we present WormQTL (http://www.wormqtl.org), an easily accessible database enabling search, comparative analysis and meta-analysis of all data on variation in Caenorhabditis spp. Over the past decade, Caenorhabditis elegans has become instrumental for molecular quantitative genetics and the systems biology of natural variation. These efforts have resulted in a valuable amount of phenotypic, high-throughput molecular and genotypic data across different developmental worm stages and environments in hundreds of C. elegans strains. WormQTL provides a workbench of analysis tools for genotype-phenotype linkage and association mapping based on but not limited to R/qtl (http://www.rqtl.org). All data can be uploaded and downloaded using simple delimited text or Excel formats and are accessible via a public web user interface for biologists and R statistic and web service interfaces for bioinformaticians, based on open source MOLGENIS and xQTL workbench software. WormQTL welcomes data submissions from other worm researchers.

  4. A Randomized Trial Comparing Classical Participatory Design to VandAID, an Interactive CrowdSourcing Platform to Facilitate User-centered Design.

    PubMed

    Dufendach, Kevin R; Koch, Sabine; Unertl, Kim M; Lehmann, Christoph U

    2017-10-26

    Early involvement of stakeholders in the design of medical software is particularly important due to the need to incorporate complex knowledge and actions associated with clinical work. Standard user-centered design methods include focus groups and participatory design sessions with individual stakeholders, which generally limit user involvement to a small number of individuals due to the significant time investments from designers and end users. The goal of this project was to reduce the effort for end users to participate in co-design of a software user interface by developing an interactive web-based crowdsourcing platform. In a randomized trial, we compared a new web-based crowdsourcing platform to standard participatory design sessions. We developed an interactive, modular platform that allows responsive remote customization and design feedback on a visual user interface based on user preferences. The responsive canvas is a dynamic HTML template that responds in real time to user preference selections. Upon completion, the design team can view the user's interface creations through an administrator portal and download the structured selections through a REDCap interface. We have created a software platform that allows users to customize a user interface and see the results of that customization in real time, receiving immediate feedback on the impact of their design choices. Neonatal clinicians used the new platform to successfully design and customize a neonatal handoff tool. They received no specific instruction and yet were able to use the software easily and reported high usability. VandAID, a new web-based crowdsourcing platform, can involve multiple users in user-centered design simultaneously and provides means of obtaining design feedback remotely. The software can provide design feedback at any stage in the design process, but it will be of greatest utility for specifying user requirements and evaluating iterative designs with multiple options.

  5. Integrating thematic web portal capabilities into the NASA Earthdata Web Infrastructure

    NASA Astrophysics Data System (ADS)

    Wong, M. M.; McLaughlin, B. D.; Huang, T.; Baynes, K.

    2015-12-01

    The National Aeronautics and Space Administration (NASA) acquires and distributes an abundance of Earth science data on a daily basis to a diverse user community worldwide. To assist the scientific community and general public in achieving a greater understanding of the interdisciplinary nature of Earth science and of key environmental and climate change topics, the NASA Earthdata web infrastructure is integrating new methods of presenting and providing access to Earth science information, data, research and results. This poster will present the process of integrating thematic web portal capabilities into the NASA Earthdata web infrastructure, with examples from the Sea Level Change Portal. The Sea Level Change Portal will be a source of current NASA research, data and information regarding sea level change. The portal will provide sea level change information through articles, graphics, videos and animations, an interactive tool to view and access sea level change data and a dashboard showing sea level change indicators. Earthdata is a part of the Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. It is comprised of twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), data discovery and service access client (Reverb and Earthdata Search), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative and a host of other discipline specific data discovery, data access, data subsetting and visualization tools.

  6. Using the CPTAC Assay Portal to identify and implement highly characterized targeted proteomics assays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteaker, Jeffrey R.; Halusa, Goran; Hoofnagle, Andrew N.

    2016-02-12

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as an open-source repository of well-characterized targeted proteomic assays. The portal is designed to curate and disseminate highly characterized, targeted mass spectrometry (MS)-based assays by providing detailed assay performance characterization data, standard operating procedures, and access to reagents. Assay content is accessed via the portal through queries to find assays targeting proteins associated with specific cellular pathways, protein complexes, or specific chromosomal regions. The position of the peptide analytes for which there are available assays are mapped relative tomore » other features of interest in the protein, such as sequence domains, isoforms, single nucleotide polymorphisms, and post-translational modifications. The overarching goals are to enable robust quantification of all human proteins and to standardize the quantification of targeted MS-based assays to ultimately enable harmonization of results over time and across laboratories.« less

  7. Communication library for run-time visualization of distributed, asynchronous data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowlan, J.; Wightman, B.T.

    1994-04-01

    In this paper we present a method for collecting and visualizing data generated by a parallel computational simulation during run time. Data distributed across multiple processes is sent across parallel communication lines to a remote workstation, which sorts and queues the data for visualization. We have implemented our method in a set of tools called PORTAL (for Parallel aRchitecture data-TrAnsfer Library). The tools comprise generic routines for sending data from a parallel program (callable from either C or FORTRAN), a semi-parallel communication scheme currently built upon Unix Sockets, and a real-time connection to the scientific visualization program AVS. Our methodmore » is most valuable when used to examine large datasets that can be efficiently generated and do not need to be stored on disk. The PORTAL source libraries, detailed documentation, and a working example can be obtained by anonymous ftp from info.mcs.anl.gov from the file portal.tar.Z from the directory pub/portal.« less

  8. Using the CPTAC Assay Portal to Identify and Implement Highly Characterized Targeted Proteomics Assays.

    PubMed

    Whiteaker, Jeffrey R; Halusa, Goran N; Hoofnagle, Andrew N; Sharma, Vagisha; MacLean, Brendan; Yan, Ping; Wrobel, John A; Kennedy, Jacob; Mani, D R; Zimmerman, Lisa J; Meyer, Matthew R; Mesri, Mehdi; Boja, Emily; Carr, Steven A; Chan, Daniel W; Chen, Xian; Chen, Jing; Davies, Sherri R; Ellis, Matthew J C; Fenyö, David; Hiltke, Tara; Ketchum, Karen A; Kinsinger, Chris; Kuhn, Eric; Liebler, Daniel C; Liu, Tao; Loss, Michael; MacCoss, Michael J; Qian, Wei-Jun; Rivers, Robert; Rodland, Karin D; Ruggles, Kelly V; Scott, Mitchell G; Smith, Richard D; Thomas, Stefani; Townsend, R Reid; Whiteley, Gordon; Wu, Chaochao; Zhang, Hui; Zhang, Zhen; Rodriguez, Henry; Paulovich, Amanda G

    2016-01-01

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as an open-source repository of well-characterized targeted proteomic assays. The portal is designed to curate and disseminate highly characterized, targeted mass spectrometry (MS)-based assays by providing detailed assay performance characterization data, standard operating procedures, and access to reagents. Assay content is accessed via the portal through queries to find assays targeting proteins associated with specific cellular pathways, protein complexes, or specific chromosomal regions. The position of the peptide analytes for which there are available assays are mapped relative to other features of interest in the protein, such as sequence domains, isoforms, single nucleotide polymorphisms, and posttranslational modifications. The overarching goals are to enable robust quantification of all human proteins and to standardize the quantification of targeted MS-based assays to ultimately enable harmonization of results over time and across laboratories.

  9. Understanding cancer survivors' information needs and information-seeking behaviors for complementary and alternative medicine from short- to long-term survival: a mixed-methods study.

    PubMed

    Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T Elizabeth

    2018-01-01

    The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. A mixed-methods approach was used with cancer survivors from the "Assessment of Patients' Experience with Cancer Care" 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families' and friends' provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients.

  10. Understanding cancer survivors’ information needs and information-seeking behaviors for complementary and alternative medicine from short- to long-term survival: a mixed-methods study

    PubMed Central

    Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T. Elizabeth

    2018-01-01

    Objective The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. Methods A mixed-methods approach was used with cancer survivors from the “Assessment of Patients’ Experience with Cancer Care” 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Results Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families’ and friends’ provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Conclusion Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients. PMID:29339938

  11. Genomics Portals: integrative web-platform for mining genomics data.

    PubMed

    Shinde, Kaustubh; Phatak, Mukta; Johannes, Freudenberg M; Chen, Jing; Li, Qian; Vineet, Joshi K; Hu, Zhen; Ghosh, Krishnendu; Meller, Jaroslaw; Medvedovic, Mario

    2010-01-13

    A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org.

  12. Genomics Portals: integrative web-platform for mining genomics data

    PubMed Central

    2010-01-01

    Background A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Results Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. Conclusion The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org. PMID:20070909

  13. A framework for collaborative review of candidate events in high data rate streams: The V-FASTR experiment as a case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Andrew F.; Cinquini, Luca; Khudikyan, Shakeh E.

    2015-01-01

    “Fast radio transients” are defined here as bright millisecond pulses of radio-frequency energy. These short-duration pulses can be produced by known objects such as pulsars or potentially by more exotic objects such as evaporating black holes. The identification and verification of such an event would be of great scientific value. This is one major goal of the Very Long Baseline Array (VLBA) Fast Transient Experiment (V-FASTR), a software-based detection system installed at the VLBA. V-FASTR uses a “commensal” (piggy-back) approach, analyzing all array data continually during routine VLBA observations and identifying candidate fast transient events. Raw data can be storedmore » from a buffer memory, which enables a comprehensive off-line analysis. This is invaluable for validating the astrophysical origin of any detection. Candidates discovered by the automatic system must be reviewed each day by analysts to identify any promising signals that warrant a more in-depth investigation. To support the timely analysis of fast transient detection candidates by V-FASTR scientists, we have developed a metadata-driven, collaborative candidate review framework. The framework consists of a software pipeline for metadata processing composed of both open source software components and project-specific code written expressly to extract and catalog metadata from the incoming V-FASTR data products, and a web-based data portal that facilitates browsing and inspection of the available metadata for candidate events extracted from the VLBA radio data.« less

  14. Simplex GPS and InSAR Inversion Software

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay W.; Lyzenga, Gregory A.; Pierce, Marlon E.

    2012-01-01

    Changes in the shape of the Earth's surface can be routinely measured with precisions better than centimeters. Processes below the surface often drive these changes and as a result, investigators require models with inversion methods to characterize the sources. Simplex inverts any combination of GPS (global positioning system), UAVSAR (uninhabited aerial vehicle synthetic aperture radar), and InSAR (interferometric synthetic aperture radar) data simultaneously for elastic response from fault and fluid motions. It can be used to solve for multiple faults and parameters, all of which can be specified or allowed to vary. The software can be used to study long-term tectonic motions and the faults responsible for those motions, or can be used to invert for co-seismic slip from earthquakes. Solutions involving estimation of fault motion and changes in fluid reservoirs such as magma or water are possible. Any arbitrary number of faults or parameters can be considered. Simplex specifically solves for any of location, geometry, fault slip, and expansion/contraction of a single or multiple faults. It inverts GPS and InSAR data for elastic dislocations in a half-space. Slip parameters include strike slip, dip slip, and tensile dislocations. It includes a map interface for both setting up the models and viewing the results. Results, including faults, and observed, computed, and residual displacements, are output in text format, a map interface, and can be exported to KML. The software interfaces with the QuakeTables database allowing a user to select existing fault parameters or data. Simplex can be accessed through the QuakeSim portal graphical user interface or run from a UNIX command line.

  15. Dynamics behaviour of an elastic non-ideal (NIS) portal frame, including fractional nonlinearities

    NASA Astrophysics Data System (ADS)

    Balthazar, J. M.; Brasil, R. M. L. F.; Felix, J. L. P.; Tusset, A. M.; Picirillo, V.; Iluik, I.; Rocha, R. T.; Nabarrete, A.; Oliveira, C.

    2016-05-01

    This paper overviews recent developments on some problems related to elastic structures, such as portal frames, taking into account the full interactions of the vibrating systems, with an energy source of limited power supply (small motors, electro-mechanical shakers). We include a discussion on fractional (rational) damping and stiffness effects on the adopted modelling. This was a plenary lecture, delivered in the event titled: Mechanics of Slender Structures, organized in Northampton, England from 21-22, September 2015.

  16. Designing a data portal for synthesis modeling

    NASA Astrophysics Data System (ADS)

    Holmes, M. A.

    2006-12-01

    Processing of field and model data in multi-disciplinary integrated science studies is a vital part of synthesis modeling. Collection and storage techniques for field data vary greatly between the participating scientific disciplines due to the nature of the data being collected, whether it be in situ, remotely sensed, or recorded by automated data logging equipment. Spreadsheets, personal databases, text files and binary files are used in the initial storage and processing of the raw data. In order to be useful to scientists, engineers and modelers the data need to be stored in a format that is easily identifiable, accessible and transparent to a variety of computing environments. The Model Operations and Synthesis (MOAS) database and associated web portal were created to provide such capabilities. The industry standard relational database is comprised of spatial and temporal data tables, shape files and supporting metadata accessible over the network, through a menu driven web-based portal or spatially accessible through ArcSDE connections from the user's local GIS desktop software. A separate server provides public access to spatial data and model output in the form of attributed shape files through an ArcIMS web-based graphical user interface.

  17. The Impact of Electronic Patient Portals on Patient Care: A Systematic Review of Controlled Trials

    PubMed Central

    Ammenwerth, Elske; Schnell-Inderst, Petra

    2012-01-01

    Background Modern information technology is changing and provides new challenges to health care. The emergence of the Internet and the electronic health record (EHR) has brought new opportunities for patients to play a more active role in his/her care. Although in many countries patients have the right to access their clinical information, access to clinical records electronically is not common. Patient portals consist of provider-tethered applications that allow patients to electronically access health information that are documented and managed by a health care institution. Although patient portals are already being implemented, it is still unclear in which ways these technologies can influence patient care. Objective To systematically review the available evidence on the impact of electronic patient portals on patient care. Methods A systematic search was conducted using PubMed and other sources to identify controlled experimental or quasi-experimental studies on the impact of patient portals that were published between 1990 and 2011. A total of 1,306 references from all the publication hits were screened, and 13 papers were retrieved for full text analysis. Results We identified 5 papers presenting 4 distinct studies. There were no statistically significant changes between intervention and control group in the 2 randomized controlled trials investigating the effect of patient portals on health outcomes. Significant changes in the patient portal group, compared to a control group, could be observed for the following parameters: quicker decrease in office visit rates and slower increase in telephone contacts; increase in number of messages sent; changes of the medication regimen; and better adherence to treatment. Conclusions The number of available controlled studies with regard to patient portals is low. Even when patient portals are often discussed as a way to empower patients and improve quality of care, there is insufficient evidence to support this assumption. PMID:23183044

  18. Trident: scalable compute archives: workflows, visualization, and analysis

    NASA Astrophysics Data System (ADS)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel

    2016-08-01

    The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub-work flows (3) ImageX, an interactive image visualization service (3) an authentication and authorization service (4) a data service that handles archival, staging and serving of data products, and (5) a notification service that serves statistical collation and reporting needs of various projects. Several other additional components are under development. Trident is an umbrella project, that evolved from the One Degree Imager, Portal, Pipeline, and Archive (ODI-PPA) project which we had initially refactored toward (1) a powerful analysis/visualization portal for Globular Cluster System (GCS) survey data collected by IU researchers, 2) a data search and download portal for the IU Electron Microscopy Center's data (EMC-SCA), 3) a prototype archive for the Ludwig Maximilian University's Wide Field Imager. The new Trident software has been used to deploy (1) a metadata quality control and analytics portal (RADY-SCA) for DICOM formatted medical imaging data produced by the IU Radiology Center, 2) Several prototype work flows for different domains, 3) a snapshot tool within IU's Karst Desktop environment, 4) a limited component-set to serve GIS data within the IU GIS web portal. Trident SCA systems leverage supercomputing and storage resources at Indiana University but can be configured to make use of any cloud/grid resource, from local workstations/servers to (inter)national supercomputing facilities such as XSEDE.

  19. Mercury- Distributed Metadata Management, Data Discovery and Access System

    NASA Astrophysics Data System (ADS)

    Palanisamy, Giri; Wilson, Bruce E.; Devarakonda, Ranjeet; Green, James M.

    2007-12-01

    Mercury is a federated metadata harvesting, search and retrieval tool based on both open source and ORNL- developed software. It was originally developed for NASA, and the Mercury development consortium now includes funding from NASA, USGS, and DOE. Mercury supports various metadata standards including XML, Z39.50, FGDC, Dublin-Core, Darwin-Core, EML, and ISO-19115 (under development). Mercury provides a single portal to information contained in disparate data management systems. It collects metadata and key data from contributing project servers distributed around the world and builds a centralized index. The Mercury search interfaces then allow the users to perform simple, fielded, spatial and temporal searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results to the user, while allowing data providers to advertise the availability of their data and maintain complete control and ownership of that data. Mercury supports various projects including: ORNL DAAC, NBII, DADDI, LBA, NARSTO, CDIAC, OCEAN, I3N, IAI, ESIP and ARM. The new Mercury system is based on a Service Oriented Architecture and supports various services such as Thesaurus Service, Gazetteer Web Service and UDDI Directory Services. This system also provides various search services including: RSS, Geo-RSS, OpenSearch, Web Services and Portlets. Other features include: Filtering and dynamic sorting of search results, book-markable search results, save, retrieve, and modify search criteria.

  20. Vigil: Providing Trust for Enhanced Security in Pervasive Systems

    DTIC Science & Technology

    2005-01-01

    environment consisting of Bluetooth, Infrared, 802.11b and Ethernet. Vigil is the extension and culmination of our two previous projects: Centaurus [22] and...Centau- rus2 [36]. The main design goal of the Centaurus project was the development of a framework for building portals to services using various...types of mobile devices. Centaurus provides a uniform infrastructure for access to heterogeneous hardware and software components. It uses a language

  1. Updates to the Virtual Atomic and Molecular Data Centre

    NASA Astrophysics Data System (ADS)

    Hill, Christian; Tennyson, Jonathan; Gordon, Iouli E.; Rothman, Laurence S.; Dubernet, Marie-Lise

    2014-06-01

    The Virtual Atomic and Molecular Data Centre (VAMDC) has established a set of standards for the storage and transmission of atomic and molecular data and an SQL-based query language (VSS2) for searching online databases, known as nodes. The project has also created an online service, the VAMDC Portal, through which all of these databases may be searched and their results compared and aggregated. Since its inception four years ago, the VAMDC e-infrastructure has grown to encompass over 40 databases, including HITRAN, in more than 20 countries and engages actively with scientists in six continents. Associated with the portal are a growing suite of software tools for the transformation of data from its native, XML-based, XSAMS format, to a range of more convenient human-readable (such as HTML) and machinereadable (such as CSV) formats. The relational database for HITRAN1, created as part of the VAMDC project is a flexible and extensible data model which is able to represent a wider range of parameters than the current fixed-format text-based one. Over the next year, a new online interface to this database will be tested, released and fully documented - this web application, HITRANonline2, will fully replace the ageing and incomplete JavaHAWKS software suite.

  2. Application of Open Source Software by the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on top of XML. Apache Solr, an open source search engine, was used to drive our search interface and as way to store references to metadata and data exposed via REST endpoints. As was the case with Apache OODT there was team experience with this component that helped drive this choice. Lastly, OpenSSO, an open source single sign on service, was used to secure and provide access constraints to our REST based services. For this product there was little past experience but given our service based approach seemed to be a natural fit. Given our exposure to open source we will discuss the tradeoffs and benefits received by the choices made. Moreover, we will dive into the context of how the software packages were used and the impact of their design and extensibility had on the construction of the infrastructure. Finally, we will compare our encounter across open source solutions and attributes that can vary the impression one will get. This comprehensive account of our endeavor should aid others in their assessment and use of open source.

  3. The Enzyme Portal: a case study in applying user-centred design methods in bioinformatics

    PubMed Central

    2013-01-01

    User-centred design (UCD) is a type of user interface design in which the needs and desires of users are taken into account at each stage of the design process for a service or product; often for software applications and websites. Its goal is to facilitate the design of software that is both useful and easy to use. To achieve this, you must characterise users’ requirements, design suitable interactions to meet their needs, and test your designs using prototypes and real life scenarios. For bioinformatics, there is little practical information available regarding how to carry out UCD in practice. To address this we describe a complete, multi-stage UCD process used for creating a new bioinformatics resource for integrating enzyme information, called the Enzyme Portal (http://www.ebi.ac.uk/enzymeportal). This freely-available service mines and displays data about proteins with enzymatic activity from public repositories via a single search, and includes biochemical reactions, biological pathways, small molecule chemistry, disease information, 3D protein structures and relevant scientific literature. We employed several UCD techniques, including: persona development, interviews, ‘canvas sort’ card sorting, user workflows, usability testing and others. Our hope is that this case study will motivate the reader to apply similar UCD approaches to their own software design for bioinformatics. Indeed, we found the benefits included more effective decision-making for design ideas and technologies; enhanced team-working and communication; cost effectiveness; and ultimately a service that more closely meets the needs of our target audience. PMID:23514033

  4. DART, a platform for the creation and registration of cone beam digital tomosynthesis datasets.

    PubMed

    Sarkar, Vikren; Shi, Chengyu; Papanikolaou, Niko

    2011-04-01

    Digital tomosynthesis is an imaging modality that allows for tomographic reconstructions using only a fraction of the images needed for CT reconstruction. Since it offers the advantages of tomographic images with a smaller imaging dose delivered to the patient, the technique offers much promise for use in patient positioning prior to radiation delivery. This paper describes a software environment developed to help in the creation of digital tomosynthesis image sets from digital portal images using three different reconstruction algorithms. The software then allows for use of the tomograms for patient positioning or for dose recalculation if shifts are not applied, possibly as part of an adaptive radiotherapy regimen.

  5. JPL Earth Science Center Visualization Multitouch Table

    NASA Astrophysics Data System (ADS)

    Kim, R.; Dodge, K.; Malhotra, S.; Chang, G.

    2014-12-01

    JPL Earth Science Center Visualization table is a specialized software and hardware to allow multitouch, multiuser, and remote display control to create seamlessly integrated experiences to visualize JPL missions and their remote sensing data. The software is fully GIS capable through time aware OGC WMTS using Lunar Mapping and Modeling Portal as the GIS backend to continuously ingest and retrieve realtime remote sending data and satellite location data. 55 inch and 82 inch unlimited finger count multitouch displays allows multiple users to explore JPL Earth missions and visualize remote sensing data through very intuitive and interactive touch graphical user interface. To improve the integrated experience, Earth Science Center Visualization Table team developed network streaming which allows table software to stream data visualization to near by remote display though computer network. The purpose of this visualization/presentation tool is not only to support earth science operation, but specifically designed for education and public outreach and will significantly contribute to STEM. Our presentation will include overview of our software, hardware, and showcase of our system.

  6. BiobankConnect: software to rapidly connect data elements for pooled analysis across biobanks using ontological and lexical indexing.

    PubMed

    Pang, Chao; Hendriksen, Dennis; Dijkstra, Martijn; van der Velde, K Joeri; Kuiper, Joel; Hillege, Hans L; Swertz, Morris A

    2015-01-01

    Pooling data across biobanks is necessary to increase statistical power, reveal more subtle associations, and synergize the value of data sources. However, searching for desired data elements among the thousands of available elements and harmonizing differences in terminology, data collection, and structure, is arduous and time consuming. To speed up biobank data pooling we developed BiobankConnect, a system to semi-automatically match desired data elements to available elements by: (1) annotating the desired elements with ontology terms using BioPortal; (2) automatically expanding the query for these elements with synonyms and subclass information using OntoCAT; (3) automatically searching available elements for these expanded terms using Lucene lexical matching; and (4) shortlisting relevant matches sorted by matching score. We evaluated BiobankConnect using human curated matches from EU-BioSHaRE, searching for 32 desired data elements in 7461 available elements from six biobanks. We found 0.75 precision at rank 1 and 0.74 recall at rank 10 compared to a manually curated set of relevant matches. In addition, best matches chosen by BioSHaRE experts ranked first in 63.0% and in the top 10 in 98.4% of cases, indicating that our system has the potential to significantly reduce manual matching work. BiobankConnect provides an easy user interface to significantly speed up the biobank harmonization process. It may also prove useful for other forms of biomedical data integration. All the software can be downloaded as a MOLGENIS open source app from http://www.github.com/molgenis, with a demo available at http://www.biobankconnect.org. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  7. BiobankConnect: software to rapidly connect data elements for pooled analysis across biobanks using ontological and lexical indexing

    PubMed Central

    Pang, Chao; Hendriksen, Dennis; Dijkstra, Martijn; van der Velde, K Joeri; Kuiper, Joel; Hillege, Hans L; Swertz, Morris A

    2015-01-01

    Objective Pooling data across biobanks is necessary to increase statistical power, reveal more subtle associations, and synergize the value of data sources. However, searching for desired data elements among the thousands of available elements and harmonizing differences in terminology, data collection, and structure, is arduous and time consuming. Materials and methods To speed up biobank data pooling we developed BiobankConnect, a system to semi-automatically match desired data elements to available elements by: (1) annotating the desired elements with ontology terms using BioPortal; (2) automatically expanding the query for these elements with synonyms and subclass information using OntoCAT; (3) automatically searching available elements for these expanded terms using Lucene lexical matching; and (4) shortlisting relevant matches sorted by matching score. Results We evaluated BiobankConnect using human curated matches from EU-BioSHaRE, searching for 32 desired data elements in 7461 available elements from six biobanks. We found 0.75 precision at rank 1 and 0.74 recall at rank 10 compared to a manually curated set of relevant matches. In addition, best matches chosen by BioSHaRE experts ranked first in 63.0% and in the top 10 in 98.4% of cases, indicating that our system has the potential to significantly reduce manual matching work. Conclusions BiobankConnect provides an easy user interface to significantly speed up the biobank harmonization process. It may also prove useful for other forms of biomedical data integration. All the software can be downloaded as a MOLGENIS open source app from http://www.github.com/molgenis, with a demo available at http://www.biobankconnect.org. PMID:25361575

  8. Marine Web Portal as an Interface between Users and Marine Data and Information Sources

    NASA Astrophysics Data System (ADS)

    Palazov, A.; Stefanov, A.; Marinova, V.; Slabakova, V.

    2012-04-01

    Fundamental elements of the success of marine data and information management system and an effective support of marine and maritime economic activities are the speed and the ease with which users can identify, locate, get access, exchange and use oceanographic and marine data and information. There are a lot of activities and bodies have been identified as marine data and information users, such as: science, government and local authorities, port authorities, shipping, marine industry, fishery and aquaculture, tourist industry, environmental protection, coast protection, oil spills combat, Search and Rescue, national security, civil protection, and general public. On other hand diverse sources of real-time and historical marine data and information exist and generally they are fragmented, distributed in different places and sometimes unknown for the users. The marine web portal concept is to build common web based interface which will provide users fast and easy access to all available marine data and information sources, both historical and real-time such as: marine data bases, observing systems, forecasting systems, atlases etc. The service is regionally oriented to meet user needs. The main advantage of the portal is that it provides general look "at glance" on all available marine data and information as well as direct user to easy discover data and information in interest. It is planned to provide personalization ability, which will give the user instrument to tailor visualization according its personal needs.

  9. Benchmarking the ATLAS software through the Kit Validation engine

    NASA Astrophysics Data System (ADS)

    De Salvo, Alessandro; Brasolin, Franco

    2010-04-01

    The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.

  10. A cross-functional service-oriented architecture to support real-time information exchange in emergency medical response.

    PubMed

    Hauenstein, Logan; Gao, Tia; Sze, Tsz Wo; Crawford, David; Alm, Alex; White, David

    2006-01-01

    Real-time information communication presents a persistent challenge to the emergency response community. During a medical emergency, various first response disciplines including Emergency Medical Service (EMS), Fire, and Police, and multiple health service facilities including hospitals, auxiliary care centers and public health departments using disparate information technology systems must coordinate their efforts by sharing real-time information. This paper describes a service-oriented architecture (SOA) that uses shared data models of emergency incidents to support the exchange of data between heterogeneous systems. This architecture is employed in the Advanced Health and Disaster Aid Network (AID-N) system, a testbed investigating information technologies to improve interoperation among multiple emergency response organizations in the Washington DC Metropolitan region. This architecture allows us to enable real-time data communication between three deployed systems: 1) a pre-hospital patient care reporting software system used on all ambulances in Arlington County, Virginia (MICHAELS), 2) a syndromic surveillance system used by public health departments in the Washington area (ESSENCE), and 3) a hazardous material reference software system (WISER) developed by the National Library Medicine. Additionally, we have extended our system to communicate with three new data sources: 1) wireless automated vital sign sensors worn by patients, 2) web portals for admitting hospitals, and 3) PDAs used by first responders at emergency scenes to input data (SIRP).

  11. Musicking Online: Organizing Reference Sources in the Digital Age

    ERIC Educational Resources Information Center

    Szeto, Kimmy

    2012-01-01

    Online music research resources took center stage at the second plenary session "Wrangling the Information Universe: Moving From Institutional Portals to a Shared Resource for Online Music Sources" held on Friday, February 17, 2012, at the Music Library Association (MLA) 2012 Annual Meeting in Dallas, Texas. The Reference Sources…

  12. Multigraph: Reusable Interactive Data Graphs

    NASA Astrophysics Data System (ADS)

    Phillips, M. B.

    2010-12-01

    There are surprisingly few good software tools available for presenting time series data on the internet. The most common practice is to use a desktop program such as Excel or Matlab to save a graph as an image which can be included in a web page like any other image. This disconnects the graph from the data in a way that makes updating a graph with new data a cumbersome manual process, and it limits the user to one particular view of the data. The Multigraph project defines an XML format for describing interactive data graphs, and software tools for creating and rendering those graphs in web pages and other internet connected applications. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions; the user can pan and zoom by clicking and dragging, in a familiar "Google Maps" kind of way. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf" through large data sets, downloading only those the parts of the data that are needed for display. The Multigraph XML format, or "MUGL" for short, provides a concise description of the visual properties of a graph, such as axes, plot styles, data sources, labels, etc, as well as interactivity properties such as how and whether the user can pan or zoom along each axis. Multigraph reads a file in this format, draws the described graph, and allows the user to interact with it. Multigraph software currently includes a Flash application for embedding graphs in web pages, a Flex component for embedding graphs in larger Flex/Flash applications, and a plugin for creating graphs in the WordPress content management system. Plans for the future include a Java version for desktop viewing and editing, a command line version for batch and server side rendering, and possibly Android and iPhone versions. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Multigraph Display of Real Time Weather Data

  13. Evaluating the Impact of Conceptual Knowledge Engineering on the Design and Usability of a Clinical and Translational Science Collaboration Portal

    PubMed Central

    Payne, Philip R.O.; Borlawsky, Tara B.; Rice, Robert; Embi, Peter J.

    2010-01-01

    With the growing prevalence of large-scale, team science endeavors in the biomedical and life science domains, the impetus to implement platforms capable of supporting asynchronous interaction among multidisciplinary groups of collaborators has increased commensurately. However, there is a paucity of literature describing systematic approaches to identifying the information needs of targeted end-users for such platforms, and the translation of such requirements into practicable software component design criteria. In previous studies, we have reported upon the efficacy of employing conceptual knowledge engineering (CKE) techniques to systematically address both of the preceding challenges in the context of complex biomedical applications. In this manuscript we evaluate the impact of CKE approaches relative to the design of a clinical and translational science collaboration portal, and report upon the preliminary qualitative users satisfaction as reported for the resulting system. PMID:21347146

  14. The Emergence of Open-Source Software in China

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…

  15. Predicting film dose to aid in cassette placement for radiation therapy portal verification film images.

    PubMed

    Keys, Richard A; Marks, James E; Haus, Arthur G

    2002-12-01

    EC film has improved portal localization images with better contrast and improved distinction of bony structures and air-tissue interfaces. A cassette with slower speed screens was used with EC film to image the treatment portal during the entire course of treatment (verification) instead of taking separate films after treatment. Measurements of film density vs source to film distance (SFD) were made using 15 and 25 cm thick water phantoms with both 6 and 18 MV photons from I to 40 cm past the phantom. A characteristic (H & D) curve was measured in air to compare dose to film density. Results show the reduction in radiation between patient and cassette more closely follows an "inverse cube law" rather than an inverse square law. Formulas to calculate radiation exposure to the film, and the desired SFD were based on patient tumor dose, calculation of the exit dose, and the inverse cube relationship. A table of exposure techniques based on the SFD for a given tumor dose was evaluated and compared to conventional techniques. Although the film has a high contrast, there is enough latitude that excellent films can be achieved using a fixed SFD based simply on the tumor dose and beam energy. Patient diameter has a smaller effect. The benefits of imaging portal films during the entire treatment are more reliability in the accuracy of the portal image, ability to detect patient motion, and reduction in the time it takes to take portal images.

  16. RDS-SL VS Communication System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-12

    The RDS-SL VS Communication System is a component of the Radiation Detection System for Strategic, Low-Volume Seaports. Its purpose is to acquire real-time data from radiation portal monitors and cameras, record that data in a database, and make it available to system operators and administrators via a web interface. The software system contains two components: a standalone data acquisition and storage component and an ASP.NETweb application that implements the web interface.

  17. Applying Multiple Methods to Comprehensively Evaluate a Patient Portal’s Effectiveness to Convey Information to Patients

    PubMed Central

    Krist, Alex H; Aycock, Rebecca A; Kreps, Gary L

    2016-01-01

    Background Patient portals have yet to achieve their full potential for enhancing health communication and improving health outcomes. Although the Patient Protection and Affordable Care Act in the United States mandates the utilization of patient portals, and usage continues to rise, their impact has not been as profound as anticipated. Objective The objective of our case study was to evaluate how well portals convey information to patients. To demonstrate how multiple methodologies could be used to evaluate and improve the design of patient-centered portals, we conducted an in-depth evaluation of an exemplar patient-centered portal designed to promote preventive care to consumers. Methods We used 31 critical incident patient interviews, 2 clinician focus groups, and a thematic content analysis to understand patients’ and clinicians’ perspectives, as well as theoretical understandings of the portal’s use. Results We gathered over 140 critical incidents, 71.8% (102/142) negative and 28.2% (40/142) positive. Positive incident categories were (1) instant medical information access, (2) clear health information, and (3) patient vigilance. Negative incident categories were (1) standardized content, (2) desire for direct communication, (3) website functionality, and (4) difficulty interpreting laboratory data. Thematic analysis of the portal’s immediacy resulted in high scores in the attributes enhances understanding (18/23, 78%), personalization (18/24, 75%), and motivates behavior (17/24, 71%), but low levels of interactivity (7/24, 29%) and engagement (2/24, 8%). Two overarching themes emerged to guide portal refinements: (1) communication can be improved with directness and interactivity and (2) perceived personalization must be greater to engage patients. Conclusions Results suggest that simple modifications, such as increased interactivity and personalized messages, can make portals customized, robust, easily accessible, and trusted information sources. PMID:27188953

  18. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    NASA Astrophysics Data System (ADS)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, meteorological observational data for the territory of the former USSR for the 20th century, and others. Current version of the system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The software framework presented allows rapid development of Web-GIS systems for geophysical data analysis thus providing specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. This work is partially supported by RFBR grants #10-07-00547, #11-05-01190, and SB RAS projects 4.31.1.5, 4.31.2.7, 4, 8, 9, 50 and 66.

  19. U.S. Army Research Laboratory (ARL) multimodal signatures database

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly

    2008-04-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) is a centralized collection of sensor data of various modalities that are co-located and co-registered. The signatures include ground and air vehicles, personnel, mortar, artillery, small arms gunfire from potential sniper weapons, explosives, and many other high value targets. This data is made available to Department of Defense (DoD) and DoD contractors, Intel agencies, other government agencies (OGA), and academia for use in developing target detection, tracking, and classification algorithms and systems to protect our Soldiers. A platform independent Web interface disseminates the signatures to researchers and engineers within the scientific community. Hierarchical Data Format 5 (HDF5) signature models provide an excellent solution for the sharing of complex multimodal signature data for algorithmic development and database requirements. Many open source tools for viewing and plotting HDF5 signatures are available over the Web. Seamless integration of HDF5 signatures is possible in both proprietary computational environments, such as MATLAB, and Free and Open Source Software (FOSS) computational environments, such as Octave and Python, for performing signal processing, analysis, and algorithm development. Future developments include extending the Web interface into a portal system for accessing ARL algorithms and signatures, High Performance Computing (HPC) resources, and integrating existing database and signature architectures into sensor networking environments.

  20. [Remarks on injuries which were the source of tetanus and were based on observations from the Clinic of Infectious Diseases in Cracow].

    PubMed

    Garlicki, A; Caban, J; Krukowiecki, J; Bociaga-Jasik, M; Kluba-Wojewoda, U

    1998-01-01

    From 1992 to 1996, 95 patients with tetanus were treated in the Chair and Department of Infectious Diseases in Cracow. Most of them came from rural area, and at old age (median 68 years). Small, trivial skin injuries were the most often identified portal of entry. Only few patients applied to doctor after injury for prophylaxis against tetanus. The authors emphasise that small skin injuries, which may be portal of entry for tetanus, should not be left abandoned.

  1. Mfold web server for nucleic acid folding and hybridization prediction.

    PubMed

    Zuker, Michael

    2003-07-01

    The abbreviated name, 'mfold web server', describes a number of closely related software applications available on the World Wide Web (WWW) for the prediction of the secondary structure of single stranded nucleic acids. The objective of this web server is to provide easy access to RNA and DNA folding and hybridization software to the scientific community at large. By making use of universally available web GUIs (Graphical User Interfaces), the server circumvents the problem of portability of this software. Detailed output, in the form of structure plots with or without reliability information, single strand frequency plots and 'energy dot plots', are available for the folding of single sequences. A variety of 'bulk' servers give less information, but in a shorter time and for up to hundreds of sequences at once. The portal for the mfold web server is http://www.bioinfo.rpi.edu/applications/mfold. This URL will be referred to as 'MFOLDROOT'.

  2. NABIC: A New Access Portal to Search, Visualize, and Share Agricultural Genomics Data.

    PubMed

    Seol, Young-Joo; Lee, Tae-Ho; Park, Dong-Suk; Kim, Chang-Kug

    2016-01-01

    The National Agricultural Biotechnology Information Center developed an access portal to search, visualize, and share agricultural genomics data with a focus on South Korean information and resources. The portal features an agricultural biotechnology database containing a wide range of omics data from public and proprietary sources. We collected 28.4 TB of data from 162 agricultural organisms, with 10 types of omics data comprising next-generation sequencing sequence read archive, genome, gene, nucleotide, DNA chip, expressed sequence tag, interactome, protein structure, molecular marker, and single-nucleotide polymorphism datasets. Our genomic resources contain information on five animals, seven plants, and one fungus, which is accessed through a genome browser. We also developed a data submission and analysis system as a web service, with easy-to-use functions and cutting-edge algorithms, including those for handling next-generation sequencing data.

  3. SU-F-T-283: A Novel Device to Enable Portal Dosimetry for Flattening Filter Free Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faught, A; Wu, Q; Adamson, J

    Purpose: Varian’s electronic portal imaging device (EPID) based portal dosimetry tool is a popular and effective means of performing IMRT QA. EPIDs for older models of the TrueBeam accelerator utilize a 40cmx30cm Image Detection Unit (IDU) that saturates at the center for standard source to imager distances with high dose rate flattening filter free (FFF) beams. This makes portal dosimetry not possible and an alternative means of IMRT QA necessary. We developed a filter that would attenuate the beam to a dose rate measureable by the IDU for portal dosimetry IMRT QA. Methods: Multipurpose 304 stainless steel plates were placedmore » on an accessory tray to attenuate the beam. Profiles of an open field measured on the IDU were acquired with varying number of plates to assess the thickness needed to reduce the maximum dose rates of 6XFFF and 10XFFF beams to measurable levels. A new portal dose image prediction (PDIP) model was commissioned based on open field measurements with plates in position, and a modified beam profile was input to portal dosimetry calibration at the console to empirically correct for attenuation and scatter. The portal dosimetry tool was used to assess agreement between predicted and measured doses for open 25×25cm{sup 2} fields and intensity modulated fields using 6XFFF and 10XFFF beams. Results: Thicknesses of 2.5cm and 3.8cm of steel were required to reduce the highest dose rates to a measureable level for 6XFFF and 10XFFF, respectively. Gamma analysis using a 3%/3mm relative criterion with the filter in place and using the new PDIP model resulted in 98.2% and 93.6% of pixels passing while intensity modulated fields showed passing rates of 98.2% and 99.0%. Conclusion: Use of the filter allows for portal dosimetry to be used for IMRT QA of FFF plans in place of purchasing a second option for IMRT QA.« less

  4. Improving the Transparency of IAEA Safeguards Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toomey, Christopher; Hayman, Aaron M.; Wyse, Evan T.

    2011-07-17

    In 2008, the Standing Advisory Group on Safeguards Implementation (SAGSI) indicated that the International Atomic Energy Agency's (IAEA) Safeguards Implementation Report (SIR) has not kept pace with the evolution of safeguards and provided the IAEA with a set of recommendations for improvement. The SIR is the primary mechanism for providing an overview of safeguards implementation in a given year and reporting on the annual safeguards findings and conclusions drawn by the Secretariat. As the IAEA transitions to State-level safeguards approaches, SIR reporting must adapt to reflect these evolutionary changes. This evolved report will better reflect the IAEA's transition to amore » more qualitative and information-driven approach, based upon State-as-a-whole considerations. This paper applies SAGSI's recommendations to the development of multiple models for an evolved SIR and finds that an SIR repurposed as a 'safeguards portal' could significantly enhance information delivery, clarity, and transparency. In addition, this paper finds that the 'portal concept' also appears to have value as a standardized information presentation and analysis platform for use by Country Officers, for continuity of knowledge purposes, and the IAEA Secretariat in the safeguards conclusion process. Accompanying this paper is a fully functional prototype of the 'portal' concept, built using commercial software and IAEA Annual Report data.« less

  5. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  6. Integrated Exoplanet Modeling with the GSFC Exoplanet Modeling & Analysis Center (EMAC)

    NASA Astrophysics Data System (ADS)

    Mandell, Avi M.; Hostetter, Carl; Pulkkinen, Antti; Domagal-Goldman, Shawn David

    2018-01-01

    Our ability to characterize the atmospheres of extrasolar planets will be revolutionized by JWST, WFIRST and future ground- and space-based telescopes. In preparation, the exoplanet community must develop an integrated suite of tools with which we can comprehensively predict and analyze observations of exoplanets, in order to characterize the planetary environments and ultimately search them for signs of habitability and life.The GSFC Exoplanet Modeling and Analysis Center (EMAC) will be a web-accessible high-performance computing platform with science support for modelers and software developers to host and integrate their scientific software tools, with the goal of leveraging the scientific contributions from the entire exoplanet community to improve our interpretations of future exoplanet discoveries. Our suite of models will include stellar models, models for star-planet interactions, atmospheric models, planet system science models, telescope models, instrument models, and finally models for retrieving signals from observational data. By integrating this suite of models, the community will be able to self-consistently calculate the emergent spectra from the planet whether from emission, scattering, or in transmission, and use these simulations to model the performance of current and new telescopes and their instrumentation.The EMAC infrastructure will not only provide a repository for planetary and exoplanetary community models, modeling tools and intermodal comparisons, but it will include a "run-on-demand" portal with each software tool hosted on a separate virtual machine. The EMAC system will eventually include a means of running or “checking in” new model simulations that are in accordance with the community-derived standards. Additionally, the results of intermodal comparisons will be used to produce open source publications that quantify the model comparisons and provide an overview of community consensus on model uncertainties on the climates of various planetary targets.

  7. Automated sensor networks to advance ocean science

    NASA Astrophysics Data System (ADS)

    Schofield, O.; Orcutt, J. A.; Arrott, M.; Vernon, F. L.; Peach, C. L.; Meisinger, M.; Krueger, I.; Kleinert, J.; Chao, Y.; Chien, S.; Thompson, D. R.; Chave, A. D.; Balasuriya, A.

    2010-12-01

    The National Science Foundation has funded the Ocean Observatories Initiative (OOI), which over the next five years will deploy infrastructure to expand scientist’s ability to remotely study the ocean. The deployed infrastructure will be linked by a robust cyberinfrastructure (CI) that will integrate marine observatories into a coherent system-of-systems. OOI is committed to engaging the ocean sciences community during the construction pahse. For the CI, this is being enabled by using a “spiral design strategy” allowing for input throughout the construction phase. In Fall 2009, the OOI CI development team used an existing ocean observing network in the Mid-Atlantic Bight (MAB) to test OOI CI software. The objective of this CI test was to aggregate data from ships, autonomous underwater vehicles (AUVs), shore-based radars, and satellites and make it available to five different data-assimilating ocean forecast models. Scientists used these multi-model forecasts to automate future glider missions in order to demonstrate the feasibility of two-way interactivity between the sensor web and predictive models. The CI software coordinated and prioritized the shared resources that allowed for the semi-automated reconfiguration of assett-tasking, and thus enabled an autonomous execution of observation plans for the fixed and mobile observation platforms. Efforts were coordinated through a web portal that provided an access point for the observational data and model forecasts. Researchers could use the CI software in tandem with the web data portal to assess the performance of individual numerical model results, or multi-model ensembles, through real-time comparisons with satellite, shore-based radar, and in situ robotic measurements. The resulting sensor net will enable a new means to explore and study the world’s oceans by providing scientists a responsive network in the world’s oceans that can be accessed via any wireless network.

  8. The Impact of an eHealth Portal on Health Care Professionals' Interaction with Patients: Qualitative Study.

    PubMed

    Das, Anita; Faxvaag, Arild; Svanæs, Dag

    2015-11-24

    People who undergo weight loss surgery require a comprehensive treatment program to achieve successful outcomes. eHealth solutions, such as secure online portals, create new opportunities for improved health care delivery and care, but depend on the organizational delivery systems and on the health care professionals providing it. So far, these have received limited attention and the overall adoption of eHealth solutions remains low. In this study, a secure eHealth portal was implemented in a bariatric surgery clinic and offered to their patients. During the study period of 6 months, 60 patients and 5 health care professionals had access. The portal included patient information, self-management tools, and communication features for online dialog with peers and health care providers at the bariatric surgery clinic. The aim of this study was to characterize and assess the impact of an eHealth portal on health care professionals' interaction with patients in bariatric surgery. This qualitative case study involved a field study consisting of contextual interviews at the clinic involving observing and speaking with personnel in their actual work environment. Semi-structured in-depth interviews were conducted with health care professionals who interacted with patients through the portal. Analysis of the collected material was done inductively using thematic analysis. The analysis revealed two main dimensions of using an eHealth portal in bariatric surgery: the transparency it represents and the responsibility that follows by providing it. The professionals reported the eHealth portal as (1) a source of information, (2) a gateway to approach and facilitate the patients, (3) a medium for irrevocable postings, (4) a channel that exposes responsibility and competence, and (5) a tool in the clinic. By providing an eHealth portal to patients in a bariatric surgery program, health care professionals can observe patients' writings and revelations thereby capturing patient challenges and acting and implementing measures. Interacting with patients through the portal can prevent dropouts and deterioration of patients' health. However, professionals report on organizational challenges and personal constraints related to communicating with patients in writing online. Further development of guidelines and education of health care professionals about how to handle, prioritize, communicate, and facilitate patients online is required in addition to increased attention to the organizational infrastructures and incentives for enabling such solutions in health care.

  9. The Impact of an eHealth Portal on Health Care Professionals’ Interaction with Patients: Qualitative Study

    PubMed Central

    Faxvaag, Arild; Svanæs, Dag

    2015-01-01

    Background People who undergo weight loss surgery require a comprehensive treatment program to achieve successful outcomes. eHealth solutions, such as secure online portals, create new opportunities for improved health care delivery and care, but depend on the organizational delivery systems and on the health care professionals providing it. So far, these have received limited attention and the overall adoption of eHealth solutions remains low. In this study, a secure eHealth portal was implemented in a bariatric surgery clinic and offered to their patients. During the study period of 6 months, 60 patients and 5 health care professionals had access. The portal included patient information, self-management tools, and communication features for online dialog with peers and health care providers at the bariatric surgery clinic. Objective The aim of this study was to characterize and assess the impact of an eHealth portal on health care professionals’ interaction with patients in bariatric surgery. Methods This qualitative case study involved a field study consisting of contextual interviews at the clinic involving observing and speaking with personnel in their actual work environment. Semi-structured in-depth interviews were conducted with health care professionals who interacted with patients through the portal. Analysis of the collected material was done inductively using thematic analysis. Results The analysis revealed two main dimensions of using an eHealth portal in bariatric surgery: the transparency it represents and the responsibility that follows by providing it. The professionals reported the eHealth portal as (1) a source of information, (2) a gateway to approach and facilitate the patients, (3) a medium for irrevocable postings, (4) a channel that exposes responsibility and competence, and (5) a tool in the clinic. Conclusions By providing an eHealth portal to patients in a bariatric surgery program, health care professionals can observe patients’ writings and revelations thereby capturing patient challenges and acting and implementing measures. Interacting with patients through the portal can prevent dropouts and deterioration of patients’ health. However, professionals report on organizational challenges and personal constraints related to communicating with patients in writing online. Further development of guidelines and education of health care professionals about how to handle, prioritize, communicate, and facilitate patients online is required in addition to increased attention to the organizational infrastructures and incentives for enabling such solutions in health care. PMID:26601678

  10. Archiving, sharing, processing and publishing historical earthquakes data: the IT point of view

    NASA Astrophysics Data System (ADS)

    Locati, Mario; Rovida, Andrea; Albini, Paola

    2014-05-01

    Digital tools devised for seismological data are mostly designed for handling instrumentally recorded data. Researchers working on historical seismology are forced to perform their daily job using a general purpose tool and/or coding their own to address their specific tasks. The lack of out-of-the-box tools expressly conceived to deal with historical data leads to a huge amount of time lost in performing tedious task to search for the data and, to manually reformat it in order to jump from one tool to the other, sometimes causing a loss of the original data. This reality is common to all activities related to the study of earthquakes of the past centuries, from the interpretations of past historical sources, to the compilation of earthquake catalogues. A platform able to preserve the historical earthquake data, trace back their source, and able to fulfil many common tasks was very much needed. In the framework of two European projects (NERIES and SHARE) and one global project (Global Earthquake History, GEM), two new data portals were designed and implemented. The European portal "Archive of Historical Earthquakes Data" (AHEAD) and the worldwide "Global Historical Earthquake Archive" (GHEA), are aimed at addressing at least some of the above mentioned issues. The availability of these new portals and their well-defined standards makes it easier than before the development of side tools for archiving, publishing and processing the available historical earthquake data. The AHEAD and GHEA portals, their underlying technologies and the developed side tools are presented.

  11. Astroinformatics in the Age of LSST: Analyzing the Summer 2012 Data Release

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; De Lee, N. M.; Stassun, K.; Paegert, M.; Cargile, P.; Burger, D.; Bloom, J. S.; Richards, J.

    2013-01-01

    The Large Synoptic Survey Telescope (LSST) will image the visible southern sky every three nights. This multi-band, multi-epoch survey will produce a torrent of data, which traditional methods of object-by-object data analysis will not be able to accommodate. Thus the need for new astroinformatics tools to visualize, simulate, mine, and analyze this quantity of data. The Berkeley Center for Time-Domain Informatics (CTDI) is building the informatics infrastructure for generic light curve classification, including the innovation of new algorithms for feature generation and machine learning. The CTDI portal (http://dotastro.org) contains one of the largest collections of public light curves, with visualization and exploration tools. The group has also published the first calibrated probabilistic classification catalog of 50k variable stars along with a data exploration portal called http://bigmacc.info. Twice a year, the LSST collaboration releases simulated LSST data, in order to aid software development. This poster also showcases a suite of new tools from the Vanderbilt Initiative in Data-instensive Astrophysics (VIDA), designed to take advantage of these large data sets. VIDA's Filtergraph interactive web tool allows one to instantly create an interactive data portal for fast, real-time visualization of large data sets. Filtergraph enables quick selection of interesting objects by easily filtering on many different columns, 2-D and 3-D representations, and on-the-fly arithmetic calculations on the data. It also makes sharing the data and the tool with collaborators very easy. The EB/RRL Factory is a neural-network based variable star classifier, which is designed to quickly identify variable stars in a variety of classes from LSST light curve data (currently tuned to Eclipsing Binaries and RR Lyrae stars), and to provide likelihood-based orbital elements or stellar parameters as appropriate. Finally the LCsimulator software allows one to create simulated light curves of multiple types of variable stars based on an LSST cadence.

  12. Software Selection: A Primer on Source and Evaluation.

    ERIC Educational Resources Information Center

    Burston, Jack

    2003-01-01

    Provides guidance on making decisions regarding the selection of foreign language instructional software. Identifies sources of foreign language software, indicates sources of foreign language software reviews, and outlines essential procedures of software evaluation. (Author/VWL)

  13. Mercury: An Example of Effective Software Reuse for Metadata Management, Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Devarakonda, Ranjeet; Palanisamy, Giri; Green, James; Wilson, Bruce E.

    2008-12-01

    Mercury is a federated metadata harvesting, data discovery and access tool based on both open source packages and custom developed software. Though originally developed for NASA, the Mercury development consortium now includes funding from NASA, USGS, and DOE. Mercury supports the reuse of metadata by enabling searching across a range of metadata specification and standards including XML, Z39.50, FGDC, Dublin-Core, Darwin-Core, EML, and ISO-19115. Mercury provides a single portal to information contained in distributed data management systems. It collects metadata and key data from contributing project servers distributed around the world and builds a centralized index. The Mercury search interfaces then allow the users to perform simple, fielded, spatial and temporal searches across these metadata sources. One of the major goals of the recent redesign of Mercury was to improve the software reusability across the 12 projects which currently fund the continuing development of Mercury. These projects span a range of land, atmosphere, and ocean ecological communities and have a number of common needs for metadata searches, but they also have a number of needs specific to one or a few projects. To balance these common and project-specific needs, Mercury's architecture has three major reusable components; a harvester engine, an indexing system and a user interface component. The harvester engine is responsible for harvesting metadata records from various distributed servers around the USA and around the world. The harvester software was packaged in such a way that all the Mercury projects will use the same harvester scripts but each project will be driven by a set of project specific configuration files. The harvested files are structured metadata records that are indexed against the search library API consistently, so that it can render various search capabilities such as simple, fielded, spatial and temporal. This backend component is supported by a very flexible, easy to use Graphical User Interface which is driven by cascading style sheets, which make it even simpler for reusable design implementation. The new Mercury system is based on a Service Oriented Architecture and effectively reuses components for various services such as Thesaurus Service, Gazetteer Web Service and UDDI Directory Services. The software also provides various search services including: RSS, Geo-RSS, OpenSearch, Web Services and Portlets, integrated shopping cart to order datasets from various data centers (ORNL DAAC, NSIDC) and integrated visualization tools. Other features include: Filtering and dynamic sorting of search results, book- markable search results, save, retrieve, and modify search criteria.

  14. Mercury: An Example of Effective Software Reuse for Metadata Management, Data Discovery and Access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devarakonda, Ranjeet

    2008-01-01

    Mercury is a federated metadata harvesting, data discovery and access tool based on both open source packages and custom developed software. Though originally developed for NASA, the Mercury development consortium now includes funding from NASA, USGS, and DOE. Mercury supports the reuse of metadata by enabling searching across a range of metadata specification and standards including XML, Z39.50, FGDC, Dublin-Core, Darwin-Core, EML, and ISO-19115. Mercury provides a single portal to information contained in distributed data management systems. It collects metadata and key data from contributing project servers distributed around the world and builds a centralized index. The Mercury search interfacesmore » then allow the users to perform simple, fielded, spatial and temporal searches across these metadata sources. One of the major goals of the recent redesign of Mercury was to improve the software reusability across the 12 projects which currently fund the continuing development of Mercury. These projects span a range of land, atmosphere, and ocean ecological communities and have a number of common needs for metadata searches, but they also have a number of needs specific to one or a few projects. To balance these common and project-specific needs, Mercury's architecture has three major reusable components; a harvester engine, an indexing system and a user interface component. The harvester engine is responsible for harvesting metadata records from various distributed servers around the USA and around the world. The harvester software was packaged in such a way that all the Mercury projects will use the same harvester scripts but each project will be driven by a set of project specific configuration files. The harvested files are structured metadata records that are indexed against the search library API consistently, so that it can render various search capabilities such as simple, fielded, spatial and temporal. This backend component is supported by a very flexible, easy to use Graphical User Interface which is driven by cascading style sheets, which make it even simpler for reusable design implementation. The new Mercury system is based on a Service Oriented Architecture and effectively reuses components for various services such as Thesaurus Service, Gazetteer Web Service and UDDI Directory Services. The software also provides various search services including: RSS, Geo-RSS, OpenSearch, Web Services and Portlets, integrated shopping cart to order datasets from various data centers (ORNL DAAC, NSIDC) and integrated visualization tools. Other features include: Filtering and dynamic sorting of search results, book- markable search results, save, retrieve, and modify search criteria.« less

  15. From a Content Delivery Portal to a Knowledge Management System for Standardized Cancer Documentation.

    PubMed

    Schlue, Danijela; Mate, Sebastian; Haier, Jörg; Kadioglu, Dennis; Prokosch, Hans-Ulrich; Breil, Bernhard

    2017-01-01

    Heterogeneous tumor documentation and its challenges of interpretation of medical terms lead to problems in analyses of data from clinical and epidemiological cancer registries. The objective of this project was to design, implement and improve a national content delivery portal for oncological terms. Data elements of existing handbooks and documentation sources were analyzed, combined and summarized by medical experts of different comprehensive cancer centers. Informatics experts created a generic data model based on an existing metadata repository. In order to establish a national knowledge management system for standardized cancer documentation, a prototypical tumor wiki was designed and implemented. Requirements engineering techniques were applied to optimize this platform. It is targeted to user groups such as documentation officers, physicians and patients. The linkage to other information sources like PubMed and MeSH was realized.

  16. Designing a Web-Based Learning Portal for Geographic Visualization and Analysis in Public Health

    PubMed Central

    Robinson, Anthony C.; Roth, Robert E.; MacEachren, Alan M.

    2011-01-01

    Interactive mapping and spatial analysis tools are underutilized by health researchers and decision-makers due to scarce training materials, few examples demonstrating the successful use of geographic visualization, and poor mechanisms for sharing results generated by geovisualization. We report here on the development of the Geovisual EXplication (G-EX) Portal, a web-based application designed to connect researchers in geovisualization and related mapping sciences to users who are working in public health and epidemiology. This paper focuses on the design and development of the G-EX Portal Learn module, a set of tools intended to disseminate learning artifacts. Initial design and development of the G-EX Portal has been guided by our past research on use and usability of geovisualization in public health. As part of the iterative design and development process, we conducted a needs assessment survey with targeted end-users that we report on here. The survey focused on users’ current learning habits, their preferred kind of learning artifacts, and issues they may have with contributing learning artifacts to web portals. Survey results showed that users desire a diverse set of learning artifacts in terms of both formats and topics covered. Results also revealed a willingness of users to contribute both learning artifacts and personal information that would help other users to evaluate the credibility of the learning artifact source. We include a detailed description of the G-EX Portal Learn module and focus on modifications to the design of the Learn module as a result from feedback we received from our survey. PMID:21937462

  17. Numerical Investigation of Rockfall Impacts on Muckpiles for Underground Portals

    NASA Astrophysics Data System (ADS)

    Effeindzourou, Anna; Giacomini, Anna; Thoeni, Klaus; Sloan, Scott W.

    2017-06-01

    Small-scale waste rock piles or muckpiles are commonly used as energy absorption barriers in various surface mining applications. This paper numerically investigates the impact behaviour of blocks on muckpiles used as cushion layer on top of underground portal entries. A three-dimensional discrete element model is implemented into the open-source framework YADE and validated using full-scale experimental data. The model allows estimating the energy absorption capacity of the muckpile and the impact forces acting on the portal structure. It also provides valuable information on the rebound characteristics which are useful for the definition of the potential safety areas in the vicinity of an underground entry. In order to show its capabilities, the model is applied to a large number of cases representing potential design conditions. The influence of block mass, impact velocity and absorbing cushion thickness on the forces at the base of the muckpile and the rebound trajectories after impact are investigated.

  18. NABIC: A New Access Portal to Search, Visualize, and Share Agricultural Genomics Data

    PubMed Central

    Seol, Young-Joo; Lee, Tae-Ho; Park, Dong-Suk; Kim, Chang-Kug

    2016-01-01

    The National Agricultural Biotechnology Information Center developed an access portal to search, visualize, and share agricultural genomics data with a focus on South Korean information and resources. The portal features an agricultural biotechnology database containing a wide range of omics data from public and proprietary sources. We collected 28.4 TB of data from 162 agricultural organisms, with 10 types of omics data comprising next-generation sequencing sequence read archive, genome, gene, nucleotide, DNA chip, expressed sequence tag, interactome, protein structure, molecular marker, and single-nucleotide polymorphism datasets. Our genomic resources contain information on five animals, seven plants, and one fungus, which is accessed through a genome browser. We also developed a data submission and analysis system as a web service, with easy-to-use functions and cutting-edge algorithms, including those for handling next-generation sequencing data. PMID:26848255

  19. Experiments with Cross-Language Information Retrieval on a Health Portal for Psychology and Psychotherapy.

    PubMed

    Andrenucci, Andrea

    2016-01-01

    Few studies have been performed within cross-language information retrieval (CLIR) in the field of psychology and psychotherapy. The aim of this paper is to to analyze and assess the quality of available query translation methods for CLIR on a health portal for psychology. A test base of 100 user queries, 50 Multi Word Units (WUs) and 50 Single WUs, was used. Swedish was the source language and English the target language. Query translation methods based on machine translation (MT) and dictionary look-up were utilized in order to submit query translations to two search engines: Google Site Search and Quick Ask. Standard IR evaluation measures and a qualitative analysis were utilized to assess the results. The lexicon extracted with word alignment of the portal's parallel corpus provided better statistical results among dictionary look-ups. Google Translate provided more linguistically correct translations overall and also delivered better retrieval results in MT.

  20. PubMed searches: overview and strategies for clinicians.

    PubMed

    Lindsey, Wesley T; Olin, Bernie R

    2013-04-01

    PubMed is a biomedical and life sciences database maintained by a division of the National Library of Medicine known as the National Center for Biotechnology Information (NCBI). It is a large resource with more than 5600 journals indexed and greater than 22 million total citations. Searches conducted in PubMed provide references that are more specific for the intended topic compared with other popular search engines. Effective PubMed searches allow the clinician to remain current on the latest clinical trials, systematic reviews, and practice guidelines. PubMed continues to evolve by allowing users to create a customized experience through the My NCBI portal, new arrangements and options in search filters, and supporting scholarly projects through exportation of citations to reference managing software. Prepackaged search options available in the Clinical Queries feature also allow users to efficiently search for clinical literature. PubMed also provides information regarding the source journals themselves through the Journals in NCBI Databases link. This article provides an overview of the PubMed database's structure and features as well as strategies for conducting an effective search.

  1. EnviroDIY ModularSensors: A Library to give Environmental Sensors a Common Interface of Functions for use with Arduino-Compatible Dataloggers

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Damiano, S. G.; Hicks, S.; Horsburgh, J. S.

    2017-12-01

    EnviroDIY is a community for do-it-yourself environmental science and monitoring (https://envirodiy.org), largely focused on sharing ideas for developing Arduino-compatible open-source sensor stations, similar to the EnviroDIY Mayfly datalogger (http://envirodiy.org/mayfly/). Here we present the ModularSensors Arduino code library (https://github.com/EnviroDIY/ModularSensors), deisigned to give all sensors and variables a common interface of functions and returns and to make it very easy to iterate through and log data from many sensors and variables. This library was written primarily for the EnviroDIY Mayfly, but we have begun to test it on other Arduino based boards. We will show the large number of developed sensor interfaces, and examples of using this library code to stream near real time data to the new EnviroDIY Water Quality Data Portal (http://data.envirodiy.org/), a data and software system based on the Observations Data Model v2 (http://www.odm2.org).

  2. Developing a research and practice tool to measure walkability: a demonstration project.

    PubMed

    Giles-Corti, Billie; Macaulay, Gus; Middleton, Nick; Boruff, Bryan; Bull, Fiona; Butterworth, Iain; Badland, Hannah; Mavoa, Suzanne; Roberts, Rebecca; Christian, Hayley

    2014-12-01

    Growing evidence shows that higher-density, mixed-use, pedestrian-friendly neighbourhoods encourage active transport, including transport-related walking. Despite widespread recognition of the benefits of creating more walkable neighbourhoods, there remains a gap between the rhetoric of the need for walkability and the creation of walkable neighbourhoods. Moreover, there is little objective data to benchmark the walkability of neighbourhoods within and between Australian cities in order to monitor planning and design intervention progress and to assess built environment and urban policy interventions required to achieve increased walkability. This paper describes a demonstration project that aimed to develop, trial and validate a 'Walkability Index Tool' that could be used by policy makers and practitioners to assess the walkability of local areas; or by researchers to access geospatial data assessing walkability. The overall aim of the project was to develop an automated geospatial tool capable of creating walkability indices for neighbourhoods at user-specified scales. The tool is based on open-source software architecture, within the Australian Urban Research Infrastructure Network (AURIN) framework, and incorporates key sub-component spatial measures of walkability (street connectivity, density and land use mix). Using state-based data, we demonstrated it was possible to create an automated walkability index. However, due to the lack of availability of consistent of national data measuring land use mix, at this stage it has not been possible to create a national walkability measure. The next stage of the project is to increase useability of the tool within the AURIN portal and to explore options for alternative spatial data sources that will enable the development of a valid national walkability index. AURIN's open-source Walkability Index Tool is a first step in demonstrating the potential benefit of a tool that could measure walkability across Australia. It also demonstrates the value of making accurate spatial data available for research purposes. SO WHAT?: There remains a gap between urban policy and practice, in terms of creating walkable neighbourhoods. When fully implemented, AURIN's walkability tool could be used to benchmark Australian cities against which planning and urban design decisions could be assessed to monitor progress towards achieving policy goals. Making cleaned data readily available for research purposes through a common portal could also save time and financial resources.

  3. MediaTracker system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandoval, D. M.; Strittmatter, R. B.; Abeyta, J. D.

    2004-01-01

    The initial objectives of this effort were to provide a hardware and software platform that can address the requirements for the accountability of classified removable electronic media and vault access logging. The Media Tracker system software assists classified media custodian in managing vault access logging and Media Tracking to prevent the inadvertent violation of rules or policies for the access to a restricted area and the movement and use of tracked items. The MediaTracker system includes the software tools to track and account for high consequence security assets and high value items. The overall benefits include: (1) real-time access tomore » the disposition of all Classified Removable Electronic Media (CREM), (2) streamlined security procedures and requirements, (3) removal of ambiguity and managerial inconsistencies, (4) prevention of incidents that can and should be prevented, (5) alignment with the DOE's initiative to achieve improvements in security and facility operations through technology deployment, and (6) enhanced individual responsibility by providing a consistent method of dealing with daily responsibilities. In response to initiatives to enhance the control of classified removable electronic media (CREM), the Media Tracker software suite was developed, piloted and implemented at the Los Alamos National Laboratory beginning in July 2000. The Media Tracker software suite assists in the accountability and tracking of CREM and other high-value assets. One component of the MediaTracker software suite provides a Laboratory-approved media tracking system. Using commercial touch screen and bar code technology, the MediaTracker (MT) component of the MediaTracker software suite provides an efficient and effective means to meet current Laboratory requirements and provides new-engineered controls to help assure compliance with those requirements. It also establishes a computer infrastructure at vault entrances for vault access logging, and can accommodate several methods of positive identification including smart cards and biometrics. Currently, we have three mechanisms that provide added security for accountability and tracking purposes. One mechanism consists of a portable, hand-held inventory scanner, which allows the custodian to physically track the items that are not accessible within a particular area. The second mechanism is a radio frequency identification (RFID) consisting of a monitoring portal, which tracks and logs in a database all activity tagged of items that pass through the portals. The third mechanism consists of an electronic tagging of a flash memory device for automated inventory of CREM in storage. By modifying this USB device the user is provided with added assurance, limiting the data from being obtained from any other computer.« less

  4. Air transportation energy consumption - Yesterday, today, and tomorrow

    NASA Technical Reports Server (NTRS)

    Mascy, A. C.; Williams, L. J.

    1975-01-01

    The energy consumption by aviation is reviewed and projections of its growth are discussed. Forecasts of domestic passenger demand are presented, and the effect of restricted fuel supply and increased fuel prices is considered. The most promising sources for aircraft fuels, their availability and cost, and possible alternative fuels are reviewed. The energy consumption by various air and surface transportation modes is identified and compared on typical portal-to-portal trips. A measure of the indirect energy consumed by ground and air modes is defined. Historical trends in aircraft energy intensities are presented and the potential fuel savings with new technologies are discussed.

  5. Web-Based Software for Managing Research

    NASA Technical Reports Server (NTRS)

    Hoadley, Sherwood T.; Ingraldi, Anthony M.; Gough, Kerry M.; Fox, Charles; Cronin, Catherine K.; Hagemann, Andrew G.; Kemmerly, Guy T.; Goodman, Wesley L.

    2007-01-01

    aeroCOMPASS is a software system, originally designed to aid in the management of wind tunnels at Langley Research Center, that could be adapted to provide similar aid to other enterprises in which research is performed in common laboratory facilities by users who may be geographically dispersed. Included in aeroCOMPASS is Web-interface software that provides a single, convenient portal to a set of project- and test-related software tools and other application programs. The heart of aeroCOMPASS is a user-oriented document-management software subsystem that enables geographically dispersed users to easily share and manage a variety of documents. A principle of "write once, read many" is implemented throughout aeroCOMPASS to eliminate the need for multiple entry of the same information. The Web framework of aeroCOMPASS provides links to client-side application programs that are fully integrated with databases and server-side application programs. Other subsystems of aeroCOMPASS include ones for reserving hardware, tracking of requests and feedback from users, generating interactive notes, administration of a customer-satisfaction questionnaire, managing execution of tests, managing archives of metadata about tests, planning tests, and providing online help and instruction for users.

  6. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine

    PubMed Central

    2011-01-01

    Background Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. Results We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. Conclusions By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype. PMID:21569484

  7. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine.

    PubMed

    Bath, Timothy G; Bozdag, Selcuk; Afzal, Vackar; Crowther, Daniel

    2011-05-13

    Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype.

  8. Using open-source programs to create a web-based portal for hydrologic information

    NASA Astrophysics Data System (ADS)

    Kim, H.

    2013-12-01

    Some hydrologic data sets, such as basin climatology, precipitation, and terrestrial water storage, are not easily obtainable and distributable due to their size and complexity. We present a Hydrologic Information Portal (HIP) that has been implemented at the University of California for Hydrologic Modeling (UCCHM) and that has been organized around the large river basins of North America. This portal can be easily accessed through a modern web browser that enables easy access and visualization of such hydrologic data sets. Some of the main features of our HIP include a set of data visualization features so that users can search, retrieve, analyze, integrate, organize, and map data within large river basins. Recent information technologies such as Google Maps, Tornado (Python asynchronous web server), NumPy/SciPy (Scientific Library for Python) and d3.js (Visualization library for JavaScript) were incorporated into the HIP to create ease in navigating large data sets. With such open source libraries, HIP can give public users a way to combine and explore various data sets by generating multiple chart types (Line, Bar, Pie, Scatter plot) directly from the Google Maps viewport. Every rendered object such as a basin shape on the viewport is clickable, and this is the first step to access the visualization of data sets.

  9. Repository on maternal child health: health portal to improve access to information on maternal child health in India.

    PubMed

    Khanna, Rajesh; Karikalan, N; Mishra, Anil Kumar; Agarwal, Anchal; Bhattacharya, Madhulekha; Das, Jayanta K

    2013-01-02

    Quality and essential health information is considered one of the most cost-effective interventions to improve health for a developing country. Healthcare portals have revolutionalized access to health information and knowledge using the Internet and related technologies, but their usage is far from satisfactory in India. This article describes a health portal developed in India aimed at providing one-stop access to efficiently search, organize and share maternal child health information relevant from public health perspective in the country. The portal 'Repository on Maternal Child Health' was developed using an open source content management system and standardized processes were followed for collection, selection, categorization and presentation of resource materials. Its usage is evaluated using key performance indicators obtained from Google Analytics, and quality assessed using a standardized checklist of knowledge management. The results are discussed in relation to improving quality and access to health information. The portal was launched in July 2010 and provides free access to full-text of 900 resource materials categorized under specific topics and themes. During the subsequent 18 months, 52,798 visits were registered from 174 countries across the world, and more than three-fourth visits were from India alone. Nearly 44,000 unique visitors visited the website and spent an average time of 4 minutes 26 seconds. The overall bounce rate was 27.6%. An increase in the number of unique visitors was found to be significantly associated with an increase in the average time on site (p-value 0.01), increase in the web traffic through search engines (p-value 0.00), and decrease in the bounce rate (p-value 0.03). There was a high degree of agreement between the two experts regarding quality assessment carried out under the three domains of knowledge access, knowledge creation and knowledge transfer (Kappa statistic 0.72). Efficient management of health information is imperative for informed decision making, and digital repositories have now-a-days become the preferred source of information management. The growing popularity of the portal indicates the potential of such initiatives in improving access to quality and essential health information in India. There is a need to develop similar mechanisms for other health domains and interlink them to facilitate access to a variety of health information from a single platform.

  10. Repository on maternal child health: Health portal to improve access to information on maternal child health in India

    PubMed Central

    2013-01-01

    Background Quality and essential health information is considered one of the most cost-effective interventions to improve health for a developing country. Healthcare portals have revolutionalized access to health information and knowledge using the Internet and related technologies, but their usage is far from satisfactory in India. This article describes a health portal developed in India aimed at providing one-stop access to efficiently search, organize and share maternal child health information relevant from public health perspective in the country. Methods The portal ‘Repository on Maternal Child Health’ was developed using an open source content management system and standardized processes were followed for collection, selection, categorization and presentation of resource materials. Its usage is evaluated using key performance indicators obtained from Google Analytics, and quality assessed using a standardized checklist of knowledge management. The results are discussed in relation to improving quality and access to health information. Results The portal was launched in July 2010 and provides free access to full-text of 900 resource materials categorized under specific topics and themes. During the subsequent 18 months, 52,798 visits were registered from 174 countries across the world, and more than three-fourth visits were from India alone. Nearly 44,000 unique visitors visited the website and spent an average time of 4 minutes 26 seconds. The overall bounce rate was 27.6%. An increase in the number of unique visitors was found to be significantly associated with an increase in the average time on site (p-value 0.01), increase in the web traffic through search engines (p-value 0.00), and decrease in the bounce rate (p-value 0.03). There was a high degree of agreement between the two experts regarding quality assessment carried out under the three domains of knowledge access, knowledge creation and knowledge transfer (Kappa statistic 0.72). Conclusions Efficient management of health information is imperative for informed decision making, and digital repositories have now-a-days become the preferred source of information management. The growing popularity of the portal indicates the potential of such initiatives in improving access to quality and essential health information in India. There is a need to develop similar mechanisms for other health domains and interlink them to facilitate access to a variety of health information from a single platform. PMID:23281735

  11. NOAA's contribution to an informed society anticipating and responding to climate and its impacts through Climate.gov

    NASA Astrophysics Data System (ADS)

    Niepold, F.

    2012-12-01

    Societal concern about the impacts of climate change is growing. Citizens in public and private sectors want easy access to credible climate science information to help them make informed decisions affecting their lives and livelihoods. Weather and climate influences almost every sector of society, and affects up to 40 percent of the United States' 10 trillion annual economy. (NRC report, 2003 entitled "Satellite Observations of the Earth's Environment: Accelerating the Transition of Research to Operations"). As the leading provider of climate, weather, and water information to the nation and the world, NOAA is a logical source for citizens to turn to for climate information. NOAA must expand and improve the way it communicates, educates, reaches out to, and engages with public stakeholders to better meet the nation's needs for timely, authoritative climate data and information. Citizens are increasingly going online to seek credible, authoritative climate information. However, users report having difficulty locating and using NOAA's online data products and services. Thus, resolving this online accessibility issue will be one of the Climate Portal's main benefits. The use of portal technology and emerging data integration and visualization tools provide an opportunity for NOAA to bring together multiple datasets from diverse disciplines and sources to deliver a more comprehensive picture of climate in the context of affected resources, communities and businesses. Additional benefits include wider extension of NOAA's data to other media such as television and free-choice learning venues, thereby increasing public exposure and engagement. The Climate Portal teams take an audience-focused approach to promoting climate science literacy among the public. The program communicates the challenges, processes, and results of NOAA-supported climate science through stories and data visualizations on the Web and in popular media. They provide information to a range of audiences to enhance society's ability to understand and plan and respond to climate variability and change. As part of a broad NOAA effort, the Climate Portal teams are working to design, test, and develop the NOAA Climate Services portal (climate.gov) that will provide ready access to climate data, information resources and educational products. The portal features customized interfaces for four audiences: scientists and sectoral data users, policy leaders, educators and students, and the public. The portal delivers climate science content that is free, readily accessible, and easily understandable, provided in flexible formats that maximize its usefulness. Important measures of success for NOAA's climate services will be the ease with which diverse public user communities are able to access and use the data products and information services that NOAA provides, the frequency with which they do so, and the trust they place in NOAA's climate resources. In addition to data and products, the Portal will offer a broad array of climate communications, outreach, and educational materials that demonstrate NOAA's leadership in providing climate science research, observations, and modeling products as a service to society. This session will discuss the partnerships and recent advancements of the climate portal and its plans for the coming year.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vang, Leng; Prescott, Steven R; Smith, Curtis

    In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.

  13. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  14. Three-dimensional Image Fusion Guidance for Transjugular Intrahepatic Portosystemic Shunt Placement.

    PubMed

    Tacher, Vania; Petit, Arthur; Derbel, Haytham; Novelli, Luigi; Vitellius, Manuel; Ridouani, Fourat; Luciani, Alain; Rahmouni, Alain; Duvoux, Christophe; Salloum, Chady; Chiaradia, Mélanie; Kobeiter, Hicham

    2017-11-01

    To assess the safety, feasibility and effectiveness of image fusion guidance with pre-procedural portal phase computed tomography with intraprocedural fluoroscopy for transjugular intrahepatic portosystemic shunt (TIPS) placement. All consecutive cirrhotic patients presenting at our interventional unit for TIPS creation from January 2015 to January 2016 were prospectively enrolled. Procedures were performed under general anesthesia in an interventional suite equipped with flat panel detector, cone-beam computed tomography (CBCT) and image fusion technique. All TIPSs were placed under image fusion guidance. After hepatic vein catheterization, an unenhanced CBCT acquisition was performed and co-registered with the pre-procedural portal phase CT images. A virtual path between hepatic vein and portal branch was made using the virtual needle path trajectory software. Subsequently, the 3D virtual path was overlaid on 2D fluoroscopy for guidance during portal branch cannulation. Safety, feasibility, effectiveness and per-procedural data were evaluated. Sixteen patients (12 males; median age 56 years) were included. Procedures were technically feasible in 15 of the 16 patients (94%). One procedure was aborted due to hepatic vein catheterization failure related to severe liver distortion. No periprocedural complications occurred within 48 h of the procedure. The median dose-area product was 91 Gy cm 2 , fluoroscopy time 15 min, procedure time 40 min and contrast media consumption 65 mL. Clinical benefit of the TIPS placement was observed in nine patients (56%). This study suggests that 3D image fusion guidance for TIPS is feasible, safe and effective. By identifying virtual needle path, CBCT enables real-time multiplanar guidance and may facilitate TIPS placement.

  15. A multiresolution processing method for contrast enhancement in portal imaging.

    PubMed

    Gonzalez-Lopez, Antonio

    2018-06-18

    Portal images have a unique feature among the imaging modalities used in radiotherapy: they provide direct visualization of the irradiated volumes. However, contrast and spatial resolution are strongly limited due to the high energy of the radiation sources. Because of this, imaging modalities using x-ray energy beams have gained importance in the verification of patient positioning, replacing portal imaging. The purpose of this work was to develop a method for the enhancement of local contrast in portal images. The method operates in the subbands of a wavelet decomposition of the image, re-scaling them in such a way that coefficients in the high and medium resolution subbands are amplified, an approach totally different of those operating on the image histogram, widely used nowadays. Portal images of an anthropomorphic phantom were acquired in an electronic portal imaging device (EPID). Then, different re-scaling strategies were investigated, studying the effects of the scaling parameters on the enhanced images. Also, the effect of using different types of transforms was studied. Finally, the implemented methods were combined with histogram equalization methods like the contrast limited adaptive histogram equalization (CLAHE), and these combinations were compared. Uniform amplification of the detail subbands shows the best results in contrast enhancement. On the other hand, linear re-escalation of the high resolution subbands increases the visibility of fine detail of the images, at the expense of an increase in noise levels. Also, since processing is applied only to detail subbands, not to the approximation, the mean gray level of the image is minimally modified and no further display adjustments are required. It is shown that re-escalation of the detail subbands of portal images can be used as an efficient method for the enhancement of both, the local contrast and the resolution of these images. © 2018 Institute of Physics and Engineering in Medicine.

  16. Estadísticas de visitas en portales web institucionales como indicador de respuesta del público a propuestas de divulgación

    NASA Astrophysics Data System (ADS)

    Lares, M.

    The presence of institutions on the internet is nowadays very important to strenghten communication channels, both internal and with the general public. The Córdoba Observatory has several web portals, including the official web page, a blog and presence on several social networks. These are one of the fundamental pillars for outreach activities, and serve as communication channel for events and scientific, academic, and outreach news. They are also a source of information for the staff, as well as data related to the Observatory internal organization and scientific production. Several statistical studies are presented, based on data taken from the visits to the official web pages. I comment on some aspects of the role of web pages as a source of consultation and as a quick response to information needs. FULL TEXT IN SPANISH

  17. Searching Online Chemical Data Repositories via the ChemAgora Portal.

    PubMed

    Zanzi, Antonella; Wittwehr, Clemens

    2017-12-26

    ChemAgora, a web application designed and developed in the context of the "Data Infrastructure for Chemical Safety Assessment" (diXa) project, provides search capabilities to chemical data from resources available online, enabling users to cross-reference their search results with both regulatory chemical information and public chemical databases. ChemAgora, through an on-the-fly search, informs whether a chemical is known or not in each of the external data sources and provides clikable links leading to the third-party web site pages containing the information. The original purpose of the ChemAgora application was to correlate studies stored in the diXa data warehouse with available chemical data. Since the end of the diXa project, ChemAgora has evolved into an independent portal, currently accessible directly through the ChemAgora home page, with improved search capabilities of online data sources.

  18. Intestinal fate of dietary zinc and copper: Postprandial net fluxes of these trace elements in portal vein of pigs.

    PubMed

    Matte, J Jacques; Girard, Christiane L; Guay, Frédéric

    2017-12-01

    In pig, the assessment of bioavailability of dietary trace minerals with classical approaches such as relative bioavailability estimates or digestive tract balances have often generated inconsistent responses. In the present study, net portal-drained-viscera fluxes were monitored after a meal to assess intestinal absorption of zinc (Zn) or copper (Cu) according to dietary sources and levels of these trace minerals. Twelve pigs were surgically equipped with portal and carotid catheters and a portal ultrasonic flow probe for 12-h postprandial measurements. In a cross-over design, pigs received boluses of inorganic (I) or organic (O) dietary Cu and Zn at adequate (A, 20 and 200mg, respectively) or high (H, 40 and 400mg, respectively) level just before a 0.8-kg meal (semi-purified diet). Whatever treatments, arterial Zn increased by 72% at 45min postprandial and gradually declined thereafter (P<0.01). Arterial Zn were greater by 11% after O than I (P=0.02) and by 19% after H than A (P<0.01) meals. Net portal-drained-viscera fluxes of Zn during the first 240min postprandial were greater by 44% after O than I (P=0.10) and by 51% after H than A (P=0.07) meals. For Cu, portal-drained-viscera fluxes of Cu up to 240min postprandial were greater (P=0.03) after A than H meals. Those results suggest that Zn is absorbed rapidly, likely in the upper digestive tract of pigs and, whatever dietary levels, more efficiently after O meals. It appears that H levels of both Zn and Cu interfered with intestinal absorption of Cu and/or stimulate post-absorption enterocyte sequestration of this mineral. Crown Copyright © 2017. Published by Elsevier GmbH. All rights reserved.

  19. The Geo Data Portal an Example Physical and Application Architecture Demonstrating the Power of the "Cloud" Concept.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Walker, J.; Kunicki, T.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics (CIDA), in holding with the President's Digital Government Strategy and the Department of Interior's IT Transformation initiative, has evolved its data center and application architecture toward the "cloud" paradigm. In this case, "cloud" refers to a goal of developing services that may be distributed to infrastructure anywhere on the Internet. This transition has taken place across the entire data management spectrum from data center location to physical hardware configuration to software design and implementation. In CIDA's case, physical hardware resides in Madison at the Wisconsin Water Science Center, in South Dakota at the Earth Resources Observation and Science Center (EROS), and in the near future at a DOI approved commercial vendor. Tasks normally conducted on desktop-based GIS software with local copies of data in proprietary formats are now done using browser-based interfaces to web processing services drawing on a network of standard data-source web services. Organizations are gaining economies of scale through data center consolidation and the creation of private cloud services as well as taking advantage of the commoditization of data processing services. Leveraging open standards for data and data management take advantage of this commoditization and provide the means to reliably build distributed service based systems. This presentation will use CIDA's experience as an illustration of the benefits and hurdles of moving to the cloud. Replicating, reformatting, and processing large data sets, such as downscaled climate projections, traditionally present a substantial challenge to environmental science researchers who need access to data subsets and derived products. The USGS Geo Data Portal (GDP) project uses cloud concepts to help earth system scientists' access subsets, spatial summaries, and derivatives of commonly needed very large data. The GDP project has developed a reusable architecture and advanced processing services that currently accesses archives hosted at Lawrence Livermore National Lab, Oregon State University, the University Corporation for Atmospheric Research, and the U.S. Geological Survey, among others. Several examples of how the GDP project uses cloud concepts will be highlighted in this presentation: 1) The high bandwidth network connectivity of large data centers reduces the need for data replication and storage local to processing services. 2) Standard data serving web services, like OPeNDAP, Web Coverage Services, and Web Feature Services allow GDP services to remotely access custom subsets of data in a variety of formats, further reducing the need for data replication and reformatting. 3) The GDP services use standard web service APIs to allow browser-based user interfaces to run complex and compute-intensive processes for users from any computer with an Internet connection. The combination of physical infrastructure and application architecture implemented for the Geo Data Portal project offer an operational example of how distributed data and processing on the cloud can be used to aid earth system science.

  20. How to Boost Engineering Support Via Web 2.0 - Seeds for the Ares Project...and/or Yours?

    NASA Technical Reports Server (NTRS)

    Scott, David W.

    2010-01-01

    The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares launch system development. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal intended to provide a seamless interface to support and simplify two critical activities: a) Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison. b) Provide archive storage for engineering instrumentation data to support engineering design, development, and test. A mix of NASA-written and COTS software provides engineering analysis tools. A by-product of using a data portal to access and display data is access to collaborative tools inherent in a Web 2.0 environment. This paper discusses how Web 2.0 techniques, particularly social media, might be applied to the traditionally conservative and formal engineering support arena. A related paper by the author [1] considers use

  1. Giovanni: The Bridge between Data and Science

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Lynnes, Christopher; Kempler, Steven J.

    2012-01-01

    NASA Giovanni (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) is a web-based remote sensing and model data visualization and analysis system developed by the Goddard Earth Sciences Data and Information Services Center (GES DISC). This web-based tool facilitates data discovery, exploration and analysis of large amount of global and regional data sets, covering atmospheric dynamics, atmospheric chemistry, hydrology, oceanographic, and land surface. Data analysis functions include Lat-Lon map, time series, scatter plot, correlation map, difference, cross-section, vertical profile, and animation etc. Visualization options enable comparisons of multiple variables and easier refinement. Recently, new features have been developed, such as interactive scatter plots and maps. The performance is also being improved, in some cases by an order of magnitude for certain analysis functions with optimized software. We are working toward merging current Giovanni portals into a single omnibus portal with all variables in one (virtual) location to help users find a variable easily and enhance the intercomparison capability

  2. dCache, towards Federated Identities & Anonymized Delegation

    NASA Astrophysics Data System (ADS)

    Ashish, A.; Millar, AP; Mkrtchyan, T.; Fuhrmann, P.; Behrmann, G.; Sahakyan, M.; Adeyemi, O. S.; Starek, J.; Litvintsev, D.; Rossi, A.

    2017-10-01

    For over a decade, dCache has relied on the authentication and authorization infrastructure (AAI) offered by VOMS, Kerberos, Xrootd etc. Although the established infrastructure has worked well and provided sufficient security, the implementation of procedures and the underlying software is often seen as a burden, especially by smaller communities trying to adopt existing HEP software stacks [1]. Moreover, scientists are increasingly dependent on service portals for data access [2]. In this paper, we describe how federated identity management systems can facilitate the transition from traditional AAI infrastructure to novel solutions like OpenID Connect. We investigate the advantages offered by OpenID Connect in regards to ‘delegation of authentication’ and ‘credential delegation for offline access’. Additionally, we demonstrate how macaroons can provide a more fine-granular authorization mechanism that supports anonymized delegation.

  3. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    ERIC Educational Resources Information Center

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  4. ODI - Portal, Pipeline, and Archive (ODI-PPA): a web-based astronomical compute archive, visualization, and analysis service

    NASA Astrophysics Data System (ADS)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Harbeck, Daniel R.; Boroson, Todd; Liu, Wilson; Kotulla, Ralf; Shaw, Richard; Henschel, Robert; Rajagopal, Jayadev; Stobie, Elizabeth; Knezek, Patricia; Martin, R. Pierre; Archbold, Kevin

    2014-07-01

    The One Degree Imager-Portal, Pipeline, and Archive (ODI-PPA) is a web science gateway that provides astronomers a modern web interface that acts as a single point of access to their data, and rich computational and visualization capabilities. Its goal is to support scientists in handling complex data sets, and to enhance WIYN Observatory's scientific productivity beyond data acquisition on its 3.5m telescope. ODI-PPA is designed, with periodic user feedback, to be a compute archive that has built-in frameworks including: (1) Collections that allow an astronomer to create logical collations of data products intended for publication, further research, instructional purposes, or to execute data processing tasks (2) Image Explorer and Source Explorer, which together enable real-time interactive visual analysis of massive astronomical data products within an HTML5 capable web browser, and overlaid standard catalog and Source Extractor-generated source markers (3) Workflow framework which enables rapid integration of data processing pipelines on an associated compute cluster and users to request such pipelines to be executed on their data via custom user interfaces. ODI-PPA is made up of several light-weight services connected by a message bus; the web portal built using Twitter/Bootstrap, AngularJS and jQuery JavaScript libraries, and backend services written in PHP (using the Zend framework) and Python; it leverages supercomputing and storage resources at Indiana University. ODI-PPA is designed to be reconfigurable for use in other science domains with large and complex datasets, including an ongoing offshoot project for electron microscopy data.

  5. ImageX: new and improved image explorer for astronomical images and beyond

    NASA Astrophysics Data System (ADS)

    Hayashi, Soichi; Gopu, Arvind; Kotulla, Ralf; Young, Michael D.

    2016-08-01

    The One Degree Imager - Portal, Pipeline, and Archive (ODI-PPA) has included the Image Explorer interactive image visualization tool since it went operational. Portal users were able to quickly open up several ODI images within any HTML5 capable web browser, adjust the scaling, apply color maps, and perform other basic image visualization steps typically done on a desktop client like DS9. However, the original design of the Image Explorer required lossless PNG tiles to be generated and stored for all raw and reduced ODI images thereby taking up tens of TB of spinning disk space even though a small fraction of those images were being accessed by portal users at any given time. It also caused significant overhead on the portal web application and the Apache webserver used by ODI-PPA. We found it hard to merge in improvements made to a similar deployment in another project's portal. To address these concerns, we re-architected Image Explorer from scratch and came up with ImageX, a set of microservices that are part of the IU Trident project software suite, with rapid interactive visualization capabilities useful for ODI data and beyond. We generate a full resolution JPEG image for each raw and reduced ODI FITS image before producing a JPG tileset, one that can be rendered using the ImageX frontend code at various locations as appropriate within a web portal (for example: on tabular image listings, views allowing quick perusal of a set of thumbnails or other image sifting activities). The new design has decreased spinning disk requirements, uses AngularJS for the client side Model/View code (instead of depending on backend PHP Model/View/Controller code previously used), OpenSeaDragon to render the tile images, and uses nginx and a lightweight NodeJS application to serve tile images thereby significantly decreasing the Time To First Byte latency by a few orders of magnitude. We plan to extend ImageX for non-FITS images including electron microscopy and radiology scan images, and its featureset to include basic functions like image overlay and colormaps. Users needing more advanced visualization and analysis capabilities could use a desktop tool like DS9+IRAF on another IU Trident project called StarDock, without having to download Gigabytes of FITS image data.

  6. Improving Transparency in the Reporting of Safeguards Implementation: FY11 Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toomey, Christopher; Odlaug, Christopher S.; Wyse, Evan T.

    2011-09-30

    In 2008, the Standing Advisory Group on Safeguards Implementation (SAGSI) indicated that the International Atomic Energy Agency's (IAEA) Safeguards Implementation Report (SIR) has not kept pace with the evolution of safeguards and provided the IAEA with a set of recommendations for improvement. The SIR is the primary mechanism for providing an overview of safeguards implementation in a given year and reporting on the annual safeguards findings and conclusions drawn by the Secretariat. As the IAEA transitions to State-level safeguards approaches, SIR reporting must adapt to reflect these evolutionary changes. This evolved report will better reflect the IAEA's transition to amore » more qualitative and information-driven approach, based upon State-as-a-whole considerations. This paper applies SAGSI's recommendations to the development of multiple models for an evolved SIR and finds that an SIR repurposed as a 'safeguards portal' could significantly enhance information delivery, clarity, and transparency. In addition, this paper finds that the 'portal concept' also appears to have value as a standardized information presentation and analysis platform for use by Country Officers, for continuity of knowledge purposes, and the IAEA Secretariat in the safeguards conclusion process. Accompanying this paper is a fully functional prototype of the 'portal' concept, built using commercial software and IAEA Annual Report data and available for viewing at http://safeguardsportal.pnnl.gov.« less

  7. Modeling Magnetic Properties in EZTB

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; vonAllmen, Paul

    2007-01-01

    A software module that calculates magnetic properties of a semiconducting material has been written for incorporation into, and execution within, the Easy (Modular) Tight-Binding (EZTB) software infrastructure. [EZTB is designed to model the electronic structures of semiconductor devices ranging from bulk semiconductors, to quantum wells, quantum wires, and quantum dots. EZTB implements an empirical tight-binding mathematical model of the underlying physics.] This module can model the effect of a magnetic field applied along any direction and does not require any adjustment of model parameters. The module has thus far been applied to study the performances of silicon-based quantum computers in the presence of magnetic fields and of miscut angles in quantum wells. The module is expected to assist experimentalists in fabricating a spin qubit in a Si/SiGe quantum dot. This software can be executed in almost any Unix operating system, utilizes parallel computing, can be run as a Web-portal application program. The module has been validated by comparison of its predictions with experimental data available in the literature.

  8. Mfold web server for nucleic acid folding and hybridization prediction

    PubMed Central

    Zuker, Michael

    2003-01-01

    The abbreviated name, ‘mfold web server’, describes a number of closely related software applications available on the World Wide Web (WWW) for the prediction of the secondary structure of single stranded nucleic acids. The objective of this web server is to provide easy access to RNA and DNA folding and hybridization software to the scientific community at large. By making use of universally available web GUIs (Graphical User Interfaces), the server circumvents the problem of portability of this software. Detailed output, in the form of structure plots with or without reliability information, single strand frequency plots and ‘energy dot plots’, are available for the folding of single sequences. A variety of ‘bulk’ servers give less information, but in a shorter time and for up to hundreds of sequences at once. The portal for the mfold web server is http://www.bioinfo.rpi.edu/applications/mfold. This URL will be referred to as ‘MFOLDROOT’. PMID:12824337

  9. Scientific Use Cases for the Virtual Atomic and Molecular Data Center

    NASA Astrophysics Data System (ADS)

    Dubernet, M. L.; Aboudarham, J.; Ba, Y. A.; Boiziot, M.; Bottinelli, S.; Caux, E.; Endres, C.; Glorian, J. M.; Henry, F.; Lamy, L.; Le Sidaner, P.; Møller, T.; Moreau, N.; Rénié, C.; Roueff, E.; Schilke, P.; Vastel, C.; Zwoelf, C. M.

    2014-12-01

    VAMDC Consortium is a worldwide consortium which federates interoperable Atomic and Molecular databases through an e-science infrastructure. The contained data are of the highest scientific quality and are crucial for many applications: astrophysics, atmospheric physics, fusion, plasma and lighting technologies, health, etc. In this paper we present astrophysical scientific use cases in relation to the use of the VAMDC e-infrastructure. Those will cover very different applications such as: (i) modeling the spectra of interstellar objects using the myXCLASS software tool implemented in the Common Astronomy Software Applications package (CASA) or using the CASSIS software tool, in its stand-alone version or implemented in the Herschel Interactive Processing Environment (HIPE); (ii) the use of Virtual Observatory tools accessing VAMDC databases; (iii) the access of VAMDC from the Paris solar BASS2000 portal; (iv) the combination of tools and database from the APIS service (Auroral Planetary Imaging and Spectroscopy); (v) combination of heterogeneous data for the application to the interstellar medium from the SPECTCOL tool.

  10. The community-driven BiG CZ software system for integration and analysis of bio- and geoscience data in the critical zone

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.

    2014-12-01

    Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single metadata catalog. The entire BiG CZ Software system is being developed on public repositories as a modular suite of open source software projects. It will be built around a new Observations Data Model Version 2.0 (ODM2) that has been developed by members of the BiG CZ project team, with community input, under separate funding.

  11. Pharmacology Portal: An Open Database for Clinical Pharmacologic Laboratory Services.

    PubMed

    Karlsen Bjånes, Tormod; Mjåset Hjertø, Espen; Lønne, Lars; Aronsen, Lena; Andsnes Berg, Jon; Bergan, Stein; Otto Berg-Hansen, Grim; Bernard, Jean-Paul; Larsen Burns, Margrete; Toralf Fosen, Jan; Frost, Joachim; Hilberg, Thor; Krabseth, Hege-Merete; Kvan, Elena; Narum, Sigrid; Austgulen Westin, Andreas

    2016-01-01

    More than 50 Norwegian public and private laboratories provide one or more analyses for therapeutic drug monitoring or testing for drugs of abuse. Practices differ among laboratories, and analytical repertoires can change rapidly as new substances become available for analysis. The Pharmacology Portal was developed to provide an overview of these activities and to standardize the practices and terminology among laboratories. The Pharmacology Portal is a modern dynamic web database comprising all available analyses within therapeutic drug monitoring and testing for drugs of abuse in Norway. Content can be retrieved by using the search engine or by scrolling through substance lists. The core content is a substance registry updated by a national editorial board of experts within the field of clinical pharmacology. This ensures quality and consistency regarding substance terminologies and classification. All laboratories publish their own repertoires in a user-friendly workflow, adding laboratory-specific details to the core information in the substance registry. The user management system ensures that laboratories are restricted from editing content in the database core or in repertoires within other laboratory subpages. The portal is for nonprofit use, and has been fully funded by the Norwegian Medical Association, the Norwegian Society of Clinical Pharmacology, and the 8 largest pharmacologic institutions in Norway. The database server runs an open-source content management system that ensures flexibility with respect to further development projects, including the potential expansion of the Pharmacology Portal to other countries. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.

  12. Data You May Like: A Recommender System for Research Data Discovery

    NASA Astrophysics Data System (ADS)

    Devaraju, A.; Davy, R.; Hogan, D.

    2016-12-01

    Various data portals been developed to facilitate access to research datasets from different sources. For example, the Data Publisher for Earth & Environmental Science (PANGAEA), the Registry of Research Data Repositories (re3data.org), and the National Geoscience Data Centre (NGDC). Due to data quantity and heterogeneity, finding relevant datasets on these portals may be difficult and tedious. Keyword searches based on specific metadata elements or multi-key indexes may return irrelevant results. Faceted searches may be unsatisfactory and time consuming, especially when facet values are exhaustive. We need a much more intelligent way to complement existing searching mechanisms in order to enhance user experiences of the data portals. We developed a recommender system that helps users to find the most relevant research datasets on the CSIRO's Data Access Portal (DAP). The system is based on content-based filtering. We computed the similarity of datasets based on data attributes (e.g., descriptions, fields of research, location, contributors, and provenance) and inference from transaction logs (e.g., the relations among datasets and between queries and datasets). We improved the recommendation quality by assigning weights to data similarities. The weight values are drawn from a survey involving data users. The recommender results for a given dataset are accessible programmatically via a web service. Taking both data attributes and user actions into account, the recommender system will make it easier for researchers to find and reuse data offered through the data portal.

  13. The Emergence of Open-Source Software in North America

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…

  14. KnowLife: a versatile approach for constructing a large knowledge graph for biomedical sciences.

    PubMed

    Ernst, Patrick; Siu, Amy; Weikum, Gerhard

    2015-05-14

    Biomedical knowledge bases (KB's) have become important assets in life sciences. Prior work on KB construction has three major limitations. First, most biomedical KBs are manually built and curated, and cannot keep up with the rate at which new findings are published. Second, for automatic information extraction (IE), the text genre of choice has been scientific publications, neglecting sources like health portals and online communities. Third, most prior work on IE has focused on the molecular level or chemogenomics only, like protein-protein interactions or gene-drug relationships, or solely address highly specific topics such as drug effects. We address these three limitations by a versatile and scalable approach to automatic KB construction. Using a small number of seed facts for distant supervision of pattern-based extraction, we harvest a huge number of facts in an automated manner without requiring any explicit training. We extend previous techniques for pattern-based IE with confidence statistics, and we combine this recall-oriented stage with logical reasoning for consistency constraint checking to achieve high precision. To our knowledge, this is the first method that uses consistency checking for biomedical relations. Our approach can be easily extended to incorporate additional relations and constraints. We ran extensive experiments not only for scientific publications, but also for encyclopedic health portals and online communities, creating different KB's based on different configurations. We assess the size and quality of each KB, in terms of number of facts and precision. The best configured KB, KnowLife, contains more than 500,000 facts at a precision of 93% for 13 relations covering genes, organs, diseases, symptoms, treatments, as well as environmental and lifestyle risk factors. KnowLife is a large knowledge base for health and life sciences, automatically constructed from different Web sources. As a unique feature, KnowLife is harvested from different text genres such as scientific publications, health portals, and online communities. Thus, it has the potential to serve as one-stop portal for a wide range of relations and use cases. To showcase the breadth and usefulness, we make the KnowLife KB accessible through the health portal (http://knowlife.mpi-inf.mpg.de).

  15. Interactive Maps on War and Peace: A WebGIS Application for Civic Education

    NASA Astrophysics Data System (ADS)

    Wirkus, Lars; Strunck, Alexander

    2013-04-01

    War and violent conflict are omnipresent-be it war in the Middle East, violent conflicts in failed states or increasing military expenditures and exports/ imports of military goods. To understand certain conflicts or peace processes and their possible interrelation, to conduct a well-founded political discussion and to support or influence decision-making, one matter is of special importance: easily accessible and, in particular, reliable data and information. Against this background, the Bonn International Center for Conversion (BICC) in close cooperation with the German Federal Agency for Civic Education (bpb) has been developing a map-based information portal on war and peace with various thematic modules for the latter's online service (http://sicherheitspolitik.bpb.de). The portal will eventually offer nine of such modules that are intended to give various target groups, such as interested members of the public, teachers and learners, policymakers and representatives of the media access to the required information in form of an interactive and country-based global overview or a comparison of different issues. Five thematic modules have been completed so far: War and conflict, peace and demobilization, military capacities, resources and conflict, conventional weapons. The portal offers a broad spectrum of different data processing and visualization tools. Its central feature is an interactive mapping component based on WebGIS and a relational database. Content and data provided through thematic maps in the form of WebGIS layers are generally supplemented by info graphics, data tables and short articles providing deeper knowledge on the respective issue. All modules and their sub-chapters are introduced by background texts. They put all interactive maps of a module into an appropriate context and help the users to also understand the interrelation between various layers. If a layer is selected, all corresponding texts and graphics are shown automatically below the map. Data tables are offered if the copyright of datasets allows such use. All data of all thematic modules is presented in country profiles in a consolidated manner. The portal has been created with Open Source Software. PostgreSQL and PostGIS, MapServer, OpenLayers, MapProxy and cmsmadesimple are combined to manipulate and transform global data sets into interactive thematic maps. A purpose-programmed layer selection menu enables users to select single layers or to combine up to three matching layers from all possible pre-set layer combinations. This applies both to fields of topics within a module and across various modules. Due to the complexity of the structure and visualization constraints, no more than three layers can be combined. The WebGIS-based information portal on war and peace is an excellent example of how GIS technologies can be used for education and outreach. Not only can they play a crucial role in supporting the educational mandate and mission of certain institutions. They can also directly support various target groups in obtaining the knowledge needed by providing a collection of straight forward designed, ready-to-use data, info graphics and maps.

  16. Schroedinger’s code: Source code availability and transparency in astrophysics

    NASA Astrophysics Data System (ADS)

    Ryan, PW; Allen, Alice; Teuben, Peter

    2018-01-01

    Astronomers use software for their research, but how many of the codes they use are available as source code? We examined a sample of 166 papers from 2015 for clearly identified software use, then searched for source code for the software packages mentioned in these research papers. We categorized the software to indicate whether source code is available for download and whether there are restrictions to accessing it, and if source code was not available, whether some other form of the software, such as a binary, was. Over 40% of the source code for the software used in our sample was not available for download.As URLs have often been used as proxy citations for software, we also extracted URLs from one journal’s 2015 research articles, removed those from certain long-term, reliable domains, and tested the remainder to determine what percentage of these URLs were still accessible in September and October, 2017.

  17. Free for All: Open Source Software

    ERIC Educational Resources Information Center

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  18. A Science Portal and Archive for Extragalactic Globular Cluster Systems Data

    NASA Astrophysics Data System (ADS)

    Young, Michael; Rhode, Katherine L.; Gopu, Arvind

    2015-01-01

    For several years we have been carrying out a wide-field imaging survey of the globular cluster populations of a sample of giant spiral, S0, and elliptical galaxies with distances of ~10-30 Mpc. We use mosaic CCD cameras on the WIYN 3.5-m and Kitt Peak 4-m telescopes to acquire deep BVR imaging of each galaxy and then analyze the data to derive global properties of the globular cluster system. In addition to measuring the total numbers, specific frequencies, spatial distributions, and color distributions for the globular cluster populations, we have produced deep, high-quality images and lists of tens to thousands of globular cluster candidates for the ~40 galaxies included in the survey.With the survey nearing completion, we have been exploring how to efficiently disseminate not only the overall results, but also all of the relevant data products, to the astronomical community. Here we present our solution: a scientific portal and archive for extragalactic globular cluster systems data. With a modern and intuitive web interface built on the same framework as the WIYN One Degree Imager Portal, Pipeline, and Archive (ODI-PPA), our system will provide public access to the survey results and the final stacked mosaic images of the target galaxies. In addition, the astrometric and photometric data for thousands of identified globular cluster candidates, as well as for all point sources detected in each field, will be indexed and searchable. Where available, spectroscopic follow-up data will be paired with the candidates. Advanced imaging tools will enable users to overlay the cluster candidates and other sources on the mosaic images within the web interface, while metadata charting tools will allow users to rapidly and seamlessly plot the survey results for each galaxy and the data for hundreds of thousands of individual sources. Finally, we will appeal to other researchers with similar data products and work toward making our portal a central repository for data related to well-studied giant galaxy globular cluster systems. This work is supported by NSF Faculty Early Career Development (CAREER) award AST-0847109.

  19. Dynamic mobility applications open source application development portal : Task 3.3 : concept of operations : final report.

    DOT National Transportation Integrated Search

    2016-10-12

    The Dynamic Mobility Applications (DMA) program seeks to promote the highest level of collaboration and preservation of intellectual capital generated from application development and associated research activities funded by the program. The program ...

  20. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    PubMed

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. GRIP Collaboration Portal: Information Management for a Hurricane Field Campaign

    NASA Astrophysics Data System (ADS)

    Conover, H.; Kulkarni, A.; Garrett, M.; Smith, T.; Goodman, H. M.

    2010-12-01

    NASA’s Genesis and Rapid Intensification Processes (GRIP) experiment, carried out in August and September of 2010, was a complex operation, involving three aircraft and their crews based at different airports, a dozen instrument teams, mission scientists, weather forecasters, project coordinators and a variety of other participants. In addition, GRIP was coordinated with concurrent airborne missions: NOAA’s IFEX and then NSF-funded PREDICT. The GRIP Collaboration Portal was developed to facilitate communication within and between the different teams and serve as an information repository for the field campaign, providing a single access point for project documents, plans, weather forecasts, flight reports and quicklook data. The portal was developed using the Drupal open source content management framework. This presentation will cover both technology and participation issues. Specific examples include: Drupal’s large and diverse open source developer community is an advantage in that we were able to reuse many modules rather than develop capabilities from scratch, but integrating multiple modules developed by many people adds to the overall complexity of the site. Many of the communication capabilities provided by the site, such as discussion forums and blogs, were not used. Participants were diligent about posting necessary documents, but the favored communication method remained email. Drupal's developer-friendly nature allowed for quick development of the customized functionality needed to accommodate the rapidly changing requirements of GRIP experiment. DC-8 Overflight of Hurricane Earl during GRIP Mission

  2. Performance of an RPM based on Gd-lined plastic scintillator for neutron and gamma detection [ANIMMA--2015-IO-372

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fanchini, Erica

    A Radiation Portal Monitor (RPM) was developed by the Istituto Nazionale di Fisica Nucleare (INFN) and Ansaldo Nucleare (ANN) within the FP7 SCINTILLA European project. The system was designed to detect both gamma and neutron radiation with a single technology. It is conceived to monitor vehicle and cargo containers in transits across borders or ports, to find radioactive elements and to avoid illegal trafficking of strategic nuclear materials. The system is based on a {sup 3}He-free neutron detection technology using plastic scintillators coupled to Gadolinium to detect and discriminate gamma from neutron signals. During the 3 years of the SCINTILLAmore » project the construction and test of the first two prototypes drove the definition of the final layout of a full RPM system consisting of two twin pillars as a portal for vehicle and cargo container scan. A custom System Control Software (SCS) manages the electronics of the RPM, the ancillary devices and the data analysis. The combination of the detector layout and of the software functionalities enables both to distinguish neutrons and gammas and to identify the energy range of a detected gamma source. The system was initially characterized via static tests with gamma and neutron sources in the INFN laboratory. These measurements were used to calibrate the detector, evaluate the response of the single pillars as well as of the full system, and optimize the RPM configuration and discrimination algorithm. During this phase, specific tests were performed to study the stability over time of the system, monitoring the measured the neutron and gamma count rates over periods of several weeks. The results allow us to demonstrate the reliability and robustness of the RPM. In a second time the RPM performance was studied via dynamic tests performed during the SCINTILLA test and benchmark campaigns. These measurements took place in the JRC ITRAP+10 facility at Ispra (Varese-Italy). The laboratory is equipped with an experimental set-up for dynamic tests of multiple systems according to international standards. The performed measurements utilized radioactive sources with activities selected according to ANSI and IEC standards to test the detector alarm performances in terms of gamma and neutron response, sensitivity to high gamma fields, sensitivity to moderated neutron sources as well as false alarm rates (FAR). In addition, the RPM was tested in challenging configurations exceeding the requirements set by international standards to determine the real limits of the system. The results obtained during these campaigns demonstrated that the system detection efficiency is not only compliant to international standards for its category, but often exceeds them, demonstrating the validity of the chosen technology and of the implemented layout. The positive performance also showed the effectiveness of the SCS and of its functionalities. To further demonstrate the system capabilities, a test in a real-life environment of the RPM is planned to happen in a near future by installing the detectors in a seaport. In this presentation I will give an overview of the RPM characteristics, of its performances as determined in the test campaign mentioned above and of future plans, to demonstrate how this technology can be an effective choice for the realization of {sup 3}He-free RPM detectors. (authors)« less

  3. Climate tools in mainstream Linux distributions

    NASA Astrophysics Data System (ADS)

    McKinstry, Alastair

    2015-04-01

    Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.

  4. Developing open-source codes for electromagnetic geophysics using industry support

    NASA Astrophysics Data System (ADS)

    Key, K.

    2017-12-01

    Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.

  5. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    PubMed

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  6. LabKey Server NAb: A tool for analyzing, visualizing and sharing results from neutralizing antibody assays

    PubMed Central

    2011-01-01

    Background Multiple types of assays allow sensitive detection of virus-specific neutralizing antibodies. For example, the extent of antibody neutralization of HIV-1, SIV and SHIV can be measured in the TZM-bl cell line through the degree of luciferase reporter gene expression after infection. In the past, neutralization curves and titers for this standard assay have been calculated using an Excel macro. Updating all instances of such a macro with new techniques can be unwieldy and introduce non-uniformity across multi-lab teams. Using Excel also poses challenges in centrally storing, sharing and associating raw data files and results. Results We present LabKey Server's NAb tool for organizing, analyzing and securely sharing data, files and results for neutralizing antibody (NAb) assays, including the luciferase-based TZM-bl NAb assay. The customizable tool supports high-throughput experiments and includes a graphical plate template designer, allowing researchers to quickly adapt calculations to new plate layouts. The tool calculates the percent neutralization for each serum dilution based on luminescence measurements, fits a range of neutralization curves to titration results and uses these curves to estimate the neutralizing antibody titers for benchmark dilutions. Results, curve visualizations and raw data files are stored in a database and shared through a secure, web-based interface. NAb results can be integrated with other data sources based on sample identifiers. It is simple to make results public after publication by updating folder security settings. Conclusions Standardized tools for analyzing, archiving and sharing assay results can improve the reproducibility, comparability and reliability of results obtained across many labs. LabKey Server and its NAb tool are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. Many members of the HIV research community can also access the LabKey Server NAb tool without installing the software by using the Atlas Science Portal (https://atlas.scharp.org). Atlas is an installation of LabKey Server. PMID:21619655

  7. The CLIMB Geoportal - A web-based dissemination and documentation platform for hydrological modelling data

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Gerken, Daniel; Ludwig, Ralf; Duttmann, Rainer

    2015-04-01

    Geoportals are important elements of spatial data infrastructures (SDIs) that are strongly based on GIS-related web services. These services are basically meant for distributing, documenting and visualizing (spatial) data in a standardized manner; an important but challenging task especially in large scientific projects with a high number of data suppliers and producers from various countries. This presentation focuses on introducing the free and open-source based geoportal solution developed within the research project CLIMB (Climate Induced Changes on the Hydrology of Mediterranean Basins, www.climb-fp7.eu) that serves as the central platform for interchanging project-related spatial data and information. In this collaboration, financed by the EU-FP7-framework and coordinated at the LMU Munich, 21 partner institutions from nine European and non-European countries were involved. The CLIMB Geoportal (lgi-climbsrv.geographie.uni-kiel.de) stores and provides spatially distributed data about the current state and future changes of the hydrological conditions within the seven CLIMB test sites around the Mediterranean. Hydrological modelling outcome - validated by the CLIMB partners - is offered to the public in forms of Web Map Services (WMS), whereas downloading the underlying data itself through Web Coverage Services (WCS) is possible for registered users only. A selection of common indicators such as discharge, drought index as well as uncertainty measures including their changes over time were used in different spatial resolution. Besides map information, the portal enables the graphical display of time series of selected variables calculated by the individual models applied within the CLIMB-project. The implementation of the CLIMB Geoportal is finally based on version 2.0c5 of the open source geospatial content management system GeoNode. It includes a GeoServer instance for providing the OGC-compliant web services and comes with a metadata catalog (pycsw) as well as a built-in WebGIS-client based on GeoExt (GeoExplorer). PostgreSQL enhanced by PostGIS in versions 9.2.1/2.0.1 serves as database backend for all base data of the study sites and for the time series of relevant hydrological indicators. Spatial model results in raster-format are stored file-based as GeoTIFFs. Due to the high number of model outputs, the generation of metadata (xml) and graphical rendering instructions (sld) associated with each single layer of the WMS has been done automatically using the statistical software R. Additional applications that have been programmed during the project period include a Java-based interface for comfortable download of climate data that was initially needed as input data in hydrological modeling as well as a tool for displaying time series of selected risk indicators which is directly integrated into the portal structure implemented using Python (Django) and JavaScript. The presented CLIMB Geoportal shows that relevant results of even large international research projects involving many partners and varying national standards in data handling, can be effectively disseminated to stakeholders, policy makers and other interested parties. Thus, it is a successful example of using free and open-source software for providing long-term visibility and access to data produced within a particular (environmental) research project.

  8. [The use of open source software in graphic anatomic reconstructions and in biomechanic simulations].

    PubMed

    Ciobanu, O

    2009-01-01

    The objective of this study was to obtain three-dimensional (3D) images and to perform biomechanical simulations starting from DICOM images obtained by computed tomography (CT). Open source software were used to prepare digitized 2D images of tissue sections and to create 3D reconstruction from the segmented structures. Finally, 3D images were used in open source software in order to perform biomechanic simulations. This study demonstrates the applicability and feasibility of open source software developed in our days for the 3D reconstruction and biomechanic simulation. The use of open source software may improve the efficiency of investments in imaging technologies and in CAD/CAM technologies for implants and prosthesis fabrication which need expensive specialized software.

  9. Three-dimensional path planning software-assisted transjugular intrahepatic portosystemic shunt: a technical modification.

    PubMed

    Tsauo, Jiaywei; Luo, Xuefeng; Ye, Linchao; Li, Xiao

    2015-06-01

    This study was designed to report our results with a modified technique of three-dimensional (3D) path planning software assisted transjugular intrahepatic portosystemic shunt (TIPS). 3D path planning software was recently developed to facilitate TIPS creation by using two carbon dioxide portograms acquired at least 20° apart to generate a 3D path for overlay needle guidance. However, one shortcoming is that puncturing along the overlay would be technically impossible if the angle of the liver access set and the angle of the 3D path are not the same. To solve this problem, a prototype 3D path planning software was fitted with a utility to calculate the angle of the 3D path. Using this, we modified the angle of the liver access set accordingly during the procedure in ten patients. Failure for technical reasons occurred in three patients (unsuccessful wedged hepatic venography in two cases, software technical failure in one case). The procedure was successful in the remaining seven patients, and only one needle pass was required to obtain portal vein access in each case. The course of puncture was comparable to the 3D path in all patients. No procedure-related complication occurred following the procedures. Adjusting the angle of the liver access set to match the angle of the 3D path determined by the software appears to be a favorable modification to the technique of 3D path planning software assisted TIPS.

  10. Library Employment Sources on the Internet

    ERIC Educational Resources Information Center

    Barr, Catherine

    2012-01-01

    This article presents a list of online resources for library job seekers. This includes general sites/portals like American Library Association (ALA): Education & Careers and Canadian Library Association: Library Careers. It also includes sites by sector, employment agencies/commercial services, listservs and networking sites.

  11. GEODE (Geo-Data Explorer) - A U.S. Geological Survey Application for Data Retrieval, Display, and Analysis through the Internet

    USGS Publications Warehouse

    Levine, Marc; Schultz, Adam

    2001-01-01

    GEODE (Geo-Data Explorer) is a free service offered by the U.S. Geological Survey (USGS) on the Internet at http://geode.usgs.gov (fig. 1). It provides digital geographically referenced data to the desktop computers of any user, including policymakers, land and resource managers, educators, industries, and private citizens. The ultimate goal of GEODE is to provide diverse users a gateway (data portal) that will supply real-time data and analysis over the Internet without the need for special hardware, software, and training.

  12. Integrated Lunar Information Architecture for Decision Support Version 3.0 (ILIADS 3.0)

    NASA Technical Reports Server (NTRS)

    Talabac, Stephen; Ames, Troy; Blank, Karin; Hostetter, Carl; Brandt, Matthew

    2013-01-01

    ILIADS 3.0 provides the data management capabilities to access CxP-vetted lunar data sets from the LMMP-provided Data Portal and the LMMP-provided On-Moon lunar data product server. (LMMP stands for Lunar Mapping and Modeling Project.) It also provides specific quantitative analysis functions to meet the stated LMMP Level 3 functional and performance requirements specifications that were approved by the CxP. The purpose of ILIADS 3.0 is to provide an integrated, rich client lunar GIS software application

  13. Providing scalable system software for high-end simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenberg, D.

    1997-12-31

    Detailed, full-system, complex physics simulations have been shown to be feasible on systems containing thousands of processors. In order to manage these computer systems it has been necessary to create scalable system services. In this talk Sandia`s research on scalable systems will be described. The key concepts of low overhead data movement through portals and of flexible services through multi-partition architectures will be illustrated in detail. The talk will conclude with a discussion of how these techniques can be applied outside of the standard monolithic MPP system.

  14. Semantic e-Learning: Next Generation of e-Learning?

    NASA Astrophysics Data System (ADS)

    Konstantinos, Markellos; Penelope, Markellou; Giannis, Koutsonikos; Aglaia, Liopa-Tsakalidi

    Semantic e-learning aspires to be the next generation of e-learning, since the understanding of learning materials and knowledge semantics allows their advanced representation, manipulation, sharing, exchange and reuse and ultimately promote efficient online experiences for users. In this context, the paper firstly explores some fundamental Semantic Web technologies and then discusses current and potential applications of these technologies in e-learning domain, namely, Semantic portals, Semantic search, personalization, recommendation systems, social software and Web 2.0 tools. Finally, it highlights future research directions and open issues of the field.

  15. Establishing Transportation Framework Services Using the Open Geospatial Consortium Web Feature Service Specification

    NASA Astrophysics Data System (ADS)

    Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.

    2005-12-01

    As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.

  16. Stream Flow Prediction and Flood Mapping in the Hindu Kush-Himalaya with the ICIMOD Water Resources App Portal (IWRAP)

    NASA Astrophysics Data System (ADS)

    Nelson, J.; Ames, D. P.; Jones, N.; Souffront, M.

    2016-12-01

    Earth observations of precipitation, temperature, moisture, and other atmospheric and land surface conditions form the foundation of global hydrologic forecasts that are increasingly available in native as well as other derived products. The European Centre for Medium Range Weather Forecasts (ECMWF) have developed such products for global flood awareness which can be downscaled to smaller regions and used for stream flow prediction in underserved areas such as the Hindu Kush-Himalaya. Combined with digital elevation data, now available at 30 meters through the Shuttle Radar Topography Mission (SRTM) reconnaissance-level flood maps can be generated across wide regions that would otherwise not be possible and where increased information to drive higher resolution models are available the same forecasts can be used to provide forcing inflows for improved flood maps. Advances in cloud computing offer a unique opportunity to facilitate deployment of water resources models as decision-making tools in the cloud-based ICIMOD Water Resources App Portal or IWRAP. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated in the Hindu Kush-Himalaya where both financial and technical capacity are limited. All that is needed to use the system is an Internet connection and a web browser. We will take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization will make results intuitive and information derived actionable. We will also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This will make our tools interoperable and we will help train those we work with so that tools and data from other projects can both consume and share with the tools developed in our project.

  17. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  18. Government Technology Acquisition Policy: The Case of Proprietary versus Open Source Software

    ERIC Educational Resources Information Center

    Hemphill, Thomas A.

    2005-01-01

    This article begins by explaining the concepts of proprietary and open source software technology, which are now competing in the marketplace. A review of recent individual and cooperative technology development and public policy advocacy efforts, by both proponents of open source software and advocates of proprietary software, subsequently…

  19. Transforming High School Classrooms with Free/Open Source Software: "It's Time for an Open Source Software Revolution"

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2008-01-01

    Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…

  20. Postprandial PYY increase by resistant starch supplementation is independent of net portal appearance of short-chain fatty acids in pigs.

    PubMed

    Ingerslev, Anne Krog; Mutt, Shivaprakash Jagalur; Lærke, Helle Nygaard; Hedemann, Mette Skou; Theil, Peter Kappel; Nielsen, Kirstine Lykke; Jørgensen, Henry; Herzig, Karl-Heinz; Bach Knudsen, Knud Erik

    2017-01-01

    Increased dietary fiber (DF) fermentation and short-chain fatty acid (SCFA) production may stimulate peptide tyrosine-tyrosine (PYY) secretion. In this study, the effects of hindgut SCFA production on postprandial PYY plasma levels were assessed using different experimental diets in a porto-arterial catheterized pig model. The pigs were fed experimental diets varying in source and levels of DF for one week in 3×3 Latin square designs. The DF sources were whole-wheat grain, wheat aleurone, rye aleurone-rich flour, rye flakes, and resistant starch. Postprandial blood samples were collected from the catheters and analyzed for PYY levels and net portal appearance (NPA) of PYY was correlated to NPA of SCFA. No significant effects of diets on NPA of PYY were observed (P > 0.05), however, resistant starch supplementation increased postprandial NPA of PYY levels by 37 to 54% compared with rye-based and Western-style control diets (P = 0.19). This increase was caused by higher mesenteric artery and portal vein PYY plasma levels (P < 0.001) and was independent of SCFA absorption (P > 0.05). The PYY levels were higher in response to the second daily meal compared with the first daily meal (P < 0.001), but similar among diets (P > 0.10). In conclusion, the increased postprandial PYY responses in pigs fed with different levels and sources of DF are not caused by an increased SCFA absorption and suggest that other mechanisms such as neural reflexes and possibly an increased flow of digesta in the small intestine may be involved. The content of DF and SCFA production did not affect PYY levels.

  1. Web portal on environmental sciences "ATMOS''

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Lykosov, V. N.; Fazliev, A. Z.

    2006-06-01

    The developed under INTAS grant web portal ATMOS (http://atmos.iao.ru and http://atmos.scert.ru) makes available to the international research community, environmental managers, and the interested public, a bilingual information source for the domain of Atmospheric Physics and Chemistry, and the related application domain of air quality assessment and management. It offers access to integrated thematic information, experimental data, analytical tools and models, case studies, and related information and educational resources compiled, structured, and edited by the partners into a coherent and consistent thematic information resource. While offering the usual components of a thematic site such as link collections, user group registration, discussion forum, news section etc., the site is distinguished by its scientific information services and tools: on-line models and analytical tools, and data collections and case studies together with tutorial material. The portal is organized as a set of interrelated scientific sites, which addressed basic branches of Atmospheric Sciences and Climate Modeling as well as the applied domains of Air Quality Assessment and Management, Modeling, and Environmental Impact Assessment. Each scientific site is open for external access information-computational system realized by means of Internet technologies. The main basic science topics are devoted to Atmospheric Chemistry, Atmospheric Spectroscopy and Radiation, Atmospheric Aerosols, Atmospheric Dynamics and Atmospheric Models, including climate models. The portal ATMOS reflects current tendency of Environmental Sciences transformation into exact (quantitative) sciences and is quite effective example of modern Information Technologies and Environmental Sciences integration. It makes the portal both an auxiliary instrument to support interdisciplinary projects of regional environment and extensive educational resource in this important domain.

  2. Characterization of a prototype neutron portal monitor detector

    NASA Astrophysics Data System (ADS)

    Nakhoul, Nabil

    The main objective of this thesis is to provide characterization measurements on a prototype neutron portal monitor (NPM) detector constructed at the University of Massachusetts Lowell. NPM detectors are deployed at all United States border crossings and shipping ports to stop the illicit transfer of weapons-grade plutonium (WGPu) into our country. This large prototype detector with its 0.93 square meter face area is based on thermal neutron capture in 6Li as an alternate technology to the current, very expensive, 3He-based NPM. A neutron detection efficiency of 27.5 % is measured with a 252Cf source which has a spontaneous fission neutron spectrum very similar to that of 240Pu in WGPu. Measurements with an intense 137Cs source establish the extreme insensitivity of the prototype NPM to gamma-ray backgrounds with only one additional count registered for 1.1 million incident gamma rays. This detector also has the ability to locate neutron sources to within an angle of a few degrees. Its sensitivity is further demonstrated by discovering in a few-second measurement the presence of a 2 curie PuBe neutron source even at a distance of 95.5 feet. This thesis also covers in considerable detail the design features that give rise to both a high intrinsic neutron detection efficiency and an extreme gamma-ray insensitivity.

  3. Popularity of Russian information sources of medical education.

    PubMed

    Vasilyeva, Irina V; Arseniev, Sergey B

    2014-01-01

    The aim of the present study is to analyze the popularity of information sources of medical educational sites <webmedinfo.ru>, medical information portal <meduniver.com>, medical portal for students <6years.net>, electronic library of medical literature <booksmed.com>, <medliter.ru> and <medbook.net.ru>. Three sites (<www.webmedinfo.ru>, <meduniver.com> and <6years.net>) provide sources of medical literature, educational videos, medical histories, medical papers and medical popular literature. And three other sites (<www.booksmed.com>, <www.medliter.ru> and <www.medbook.net.ru>) provide sources for electronic medical books on various subjects. Using on-line programs Alexa and Cy-pr we have analyzed the website's rating and identified the main data and time-varying data of the sites. Calculated Alexa Rank rating was determined for each site. Our study has shown that the most popular information sources of medical education among the six studied sites for Russian users is <meduniver.com>; the site <booksmed.com> is at the second place referring to the Alexa Rank rating and the site <webmedinfo.ru> is at the second place referring to the citation index in Yandex. The most popular medical site of electronic medical books is <booksmed.com>.

  4. Smart Grid Information Clearinghouse (SGIC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, Saifur

    Since the Energy Independence and Security Act of 2007 was enacted, there has been a large number of websites that discusses smart grid and relevant information, including those from government, academia, industry, private sector and regulatory. These websites collect information independently. Therefore, smart grid information was quite scattered and dispersed. The objective of this work was to develop, populate, manage and maintain the public Smart Grid Information Clearinghouse (SGIC) web portal. The information in the SGIC website is comprehensive that includes smart grid information, research & development, demonstration projects, technical standards, costs & benefit analyses, business cases, legislation, policy &more » regulation, and other information on lesson learned and best practices. The content in the SGIC website is logically grouped to allow easily browse, search and sort. In addition to providing the browse and search feature, the SGIC web portal also allow users to share their smart grid information with others though our online content submission platform. The Clearinghouse web portal, therefore, serves as the first stop shop for smart grid information that collects smart grid information in a non-bias, non-promotional manner and can provide a missing link from information sources to end users and better serve users’ needs. The web portal is available at www.sgiclearinghouse.org. This report summarizes the work performed during the course of the project (September 2009 – August 2014). Section 2.0 lists SGIC Advisory Committee and User Group members. Section 3.0 discusses SGIC information architecture and web-based database application functionalities. Section 4.0 summarizes SGIC features and functionalities, including its search, browse and sort capabilities, web portal social networking, online content submission platform and security measures implemented. Section 5.0 discusses SGIC web portal contents, including smart grid 101, smart grid projects, deployment experience (i.e., use cases, lessons learned, cost-benefit analyses and business cases), in-depth information (i.e., standards, technology, cyber security, legislation, education and training and demand response), as well as international information. Section 6.0 summarizes SGIC statistics from the launch of the portal on July 07, 2010 to August 31, 2014. Section 7.0 summarizes publicly available information as a result of this work.« less

  5. The effect of partial portal decompression on portal blood flow and effective hepatic blood flow in man: a prospective study.

    PubMed

    Rosemurgy, A S; McAllister, E W; Godellas, C V; Goode, S E; Albrink, M H; Fabri, P J

    1995-12-01

    With the advent of transjugular intrahepatic porta-systemic stent shunt and the wider application of the surgically placed small diameter prosthetic H-graft portacaval shunt (HGPCS), partial portal decompression in the treatment of portal hypertension has received increased attention. The clinical results supporting the use of partial portal decompression are its low incidence of variceal rehemorrhage due to decreased portal pressures and its low rate of hepatic failure, possibly due to maintenance of blood flow to the liver. Surprisingly, nothing is known about changes in portal hemodynamics and effective hepatic blood flow following partial portal decompression. To prospectively evaluate changes in portal hemodynamics and effective hepatic blood flow brought about by partial portal decompression, the following were determined in seven patients undergoing HGPCS: intraoperative pre- and postshunt portal vein pressures and portal vein-inferior vena cava pressure gradients, intraoperative pre- and postshunt portal vein flow, and pre- and postoperative effective hepatic blood flow. With HGPCS, portal vein pressures and portal vein-inferior vena cava pressure gradients decreased significantly, although portal pressures remained above normal. In contrast to the significant decreases in portal pressures, portal vein blood flow and effective hepatic blood flow do not decrease significantly. Changes in portal vein pressures and portal vein-inferior vena cava pressure gradients are great when compared to changes in portal vein flow and effective hepatic blood flow. Reduction of portal hypertension with concomitant maintenance of hepatic blood flow may explain why hepatic dysfunction is avoided following partial portal decompression.

  6. SOURCE WATER CONTROL WITHIN THE MARY MURPHY MINE

    EPA Science Inventory

    The Mary Murphy mine is located in Chaffee County, Colorado, approximately 12 miles southwest from Buena Vista in the San Isabel National Forest.. The mine drains water from multiple portals into Chalk Creek; this mine water contains elevated levels of zinc and cadmium which exce...

  7. Toxic Substances Portal- Arsenic

    MedlinePlus

    ... Has the federal government made recommendations to protect human health? The EPA has set limits on the amount of arsenic that industrial sources can release to the environment and has restricted or cancelled many of the uses of arsenic in pesticides. EPA has set a limit of 0.01 ...

  8. 48 CFR 227.7203-17 - Overseas contracts with foreign sources.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...

  9. 48 CFR 227.7203-17 - Overseas contracts with foreign sources.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...

  10. 48 CFR 227.7203-17 - Overseas contracts with foreign sources.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...

  11. 48 CFR 227.7203-17 - Overseas contracts with foreign sources.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...

  12. 48 CFR 227.7203-17 - Overseas contracts with foreign sources.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...

  13. Nurturing reliable and robust open-source scientific software

    NASA Astrophysics Data System (ADS)

    Uieda, L.; Wessel, P.

    2017-12-01

    Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo (zenodo.org). However, citations to these sources are not always recognized when computing citation metrics. In summary, the widespread development of reliable and robust open-source software relies on the creation of formal training programs in software development best practices and the recognition of software as a valid form of scholarship.

  14. The NIH 3D Print Exchange: A Public Resource for Bioscientific and Biomedical 3D Prints.

    PubMed

    Coakley, Meghan F; Hurt, Darrell E; Weber, Nick; Mtingwa, Makazi; Fincher, Erin C; Alekseyev, Vsevelod; Chen, David T; Yun, Alvin; Gizaw, Metasebia; Swan, Jeremy; Yoo, Terry S; Huyen, Yentram

    2014-09-01

    The National Institutes of Health (NIH) has launched the NIH 3D Print Exchange, an online portal for discovering and creating bioscientifically relevant 3D models suitable for 3D printing, to provide both researchers and educators with a trusted source to discover accurate and informative models. There are a number of online resources for 3D prints, but there is a paucity of scientific models, and the expertise required to generate and validate such models remains a barrier. The NIH 3D Print Exchange fills this gap by providing novel, web-based tools that empower users with the ability to create ready-to-print 3D files from molecular structure data, microscopy image stacks, and computed tomography scan data. The NIH 3D Print Exchange facilitates open data sharing in a community-driven environment, and also includes various interactive features, as well as information and tutorials on 3D modeling software. As the first government-sponsored website dedicated to 3D printing, the NIH 3D Print Exchange is an important step forward to bringing 3D printing to the mainstream for scientific research and education.

  15. Graphics interfaces and numerical simulations: Mexican Virtual Solar Observatory

    NASA Astrophysics Data System (ADS)

    Hernández, L.; González, A.; Salas, G.; Santillán, A.

    2007-08-01

    Preliminary results associated to the computational development and creation of the Mexican Virtual Solar Observatory (MVSO) are presented. Basically, the MVSO prototype consists of two parts: the first, related to observations that have been made during the past ten years at the Solar Observation Station (EOS) and at the Carl Sagan Observatory (OCS) of the Universidad de Sonora in Mexico. The second part is associated to the creation and manipulation of a database produced by numerical simulations related to solar phenomena, we are using the MHD ZEUS-3D code. The development of this prototype was made using mysql, apache, java and VSO 1.2. based GNU and `open source philosophy'. A graphic user interface (GUI) was created in order to make web-based, remote numerical simulations. For this purpose, Mono was used, because it is provides the necessary software to develop and run .NET client and server applications on Linux. Although this project is still under development, we hope to have access, by means of this portal, to other virtual solar observatories and to be able to count on a database created through numerical simulations or, given the case, perform simulations associated to solar phenomena.

  16. KNMI Data Centre: Easy access for all

    NASA Astrophysics Data System (ADS)

    van de Vegte, John; Som de Cerff, Wim; Plieger, Maarten; de Vreede, Ernst; Sluiter, Raymond; Willem Noteboom, Jan; van der Neut, Ian; Verhoef, Hans; van Versendaal, Robert; van Binnendijk, Martin; Kalle, Henk; Knopper, Arthur; Spit, Jasper; Mastop, Joeri; Klos, Olaf; Calis, Gijs; Ha, Siu-Siu; van Moosel, Wim; Klein Ikkink, Henk-Jan; Tosun, Tuncay

    2013-04-01

    KNMI is the Dutch institute for weather, climate research and seismology. It disseminates weather information to the public at large, the government, aviation and the shipping industry in the interest of safety, the economy and a sustainable environment. To gain insight into long-term developments KNMI conducts research on climate change. Making the knowledge, data and information on hand at KNMI accessible is one core activity. A huge part of the KNMI information is from numerical models, insitu sensor networks and remote sensing satellites. This digital collection is mostly internal only available and is a collection of non searchable , non standardized file formats, lacking documentation and has no references to scientific publications. With the KNMI Data Centre (KDC) project these issues are tackled. In the project a user driven development approach with SCRUM was chosen to get maximum user involvement in a relative short development timeframe. Building on open standards and proven open source technology (which includes in-house developed software like ADAGUC WMS and Portal) resulted in a first release in December 2012 This presentation will focus on the aspects of KDC relating to its technical challenges, the development strategy and the initial usage results of the data centre.

  17. Portal venous stent placement for treatment of portal hypertension caused by benign main portal vein stenosis.

    PubMed

    Shan, Hong; Xiao, Xiang-Sheng; Huang, Ming-Sheng; Ouyang, Qiang; Jiang, Zai-Bo

    2005-06-07

    To evaluate the value of endovascular stent in the treatment of portal hypertension caused by benign main portal vein stenosis. Portal vein stents were implanted in six patients with benign main portal vein stenosis (inflammatory stenosis in three cases, postprocedure of liver transplantation in another three cases). Changes in portal vein pressure, portal vein patency, relative clinical symptoms, complications, and survival were evaluated. Six metallic stents were successfully placed across the portal vein stenotic or obstructive lesions in six patients. Mean portal venous pressure decreased significantly after stent implantation from (37.3+/-4.7) cm H(2)O to (18.0+/-1.9) cm H(2)O. The portal blood flow restored and the symptoms caused by portal hypertension were eliminated. There were no severe procedure-related complications. The patients were followed up for 1-48 mo. The portal vein remained patent during follow-up. All patients survived except for one patient who died of other complications of liver transplantation. Percutaneous portal vein stent placement for the treatment of portal hypertension caused by benign main portal vein stenosis is safe and effective.

  18. Common characteristics of open source software development and applicability for drug discovery: a systematic review.

    PubMed

    Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne

    2011-09-28

    Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.

  19. An Analysis of Open Source Security Software Products Downloads

    ERIC Educational Resources Information Center

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  20. Is liver perfusion CT reproducible? A study on intra- and interobserver agreement of normal hepatic haemodynamic parameters obtained with two different software packages.

    PubMed

    Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe

    2017-10-01

    To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.

  1. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  2. Open access to high-level data and analysis tools in the CMS experiment at the LHC

    DOE PAGES

    Calderon, A.; Colling, D.; Huffman, A.; ...

    2015-12-23

    The CMS experiment, in recognition of its commitment to data preservation and open access as well as to education and outreach, has made its first public release of high-level data under the CC0 waiver: up to half of the proton-proton collision data (by volume) at 7 TeV from 2010 in CMS Analysis Object Data format. CMS has prepared, in collaboration with CERN and the other LHC experiments, an open-data web portal based on Invenio. The portal provides access to CMS public data as well as to analysis tools and documentation for the public. The tools include an event display andmore » histogram application that run in the browser. In addition a virtual machine containing a CMS software environment along with XRootD access to the data is available. Within the virtual machine the public can analyse CMS data, example code is provided. As a result, we describe the accompanying tools and documentation and discuss the first experiences of data use.« less

  3. Allen Brain Atlas: an integrated spatio-temporal portal for exploring the central nervous system

    PubMed Central

    Sunkin, Susan M.; Ng, Lydia; Lau, Chris; Dolbeare, Tim; Gilbert, Terri L.; Thompson, Carol L.; Hawrylycz, Michael; Dang, Chinh

    2013-01-01

    The Allen Brain Atlas (http://www.brain-map.org) provides a unique online public resource integrating extensive gene expression data, connectivity data and neuroanatomical information with powerful search and viewing tools for the adult and developing brain in mouse, human and non-human primate. Here, we review the resources available at the Allen Brain Atlas, describing each product and data type [such as in situ hybridization (ISH) and supporting histology, microarray, RNA sequencing, reference atlases, projection mapping and magnetic resonance imaging]. In addition, standardized and unique features in the web applications are described that enable users to search and mine the various data sets. Features include both simple and sophisticated methods for gene searches, colorimetric and fluorescent ISH image viewers, graphical displays of ISH, microarray and RNA sequencing data, Brain Explorer software for 3D navigation of anatomy and gene expression, and an interactive reference atlas viewer. In addition, cross data set searches enable users to query multiple Allen Brain Atlas data sets simultaneously. All of the Allen Brain Atlas resources can be accessed through the Allen Brain Atlas data portal. PMID:23193282

  4. Fine-grained policy control in U.S. Army Research Laboratory (ARL) multimodal signatures database

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly; Grueneberg, Keith; Wood, David; Calo, Seraphin

    2014-06-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) consists of a number of colocated relational databases representing a collection of data from various sensors. Role-based access to this data is granted to external organizations such as DoD contractors and other government agencies through a client Web portal. In the current MMSDB system, access control is only at the database and firewall level. In order to offer finer grained security, changes to existing user profile schemas and authentication mechanisms are usually needed. In this paper, we describe a software middleware architecture and implementation that allows fine-grained access control to the MMSDB at a dataset, table, and row level. Result sets from MMSDB queries issued in the client portal are filtered with the use of a policy enforcement proxy, with minimal changes to the existing client software and database. Before resulting data is returned to the client, policies are evaluated to determine if the user or role is authorized to access the data. Policies can be authored to filter data at the row, table or column level of a result set. The system uses various technologies developed in the International Technology Alliance in Network and Information Science (ITA) for policy-controlled information sharing and dissemination1. Use of the Policy Management Library provides a mechanism for the management and evaluation of policies to support finer grained access to the data in the MMSDB system. The GaianDB is a policy-enabled, federated database that acts as a proxy between the client application and the MMSDB system.

  5. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    NASA Astrophysics Data System (ADS)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  6. Can your software engineer program your PLC?

    NASA Astrophysics Data System (ADS)

    Borrowman, Alastair J.; Taylor, Philip

    2016-07-01

    The use of Programmable Logic Controllers (PLCs) in the control of large physics experiments is ubiquitous1, 2, 3. The programming of these controllers is normally the domain of engineers with a background in electronics, this paper introduces PLC program development from the software engineer's perspective. PLC programs provide the link between control software running on PC architecture systems and physical hardware controlled and monitored by digital and analog signals. The higher-level software running on the PC is typically responsible for accepting operator input and from this deciding when and how hardware connected to the PLC is controlled. The PLC accepts demands from the PC, considers the current state of its connected hardware and if correct to do so (based upon interlocks or other constraints) adjusts its hardware output signals appropriately for the PC's demands. A published ICD (Interface Control Document) defines the PLC memory locations available to be written and read by the PC to control and monitor the hardware. Historically the method of programming PLCs has been ladder diagrams that closely resemble circuit diagrams, however, PLC manufacturers nowadays also provide, and promote, the use of higher-level programming languages4. Based on techniques used in the development of high-level PC software to control PLCs for multiple telescopes, this paper examines the development of PLC programs to operate the hardware of a medical cyclotron beamline controlled from a PC using the Experimental Physics and Industrial Control System (EPICS), which is also widely used in telescope control5, 6, 7. The PLC used is the new generation Siemens S7-1200 programmed using Siemens Pascal based Structured Control Language (SCL), which is their implementation of Structured Text (ST). The approach described is that from a software engineer's perspective, utilising Siemens Totally Integrated Automation (TIA) Portal integrated development environment (IDE) to create modular PLC programs based upon reusable functions capable of being unit tested without the PLC connected to hardware. Emphasis has been placed on designing an interface between EPICS and SCL that enforces correct operation of hardware through stringent separation of PC accessible PLC memory and hardware I/O addresses used only by the PLC. The paper also introduces the method used to automate the creation, from the same source document, the PLC memory structure (tag) definitions (defining memory used to access hardware I/O and that accessed by the PC) and creation of the PC program data structures (EPICS database records) used to access the permitted PLC addresses. From direct experience this paper demonstrates the advantages of PLC program development being shared between electronic and software engineers, to enable use of the most appropriate processes from both the perspective of the hardware and the higher-level software used to control it.

  7. Building a Snow Data System on the Apache OODT Open Technology Stack

    NASA Astrophysics Data System (ADS)

    Goodale, C. E.; Painter, T. H.; Mattmann, C. A.; Hart, A. F.; Ramirez, P.; Zimdars, P.; Bryant, A. C.; Snow Data System Team

    2011-12-01

    Snow cover and its melt dominate regional climate and hydrology in many of the world's mountainous regions. One-sixth of Earth's population depends on snow- or glacier-melt for water resources. Operationally, seasonal forecasts of snowmelt-generated streamflow are leveraged through empirical relations based on past snowmelt periods. These historical data show that climate is changing, but the changes reduce the reliability of the empirical relations. Therefore optimal future management of snowmelt derived water resources will require explicit physical models driven by remotely sensed snow property data. Toward this goal, the Snow Optics Laboratory at the Jet Propulsion Laboratory has initiated a near real-time processing pipeline to generate and publish post-processed snow data products within a few hours of satellite acquisition. To solve this challenge, a Scientific Data Management and Processing System was required and the JPL Team leveraged an open-source project called Object Oriented Data Technology (OODT). OODT was developed within NASA's Jet Propulsion Laboratory across the last 10 years. OODT has supported various scientific data management and processing projects, providing solutions in the Earth, Planetary, and Medical science fields. It became apparent that the project needed to be opened to a larger audience to foster and promote growth and adoption. OODT was open-sourced at the Apache Software Foundation in November 2010 and has a growing community of users and committers that are constantly improving the software. Leveraging OODT, the JPL Snow Data System (SnowDS) Team was able to install and configure a core Data Management System (DMS) that would download MODIS raw data files and archive the products in a local repository for post processing. The team has since built an online data portal, and an algorithm-processing pipeline using the Apache OODT software as the foundation. We will present the working SnowDS system with its core remote sensing components: the MODIS Snow Covered Area and Grain size model (MODSCAG) and the MODIS Dust Radiative Forcing in Snow (MOD-DRFS). These products will be delivered in near real time to water managers and the broader cryosphere and climate community beginning in Winter 2012. We will then present the challenges and opportunities we see in the future as the SnowDS matures and contributions are made back to the OODT project.

  8. Leukemia and risk of recurrent Escherichia coli bacteremia: genotyping implicates E. coli translocation from the colon to the bloodstream.

    PubMed

    Samet, A; Sledzińska, A; Krawczyk, B; Hellmann, A; Nowicki, S; Kur, J; Nowicki, B

    2013-11-01

    In patients with leukemia, the portal(s) and reasons for the persistence of an Escherichia coli recurrent bacteremia remain unclear. Adult Hematology Clinic (AHC) databases at the State Clinical Hospital in Gdańsk were reviewed to evaluate the frequency of E. coli bacteremia between 2002 and 2005. Blood and bowel E. coli strains were obtained and the genetic relatedness of the strains was analyzed. The rate of E. coli bacteremia per 1,000 admissions at the AHC was higher (85.0) than in the other clinics of the hospital (2.9), p < 0.001. A higher mortality was observed in patients with a history of E. coli versus non-E. coli bacteremia [30/95 (31 %) vs. 53/430 (12 %), p < 0.001]; 72.8 % of patients with leukemia had an unknown source of bacteremia. In 2005, 6 out of 25 (24 %) patients with leukemia had ≥2 episodes of E. coli-positive blood cultures. These gastrointestinal E. coli isolates were replaced within 3-8 weeks with a new E. coli H genotype. A recurrent episode of bacteremia was usually caused by an infection with a transient E. coli H genotype identical to that found in the subject's bowel. Consistent with the definition of bowel/blood translocation, the bowel appeared to be a portal for E. coli in these subjects and, hence, a clear source for their recurring bacteremia.

  9. OSIRIX: open source multimodality image navigation software

    NASA Astrophysics Data System (ADS)

    Rosset, Antoine; Pysher, Lance; Spadola, Luca; Ratib, Osman

    2005-04-01

    The goal of our project is to develop a completely new software platform that will allow users to efficiently and conveniently navigate through large sets of multidimensional data without the need of high-end expensive hardware or software. We also elected to develop our system on new open source software libraries allowing other institutions and developers to contribute to this project. OsiriX is a free and open-source imaging software designed manipulate and visualize large sets of medical images: http://homepage.mac.com/rossetantoine/osirix/

  10. Using a Simple Knowledge Organization System to facilitate Catalogue and Search for the ESA CCI Open Data Portal

    NASA Astrophysics Data System (ADS)

    Wilson, Antony; Bennett, Victoria; Donegan, Steve; Juckes, Martin; Kershaw, Philip; Petrie, Ruth; Stephens, Ag; Waterfall, Alison

    2016-04-01

    The ESA Climate Change Initiative (CCI) is a €75m programme that runs from 2009-2016, with a goal to provide stable, long-term, satellite-based essential climate variable (ECV) data products for climate modellers and researchers. As part of the CCI, ESA have funded the Open Data Portal project to establish a central repository to bring together the data from these multiple sources and make it available in a consistent way, in order to maximise its dissemination amongst the international user community. Search capabilities are a critical component to attaining this goal. To this end, the project is providing dataset-level metadata in the form of ISO 19115 records served via a standard OGC CSW interface. In addition, the Open Data Portal is re-using the search system from the Earth System Grid Federation (ESGF), successfully applied to support CMIP5 (5th Coupled Model Intercomparison Project) and obs4MIPs. This uses a tightly defined controlled vocabulary of metadata terms, the DRS (The Data Reference Syntax) which encompass different aspects of the data. This system hs facilitated the construction of a powerful faceted search interface to enable users to discover data at the individual file level of granularity through ESGF's web portal frontend. The use of a consistent set of model experiments for CMIP5 allowed the definition of a uniform DRS for all model data served from ESGF. For CCI however, there are thirteen ECVs, each of which is derived from multiple sources and different science communities resulting in highly heterogeneous metadata. An analysis has been undertaken of the concepts in use, with the aim to produce a CCI DRS which could be provide a single authoritative source for cataloguing and searching the CCI data for the Open Data Portal. The use of SKOS (Simple Knowledge Organization System) and OWL (Web Ontology Language) to represent the DRS are a natural fit and provide controlled vocabularies as well as a way to represent relationships between similar terms used in different ECVs. An iterative approach has been adopted for the model development working closely with domain experts and drawing on practical experience working with content in the input datasets. Tooling has been developed to enable the definition of vocabulary terms via a simple spreadsheet format which can then be automatically converted into Turtle notation and uploaded to the CCI DRS vocabulary service. With a baseline model established, work is underway to develop an ingestion pipeline to import validated search metadata into the ESGF and OGC CSW search services. In addition to the search terms indexed into the ESGF search system, ISO 19115 records will also be similarly tagged during this process with search terms from the data model. In this way it will be possible to construct a faceted search user interface for the Portal which can yield linked search results for data both at the file and dataset level granularity. It is hoped that this will also provide a rich range of content for third-party organisations wishing to incorporate access to CCI data in their own applications and services.

  11. SU-G-TeP4-02: A Method for Evaluating the Direct Impact of Failed IMRT QAs On Patient Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geneser, S; Butkus, M

    Purpose: We developed a method to calculate patient doses corresponding to IMRT QA measurements in order to determine and assess the actual dose delivered for plans with failed (or borderline) IMRT QA. This work demonstrates the feasibility of automatically computing delivered patient dose from portal dosimetry measurements in the Varian TPS system, which would provide a valuable and clinically viable IMRT QA tool for physicists and physicians. Methods: IMRT QA fluences were measured using portal dosimetry, processed using in-house matlab software, and imported back into Eclipse to calculate dose on the planning CT. To validate the proposed workflow, the Eclipsemore » calculated portal dose for a 5-field sliding window prostate boost plan was processed as described above. The resulting dose was compared to the planned dose and found to be within 0.5 Gy. Two IMRT QA results for the prostate boost plan (one that failed and one that passed) were processed and the resulting patient doses were evaluated. Results: The max dose difference between IMRT QA #1 and the original planned and approved dose is 4.5 Gy, while the difference between the planned and IMRT QA #2 dose is 4.0 Gy. The inferior portion of the PTV is slightly underdosed in both plans, and the superior portion is slightly overdosed. The patient dose resulting from IMRT QA #1 and #2 differs by only 0.5 Gy. With this new information, it may be argued that the evaluated plan alteration to obtain passing gamma analysis produced clinically irrelevant differences. Conclusion: Evaluation of the delivered QA dose on the planning CT provides valuable information about the clinical relevance of failed or borderline IMRT QAs. This particular workflow demonstrates the feasibility of pushing the measured IMRT QA portal dosimetry results directly back onto the patient planning CT within the Varian system.« less

  12. SU-E-T-647: Quality Assurance of VMAT by Gamma Analysis Dependence On Low-Dose Threshold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, J; Kim, M; Lee, S

    2015-06-15

    Purpose: The AAPM TG-119 instructed institutions to use low-dose threshold (LDT) of 10% or a ROI determined by the jaw when they collected gamma analysis QA data of planar dose distribution. Also, based on a survey by Nelms and Simon, more than 70% of institutions use a LDT between 0% and 10% for gamma analysis. However, there are no clinical data to quantitatively demonstrate the impact of the LDT on the gamma index. Therefore, we performed a gamma analysis with LDTs of 0% to 15% according to both global and local normalization and different acceptance criteria: 3%/3 mm, 2%/2 mm,more » and 1%/1 mm. Methods: A total of 30 treatment plans—10 head and neck, 10 brain, and 10 prostate cancer cases—were randomly selected from the Varian Eclipse TPS, retrospectively. For the gamma analysis, a predicted portal image was acquired through a portal dose calculation algorithm in the Eclipse TPS, and a measured portal image was obtained using a Varian Clinac iX and an EPID. Then, the gamma analysis was performed using the Portal Dosimetry software. Results: For the global normalization, the gamma passing rate (%GP) decreased as the LDT increased, and all cases of low-dose thresholds exhibited a %GP above 95% for both the 3%/3 mm and 2%/2 mm criteria. However, for local normalization, the %GP increased as LDT increased. The gamma passing rate with LDT of 10% increased by 6.86%, 9.22% and 6.14% compared with the 0% in the case of the head and neck, brain and prostate for 3%/3 mm criteria, respectively. Conclusion: Applying the LDT in the global normalization does not have critical impact to judge patient-specific QA results. However, LDT for the local normalization should be carefully selected because applying the LDT could affect the average of the %GP to increase rapidly.« less

  13. Expanding Human Capabilities through the Adoption and Utilization of Free, Libre, and Open Source Software

    ERIC Educational Resources Information Center

    Simpson, James Daniel

    2014-01-01

    Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…

  14. Open-source software: not quite endsville.

    PubMed

    Stahl, Matthew T

    2005-02-01

    Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.

  15. Preparing a scientific manuscript in Linux: Today's possibilities and limitations.

    PubMed

    Tchantchaleishvili, Vakhtang; Schmitto, Jan D

    2011-10-22

    Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux.

  16. Development of a Web Based Simulating System for Earthquake Modeling on the Grid

    NASA Astrophysics Data System (ADS)

    Seber, D.; Youn, C.; Kaiser, T.

    2007-12-01

    Existing cyberinfrastructure-based information, data and computational networks now allow development of state- of-the-art, user-friendly simulation environments that democratize access to high-end computational environments and provide new research opportunities for many research and educational communities. Within the Geosciences cyberinfrastructure network, GEON, we have developed the SYNSEIS (SYNthetic SEISmogram) toolkit to enable efficient computations of 2D and 3D seismic waveforms for a variety of research purposes especially for helping to analyze the EarthScope's USArray seismic data in a speedy and efficient environment. The underlying simulation software in SYNSEIS is a finite difference code, E3D, developed by LLNL (S. Larsen). The code is embedded within the SYNSEIS portlet environment and it is used by our toolkit to simulate seismic waveforms of earthquakes at regional distances (<1000km). Architecturally, SYNSEIS uses both Web Service and Grid computing resources in a portal-based work environment and has a built in access mechanism to connect to national supercomputer centers as well as to a dedicated, small-scale compute cluster for its runs. Even though Grid computing is well-established in many computing communities, its use among domain scientists still is not trivial because of multiple levels of complexities encountered. We grid-enabled E3D using our own dialect XML inputs that include geological models that are accessible through standard Web services within the GEON network. The XML inputs for this application contain structural geometries, source parameters, seismic velocity, density, attenuation values, number of time steps to compute, and number of stations. By enabling a portal based access to a such computational environment coupled with its dynamic user interface we enable a large user community to take advantage of such high end calculations in their research and educational activities. Our system can be used to promote an efficient and effective modeling environment to help scientists as well as educators in their daily activities and speed up the scientific discovery process.

  17. Computational knowledge integration in biopharmaceutical research.

    PubMed

    Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim

    2003-09-01

    An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.

  18. Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data

    NASA Astrophysics Data System (ADS)

    Zhao, P.; Lynnes, C.; Vollmer, B.; Savtchenko, A. K.; Yang, W.

    2011-12-01

    Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real time Capability for EOS (LANCE) provide the information on the global and regional atmospheric state with very low latency. An open and interoperable platform is useful to facilitate access to and integration of LANCE AIRS NRT data. This paper discusses the use of open-source software components to build Web services for publishing and accessing AIRS NRT data in the context of Service Oriented Architecture (SOA). The AIRS NRT data have also been made available through an OPeNDAP server. OPeNDAP allows several open-source netCDF-based tools such as Integrated Data Viewer, Ferret and Panoply to directly display the Level 2 data over the network. To enable users to locate swath data files in the OPeNDAP server that lie within a certain geographical area, graphical "granule maps" are being added to show the outline of each file on a map of the Earth. The metadata of AIRS NRT data and services is then explored to implement information advertisement and discovery in catalogue systems. Datacasting, an RSS-based technology for accessing Earth Science data and information to facilitate the subscriptions to AIRS NRT data availability, filtering, downloading and viewing data, is also discussed. To provide an easy entry point to AIRS NRT data and services, a Web portal designed for customized data downloading and visualization is introduced.

  19. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  20. Resources monitoring and automatic management system for multi-VO distributed computing system

    NASA Astrophysics Data System (ADS)

    Chen, J.; Pelevanyuk, I.; Sun, Y.; Zhemchugov, A.; Yan, T.; Zhao, X. H.; Zhang, X. M.

    2017-10-01

    Multi-VO supports based on DIRAC have been set up to provide workload and data management for several high energy experiments in IHEP. To monitor and manage the heterogeneous resources which belong to different Virtual Organizations in a uniform way, a resources monitoring and automatic management system based on Resource Status System(RSS) of DIRAC has been presented in this paper. The system is composed of three parts: information collection, status decision and automatic control, and information display. The information collection includes active and passive way of gathering status from different sources and stores them in databases. The status decision and automatic control is used to evaluate the resources status and take control actions on resources automatically through some pre-defined policies and actions. The monitoring information is displayed on a web portal. Both the real-time information and historical information can be obtained from the web portal. All the implementations are based on DIRAC framework. The information and control including sites, policies, web portal for different VOs can be well defined and distinguished within DIRAC user and group management infrastructure.

  1. A non-ideal portal frame energy harvester controlled using a pendulum

    NASA Astrophysics Data System (ADS)

    Iliuk, I.; Balthazar, J. M.; Tusset, A. M.; Piqueira, J. R. C.; Rodrigues de Pontes, B.; Felix, J. L. P.; Bueno, Á. M.

    2013-09-01

    A model of energy harvester based on a simple portal frame structure is presented. The system is considered to be non-ideal system (NIS) due to interaction with the energy source, a DC motor with limited power supply and the system structure. The nonlinearities present in the piezoelectric material are considered in the piezoelectric coupling mathematical model. The system is a bi-stable Duffing oscillator presenting a chaotic behavior. Analyzing the average power variation, and bifurcation diagrams, the value of the control variable that optimizes power or average value that stabilizes the chaotic system in the periodic orbit is determined. The control sensitivity is determined to parametric errors in the damping and stiffness parameters of the portal frame. The proposed passive control technique uses a simple pendulum to tuned to the vibration of the structure to improve the energy harvesting. The results show that with the implementation of the control strategy it is possible to eliminate the need for active or semi active control, usually more complex. The control also provides a way to regulate the energy captured to a desired operating frequency.

  2. Automatic indexing in a drug information portal.

    PubMed

    Sakji, Saoussen; Letord, Catherine; Dahamna, Badisse; Kergourlay, Ivan; Pereira, Suzanne; Joubert, Michel; Darmoni, Stéfan

    2009-01-01

    The objective of this work is to create a bilingual (French/English) Drug Information Portal (DIP), in a multi-terminological context and to emphasize its exploitation by an ATC automatic indexing allowing having more pertinent information about substances, organs or systems on which drugs act and their therapeutic and chemical characteristics. The development of the DIP was based on the CISMeF portal, which catalogues and indexes the most important and quality-controlled sources of institutional health information in French. DIP has created specific functionalities and uses specific drugs terminologies such as the ATC classification which used to automatic index the DIP resources. DIP is the result of collaboration between the CISMeF team and the VIDAL Company, specialized in drug information. DIP is conceived to facilitate the user information retrieval. The ATC automatic indexing provided relevant results in 76% of cases. Using multi-terminological context and in the framework of the drug field, indexing drugs with the appropriate codes or/and terms revealed to be very important to have the appropriate information storage and retrieval. The main challenge in the coming year is to increase the accuracy of the approach.

  3. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  4. Superselective intra-arterial hepatic injection of indocyanine green (ICG) for fluorescence image-guided segmental positive staining: experimental proof of the concept.

    PubMed

    Diana, Michele; Liu, Yu-Yin; Pop, Raoul; Kong, Seong-Ho; Legnèr, Andras; Beaujeux, Remy; Pessaux, Patrick; Soler, Luc; Mutter, Didier; Dallemagne, Bernard; Marescaux, Jacques

    2017-03-01

    Intraoperative liver segmentation can be obtained by means of percutaneous intra-portal injection of a fluorophore and illumination with a near-infrared light source. However, the percutaneous approach is challenging in the minimally invasive setting. We aimed to evaluate the feasibility of fluorescence liver segmentation by superselective intra-hepatic arterial injection of indocyanine green (ICG). Eight pigs (mean weight: 26.01 ± 5.21 kg) were involved. Procedures were performed in a hybrid experimental operative suite equipped with the Artis Zeego ® , multiaxis robotic angiography system. A pneumoperitoneum was established and four laparoscopic ports were introduced. The celiac trunk was catheterized, and a microcatheter was advanced into different segmental hepatic artery branches. A near-infrared laparoscope (D-Light P, Karl Storz) was used to detect the fluorescent signal. To assess the correspondence between arterial-based fluorescence demarcation and liver volume, metallic markers were placed along the fluorescent border, followed by a 3D CT-scanning, after injecting intra-arterial radiological contrast (n = 3). To assess the correspondence between arterial and portal supplies, percutaneous intra-portal angiography and intra-arterial angiography were performed simultaneously (n = 1). Bright fluorescence signal enhancing the demarcation of target segments was obtained from 0.1 mg/mL, in matter of seconds. Correspondence between the volume of hepatic segments and arterial territories was confirmed by CT angiography. Higher background fluorescence noise was found after positive staining by intra-portal ICG injection, due to parenchymal accumulation and porto-systemic shunting. Intra-hepatic arterial ICG injection, rapidly highlights hepatic target segment borders, with a better signal-to-background ratio as compared to portal vein injection, in the experimental setting.

  5. Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs

    NASA Astrophysics Data System (ADS)

    O'Connor, Rory V.

    This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.

  6. Building a Cloud Infrastructure for a Virtual Environmental Observatory

    NASA Astrophysics Data System (ADS)

    El-khatib, Y.; Blair, G. S.; Gemmell, A. L.; Gurney, R. J.

    2012-12-01

    Environmental science is often fragmented: data is collected by different organizations using mismatched formats and conventions, and models are misaligned and run in isolation. Cloud computing offers a lot of potential in the way of resolving such issues by supporting data from different sources and at various scales, and integrating models to create more sophisticated and collaborative software services. The Environmental Virtual Observatory pilot (EVOp) project, funded by the UK Natural Environment Research Council, aims to demonstrate how cloud computing principles and technologies can be harnessed to develop more effective solutions to pressing environmental issues. The EVOp infrastructure is a tailored one constructed from resources in both private clouds (owned and managed by us) and public clouds (leased from third party providers). All system assets are accessible via a uniform web service interface in order to enable versatile and transparent resource management, and to support fundamental infrastructure properties such as reliability and elasticity. The abstraction that this 'everything as a service' principle brings also supports mashups, i.e. combining different web services (such as models) and data resources of different origins (in situ gauging stations, warehoused data stores, external sources, etc.). We adopt the RESTful style of web services in order to draw a clear line between client and server (i.e. cloud host) and also to keep the server completely stateless. This significantly improves the scalability of the infrastructure and enables easy infrastructure management. For instance, tasks such as load balancing and failure recovery are greatly simplified without the need for techniques such as advance resource reservation or shared block devices. Upon this infrastructure, we developed a web portal composed of a bespoke collection of web-based visualization tools to help bring out relationships or patterns within the data. The portal was designed for use without any programming prerequisites by stakeholders from different backgrounds such as scientists, policy makers, local communities, and the general public. The development of the portal was carried out using an iterative behaviour-driven approach. We have developed six distinct storyboards to determine the requirements of different users. From these, we identified two storyboards to implement during the pilot phase. The first explores flooding at a local catchment scale for farmers and the public. We simulate hydrological interactions to determine where saturated land-surface areas develop. Model parameter values resembling catchment characteristics could be specified either explicitly (for domain specialists) or indirectly using one of several predefined land use scenarios (for less familiar audiences). The second storyboard investigates the diffuse of agricultural pollution at a national level, with regulators as users. We study the flux of Nitrogen and Phosphorus from land to rivers and coastal regions at various scales of drainage and reporting units. This is particularly useful to uncover the impact of existing policy instruments or risk from future environmental changes on the levels of N and P flux.

  7. Three-Dimensional Path Planning Software-Assisted Transjugular Intrahepatic Portosystemic Shunt: A Technical Modification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsauo, Jiaywei, E-mail: 80732059@qq.com; Luo, Xuefeng, E-mail: luobo-913@126.com; Ye, Linchao, E-mail: linchao.ye@siemens.com

    2015-06-15

    PurposeThis study was designed to report our results with a modified technique of three-dimensional (3D) path planning software assisted transjugular intrahepatic portosystemic shunt (TIPS).Methods3D path planning software was recently developed to facilitate TIPS creation by using two carbon dioxide portograms acquired at least 20° apart to generate a 3D path for overlay needle guidance. However, one shortcoming is that puncturing along the overlay would be technically impossible if the angle of the liver access set and the angle of the 3D path are not the same. To solve this problem, a prototype 3D path planning software was fitted with a utility to calculate themore » angle of the 3D path. Using this, we modified the angle of the liver access set accordingly during the procedure in ten patients.ResultsFailure for technical reasons occurred in three patients (unsuccessful wedged hepatic venography in two cases, software technical failure in one case). The procedure was successful in the remaining seven patients, and only one needle pass was required to obtain portal vein access in each case. The course of puncture was comparable to the 3D path in all patients. No procedure-related complication occurred following the procedures.ConclusionsAdjusting the angle of the liver access set to match the angle of the 3D path determined by the software appears to be a favorable modification to the technique of 3D path planning software assisted TIPS.« less

  8. Meeting Reference Responsibilities through Library Web Sites.

    ERIC Educational Resources Information Center

    Adams, Michael

    2001-01-01

    Discusses library Web sites and explains some of the benefits when libraries make their sites into reference portals, linking them to other useful Web sites. Topics include print versus Web information sources; limitations of search engines; what Web sites to include, including criteria for inclusions; and organizing the sites. (LRW)

  9. 75 FR 18607 - Mandatory Reporting of Greenhouse Gases: Petroleum and Natural Gas Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-12

    ... any of the following methods: Federal eRulemaking Portal: http://www.regulations.gov . Follow the... the Source Category D. Selection of Reporting Threshold E. Selection of Proposed Monitoring Methods F... rule and the monitoring methods proposed. This section then provides a brief summary of, and rationale...

  10. 78 FR 69097 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... (CRM) system to improve the response to correspondences from individuals seeking information from a... and private sector sources.'' The SalesForce CRM provides a centralized portal to manage frequently... topics. Depending on the topic searched, the CRM queries the database of pre-approved questions and...

  11. Naval Oceanography Portal

    Science.gov Websites

    section Advanced Search... Sections Home Time Earth Orientation Astronomy Meteorology Oceanography Ice You ) provides a wide range of astronomical data and products, and serves as the official source of time for the U.S. Department of Defense and a standard of time for the entire United States. The following NMOC

  12. SHIFT: Shared Information Framework and Technology Concept

    DTIC Science & Technology

    2009-02-01

    easy to get an overview of all those tasks what is happening around each mission. Federated Search allows actors to search multiple data sources...is sent to every individual database in the portal or federated search list. 4.7. SHIFT and the world around it The idea in SHIFT is to enable

  13. The social disutility of software ownership.

    PubMed

    Douglas, David M

    2011-09-01

    Software ownership allows the owner to restrict the distribution of software and to prevent others from reading the software's source code and building upon it. However, free software is released to users under software licenses that give them the right to read the source code, modify it, reuse it, and distribute the software to others. Proponents of free software such as Richard M. Stallman and Eben Moglen argue that the social disutility of software ownership is a sufficient justification for prohibiting it. This social disutility includes the social instability of disregarding laws and agreements covering software use and distribution, inequality of software access, and the inability to help others by sharing software with them. Here I consider these and other social disutility claims against withholding specific software rights from users, in particular, the rights to read the source code, duplicate, distribute, modify, imitate, and reuse portions of the software within new programs. I find that generally while withholding these rights from software users does cause some degree of social disutility, only the rights to duplicate, modify and imitate cannot legitimately be denied to users on this basis. The social disutility of withholding the rights to distribute the software, read its source code and reuse portions of it in new programs is insufficient to prohibit software owners from denying them to users. A compromise between the software owner and user can minimise the social disutility of withholding these particular rights from users. However, the social disutility caused by software patents is sufficient for rejecting such patents as they restrict the methods of reducing social disutility possible with other forms of software ownership.

  14. The Open Source Teaching Project (OSTP): Research Note.

    ERIC Educational Resources Information Center

    Hirst, Tony

    The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…

  15. Free and open source software for the manipulation of digital images.

    PubMed

    Solomon, Robert W

    2009-06-01

    Free and open source software is a type of software that is nearly as powerful as commercial software but is freely downloadable. This software can do almost everything that the expensive programs can. GIMP (gnu image manipulation program) is the free program that is comparable to Photoshop, and versions are available for Windows, Macintosh, and Linux platforms. This article briefly describes how GIMP can be installed and used to manipulate radiology images. It is no longer necessary to budget large amounts of money for high-quality software to achieve the goals of image processing and document creation because free and open source software is available for the user to download at will.

  16. ESA's Multi-mission Sentinel-1 Toolbox

    NASA Astrophysics Data System (ADS)

    Veci, Luis; Lu, Jun; Foumelis, Michael; Engdahl, Marcus

    2017-04-01

    The Sentinel-1 Toolbox is a new open source software for scientific learning, research and exploitation of the large archives of Sentinel and heritage missions. The Toolbox is based on the proven BEAM/NEST architecture inheriting all current NEST functionality including multi-mission support for most civilian satellite SAR missions. The project is funded through ESA's Scientific Exploitation of Operational Missions (SEOM). The Sentinel-1 Toolbox will strive to serve the SEOM mandate by providing leading-edge software to the science and application users in support of ESA's operational SAR mission as well as by educating and growing a SAR user community. The Toolbox consists of a collection of processing tools, data product readers and writers and a display and analysis application. A common architecture for all Sentinel Toolboxes is being jointly developed by Brockmann Consult, Array Systems Computing and C-S called the Sentinel Application Platform (SNAP). The SNAP architecture is ideal for Earth Observation processing and analysis due the following technological innovations: Extensibility, Portability, Modular Rich Client Platform, Generic EO Data Abstraction, Tiled Memory Management, and a Graph Processing Framework. The project has developed new tools for working with Sentinel-1 data in particular for working with the new Interferometric TOPSAR mode. TOPSAR Complex Coregistration and a complete Interferometric processing chain has been implemented for Sentinel-1 TOPSAR data. To accomplish this, a coregistration following the Spectral Diversity[4] method has been developed as well as special azimuth handling in the coherence, interferogram and spectral filter operators. The Toolbox includes reading of L0, L1 and L2 products in SAFE format, calibration and de-noising, slice product assembling, TOPSAR deburst and sub-swath merging, terrain flattening radiometric normalization, and visualization for L2 OCN products. The Toolbox also provides several new tools for exploitation of polarimetric data including speckle filters, decompositions, and classifiers. The Toolbox will also include tools for large data stacks, supervised and unsupervised classification, improved vector handling and change detection. Architectural improvements such as smart memory configuration, task queuing, and optimizations for complex data will provide better support and performance for very large products and stacks.In addition, a Cloud Exploitation Platform Extension (CEP) has been developed to add the capability to smoothly utilize a cloud computing platform where EO data repositories and high performance processing capabilities are available. The extension to the SENTINEL Application Platform would facilitate entry into cloud processing services for supporting bulk processing on high performance clusters. Since December 2016, the COMET-LiCS InSAR portal (http://comet.nerc.ac.uk/COMET-LiCS-portal/) has been live, delivering interferograms and coherence estimates over the entire Alpine-Himalayan belt. The portal already contains tens of thousands of products, which can be browsed in a user-friendly portal, and downloaded for free by the general public. For our processing, we use the facilities at the Climate and Environmental Monitoring from Space (CEMS). Here we have large storage and processing facilities to our disposal, and a complete duplicate of the Sentinel-1 archive is maintained. This greatly simplifies the infrastructure we had to develop for automated processing of large areas. Here we will give an overview of the current status of the processing system, as well as discuss future plans. We will cover the infrastructure we developed to automatically produce interferograms and its challenges, and the processing strategy for time series analysis. We will outline the objectives of the system in the near and distant future, and a roadmap for its continued development. Finally, we will highlight some of the scientific results and projects linked to the system.

  17. Integrating open-source software applications to build molecular dynamics systems.

    PubMed

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  18. Scaling Critical Zone analysis tasks from desktop to the cloud utilizing contemporary distributed computing and data management approaches: A case study for project based learning of Cyberinfrastructure concepts

    NASA Astrophysics Data System (ADS)

    Swetnam, T. L.; Pelletier, J. D.; Merchant, N.; Callahan, N.; Lyons, E.

    2015-12-01

    Earth science is making rapid advances through effective utilization of large-scale data repositories such as aerial LiDAR and access to NSF-funded cyberinfrastructures (e.g. the OpenTopography.org data portal, iPlant Collaborative, and XSEDE). Scaling analysis tasks that are traditionally developed using desktops, laptops or computing clusters to effectively leverage national and regional scale cyberinfrastructure pose unique challenges and barriers to adoption. To address some of these challenges in Fall 2014 an 'Applied Cyberinfrastructure Concepts' a project-based learning course (ISTA 420/520) at the University of Arizona focused on developing scalable models of 'Effective Energy and Mass Transfer' (EEMT, MJ m-2 yr-1) for use by the NSF Critical Zone Observatories (CZO) project. EEMT is a quantitative measure of the flux of available energy to the critical zone, and its computation involves inputs that have broad applicability (e.g. solar insolation). The course comprised of 25 students with varying level of computational skills and with no prior domain background in the geosciences, collaborated with domain experts to develop the scalable workflow. The original workflow relying on open-source QGIS platform on a laptop was scaled to effectively utilize cloud environments (Openstack), UA Campus HPC systems, iRODS, and other XSEDE and OSG resources. The project utilizes public data, e.g. DEMs produced by OpenTopography.org and climate data from Daymet, which are processed using GDAL, GRASS and SAGA and the Makeflow and Work-queue task management software packages. Students were placed into collaborative groups to develop the separate aspects of the project. They were allowed to change teams, alter workflows, and design and develop novel code. The students were able to identify all necessary dependencies, recompile source onto the target execution platforms, and demonstrate a functional workflow, which was further improved upon by one of the group leaders over Spring 2015. All of the code, documentation and workflow description are currently available on GitHub and a public data portal is in development. We present a case study of how students reacted to the challenge of a real science problem, their interactions with end-users, what went right, and what could be done better in the future.

  19. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  20. On the Prospects and Concerns of Integrating Open Source Software Environment in Software Engineering Education

    ERIC Educational Resources Information Center

    Kamthan, Pankaj

    2007-01-01

    Open Source Software (OSS) has introduced a new dimension in software community. As the development and use of OSS becomes prominent, the question of its integration in education arises. In this paper, the following practices fundamental to projects and processes in software engineering are examined from an OSS perspective: project management;…

  1. 76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... be held in the General Services Administration (GSA), Central Office Auditorium, 1800 F Street NW...

  2. Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.

    ERIC Educational Resources Information Center

    Newby, Gregory B.; Greenberg, Jane; Jones, Paul

    2003-01-01

    Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)

  3. Daily Planet Redesign: eZ Publish Web Content Management Implementation

    NASA Technical Reports Server (NTRS)

    Dutra, Jayne E.

    2006-01-01

    This viewgraph presentation reviews the process of the redesign of the Daily . Planet news letter as a content management implementation project. This is a site that is an internal news site that acts as a communication vehicle for a large volume of content. The Objectives for the site redesign was: (1) Clean visual design, (2) Facilitation of publication processes, (3) More efficient maintenance mode, (4) Automated publishing to internal portal, (5) Better navigation through improved site IA, (6) Archiving and retrieval functionality, (7) Back to basics on fundamental business goals. The CM is a process not a software package

  4. Scalable web services for the PSIPRED Protein Analysis Workbench.

    PubMed

    Buchan, Daniel W A; Minneci, Federico; Nugent, Tim C O; Bryson, Kevin; Jones, David T

    2013-07-01

    Here, we present the new UCL Bioinformatics Group's PSIPRED Protein Analysis Workbench. The Workbench unites all of our previously available analysis methods into a single web-based framework. The new web portal provides a greatly streamlined user interface with a number of new features to allow users to better explore their results. We offer a number of additional services to enable computationally scalable execution of our prediction methods; these include SOAP and XML-RPC web server access and new HADOOP packages. All software and services are available via the UCL Bioinformatics Group website at http://bioinf.cs.ucl.ac.uk/.

  5. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control

    PubMed Central

    Mathew, Cherian; Obst, Matthias; Vicario, Saverio; Haines, Robert; Williams, Alan R.; de Jong, Yde; Goble, Carole

    2014-01-01

    Abstract The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users. PMID:25535486

  6. Data to knowledge: how to get meaning from your result.

    PubMed

    Berman, Helen M; Gabanyi, Margaret J; Groom, Colin R; Johnson, John E; Murshudov, Garib N; Nicholls, Robert A; Reddy, Vijay; Schwede, Torsten; Zimmerman, Matthew D; Westbrook, John; Minor, Wladek

    2015-01-01

    Structural and functional studies require the development of sophisticated 'Big Data' technologies and software to increase the knowledge derived and ensure reproducibility of the data. This paper presents summaries of the Structural Biology Knowledge Base, the VIPERdb Virus Structure Database, evaluation of homology modeling by the Protein Model Portal, the ProSMART tool for conformation-independent structure comparison, the LabDB 'super' laboratory information management system and the Cambridge Structural Database. These techniques and technologies represent important tools for the transformation of crystallographic data into knowledge and information, in an effort to address the problem of non-reproducibility of experimental results.

  7. CHRPR Operations Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Windsor, Bradford T.; Woodring, Mitchell L.; Myjak, Mitchell J.

    2012-08-21

    1.0 Overview The TSA systems VM-250AGN portal monitor is a set of two pillars made to detect nuclear material in a vehicle. Each pillar contains two polyvinyl toluene (PVT) plastic gamma ray detectors and four 3He neutron detectors, as well as a power supply and electronics to process the output from these detectors. Pacific Northwest National Laboratory has designed and built a continuous high-resolution PVT readout (CHRPR) for the TSA portal to allow spectral readout from the gamma and neutron detectors. The CHRPR helps differentiate between different types of radioactive material through increased spectroscopic capability and associated developments. The TSAmore » VM-250AGN continually monitors the natural neutron and gamma ray background which occurs around the pillars. When the system is installed, the two pillars are placed on either side of a roadway, and a vehicle presence sensor records the passage of cars between them. When radiation measurements exceed a preset alarm threshold, the system alarms to let the user know that a radioactive material is present. Time-stamped measurements are continually sent to a computer, where they can be recorded via a Windows terminal or the TSA RAVEN software. For each pillar in the original TSA model, output from each detector is amplified and shaped by a single channel analyzer, the SCA-775. Information from both SCA-775’s are passed to the SC-770 in the master pillar. This is the detector interface module and main data processor. It counts electrical pulses and uses program software to output total readings to the computer, as well as trigger any appropriate alarms. The CHRPR allows a parallel approach to recording radiation readings from the TSA system. After installing the CHRPR system, all TSA power and signal connections are unchanged. The CHRPR captures electrical pulses containing detector and occupancy sensor information from the SCA-775 on either side. These pulses are converted to a signal with a time width proportional to the amplitude, via voltage to pulse width converters (VPW). These time widths are then digitized by a field programmable gate array (FPGA) and transmitted over Ethernet to a data acquisition computer. The CHRPR records the magnitude of each pulse to a continuous event mode file on or each detector and occupancy sensor This manual begins with CHRPR installation instructions, then a section on CHRPR software. Afterward is a brief overview of how the TSA system works, then an explanation of the CHRPR. This manual is meant as a supplement to the TSA VM-250AGN manual, which can be found at http://tsasystems.com/library/manuals/pm700agn-vm250agn_manual.pdf . That manual is the manufacturer’s guide for the installation, programming, and maintenance of the portal system.« less

  8. The SOOS Data Portal, providing access to Southern Oceans data

    NASA Astrophysics Data System (ADS)

    Proctor, Roger; Finney, Kim; Blain, Peter; Taylor, Fiona; Newman, Louise; Meredith, Mike; Schofield, Oscar

    2013-04-01

    The Southern Ocean Observing System (SOOS) is an international initiative to enhance, coordinate and expand the strategic observations of the Southern Oceans that are required to address key scientific and societal challenges. A key component of SOOS will be the creation and maintenance of a Southern Ocean Data Portal to provide improved access to historical and ongoing data (Schofield et al., 2012, Eos, Vol. 93, No. 26, pp 241-243). The scale of this effort will require strong leveraging of existing data centres, new cyberinfrastructure development efforts, and defined data collection, quality control, and archiving procedures across the international community. The task of assembling the SOOS data portal is assigned to the SOOS Data Management Sub-Committee. The information infrastructure chosen for the SOOS data portal is based on the Australian Ocean Data Network (AODN, http://portal.aodn.org.au). The AODN infrastructure is built on open-source tools and the use of international standards ensures efficiency of data exchange and interoperability between contributing systems. OGC standard web services protocols are used for serving of data via the internet. These include Web Map Service (WMS) for visualisation, Web Feature Service (WFS) for data download, and Catalogue Service for Web (CSW) for catalogue exchange. The portal offers a number of tools to access and visualize data: - a Search link to the metadata catalogue enables search and discovery by simple text search, by geographic area, temporal extent, keyword, parameter, organisation, or by any combination of these, allowing users to gain access to further information and/or the data for download. Also, searches can be restricted to items which have either data to download, or attached map layers, or both - a Map interface for discovery and display of data, with the ability to change the style and opacity of layers, add additional data layers via OGC Web Map Services, view animated timeseries datastreams - data can be easily accessed and downloaded including directly from OPeNDAP/THREDDS servers. The SOOS data portal (http://soos.aodn.org.au/soos) aims to make access to Southern Ocean data a simple process and the initial layout classifies data into six themes - Heat and Freshwater; Circulation; Ice-sheets and Sea level; Carbon; Sea-ice; and Ecosystems, with the ability to integrate layers between themes. The portal is in its infancy (pilot launched January 2013) with a limited number of datasets available; however, the number of datasets is expected to grow rapidly as the international community becomes fully engaged.

  9. Enhanced metabolite generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chidambaram, Devicharan

    The present invention relates to the enhanced production of metabolites by a process whereby a carbon source is oxidized with a fermentative microbe in a compartment having a portal. An electron acceptor is added to the compartment to assist the microbe in the removal of excess electrons. The electron acceptor accepts electrons from the microbe after oxidation of the carbon source. Other transfers of electrons can take place to enhance the production of the metabolite, such as acids, biofuels or brewed beverages.

  10. Providing web-based tools for time series access and analysis

    NASA Astrophysics Data System (ADS)

    Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane

    2014-05-01

    Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30, 833-845. Verbesselt, J., R. Hyndman, G. Newnham and D. Culvenor (2010). Detecting trend and seasonal changes in satellite image time series. Remote Sensing of Environment, 114, 106-115. DOI: 10.1016/j.rse.2009.08.014 Forkel, M., N. Carvalhais, J. Verbesselt, M. Mahecha, C. Neigh and M. Reichstein (2013). Trend Change Detection in NDVI Time Series: Effects of Inter-Annual Variability and Methodology. Remote Sensing 5, 2113-2144.

  11. Preparing a scientific manuscript in Linux: Today's possibilities and limitations

    PubMed Central

    2011-01-01

    Background Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Findings Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux. PMID:22018246

  12. Content analysis of cancer blog posts.

    PubMed

    Kim, Sujin

    2009-10-01

    The efficacy of user-defined subject tagging and software-generated subject tagging for describing and organizing cancer blog contents was explored. The Technorati search engine was used to search the blogosphere for cancer blog postings generated during a two-month period. Postings were mined for relevant subject concepts, and blogger-defined tags and Text Analysis Portal for Research (TAPoR) software-defined tags were generated for each message. Descriptive data were collected, and the blogger-defined tags were compared with software-generated tags. Three standard vocabularies (Opinion Templates, Basic Resource, and Medical Subject Headings [MeSH] Resource) were used to assign subject terms to the blogs, with results compared for efficacy in information retrieval. Descriptive data showed that most of the studied cancer blogs (80%) contained fewer than 500 words each. The numbers of blogger-defined tags per posting (M = 4.49 per posting) were significantly smaller than the TAPoR keywords (M = 23.55 per posting). Both blogger-defined subject tags and software-generated subject tags were often overly broad or overly narrow in focus, producing less than effective search results for those seeking to extract information from cancer blogs. Additional exploration into methods for systematically organizing cancer blog postings is necessary if blogs are to become stable and efficacious information resources for cancer patients, friends, families, or providers.

  13. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  14. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  15. Percutaneous Portal Vein Access and Transhepatic Tract Hemostasis

    PubMed Central

    Saad, Wael E. A.; Madoff, David C.

    2012-01-01

    Percutaneous portal vein interventions require minimally invasive access to the portal venous system. Common approaches to the portal vein include transjugular hepatic vein to portal vein access and direct transhepatic portal vein access. A major concern of the transhepatic route is the risk of postprocedural bleeding, which is increased when patients are anticoagulated or receiving pharmaceutical thrombolytic therapy. Thus percutaneous portal vein access and subsequent closure are important technical parts of percutaneous portal vein procedures. At present, various techniques have been used for either portal access or subsequent transhepatic tract closure and hemostasis. Regardless of the method used, meticulous technique is required to achieve the overall safety and effectiveness of portal venous procedures. This article reviews the various techniques of percutaneous transhepatic portal vein access and the various closure and hemostatic methods used to reduce the risk of postprocedural bleeding. PMID:23729976

  16. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley

    2018-05-01

    We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.

  17. Open source posturography.

    PubMed

    Rey-Martinez, Jorge; Pérez-Fernández, Nicolás

    2016-12-01

    The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.

  18. An Investigation of an Open-Source Software Development Environment in a Software Engineering Graduate Course

    ERIC Educational Resources Information Center

    Ge, Xun; Huang, Kun; Dong, Yifei

    2010-01-01

    A semester-long ethnography study was carried out to investigate project-based learning in a graduate software engineering course through the implementation of an Open-Source Software Development (OSSD) learning environment, which featured authentic projects, learning community, cognitive apprenticeship, and technology affordances. The study…

  19. Patient portal doldrums: does an exam room promotional video during an office visit increase patient portal registrations and portal use?

    PubMed

    North, Frederick; Hanna, Barbara K; Crane, Sarah J; Smith, Steven A; Tulledge-Scheitel, Sidna M; Stroebel, Robert J

    2011-12-01

    The patient portal is a web service which allows patients to view their electronic health record, communicate online with their care teams, and manage healthcare appointments and medications. Despite advantages of the patient portal, registrations for portal use have often been slow. Using a secure video system on our existing exam room electronic health record displays during regular office visits, the authors showed patients a video which promoted use of the patient portal. The authors compared portal registrations and portal use following the video to providing a paper instruction sheet and to a control (no additional portal promotion). From the 12,050 office appointments examined, portal registrations within 45 days of the appointment were 11.7%, 7.1%, and 2.5% for video, paper instructions, and control respectively (p<0.0001). Within 6 months following the interventions, 3.5% in the video cohort, 1.2% in the paper, and 0.75% of the control patients demonstrated portal use by initiating portal messages to their providers (p<0.0001).

  20. Removing a barrier to computer-based outbreak and disease surveillance--the RODS Open Source Project.

    PubMed

    Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J

    2004-09-24

    Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.

  1. The Visible Human Data Sets (VHD) and Insight Toolkit (ITk): Experiments in Open Source Software

    PubMed Central

    Ackerman, Michael J.; Yoo, Terry S.

    2003-01-01

    From its inception in 1989, the Visible Human Project was designed as an experiment in open source software. In 1994 and 1995 the male and female Visible Human data sets were released by the National Library of Medicine (NLM) as open source data sets. In 2002 the NLM released the first version of the Insight Toolkit (ITk) as open source software. PMID:14728278

  2. Photo-Modeling and Cloud Computing. Applications in the Survey of Late Gothic Architectural Elements

    NASA Astrophysics Data System (ADS)

    Casu, P.; Pisu, C.

    2013-02-01

    This work proposes the application of the latest methods of photo-modeling to the study of Gothic architecture in Sardinia. The aim is to consider the versatility and ease of use of such documentation tools in order to study architecture and its ornamental details. The paper illustrates a procedure of integrated survey and restitution, with the purpose to obtain an accurate 3D model of some gothic portals. We combined the contact survey and the photographic survey oriented to the photo-modelling. The software used is 123D Catch by Autodesk an Image Based Modelling (IBM) system available free. It is a web-based application that requires a few simple steps to produce a mesh from a set of not oriented photos. We tested the application on four portals, working at different scale of detail: at first the whole portal and then the different architectural elements that composed it. We were able to model all the elements and to quickly extrapolate simple sections, in order to make a comparison between the moldings, highlighting similarities and differences. Working in different sites at different scale of detail, have allowed us to test the procedure under different conditions of exposure, sunshine, accessibility, degradation of surface, type of material, and with different equipment and operators, showing if the final result could be affected by these factors. We tested a procedure, articulated in a few repeatable steps, that can be applied, with the right corrections and adaptations, to similar cases and/or larger or smaller elements.

  3. A community effort to construct a gravity database for the United States and an associated Web portal

    USGS Publications Warehouse

    Keller, Gordon R.; Hildenbrand, T.G.; Kucks, R.; Webring, M.; Briesacher, A.; Rujawitz, K.; Hittleman, A.M.; Roman, D.R.; Winester, D.; Aldouri, R.; Seeley, J.; Rasillo, J.; Torres, R.; Hinze, W. J.; Gates, A.; Kreinovich, V.; Salayandia, L.

    2006-01-01

    Potential field data (gravity and magnetic measurements) are both useful and costeffective tools for many geologic investigations. Significant amounts of these data are traditionally in the public domain. A new magnetic database for North America was released in 2002, and as a result, a cooperative effort between government agencies, industry, and universities to compile an upgraded digital gravity anomaly database, grid, and map for the conterminous United States was initiated and is the subject of this paper. This database is being crafted into a data system that is accessible through a Web portal. This data system features the database, software tools, and convenient access. The Web portal will enhance the quality and quantity of data contributed to the gravity database that will be a shared community resource. The system's totally digital nature ensures that it will be flexible so that it can grow and evolve as new data, processing procedures, and modeling and visualization tools become available. Another goal of this Web-based data system is facilitation of the efforts of researchers and students who wish to collect data from regions currently not represented adequately in the database. The primary goal of upgrading the United States gravity database and this data system is to provide more reliable data that support societal and scientific investigations of national importance. An additional motivation is the international intent to compile an enhanced North American gravity database, which is critical to understanding regional geologic features, the tectonic evolution of the continent, and other issues that cross national boundaries. ?? 2006 Geological Society of America. All rights reserved.

  4. 29 CFR 785.34 - Effect of section 4 of the Portal-to-Portal Act.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 3 2014-07-01 2014-07-01 false Effect of section 4 of the Portal-to-Portal Act. 785.34... of Principles Traveltime § 785.34 Effect of section 4 of the Portal-to-Portal Act. The Portal Act... employee and activities that are incidental to the use of such vehicle for commuting are not considered...

  5. 29 CFR 785.34 - Effect of section 4 of the Portal-to-Portal Act.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 3 2011-07-01 2011-07-01 false Effect of section 4 of the Portal-to-Portal Act. 785.34... of Principles Traveltime § 785.34 Effect of section 4 of the Portal-to-Portal Act. The Portal Act... employee and activities that are incidental to the use of such vehicle for commuting are not considered...

  6. 29 CFR 785.34 - Effect of section 4 of the Portal-to-Portal Act.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 3 2013-07-01 2013-07-01 false Effect of section 4 of the Portal-to-Portal Act. 785.34... of Principles Traveltime § 785.34 Effect of section 4 of the Portal-to-Portal Act. The Portal Act... employee and activities that are incidental to the use of such vehicle for commuting are not considered...

  7. 29 CFR 785.34 - Effect of section 4 of the Portal-to-Portal Act.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 3 2012-07-01 2012-07-01 false Effect of section 4 of the Portal-to-Portal Act. 785.34... of Principles Traveltime § 785.34 Effect of section 4 of the Portal-to-Portal Act. The Portal Act... employee and activities that are incidental to the use of such vehicle for commuting are not considered...

  8. The case for open-source software in drug discovery.

    PubMed

    DeLano, Warren L

    2005-02-01

    Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.

  9. Extrahepatic portal vein obstruction and portal vein thrombosis in special situations: Need for a new classification.

    PubMed

    Wani, Zeeshan A; Bhat, Riyaz A; Bhadoria, Ajeet S; Maiwall, Rakhi

    2015-01-01

    Extrahepatic portal vein obstruction is a vascular disorder of liver, which results in obstruction and cavernomatous transformation of portal vein with or without the involvement of intrahepatic portal vein, splenic vein, or superior mesenteric vein. Portal vein obstruction due to chronic liver disease, neoplasm, or postsurgery is a separate entity and is not the same as extrahepatic portal vein obstruction. Patients with extrahepatic portal vein obstruction are generally young and belong mostly to Asian countries. It is therefore very important to define portal vein thrombosis as acute or chronic from management point of view. Portal vein thrombosis in certain situations such as liver transplant and postsurgical/liver transplant period is an evolving area and needs extensive research. There is a need for a new classification, which includes all areas of the entity. In the current review, the most recent literature of extrahepatic portal vein obstruction is reviewed and summarized.

  10. Extrahepatic Portal Vein Obstruction and Portal Vein Thrombosis in Special Situations: Need for a New Classification

    PubMed Central

    Wani, Zeeshan A.; Bhat, Riyaz A.; Bhadoria, Ajeet S.; Maiwall, Rakhi

    2015-01-01

    Extrahepatic portal vein obstruction is a vascular disorder of liver, which results in obstruction and cavernomatous transformation of portal vein with or without the involvement of intrahepatic portal vein, splenic vein, or superior mesenteric vein. Portal vein obstruction due to chronic liver disease, neoplasm, or postsurgery is a separate entity and is not the same as extrahepatic portal vein obstruction. Patients with extrahepatic portal vein obstruction are generally young and belong mostly to Asian countries. It is therefore very important to define portal vein thrombosis as acute or chronic from management point of view. Portal vein thrombosis in certain situations such as liver transplant and postsurgical/liver transplant period is an evolving area and needs extensive research. There is a need for a new classification, which includes all areas of the entity. In the current review, the most recent literature of extrahepatic portal vein obstruction is reviewed and summarized. PMID:26021771

  11. Case report: patient portal versus telephone recruitment for a surgical research study.

    PubMed

    Baucom, R B; Ousley, J; Poulose, B K; Rosenbloom, S T; Jackson, G P

    2014-01-01

    Patient portal adoption has rapidly increased over the last decade. Most patient portal research has been done in primary care or medical specialties, and few studies have examined their use in surgical patients or for recruiting research subjects. No known studies have compared portal messaging with other approaches of recruitment. This case report describes our experience with patient portal versus telephone recruitment for a study involving long-term follow up of surgical patients. Participants were recruited for a study of recurrence after ventral hernia repair through telephone calls and patient portal messaging based on registration status with the portal. Potential subjects who did not have a portal account or whose portal messages were returned after 5 days were called. The proportion of participants enrolled with each method was determined and demographics of eligible patients, portal users, and participants were compared. 1359 patients were eligible for the hernia study, and enrollment was 35% (n=465). Most participants were recruited by telephone (84%, n=391); 16% (n=74) were recruited through portal messaging. Forty-four percent of eligible participants had a registered portal account, and 14% of users responded to the recruitment message. Portal users were younger than non-users (55 vs. 58 years, p<0.001); participants recruited through the portal versus telephone were also younger (54 vs. 59 years, p=0.001). Differences in the sex and racial distributions between users and non-users and between portal and telephone recruits were not significant. Portal versus telephone recruitment for a surgical research study demonstrated modest portal recruitment rates and similar demographics between recruitment methods. Published studies of portal-only recruitment in primary care or medical-specialty patient populations have demonstrated higher enrollment rates, but this case study demonstrates that portal recruitment for research studies in the surgical population is feasible, and it offers convenience to patients and researchers.

  12. NREL: Renewable Resource Data Center - Geothermal Resource Related Links

    Science.gov Websites

    from the following sources: U.S. Department of Energy Geothermal Technologies Office. National Geothermal Resource Related Links Comprehensive geothermal resource information is also available Geothermal Data System A portal to geothermal data. Southern Methodist University Geothermal Laboratory The

  13. TOXNET and Beyond - Using the NLMs Environmental Health and Toxicology Portal-February

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Templin-Branner, W.

    2010-02-24

    The purpose of this training is to familiarize participants with reliable online environmental health and toxicology information, from the National Library of Medicine and other reliable sources. Skills and knowledge acquired in this training class will enable participants to access, utilize, and refer others to environmental health and toxicology information.

  14. Challenges of Implementing Free and Open Source Software (FOSS): Evidence from the Indian Educational Setting

    ERIC Educational Resources Information Center

    Thankachan, Briju; Moore, David Richard

    2017-01-01

    The use of Free and Open Source Software (FOSS), a subset of Information and Communication Technology (ICT), can reduce the cost of purchasing software. Despite the benefit in the initial purchase price of software, deploying software requires total cost that goes beyond the initial purchase price. Total cost is a silent issue of FOSS and can only…

  15. Footing the bill: patient portals, part I.

    PubMed

    Lawrence, Daphne

    2009-05-01

    Tie financial portal strategy into overall portal strategy. Savings from patient portals for finance come in the areas of call center volumes, bill pay, scheduling, and increased volume. Financial functions on the patient portal should be balanced with clinical functions. Improve the revenue cycle process before going to a portal.

  16. Requirements Engineering in Building Climate Science Software

    NASA Astrophysics Data System (ADS)

    Batcheller, Archer L.

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the software team or users have control and responsibility for making changes in response to new scientific ideas. Thick infrastructure provides more functionality for users, but gives them less control of it. The stability of infrastructure trades off against the responsiveness that the infrastructure can have to user needs.

  17. Open Source Software in Medium Size Organizations: Key Factors for Adoption

    ERIC Educational Resources Information Center

    Solomon, Jerry T.

    2010-01-01

    For-profit organizations are constantly evaluating new technologies to gain competitive advantage. One such technology, application software, has changed significantly over the past 25 years with the introduction of Open Source Software (OSS). In contrast to commercial software that is developed by private companies and sold to organizations, OSS…

  18. Space Environment Forecasting with Neutron Monitors: Establishing a novel service for the ESA SSA Program

    NASA Astrophysics Data System (ADS)

    Papaioannou, Athanasios; Mavromichalaki, Helen; Souvatzoglou, George; Paschalis, Pavlos; Sarlanis, Christos; Dimitroulakos, John; Gerontidou, Maria

    2013-04-01

    High-energy particles released at the Sun during a solar flare or a very energetic coronal mass ejection, result to a significant intensity increase at neutron monitor measurements known as Ground Level Enhancements (GLEs). Due to their space weather impact (i.e. risks and failures at communication and navigation systems, spacecraft electronics and operations, space power systems, manned space missions, and commercial aircraft operations) it is crucial to establish a real-time operational system that would be in place to issue reliable and timely GLE Alerts. Currently, the Cosmic Ray group of the National and Kapodistrian University of Athens is working towards the establishment of a Neutron Monitor Service that will be made available via the Space Weather Portal operated by the European Space Agency (ESA), under the Space Situational Awareness (SSA) Program. To this end, a web interface providing data from multiple Neutron Monitor stations as well as an upgraded GLE Alert will be provided. Both services are now under testing and validation and they will probably enter to an operational phase next year. The core of this Neutron Monitor Service is the GLE Alert software, and therefore, the main goal of this research effort is to upgrade the existing GLE Alert software, to minimize the probability of a false alarm and to enhance the usability of the corresponding results. The ESA Neutron Monitor Service is building upon the infrastructure made available with the implementation of the High-Resolution Neutron Monitor Database (NMDB). In this work the structure of the Neutron Monitor Service for ESA SSA Program and the impact of the novel GLE Alert Service that will be made available to future users via ESA SSA web portal will be presented and further discussed.

  19. SU-F-T-486: A Simple Approach to Performing Light Versus Radiation Field Coincidence Quality Assurance Using An Electronic Portal Imaging Device (EPID)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herchko, S; Ding, G

    2016-06-15

    Purpose: To develop an accurate, straightforward, and user-independent method for performing light versus radiation field coincidence quality assurance utilizing EPID images, a simple phantom made of readily-accessible materials, and a free software program. Methods: A simple phantom consisting of a blocking tray, graph paper, and high-density wire was constructed. The phantom was used to accurately set the size of a desired light field and imaged on the electronic portal imaging device (EPID). A macro written for use in ImageJ, a free image processing software, was then use to determine the radiation field size utilizing the high density wires on themore » phantom for a pixel to distance calibration. The macro also performs an analysis on the measured radiation field utilizing the tolerances recommended in the AAPM Task Group #142. To verify the accuracy of this method, radiochromic film was used to qualitatively demonstrate agreement between the film and EPID results, and an additional ImageJ macro was used to quantitatively compare the radiation field sizes measured both with the EPID and film images. Results: The results of this technique were benchmarked against film measurements, which have been the gold standard for testing light versus radiation field coincidence. The agreement between this method and film measurements were within 0.5 mm. Conclusion: Due to the operator dependency associated with tracing light fields and measuring radiation fields by hand when using film, this method allows for a more accurate comparison between the light and radiation fields with minimal operator dependency. Removing the need for radiographic or radiochromic film also eliminates a reoccurring cost and increases procedural efficiency.« less

  20. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  1. Portal Vein Stenting for Portal Biliopathy with Jaundice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hyun, Dongho, E-mail: mesentery@naver.com; Park, Kwang Bo, E-mail: kbjh.park@samsung.com; Lim, Seong Joo

    2016-04-15

    Portal biliopathy refers to obstruction of the bile duct by dilated peri- or para-ductal collateral channels following the main portal vein occlusion from various causes. Surgical shunt operation or endoscopic treatment has been reported. Herein, we report a case of portal biliopathy that was successfully treated by interventional portal vein recanalization.

  2. Evolving the NCSA CyberCollaboratory for Distributed Environmental Observatory Networks

    NASA Astrophysics Data System (ADS)

    Myers, J.; Liu, Y.; Minsker, B.; Futrelle, J.; Downey, S.; Kim, I.; Rantanen, E.

    2007-12-01

    Since 2004, NCSA's Cybercollaboratory, which is built on top of the open source Liferay portal framework, has been evolving as part of NCSA's efforts to build national cyberinfrastructure to support collaborative research in environmental engineering and hydrological sciences and allow users to efficiently share contents (sensors, data, model, documents, etc.) in a context-sensitive way (e.g., providing different tools/data based on group affiliation and geospatial contexts). During this period, we provided the CyberCollaboratory to users in CLEANER (Collaborative Large-scale Engineering Analysis Network for Environmental Research, now WATer and Environmental Research Systems (WATERS) network) Project Office and several CLEANER /WATERS testbed projects. Preliminary statistics shows that one in four users (among over 400 registered users) provided contents with many other reading/accessing those contents (such as messages, documents, wikis). During the course of this use, and in evaluation by others including representatives from the CUAHSI (Consortium of Universities for the Advancement of Hydrologic Science) community, we have received significant feedback on issues of usability and suitability to various communities involved in environmental observatories. Much of this feedback applies to collaborative portals in general and some reflect a comparison of portals with newer Web 2.0 style social -networking sites. For example, users working in multiple groups found it difficult to get an overview of all of their activities and found differences in group layouts to be confusing. Users also found the standard account creation and group management processes cumbersome compared to inviting people to be friends on social sites and wanted a better sense of presence and social networks within the portal. The fragmentation of group documents between local stores, the portal document repository and email, and issues of "lost updates" was another significant concern. This poster reviews the usability feedback, identifies key issues that hinder traditional portal-based collaboration environments, and presents design changes made to the Cybercollaboratory to address them. Feedback on the effectiveness of the new design from hydrologists and environmental researchers and preliminary results from a formal usability study will also be presented.

  3. Radiation portal monitor system and method

    DOEpatents

    Morris, Christopher [Los Alamos, NM; Borozdin, Konstantin N [Los Alamos, NM; Green, J Andrew [Los Alamos, NM; Hogan, Gary E [Los Alamos, NM; Makela, Mark F [Los Alamos, NM; Priedhorsky, William C [Los Alamos, NM; Saunders, Alexander [Los Alamos, NM; Schultz, Larry J [Los Alamos, NM; Sossong, Michael J [Los Alamos, NM

    2009-12-15

    A portal monitoring system has a cosmic ray charged particle tracker with a plurality of drift cells. The drift cells, which can be for example aluminum drift tubes, can be arranged at least above and below a volume to be scanned to thereby track incoming and outgoing charged particles, such as cosmic ray muons, whilst also detecting gamma rays. The system can selectively detect devices or materials, such as iron, lead, gold and/or tungsten, occupying the volume from multiple scattering of the charged particles passing through the volume and can also detect any radioactive sources occupying the volume from gamma rays emitted therefrom. If necessary, the drift tubes can be sealed to eliminate the need for a gas handling system. The system can be employed to inspect occupied vehicles at border crossings for nuclear threat objects.

  4. Methionine metabolism in piglets Fed DL-methionine or its hydroxy analogue was affected by distribution of enzymes oxidizing these sources to keto-methionine.

    PubMed

    Fang, Zhengfeng; Luo, Hefeng; Wei, Hongkui; Huang, Feiruo; Qi, Zhili; Jiang, Siwen; Peng, Jian

    2010-02-10

    Previous evidence shows that the extensive catabolism of dietary essential amino acids (AA) by the intestine results in decreased availability of these AA for protein synthesis in extraintestinal tissues. This raises the possibility that extraintestinal availability of AA may be improved by supplying the animal with an AA source more of which can bypass the intestine. To test this hypothesis, six barrows (35-day-old, 8.6 +/- 1.4 kg), implanted with arterial, portal, and mesenteric catheters, were fed a DL-methionine (DL-MET) or DL-2-hydroxy-4-methylthiobutyrate (DL-HMTB) diet once hourly and infused intramesenterically with 1% p-amino hippurate. Although the directly available L-MET in DL-MET diet was about 1.2-fold that in DL-HMTB diet, the net portal appearance of L-MET was not different between the two diets. Compared with the low mRNA abundance and low activity of D-2-hydroxy acid dehydrogenase (D-HADH) and l-2-hydroxy acid oxidase (L-HAOX) in the intestine, the high mRNA abundance and high activity of D-AA oxidase (D-AAOX) indicated that the intestine had a relatively higher capacity of D-MET utilization than of dl-HMTB utilization to L-MET synthesis and its subsequent metabolism. However, in contrast to the much lower D-AAOX activity (nmol/g tissue) in the stomach than in the liver and kidney, both d-HADH and L-HAOX activity in the stomach was comparable with those in the liver and/or kidney, indicating the substantial capacity of the stomach to convert DL-HMTB to L-MET. Collectively, the difference in distribution of activity and mRNA abundance of D-AAOX, D-HADH, and L-HAOX in the piglets may offer a biological basis for the similar portal appearance of L-MET between DL-MET and DL-HMTB diets, and thus may provide new important insights into nutritional efficiency of different L-MET sources.

  5. Increasing Health Portal Utilization in Cardiac Ambulatory Patients: A Pilot Project.

    PubMed

    Shaw, Carmen L; Casterline, Gayle L; Taylor, Dennis; Fogle, Maureen; Granger, Bradi

    2017-10-01

    Increasing health portal participation actively engages patients in their care and improves outcomes. The primary aim for this project was to increase patient health portal utilization. Nurses used a tablet-based demo to teach patients how to navigate the health portal. Assigning health videos to the portal was a tactic used to increase utilization. Each patient participant was surveyed about health portal utilization at initial nurse navigator appointment, day of procedure, and 30 days after discharge. Seventy-three percent (n = 14) of the 19 selected patients received the intervention; 36% (n = 4) of patients reported using a health portal feature; meaningful use metric preintervention increased from 12% to 16% after the intervention; 16% and 18% of patients viewed assigned videos in their health portal prior to procedure and after hospital discharge. Patients need a reason to access their health portal. Education alone is not enough to motivate patient portal use. Further research is needed to specify what tactics are required to motivate patients to use their health portals.

  6. Portal hypertension: a review of portosystemic collateral pathways and endovascular interventions.

    PubMed

    Pillai, A K; Andring, B; Patel, A; Trimmer, C; Kalva, S P

    2015-10-01

    The portal vein is formed at the confluence of the splenic and superior mesenteric vein behind the head of the pancreas. Normal blood pressure within the portal system varies between 5 and 10 mmHg. Portal hypertension is defined when the gradient between the portal and systemic venous blood pressure exceeds 5 mmHg. The most common cause of portal hypertension is cirrhosis. In cirrhosis, portal hypertension develops due to extensive fibrosis within the liver parenchyma causing increased vascular resistance. In addition, the inability of the liver to metabolise certain vasodilators leads to hyperdynamic splanchnic circulation resulting in increased portal blood flow. Decompression of the portal pressure is achieved by formation of portosystemic collaterals. In this review, we will discuss the pathophysiology, anatomy, and imaging findings of spontaneous portosystemic collaterals and clinical manifestations of portal hypertension with emphasis on the role of interventional radiology in the management of complications related to portal hypertension. Copyright © 2015 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  7. The BAOBAB data portal and DACCIWA database

    NASA Astrophysics Data System (ADS)

    Brissebrat, Guillaume; Belmahfoud, Nizar; Cloché, Sophie; Ferré, Hélène; Fleury, Laurence; Mière, Arnaud; Ramage, Karim

    2017-04-01

    In the framework of the African Monsoon Multidisciplinary Analyses (AMMA) programme, several tools have been developed in order to boost the data and information exchange between researchers from different disciplines: a user-friendly data management and dissemination system, quasi real-time display websites and a scientific paper exchange collaborative tool. The information system is enriched by past and ongoing projects (IMPETUS, FENNEC, ESCAPE, QweCI, ACASIS, DACCIWA...) addressing meteorology, atmospheric chemistry, hydrology, extreme events, health, adaptation of human societies... It is becoming a reference information system on environmental issues in West Africa: BAOBAB (Base Afrique de l'Ouest beyond AMMA Base). The projects include airborne, ground-based and ocean measurements, social science surveys, satellite data use, modelling studies and value-added product development. Therefore, the BAOBAB data portal enables to access a great amount and a large variety of data: - 250 local observation datasets, that have been collected by operational networks since 1850, long term monitoring research networks and intensive scientific campaigns; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Data documentation complies with metadata international standards, and data are delivered into standard formats. The data request interface takes full advantage of the database relational structure and enables users to elaborate multicriteria requests (period, area, property…). The BAOBAB data portal counts about 900 registered users, and 50 data requests every month. The databases and data portal have been developed and are operated jointly by SEDOO and ESPRI in France: http://baoab.sedoo.fr. The ongoing DACCIWA (Dynamics-Aerosol-Chemistry-Cloud Interactions over West Africa) project uses the BAOBAB portal to distribute its data: http://baobab.sedoo.fr/DACCIWA/. 30 datasets are already available: - Local observation from DACCIWA-supersites at Savé (Benin), Kumasi (Ghana), and Ile-Ife (Nigeria); - Radiosonde data from stations in Benin, Cameroon, Côte d'Ivoire, Ghana and Nigeria. During the June-July 2016 DACCIWA campaign, a day-to-day chart display software has been designed and operated in order to monitor meteorological and environment information and to meet the observational team needs: - Quickooks from DACCIWA-supersites instruments; - Atmospheric and chemical models outputs; - Satellite products (Eumetsat, TERRA-MODIS...). This website (http://dacciwa.sedoo.fr) constitutes now a synthetic view on the campaign and a preliminary investigation tool for researchers. Similar websites are still online for past campaigns : AMMA 2006 (http://aoc.amma-international.org) and FENNEC 2011 (http://fenoc.sedoo.fr). Since 2011, the same software enables a group of French and Senegalese researchers and forecasters to exchange in near real-time physical indices and diagnosis calculated from numerical weather operational forecasts, satellite products and in situ operational observations along the monsoon season, in order to better assess, understand and anticipate the monsoon intraseasonal variability (http://misva.sedoo.fr). Another similar website is dedicated to heat waves diagnosis and monitoring (http://acasis.sedoo.fr). It aims at becoming an operational component for national early warning systems. Every scientist is invited to make use of the BAOBAB online tools and data. Scientists or project leaders who have management needs for existing or future datasets concerning West Africa are welcome to use the BAOBAB framework and to contact baobab@sedoo.fr.

  8. Developing an Open Source Option for NASA Software

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Parks, John W. (Technical Monitor)

    2003-01-01

    We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.

  9. Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.

    PubMed

    Benson, Tim

    2016-07-04

    Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.

  10. The Role of Organizational Sub-Cultures in Higher Education Adoption of Open Source Software (OSS) for Teaching/Learning

    ERIC Educational Resources Information Center

    Williams van Rooij, Shahron

    2010-01-01

    This paper contrasts the arguments offered in the literature advocating the adoption of open source software (OSS)--software delivered with its source code--for teaching and learning applications, with the reality of limited enterprise-wide deployment of those applications in U.S. higher education. Drawing on the fields of organizational…

  11. Looking toward the Future: A Case Study of Open Source Software in the Humanities

    ERIC Educational Resources Information Center

    Quamen, Harvey

    2006-01-01

    In this article Harvey Quamen examines how the philosophy of open source software might be of particular benefit to humanities scholars in the near future--particularly for academic journals with limited financial resources. To this end he provides a case study in which he describes his use of open source technology (MySQL database software and…

  12. Free/Libre Open Source Software Implementation in Schools: Evidence from the Field and Implications for the Future

    ERIC Educational Resources Information Center

    Lin, Yu-Wei; Zini, Enrico

    2008-01-01

    This empirical paper shows how free/libre open source software (FLOSS) contributes to mutual and collaborative learning in an educational environment. Unlike proprietary software, FLOSS allows extensive customisation of software to support the needs of local users better. This also allows users to participate more proactively in the development…

  13. Shaping Software Engineering Curricula Using Open Source Communities: A Case Study

    ERIC Educational Resources Information Center

    Bowring, James; Burke, Quinn

    2016-01-01

    This paper documents four years of a novel approach to teaching a two-course sequence in software engineering as part of the ABET-accredited computer science curriculum at the College of Charleston. This approach is team-based and centers on learning software engineering in the context of open source software projects. In the first course, teams…

  14. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  15. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  16. Magnetic Resonance Imaging of a Liver Hydatid Cyst Invading the Portal Vein and Causing Portal Cavernomatosis.

    PubMed

    Herek, Duygu; Sungurtekin, Ugur

    2015-01-01

    Hepatic hydatid cysts rarely invade portal veins causing portal cavernomatosis as a secondary complication. We report the case of a patient with direct invasion of the right portal vein by hydatid cysts causing portal cavernomatosis diagnosed via magnetic resonance imaging (MRI). The presented case highlights the useful application of MRI with T2-weighted images and gadolinium-enhanced T1-weighted images in the diagnosis of hepatic hydatid lesions presenting with a rare complication of portal cavernomatosis.

  17. Gramene 2013: comparative plant genomics resources.

    PubMed

    Monaco, Marcela K; Stein, Joshua; Naithani, Sushma; Wei, Sharon; Dharmawardhana, Palitha; Kumari, Sunita; Amarasinghe, Vindhya; Youens-Clark, Ken; Thomason, James; Preece, Justin; Pasternak, Shiran; Olson, Andrew; Jiao, Yinping; Lu, Zhenyuan; Bolser, Dan; Kerhornou, Arnaud; Staines, Dan; Walts, Brandon; Wu, Guanming; D'Eustachio, Peter; Haw, Robin; Croft, David; Kersey, Paul J; Stein, Lincoln; Jaiswal, Pankaj; Ware, Doreen

    2014-01-01

    Gramene (http://www.gramene.org) is a curated online resource for comparative functional genomics in crops and model plant species, currently hosting 27 fully and 10 partially sequenced reference genomes in its build number 38. Its strength derives from the application of a phylogenetic framework for genome comparison and the use of ontologies to integrate structural and functional annotation data. Whole-genome alignments complemented by phylogenetic gene family trees help infer syntenic and orthologous relationships. Genetic variation data, sequences and genome mappings available for 10 species, including Arabidopsis, rice and maize, help infer putative variant effects on genes and transcripts. The pathways section also hosts 10 species-specific metabolic pathways databases developed in-house or by our collaborators using Pathway Tools software, which facilitates searches for pathway, reaction and metabolite annotations, and allows analyses of user-defined expression datasets. Recently, we released a Plant Reactome portal featuring 133 curated rice pathways. This portal will be expanded for Arabidopsis, maize and other plant species. We continue to provide genetic and QTL maps and marker datasets developed by crop researchers. The project provides a unique community platform to support scientific research in plant genomics including studies in evolution, genetics, plant breeding, molecular biology, biochemistry and systems biology.

  18. Management and Analysis of Radiation Portal Monitor Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, Nathan C; Alcala, Scott; Crye, Jason Michael

    2014-01-01

    Oak Ridge National Laboratory (ORNL) receives, archives, and analyzes data from radiation portal monitors (RPMs). Over time the amount of data submitted for analysis has grown significantly, and in fiscal year 2013, ORNL received 545 gigabytes of data representing more than 230,000 RPM operating days. This data comes from more than 900 RPMs. ORNL extracts this data into a relational database, which is accessed through a custom software solution called the Desktop Analysis and Reporting Tool (DART). DART is used by data analysts to complete a monthly lane-by-lane review of RPM status. Recently ORNL has begun to extend its datamore » analysis based on program-wide data processing in addition to the lane-by-lane review. Program-wide data processing includes the use of classification algorithms designed to identify RPMs with specific known issues and clustering algorithms intended to identify as-yet-unknown issues or new methods and measures for use in future classification algorithms. This paper provides an overview of the architecture used in the management of this data, performance aspects of the system, and additional requirements and methods used in moving toward an increased program-wide analysis paradigm.« less

  19. Atomic and Molecular Databases, VAMDC (Virtual Atomic and Molecular Data Centre)

    NASA Astrophysics Data System (ADS)

    Dubernet, Marie-Lise; Zwölf, Carlo Maria; Moreau, Nicolas; Awa Ba, Yaya; VAMDC Consortium

    2015-08-01

    The "Virtual Atomic and Molecular Data Centre Consortium",(VAMDC Consortium, http://www.vamdc.eu) is a Consortium bound by an Memorandum of Understanding aiming at ensuring the sustainability of the VAMDC e-infrastructure. The current VAMDC e-infrastructure inter-connects about 30 atomic and molecular databases with the number of connected databases increasing every year: some databases are well-known databases such as CDMS, JPL, HITRAN, VALD,.., other databases have been created since the start of VAMDC. About 90% of our databases are used for astrophysical applications. The data can be queried, retrieved, visualized in a single format from a general portal (http://portal.vamdc.eu) and VAMDC is also developing standalone tools in order to retrieve and handle the data. VAMDC provides software and support in order to include databases within the VAMDC e-infrastructure. One current feature of VAMDC is the constrained environnement of description of data that ensures a higher quality for distribution of data; a future feature is the link of VAMDC with evaluation/validation groups. The talk will present the VAMDC Consortium and the VAMDC e infrastructure with its underlying technology, its services, its science use cases and its etension towards other communities than the academic research community.

  20. The Anatomy of a Grid portal

    NASA Astrophysics Data System (ADS)

    Licari, Daniele; Calzolari, Federico

    2011-12-01

    In this paper we introduce a new way to deal with Grid portals referring to our implementation. L-GRID is a light portal to access the EGEE/EGI Grid infrastructure via Web, allowing users to submit their jobs from a common Web browser in a few minutes, without any knowledge about the Grid infrastructure. It provides the control over the complete lifecycle of a Grid Job, from its submission and status monitoring, to the output retrieval. The system, implemented as client-server architecture, is based on the Globus Grid middleware. The client side application is based on a java applet; the server relies on a Globus User Interface. There is no need of user registration on the server side, and the user needs only his own X.509 personal certificate. The system is user-friendly, secure (it uses SSL protocol, mechanism for dynamic delegation and identity creation in public key infrastructures), highly customizable, open source, and easy to install. The X.509 personal certificate does not get out from the local machine. It allows to reduce the time spent for the job submission, granting at the same time a higher efficiency and a better security level in proxy delegation and management.

  1. A simple quality assurance test tool for the visual verification of light and radiation field congruent using electronic portal images device and computed radiography

    PubMed Central

    2012-01-01

    Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID) or computed radiography (CR). We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence. PMID:22452821

  2. A Laboratory-Based System for Managing and Distributing Publically Funded Geochemical Data in a Collaborative Environment

    NASA Astrophysics Data System (ADS)

    McInnes, B.; Brown, A.; Liffers, M.

    2015-12-01

    Publically funded laboratories have a responsibility to generate, archive and disseminate analytical data to the research community. Laboratory managers know however, that a long tail of analytical effort never escapes researchers' thumb drives once they leave the lab. This work reports on a research data management project (Digital Mineralogy Library) where integrated hardware and software systems automatically archive and deliver analytical data and metadata to institutional and community data portals. The scientific objective of the DML project was to quantify the modal abundance of heavy minerals extracted from key lithological units in Western Australia. The selected analytical platform was a TESCAN Integrated Mineral Analyser (TIMA) that uses EDS-based mineral classification software to image and quantify mineral abundance and grain size at micron scale resolution. The analytical workflow used a bespoke laboratory information management system (LIMS) to orchestrate: (1) the preparation of grain mounts with embedded QR codes that serve as enduring links between physical samples and analytical data, (2) the assignment of an International Geo Sample Number (IGSN) and Digital Object Identifier (DOI) to each grain mount via the System for Earth Sample Registry (SESAR), (3) the assignment of a DOI to instrument metadata via Research Data Australia, (4) the delivery of TIMA analytical outputs, including spatially registered mineralogy images and mineral abundance data, to an institutionally-based data management server, and (5) the downstream delivery of a final data product via a Google Maps interface such as the AuScope Discovery Portal. The modular design of the system permits the networking of multiple instruments within a single site or multiple collaborating research institutions. Although sharing analytical data does provide new opportunities for the geochemistry community, the creation of an open data network requires: (1) adopting open data reporting standards and conventions, (2) requiring instrument manufacturers and software developers to deliver and process data in formats compatible with open standards, and (3) public funding agencies to incentivise researchers, laboratories and institutions to make their data open and accessible to consumers.

  3. "One-Stop Shopping" for Ocean Remote-Sensing and Model Data

    NASA Technical Reports Server (NTRS)

    Li, P. Peggy; Vu, Quoc; Chao, Yi; Li, Zhi-Jin; Choi, Jei-Kook

    2006-01-01

    OurOcean Portal 2.0 (http:// ourocean.jpl.nasa.gov) is a software system designed to enable users to easily gain access to ocean observation data, both remote-sensing and in-situ, configure and run an Ocean Model with observation data assimilated on a remote computer, and visualize both the observation data and the model outputs. At present, the observation data and models focus on the California coastal regions and Prince William Sound in Alaska. This system can be used to perform both real-time and retrospective analyses of remote-sensing data and model outputs. OurOcean Portal 2.0 incorporates state-of-the-art information technologies (IT) such as MySQL database, Java Web Server (Apache/Tomcat), Live Access Server (LAS), interactive graphics with Java Applet at the Client site and MatLab/GMT at the server site, and distributed computing. OurOcean currently serves over 20 real-time or historical ocean data products. The data are served in pre-generated plots or their native data format. For some of the datasets, users can choose different plotting parameters and produce customized graphics. OurOcean also serves 3D Ocean Model outputs generated by ROMS (Regional Ocean Model System) using LAS. The Live Access Server (LAS) software, developed by the Pacific Marine Environmental Laboratory (PMEL) of the National Oceanic and Atmospheric Administration (NOAA), is a configurable Web-server program designed to provide flexible access to geo-referenced scientific data. The model output can be views as plots in horizontal slices, depth profiles or time sequences, or can be downloaded as raw data in different data formats, such as NetCDF, ASCII, Binary, etc. The interactive visualization is provided by graphic software, Ferret, also developed by PMEL. In addition, OurOcean allows users with minimal computing resources to configure and run an Ocean Model with data assimilation on a remote computer. Users may select the forcing input, the data to be assimilated, the simulation period, and the output variables and submit the model to run on a backend parallel computer. When the run is complete, the output will be added to the LAS server for

  4. Variation in use of Internet-based patient portals by parents of children with chronic disease.

    PubMed

    Byczkowski, Terri L; Munafo, Jennifer K; Britto, Maria T

    2011-05-01

    To assess the use of Internet-based portals among families of children with chronic diseases and to describe characteristics of portal registrants and users. Retrospective observational study. Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio, using data from September 1, 2003, through February 29, 2008. Patients/ Parents of children with diabetes mellitus, juvenile idiopathic arthritis, or cystic fibrosis. Parents of children with a chronic disease were given the opportunity to access health-related information for their children via an Internet-based portal. Percentage of families who obtained a portal account (registered), used the portal for the first time within 3 months and again 3 to 6 months after registration, number of times logged in, and session length. Of 1900 families, 27.9% obtained a portal account. Of those, 47.8% used the portal within 3 months of registration and 15.9% continued to use the portal 3 to 6 months after registration. Families of African American patients and of patients insured by Medicaid were less likely to obtain a portal account. More outpatient visits and having private health insurance coverage were associated with increased portal registration and use. Understanding the feasibility of portal use by parents is an important first step to using portals for improving self-management, patient-provider interactions, and outcomes for children with chronic diseases. Subsequent studies should address parent perceptions of the value portals add to the management of the chronic disease of their child and ways to increase that value. Barriers to using portals among racial minorities and publicly insured families should also be studied to address disparities.

  5. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  6. Mouse and Rat Models of Induction of Hepatic Fibrosis and Assessment of Portal Hypertension.

    PubMed

    Klein, Sabine; Schierwagen, Robert; Uschner, Frank Erhard; Trebicka, Jonel

    2017-01-01

    Portal hypertension either develops due to progressive liver fibrosis or is the consequence of vascular liver diseases such as portal vein thrombosis or non-cirrhotic portal hypertension. This chapter focuses on different rodent models of liver fibrosis with portal hypertension and also in few non-cirrhotic portal hypertension models. Importantly, after the development of portal hypertension, the proper assessment of drug effects in the portal and systemic circulation should be discussed. The last part of the chapter is dedicated in these techniques to assess the in vivo hemodynamics and the ex vivo techniques of the isolated liver perfusion and vascular contractility.

  7. Interim Open Source Software (OSS) Policy

    EPA Pesticide Factsheets

    This interim Policy establishes a framework to implement the requirements of the Office of Management and Budget's (OMB) Federal Source Code Policy to achieve efficiency, transparency and innovation through reusable and open source software.

  8. Open access for ALICE analysis based on virtualization technology

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  9. A technique for reducing patient setup uncertainties by aligning and verifying daily positioning of a moving tumor using implanted fiducials

    PubMed Central

    Balter, Peter; Morice, Rodolfo C.; Choi, Bum; Kudchadker, Rajat J.; Bucci, Kara; Chang, Joe Y.; Dong, Lei; Tucker, Susan; Vedam, Sastry; Briere, Tina; Starkschall, George

    2008-01-01

    This study aimed to validate and implement a methodology in which fiducials implanted in the periphery of lung tumors can be used to reduce uncertainties in tumor location. Alignment software that matches marker positions on two‐dimensional (2D) kilovoltage portal images to positions on three‐dimensional (3D) computed tomography data sets was validated using static and moving phantoms. This software also was used to reduce uncertainties in tumor location in a patient with fiducials implanted in the periphery of a lung tumor. Alignment of fiducial locations in orthogonal projection images with corresponding fiducial locations in 3D data sets can position both static and moving phantoms with an accuracy of 1 mm. In a patient, alignment based on fiducial locations reduced systematic errors in the left–right direction by 3 mm and random errors by 2 mm, and random errors in the superior–inferior direction by 3 mm as measured by anterior–posterior cine images. Software that matches fiducial markers on 2D and 3D images is effective for aligning both static and moving fiducials before treatment and can be implemented to reduce patient setup uncertainties. PACS number: 81.40.Wx

  10. The Global Climate Dashboard: a Software Interface to Stream Comprehensive Climate Data

    NASA Astrophysics Data System (ADS)

    Gardiner, N.; Phillips, M.; NOAA Climate Portal Dashboard

    2011-12-01

    The Global Climate Dashboard is an integral component of NOAA's web portal to climate data, services, and value-added content for decision-makers, teachers, and the science-attentive public (www.clmate.gov). The dashboard provides a rapid view of observational data that demonstrate climate change and variability, as well as outputs from the Climate Model Intercomparison Project version 3, which was built to support the Intergovernmental Panel on Climate Change fourth assessment. The data shown in the dashboard therefore span a range of climate science disciplines with applications that serve audiences with diverse needs. The dashboard is designed with reusable software components that allow it to be implemented incrementally on a wide range of platforms including desktops, tablet devices, and mobile phones. The underlying software components support live streaming of data and provide a way of encapsulating graph sytles and other presentation details into a device-independent standard format that results in a common visual look and feel across all platforms. Here we describe the pedagogical objectives, technical implementation, and the deployment of the dashboard through climate.gov and partner web sites and describe plans to develop a mobile application using the same framework.

  11. The CARMEN software as a service infrastructure.

    PubMed

    Weeks, Michael; Jessop, Mark; Fletcher, Martyn; Hodge, Victoria; Jackson, Tom; Austin, Jim

    2013-01-28

    The CARMEN platform allows neuroscientists to share data, metadata, services and workflows, and to execute these services and workflows remotely via a Web portal. This paper describes how we implemented a service-based infrastructure into the CARMEN Virtual Laboratory. A Software as a Service framework was developed to allow generic new and legacy code to be deployed as services on a heterogeneous execution framework. Users can submit analysis code typically written in Matlab, Python, C/C++ and R as non-interactive standalone command-line applications and wrap them as services in a form suitable for deployment on the platform. The CARMEN Service Builder tool enables neuroscientists to quickly wrap their analysis software for deployment to the CARMEN platform, as a service without knowledge of the service framework or the CARMEN system. A metadata schema describes each service in terms of both system and user requirements. The search functionality allows services to be quickly discovered from the many services available. Within the platform, services may be combined into more complicated analyses using the workflow tool. CARMEN and the service infrastructure are targeted towards the neuroscience community; however, it is a generic platform, and can be targeted towards any discipline.

  12. Modified Anterolateral Portals in Elbow Arthroscopy: A Cadaveric Study on Safety.

    PubMed

    Thon, Stephen; Gold, Peter; Rush, Lane; O'Brien, Michael J; Savoie, Felix H

    2017-11-01

    To evaluate the proximity to the radial nerve on cadaveric specimens of 2 modified anterolateral portals used for elbow arthroscopy. Ten fresh cadaveric elbow specimens were prepared. Four-millimeter Steinman pins were inserted into 3 anterolateral portal sites in relation to the lateral epicondyle: (1) the standard distal anterolateral portal, (2) a modified direct anterolateral portal, and (3) a modified proximal anterolateral portal. These were defined as follows: direct portals 2 cm directly anterior to the lateral epicondyle, and proximal portals 2 cm proximal and 2 cm directly anterior to the lateral epicondyle. Each elbow was then dissected to reveal the course of the radial nerve. Digital photographs were taken of each specimen, and the distance from the Steinman pin to the radial nerve was measured. The modified proximal anterolateral and direct anterolateral portals were found to be a statistically significant distance from the radial nerve compare to the distal portal site (P = .011 and P = .0011, respectively). No significant difference was found in the proximity of the radial nerve between the modified proximal and direct anterolateral portals (P = .25). Inadequate imaging was found at a single portal site for the proximal site; 9 specimens were used for analysis of this portal with 10 complete specimens for the other 2 sites. In cadaveric analysis, both the modified proximal and direct lateral portals provide adequate distance from the radial nerve and may be safe for clinical use. In this study, the distal anterolateral portal was in close proximity of the radial nerve and may result in iatrogenic injury in the clinical setting. This is a cadaveric analysis of 2 modified portal locations at the anterolateral elbow for use in elbow arthroscopy. Further clinical studies are needed prior to determining their absolute safety in comparison to previously identified portal sites. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  13. EBSDinterp 1.0: A MATLAB® Program to Perform Microstructurally Constrained Interpolation of EBSD Data.

    PubMed

    Pearce, Mark A

    2015-08-01

    EBSDinterp is a graphic user interface (GUI)-based MATLAB® program to perform microstructurally constrained interpolation of nonindexed electron backscatter diffraction data points. The area available for interpolation is restricted using variations in pattern quality or band contrast (BC). Areas of low BC are not available for interpolation, and therefore cannot be erroneously filled by adjacent grains "growing" into them. Points with the most indexed neighbors are interpolated first and the required number of neighbors is reduced with each successive round until a minimum number of neighbors is reached. Further iterations allow more data points to be filled by reducing the BC threshold. This method ensures that the best quality points (those with high BC and most neighbors) are interpolated first, and that the interpolation is restricted to grain interiors before adjacent grains are grown together to produce a complete microstructure. The algorithm is implemented through a GUI, taking advantage of MATLAB®'s parallel processing toolbox to perform the interpolations rapidly so that a variety of parameters can be tested to ensure that the final microstructures are robust and artifact-free. The software is freely available through the CSIRO Data Access Portal (doi:10.4225/08/5510090C6E620) as both a compiled Windows executable and as source code.

  14. Digital Rebirth of the Greatest Church of Cluny Maior Ecclesia: from Optronic Surveys to Real Time Use of the Digital Model

    NASA Astrophysics Data System (ADS)

    Landrieu, J.; Père, C.; Rollier, J.; Castandet, S.; Schotte, G.

    2011-09-01

    Our multidisciplinary team has virtually reconstructed the greatest church of the Romanesque period in Europe. The third church of the Abbey of Cluny (12th c.) has been destroyed after the French Revolution, leaving only 8% of the building standing. Many documents have been studied, to include the latest archaeological knowledge in the virtual model. Most remains have been scanned for CAD restitution. The mock-up of the church needed 1600 different numerical files, including the scanned pieces and the anastylosis of a Romanesque portal, a Gothic façade and a mosaic pavement. We faced various difficulties to assemble the different elements of the huge building, and to include the digitized parts. Our workflow consisted in generating geometrical shapes of the church, enriched with metadata such as texture, material... The whole mock up was finally exported to dedicated software to run the rendering step. Our work consisted in creating a whole database of 3D models as well as 2D sources (plans, engravings, pictures...) accessible by the scientific community. The scientific perspectives focus on a representation in virtual immersion of the grand church at scale 1 and an access to the digital mock-up through Augmented Reality.

  15. Observ-OM and Observ-TAB: Universal syntax solutions for the integration, search, and exchange of phenotype and genotype information.

    PubMed

    Adamusiak, Tomasz; Parkinson, Helen; Muilu, Juha; Roos, Erik; van der Velde, Kasper Joeri; Thorisson, Gudmundur A; Byrne, Myles; Pang, Chao; Gollapudi, Sirisha; Ferretti, Vincent; Hillege, Hans; Brookes, Anthony J; Swertz, Morris A

    2012-05-01

    Genetic and epidemiological research increasingly employs large collections of phenotypic and molecular observation data from high quality human and model organism samples. Standardization efforts have produced a few simple formats for exchange of these various data, but a lightweight and convenient data representation scheme for all data modalities does not exist, hindering successful data integration, such as assignment of mouse models to orphan diseases and phenotypic clustering for pathways. We report a unified system to integrate and compare observation data across experimental projects, disease databases, and clinical biobanks. The core object model (Observ-OM) comprises only four basic concepts to represent any kind of observation: Targets, Features, Protocols (and their Applications), and Values. An easy-to-use file format (Observ-TAB) employs Excel to represent individual and aggregate data in straightforward spreadsheets. The systems have been tested successfully on human biobank, genome-wide association studies, quantitative trait loci, model organism, and patient registry data using the MOLGENIS platform to quickly setup custom data portals. Our system will dramatically lower the barrier for future data sharing and facilitate integrated search across panels and species. All models, formats, documentation, and software are available for free and open source (LGPLv3) at http://www.observ-om.org. © 2012 Wiley Periodicals, Inc.

  16. Contrast-enhanced sonography for quantitative assessment of portal hypertension in patients with liver cirrhosis.

    PubMed

    Qu, En-Ze; Zhang, Ying-Cai; Li, Zhi-Yan; Liu, Yang; Wang, Jin-Rui

    2014-11-01

    The clinical utility of contrast-enhanced sonography in portal hypertension remains unclear. We explored the feasibility of using contrast-enhanced sonography for noninvasive assessment of portal venous pressure. Twenty healthy individuals (control group; 9 men; mean age, 46.4 years) and 18 patients with portal hypertension (15 men; mean age, 46.2 years) were enrolled in this study. The portal hypertension group included patients who underwent splenectomy and pericardial blood vessel disarticulation at our hospital from October 2010 to March 2011. One week before surgery, patients with portal hypertension underwent preoperative liver contrast-enhanced sonography. Two-dimensional, Doppler, and contrast-enhanced sonographic parameters were compared between the groups. Portal venous pressure was measured intraoperatively by portal vein puncture in the portal hypertension group, and its relationship with the other parameters was analyzed. The 2-dimensional, Doppler, and contrast-enhanced sonographic parameters differed between the groups (P < .01). Portal venous pressure was inversely correlated with the area under the portal vein/hepatic artery time-intensity curve ratio (Qp/Qa), portal vein/hepatic artery strength ratio (Ip/Ia), and portal vein/hepatic artery wash-in perfusion slope ratio (βp/βa), with correlation coefficients of -0.701, -0.625, and -0.494, respectively. Measurement of the liver contrast-enhanced sonographic parameters Qp/Qa, Ip/Ia, and βp/βa could be used as a new quantitative method for noninvasively assessing portal venous pressure. © 2014 by the American Institute of Ultrasound in Medicine.

  17. Paleomagnetism.org - An online multi-platform and open source environment for paleomagnetic analysis.

    NASA Astrophysics Data System (ADS)

    Koymans, Mathijs; Langereis, Cor; Pastor-Galán, Daniel; van Hinsbergen, Douwe

    2017-04-01

    This contribution gives an overview of Paleomagnetism.org (Koymans et al., 2016), an online environment for paleomagnetic analysis. The application is developed in JavaScript and is fully open-sourced. It presents an interactive website in which paleomagnetic data can be interpreted, evaluated, visualized, and shared with others. The application has been available from late 2015 and since then has evolved with the addition of a magnetostratigraphic tool, additional input formats, and features that emphasize on the link between geomagnetism and tectonics. In the interpretation portal, principle component analysis (Kirschvink et al., 1981) can be applied on visualized demagnetization data (Zijderveld, 1967). Interpreted directions and great circles are combined using the iterative procedure described by (McFadden and McElhinny, 1988). The resulting directions can be further used in the statistics portal or exported as raw tabulated data and high-quality figures. The available tools in the statistics portal cover standard Fisher statistics for directional data and virtual geomagnetic poles (Fisher, 1953; Butler, 1992; Deenen et al., 2011). Other tools include the eigenvector approach foldtest (Tauxe and Watson, 1994), a bootstrapped reversal test (Tauxe et al., 2009), and the classical reversal test (McFadden and McElhinny, 1990). An implementation exists for the detection and correction of inclination shallowing in sediments (Tauxe and Kent, 2004; Tauxe et al., 2008), and a module to visualize apparent polar wander paths (Torsvik et al., 2012; Kent and Irving, 2010; Besse and Courtillot, 2002) for large continent-bearing plates. A miscellaneous portal exists for a set of tools that include a boostrapped oroclinal test (Pastor-Galán et al., 2016) for assessing possible linear relationships between strike and declination. Another tool that is available completes a net tectonic rotation analysis (after Morris et al., 1999) that restores a dyke to its paleo-vertical and can be used in determining paleo-spreading directions fundamental to plate reconstructions. Paleomagnetism.org provides an integrated approach for researchers to export and share paleomagnetic data through a common interface. The portals create a custom exportable file that can be distributed and included in public databases. With a publication, this file can be appended and would contain all paleomagnetic data discussed in the publication. The appended file can then be imported to the application by other researchers for reviewing. The accessibility and simplicity through which paleomagnetic data can be interpreted, analyzed, visualized, and shared should make Paleomagnetism.org of interest to the paleomagnetic and tectonic communities.

  18. Public health information and statistics dissemination efforts for Indonesia on the Internet.

    PubMed

    Hanani, Febiana; Kobayashi, Takashi; Jo, Eitetsu; Nakajima, Sawako; Oyama, Hiroshi

    2011-01-01

    To elucidate current issues related to health statistics dissemination efforts on the Internet in Indonesia and to propose a new dissemination website as a solution. A cross-sectional survey was conducted. Sources of statistics were identified using link relationship and Google™ search. Menu used to locate statistics, mode of presentation and means of access to statistics, and available statistics were assessed for each site. Assessment results were used to derive design specification; a prototype system was developed and evaluated with usability test. 49 sources were identified on 18 governmental, 8 international and 5 non-government websites. Of 49 menus identified, 33% used non-intuitive titles and lead to inefficient search. 69% of them were on government websites. Of 31 websites, only 39% and 23% used graph/chart and map for presentation. Further, only 32%, 39% and 19% provided query, export and print feature. While >50% sources reported morbidity, risk factor and service provision statistics, <40% sources reported health resource and mortality statistics. Statistics portal website was developed using Joomla!™ content management system. Usability test demonstrated its potential to improve data accessibility. In this study, government's efforts to disseminate statistics in Indonesia are supported by non-governmental and international organizations and existing their information may not be very useful because it is: a) not widely distributed, b) difficult to locate, and c) not effectively communicated. Actions are needed to ensure information usability, and one of such actions is the development of statistics portal website.

  19. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  20. 76 FR 66051 - Availability of the Fiscal Year 2010 United States Special Operations Command (USSOCOM) Inventory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ... Authorization Act for Fiscal Year 2008 (NDAA 08) Section 807, the Director of Procurement USSOCOM and the Office of the Director, Defense Procurement and Acquisition Policy, Office of Strategic Sourcing (DPAP/SS... services. The inventory will be published to the USSOCOM public portal Web site at the following location...

Top