Sample records for netcdf network common

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, James

    Atmospheric Radiation Measurement (ARM) Program standard data format is NetCDF 3 (Network Common Data Form). The object of this tutorial is to provide a basic introduction to NetCDF with an emphasis on aspects of the ARM application of NetCDF. The goal is to provide basic instructions for reading and visualizing ARM NetCDF data with the expectation that these examples can then be applied to more complex applications.

  2. Implementing Network Common Data Form (netCDF) for the 3DWF Model

    DTIC Science & Technology

    2016-02-01

    format. In addition, data extraction from netCDF-formatted Weather Research and Forecasting ( WRF ) model results necessary for the 3DWF model’s wind...Requirement for the 3DWF Model 1 3. Implementing netCDF to the 3DWF Model 2 3.1 Weather Research and Forecasting ( WRF ) domain and results 3 3.2...Extracting Variables from netCDF Formatted WRF Data File 5 3.3 Converting the 3DWF’s Results into netCDF 11 4. Conclusion 14 5. References 15 Appendix

  3. Public-domain-software solution to data-access problems for numerical modelers

    USGS Publications Warehouse

    Jenter, Harry; Signell, Richard

    1992-01-01

    Unidata's network Common Data Form, netCDF, provides users with an efficient set of software for scientific-data-storage, retrieval, and manipulation. The netCDF file format is machine-independent, direct-access, self-describing, and in the public domain, thereby alleviating many problems associated with accessing output from large hydrodynamic models. NetCDF has programming interfaces in both the Fortran and C computer language with an interface to C++ planned for release in the future. NetCDF also has an abstract data type that relieves users from understanding details of the binary file structure; data are written and retrieved by an intuitive, user-supplied name rather than by file position. Users are aided further by Unidata's inclusion of the Common Data Language, CDL, a printable text-equivalent of the contents of a netCDF file. Unidata provides numerous operators and utilities for processing netCDF files. In addition, a number of public-domain and proprietary netCDF utilities from other sources are available at this time or will be available later this year. The U.S. Geological Survey has produced and is producing a number of public-domain netCDF utilities.

  4. Geospatial Analysis Tool Kit for Regional Climate Datasets (GATOR) : An Open-source Tool to Compute Climate Statistic GIS Layers from Argonne Climate Modeling Results

    DTIC Science & Technology

    2017-08-01

    This large repository of climate model results for North America (Wang and Kotamarthi 2013, 2014, 2015) is stored in Network Common Data Form (NetCDF...Network Common Data Form (NetCDF). UCAR/Unidata Program Center, Boulder, CO. Available at: http://www.unidata.ucar.edu/software/netcdf. Accessed on 6/20...emissions diverge from each other regarding fossil fuel use, technology, and other socioeconomic factors. As a result, the estimated emissions for each of

  5. Users' Manual and Installation Guide for the EverVIEW Slice and Dice Tool (Version 1.0 Beta)

    USGS Publications Warehouse

    Roszell, Dustin; Conzelmann, Craig; Chimmula, Sumani; Chandrasekaran, Anuradha; Hunnicut, Christina

    2009-01-01

    Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need arose for additional tools to view and manipulate NetCDF datasets, specifically to create subsets of large NetCDF files. To address this need, we created the EverVIEW Slice and Dice Tool to allow users to create subsets of grid-based NetCDF files. The major functions of this tool are (1) to subset NetCDF files both spatially and temporally; (2) to view the NetCDF data in table form; and (3) to export filtered data to a comma-separated value file format.

  6. Filtering NetCDF Files by Using the EverVIEW Slice and Dice Tool

    USGS Publications Warehouse

    Conzelmann, Craig; Romañach, Stephanie S.

    2010-01-01

    Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. It was created to provide a common interface between applications and real-time meteorological and other scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need surfaced for additional tools to view and manipulate NetCDF datasets, specifically to filter the files by creating subsets of large NetCDF files. The U.S. Geological Survey (USGS) and the Joint Ecosystem Modeling (JEM) group are working to address these needs with applications like the EverVIEW Slice and Dice Tool, which allows users to filter grid-based NetCDF files, thus targeting those data most important to them. The major functions of this tool are as follows: (1) to create subsets of NetCDF files temporally, spatially, and by data value; (2) to view the NetCDF data in table form; and (3) to export the filtered data to a comma-separated value (CSV) file format. The USGS and JEM will continue to work with scientists and natural resource managers across The Everglades to solve complex restoration problems through technological advances.

  7. NCWin — A Component Object Model (COM) for processing and visualizing NetCDF data

    USGS Publications Warehouse

    Liu, Jinxun; Chen, J.M.; Price, D.T.; Liu, S.

    2005-01-01

    NetCDF (Network Common Data Form) is a data sharing protocol and library that is commonly used in large-scale atmospheric and environmental data archiving and modeling. The NetCDF tool described here, named NCWin and coded with Borland C + + Builder, was built as a standard executable as well as a COM (component object model) for the Microsoft Windows environment. COM is a powerful technology that enhances the reuse of applications (as components). Environmental model developers from different modeling environments, such as Python, JAVA, VISUAL FORTRAN, VISUAL BASIC, VISUAL C + +, and DELPHI, can reuse NCWin in their models to read, write and visualize NetCDF data. Some Windows applications, such as ArcGIS and Microsoft PowerPoint, can also call NCWin within the application. NCWin has three major components: 1) The data conversion part is designed to convert binary raw data to and from NetCDF data. It can process six data types (unsigned char, signed char, short, int, float, double) and three spatial data formats (BIP, BIL, BSQ); 2) The visualization part is designed for displaying grid map series (playing forward or backward) with simple map legend, and displaying temporal trend curves for data on individual map pixels; and 3) The modeling interface is designed for environmental model development by which a set of integrated NetCDF functions is provided for processing NetCDF data. To demonstrate that the NCWin can easily extend the functions of some current GIS software and the Office applications, examples of calling NCWin within ArcGIS and MS PowerPoint for showing NetCDF map animations are given.

  8. Visualizing NetCDF Files by Using the EverVIEW Data Viewer

    USGS Publications Warehouse

    Conzelmann, Craig; Romañach, Stephanie S.

    2010-01-01

    Over the past few years, modelers in South Florida have started using Network Common Data Form (NetCDF) as the standard data container format for storing hydrologic and ecologic modeling inputs and outputs. With its origins in the meteorological discipline, NetCDF was created by the Unidata Program Center at the University Corporation for Atmospheric Research, in conjunction with the National Aeronautics and Space Administration and other organizations. NetCDF is a portable, scalable, self-describing, binary file format optimized for storing array-based scientific data. Despite attributes which make NetCDF desirable to the modeling community, many natural resource managers have few desktop software packages which can consume NetCDF and unlock the valuable data contained within. The U.S. Geological Survey and the Joint Ecosystem Modeling group, an ecological modeling community of practice, are working to address this need with the EverVIEW Data Viewer. Available for several operating systems, this desktop software currently supports graphical displays of NetCDF data as spatial overlays on a three-dimensional globe and views of grid-cell values in tabular form. An included Open Geospatial Consortium compliant, Web-mapping service client and charting interface allows the user to view Web-available spatial data as additional map overlays and provides simple charting visualizations of NetCDF grid values.

  9. Collaborative Sharing of Multidimensional Space-time Data Using HydroShare

    NASA Astrophysics Data System (ADS)

    Gan, T.; Tarboton, D. G.; Horsburgh, J. S.; Dash, P. K.; Idaszak, R.; Yi, H.; Blanton, B.

    2015-12-01

    HydroShare is a collaborative environment being developed for sharing hydrological data and models. It includes capability to upload data in many formats as resources that can be shared. The HydroShare data model for resources uses a specific format for the representation of each type of data and specifies metadata common to all resource types as well as metadata unique to specific resource types. The Network Common Data Form (NetCDF) was chosen as the format for multidimensional space-time data in HydroShare. NetCDF is widely used in hydrological and other geoscience modeling because it contains self-describing metadata and supports the creation of array-oriented datasets that may include three spatial dimensions, a time dimension and other user defined dimensions. For example, NetCDF may be used to represent precipitation or surface air temperature fields that have two dimensions in space and one dimension in time. This presentation will illustrate how NetCDF files are used in HydroShare. When a NetCDF file is loaded into HydroShare, header information is extracted using the "ncdump" utility. Python functions developed for the Django web framework on which HydroShare is based, extract science metadata present in the NetCDF file, saving the user from having to enter it. Where the file follows Climate Forecast (CF) convention and Attribute Convention for Dataset Discovery (ACDD) standards, metadata is thus automatically populated. Users also have the ability to add metadata to the resource that may not have been present in the original NetCDF file. HydroShare's metadata editing functionality then writes this science metadata back into the NetCDF file to maintain consistency between the science metadata in HydroShare and the metadata in the NetCDF file. This further helps researchers easily add metadata information following the CF and ACDD conventions. Additional data inspection and subsetting functions were developed, taking advantage of Python and command line libraries for working with NetCDF files. We describe the design and implementation of these features and illustrate how NetCDF files from a modeling application may be curated in HydroShare and thus enhance reproducibility of the associated research. We also discuss future development planned for multidimensional space-time data in HydroShare.

  10. A data model of the Climate and Forecast metadata conventions (CF-1.6) with a software implementation (cf-python v2.1)

    NASA Astrophysics Data System (ADS)

    Hassell, David; Gregory, Jonathan; Blower, Jon; Lawrence, Bryan N.; Taylor, Karl E.

    2017-12-01

    The CF (Climate and Forecast) metadata conventions are designed to promote the creation, processing, and sharing of climate and forecasting data using Network Common Data Form (netCDF) files and libraries. The CF conventions provide a description of the physical meaning of data and of their spatial and temporal properties, but they depend on the netCDF file encoding which can currently only be fully understood and interpreted by someone familiar with the rules and relationships specified in the conventions documentation. To aid in development of CF-compliant software and to capture with a minimal set of elements all of the information contained in the CF conventions, we propose a formal data model for CF which is independent of netCDF and describes all possible CF-compliant data. Because such data will often be analysed and visualised using software based on other data models, we compare our CF data model with the ISO 19123 coverage model, the Open Geospatial Consortium CF netCDF standard, and the Unidata Common Data Model. To demonstrate that this CF data model can in fact be implemented, we present cf-python, a Python software library that conforms to the model and can manipulate any CF-compliant dataset.

  11. Displaying Composite and Archived Soundings in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Volkmer, Matthew R.; Blottman, Peter F.; Sharp, David W.

    2008-01-01

    In a previous task, the Applied Meteorology Unit (AMU) developed spatial and temporal climatologies of lightning occurrence based on eight atmospheric flow regimes. The AMU created climatological, or composite, soundings of wind speed and direction, temperature, and dew point temperature at four rawinsonde observation stations at Jacksonville, Tampa, Miami, and Cape Canaveral Air Force Station, for each of the eight flow regimes. The composite soundings were delivered to the National Weather Service (NWS) Melbourne (MLB) office for display using the National version of the Skew-T Hodograph analysis and Research Program (NSHARP) software program. The NWS MLB requested the AMU make the composite soundings available for display in the Advanced Weather Interactive Processing System (AWIPS), so they could be overlaid on current observed soundings. This will allow the forecasters to compare the current state of the atmosphere with climatology. This presentation describes how the AMU converted the composite soundings from NSHARP Archive format to Network Common Data Form (NetCDF) format, so that the soundings could be displayed in AWl PS. The NetCDF is a set of data formats, programming interfaces, and software libraries used to read and write scientific data files. In AWIPS, each meteorological data type, such as soundings or surface observations, has a unique NetCDF format. Each format is described by a NetCDF template file. Although NetCDF files are in binary format, they can be converted to a text format called network Common data form Description Language (CDL). A software utility called ncgen is used to create a NetCDF file from a CDL file, while the ncdump utility is used to create a CDL file from a NetCDF file. An AWIPS receives soundings in Binary Universal Form for the Representation of Meteorological data (BUFR) format (http://dss.ucar.edu/docs/formats/bufr/), and then decodes them into NetCDF format. Only two sounding files are generated in AWIPS per day. One file contains all of the soundings received worldwide between 0000 UTC and 1200 UTC, and the other includes all soundings between 1200 UTC and 0000 UTC. In order to add the composite soundings into AWIPS, a procedure was created to configure, or localize, AWIPS. This involved modifying and creating several configuration text files. A unique fourcharacter site identifier was created for each of the 32 soundings so each could be viewed separately. The first three characters were based on the site identifier of the observed sounding, while the last character was based on the flow regime. While researching the localization process for soundings, the AMU discovered a method of archiving soundings so old soundings would not get purged automatically by AWl PS. This method could provide an alternative way of localizing AWl PS for composite soundings. In addition, this would allow forecasters to use archived soundings in AWIPS for case studies. A test sounding file in NetCDF format was written in order to verify the correct format for soundings in AWIPS. After the file was viewed successfully in AWIPS, the AMU wrote a software program in the Tool Command Language/Tool Kit (Tcl/Tk) language to convert the 32 composite soundings from NSHARP Archive to CDL format. The ncgen utility was then used to convert the CDL file to a NetCDF file. The NetCDF file could then be read and displayed in AWIPS.

  12. Situational Lightning Climatologies for Central Florida: Phase III

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III

    2008-01-01

    This report describes work done by the Applied Meteorology Unit (AMU) to add composite soundings to the Advanced Weather Interactive Processing System (AWIPS). This allows National Weather Service (NWS) forecasters to compare the current atmospheric state with climatology. In a previous phase, the AMU created composite soundings for four rawinsonde observation stations in Florida, for each of eight flow regimes. The composite soundings were delivered to the NWS Melbourne (MLB) office for display using the NSHARP software program. NWS MLB requested that the AMU make the composite soundings available for display in AWIPS. The AMU first created a procedure to customize AWIPS so composite soundings could be displayed. A unique four-character identifier was created for each of the 32 composite soundings. The AMU wrote a Tool Command Language/Tool Kit (TcVTk) software program to convert the composite soundings from NSHARP to Network Common Data Form (NetCDF) format. The NetCDF files were then displayable by AWIPS.

  13. NASA Briefing for Unidata

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher

    2016-01-01

    The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk covered the program of cloud computing prototypes being undertaken for the Earth Observing System Data and Information System (EOSDIS). Also discussed were dataset interoperability recommendations ratified via the EOSDIS Standards Office and the HDF Product Designer tool with respect to its possible applicability to data in network Common Data Form (NetCDF) version 4.

  14. Linking netCDF Data with the Semantic Web - Enhancing Data Discovery Across Domains

    NASA Astrophysics Data System (ADS)

    Biard, J. C.; Yu, J.; Hedley, M.; Cox, S. J. D.; Leadbetter, A.; Car, N. J.; Druken, K. A.; Nativi, S.; Davis, E.

    2016-12-01

    Geophysical data communities are publishing large quantities of data across a wide variety of scientific domains which are overlapping more and more. Whilst netCDF is a common format for many of these communities, it is only one of a large number of data storage and transfer formats. One of the major challenges ahead is finding ways to leverage these diverse data sets to advance our understanding of complex problems. We describe a methodology for incorporating Resource Description Framework (RDF) triples into netCDF files called netCDF-LD (netCDF Linked Data). NetCDF-LD explicitly connects the contents of netCDF files - both data and metadata, with external web-based resources, including vocabularies, standards definitions, and data collections, and through them, a whole host of related information. This approach also preserves and enhances the self describing essence of the netCDF format and its metadata, whilst addressing the challenge of integrating various conventions into files. We present a case study illustrating how reasoning over RDF graphs can empower researchers to discover datasets across domain boundaries.

  15. A Prototype Web-based system for GOES-R Space Weather Data

    NASA Astrophysics Data System (ADS)

    Sundaravel, A.; Wilkinson, D. C.

    2010-12-01

    The Geostationary Operational Environmental Satellite-R Series (GOES-R) makes use of advanced instruments and technologies to monitor the Earth's surface and provide with accurate space weather data. The first GOES-R series satellite is scheduled to be launched in 2015. The data from the satellite will be widely used by scientists for space weather modeling and predictions. This project looks into the ways of how these datasets can be made available to the scientists on the Web and to assist them on their research. We are working on to develop a prototype web-based system that allows users to browse, search and download these data. The GOES-R datasets will be archived in NetCDF (Network Common Data Form) and CSV (Comma Separated Values) format. The NetCDF is a self-describing data format that contains both the metadata information and the data. The data is stored in an array-oriented fashion. The web-based system will offer services in two ways: via a web application (portal) and via web services. Using the web application, the users can download data in NetCDF or CSV format and can also plot a graph of the data. The web page displays the various categories of data and the time intervals for which the data is available. The web application (client) sends the user query to the server, which then connects to the data sources to retrieve the data and delivers it to the users. Data access will also be provided via SOAP (Simple Object Access Protocol) and REST (Representational State Transfer) web services. These provide functions which can be used by other applications to fetch data and use the data for further processing. To build the prototype system, we are making use of proxy data from existing GOES and POES space weather datasets. Java is the programming language used in developing tools that formats data to NetCDF and CSV. For the web technology we have chosen Grails to develop both the web application and the services. Grails is an open source web application framework based on the Groovy language. We are also making use of the THREDDS (Thematic Realtime Environmental Distributed Data Services) server to publish and access the NetCDF files. We have completed developing software tools to generate NetCDF and CSV data files and also tools to translate NetCDF to CSV. The current phase of the project involves in designing and developing the web interface.

  16. A Study of NetCDF as an Approach for High Performance Medical Image Storage

    NASA Astrophysics Data System (ADS)

    Magnus, Marcone; Coelho Prado, Thiago; von Wangenhein, Aldo; de Macedo, Douglas D. J.; Dantas, M. A. R.

    2012-02-01

    The spread of telemedicine systems increases every day. The systems and PACS based on DICOM images has become common. This rise reflects the need to develop new storage systems, more efficient and with lower computational costs. With this in mind, this article discusses a study for application in NetCDF data format as the basic platform for storage of DICOM images. The study case comparison adopts an ordinary database, the HDF5 and the NetCDF to storage the medical images. Empirical results, using a real set of images, indicate that the time to retrieve images from the NetCDF for large scale images has a higher latency compared to the other two methods. In addition, the latency is proportional to the file size, which represents a drawback to a telemedicine system that is characterized by a large amount of large image files.

  17. Advances in a distributed approach for ocean model data interoperability

    USGS Publications Warehouse

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  18. Tool to assess contents of ARM surface meteorology network netCDF files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staudt, A.; Kwan, T.; Tichler, J.

    The Atmospheric Radiation Measurement (ARM) Program, supported by the US Department of Energy, is a major program of atmospheric measurement and modeling designed to improve the understanding of processes and properties that affect atmospheric radiation, with a particular focus on the influence of clouds and the role of cloud radiative feedback in the climate system. The ARM Program will use three highly instrumented primary measurement sites. Deployment of instrumentation at the first site, located in the Southern Great Plains of the United States, began in May of 1992. The first phase of deployment at the second site in the Tropicalmore » Western Pacific is scheduled for late in 1995. The third site will be in the North Slope of Alaska and adjacent Arctic Ocean. To meet the scientific objectives of ARM, observations from the ARM sites are combined with data from other sources; these are called external data. Among these external data sets are surface meteorological observations from the Oklahoma Mesonet, a Kansas automated weather network, the Wind Profiler Demonstration Network (WPDN), and the National Weather Service (NWS) surface stations. Before combining these data with the Surface Meteorological Observations Station (SMOS) ARM data, it was necessary to assess the contents and quality of both the ARM and the external data sets. Since these data sets had previously been converted to netCDF format for use by the ARM Science Team, a tool was written to assess the contents of the netCDF files.« less

  19. Challenges to Standardization: A Case Study Using Coastal and Deep-Ocean Water Level Data

    NASA Astrophysics Data System (ADS)

    Sweeney, A. D.; Stroker, K. J.; Mungov, G.; McLean, S. J.

    2015-12-01

    Sea levels recorded at coastal stations and inferred from deep-ocean pressure observations at the seafloor are submitted for archive in multiple data and metadata formats. These formats include two forms of schema-less XML and a custom binary format accompanied by metadata in a spreadsheet. The authors report on efforts to use existing standards to make this data more discoverable and more useful beyond their initial use in detecting tsunamis. An initial review of data formats for sea level data around the globe revealed heterogeneity in presentation and content. In the absence of a widely-used domain-specific format, we adopted the general model for structuring data and metadata expressed by the Network Common Data Form (netCDF). netCDF has been endorsed by the Open Geospatial Consortium and has the advantages of small size when compared to equivalent plain text representation and provides a standard way of embedding metadata in the same file. We followed the orthogonal time-series profile of the Climate and Forecast discrete sampling geometries as the convention for structuring the data and describing metadata relevant for use. We adhered to the Attribute Convention for Data Discovery for capturing metadata to support user search. Beyond making it possible to structure data and metadata in a standard way, netCDF is supported by multiple software tools in providing programmatic cataloging, access, subsetting, and transformation to other formats. We will describe our successes and failures in adhering to existing standards and provide requirements for either augmenting existing conventions or developing new ones. Some of these enhancements are specific to sea level data, while others are applicable to time-series data in general.

  20. Carolinas Coastal Change Processes Project data report for observations near Diamond Shoals, North Carolina, January-May 2009

    USGS Publications Warehouse

    Armstrong, Brandy N.; Warner, John C.; Voulgaris, George; List, Jeffrey H.; Thieler, E. Robert; Martini, Marinna A.; Montgomery, Ellyn T.

    2011-01-01

    This Open-File Report provides information collected for an oceanographic field study that occurred during January - May 2009 to investigate processes that control the sediment transport dynamics at Diamond Shoals, North Carolina. The objective of this report is to make the data available in digital form and to provide information to facilitate further analysis of the data. The report describes the background, experimental setup, equipment, and locations of the sensor deployments. The edited data are presented in time-series plots for rapid visualization of the data set, and in data files that are in the Network Common Data Format (netcdf). Supporting observational data are also included.

  1. Global Ocean Currents Database

    NASA Astrophysics Data System (ADS)

    Boyer, T.; Sun, L.

    2016-02-01

    The NOAA's National Centers for Environmental Information has released an ocean currents database portal that aims 1) to integrate global ocean currents observations from a variety of instruments with different resolution, accuracy and response to spatial and temporal variability into a uniform network common data form (NetCDF) format and 2) to provide a dedicated online data discovery, access to NCEI-hosted and distributed data sources for ocean currents data. The portal provides a tailored web application that allows users to search for ocean currents data by platform types and spatial/temporal ranges of their interest. The dedicated web application is available at http://www.nodc.noaa.gov/gocd/index.html. The NetCDF format supports widely-used data access protocols and catalog services such as OPeNDAP (Open-source Project for a Network Data Access Protocol) and THREDDS (Thematic Real-time Environmental Distributed Data Services), which the GOCD users can use data files with their favorite analysis and visualization client software without downloading to their local machine. The potential users of the ocean currents database include, but are not limited to, 1) ocean modelers for their model skills assessments, 2) scientists and researchers for studying the impact of ocean circulations on the climate variability, 3) ocean shipping industry for safety navigation and finding optimal routes for ship fuel efficiency, 4) ocean resources managers while planning for the optimal sites for wastes and sewages dumping and for renewable hydro-kinematic energy, and 5) state and federal governments to provide historical (analyzed) ocean circulations as an aid for search and rescue

  2. Report on IVS-WG4

    NASA Astrophysics Data System (ADS)

    Gipson, John

    2011-07-01

    I describe the proposed data structure for storing, archiving and processing VLBI data. In this scheme, most VLBI data is stored in NetCDF files. NetCDF has the advantage that there are interfaces to most common computer languages including Fortran, Fortran-90, C, C++, Perl, etc, and the most common operating systems including linux, Windows and Mac. The data files for a particular session are organized by special ASCII "wrapper" files which contain pointers to the data files. This allows great flexibility in the processing and analysis of VLBI data, and also allows for extending the types of data used, e.g., source maps. I discuss the use of the new format in calc/solve and other VLBI analysis packages. I also discuss plans for transitioning to the new structure.

  3. A Climate Statistics Tool and Data Repository

    NASA Astrophysics Data System (ADS)

    Wang, J.; Kotamarthi, V. R.; Kuiper, J. A.; Orr, A.

    2017-12-01

    Researchers at Argonne National Laboratory and collaborating organizations have generated regional scale, dynamically downscaled climate model output using Weather Research and Forecasting (WRF) version 3.3.1 at a 12km horizontal spatial resolution over much of North America. The WRF model is driven by boundary conditions obtained from three independent global scale climate models and two different future greenhouse gas emission scenarios, named representative concentration pathways (RCPs). The repository of results has a temporal resolution of three hours for all the simulations, includes more than 50 variables, is stored in Network Common Data Form (NetCDF) files, and the data volume is nearly 600Tb. A condensed 800Gb set of NetCDF files were made for selected variables most useful for climate-related planning, including daily precipitation, relative humidity, solar radiation, maximum temperature, minimum temperature, and wind. The WRF model simulations are conducted for three 10-year time periods (1995-2004, 2045-2054, and 2085-2094), and two future scenarios RCP4.5 and RCP8.5). An open-source tool was coded using Python 2.7.8 and ESRI ArcGIS 10.3.1 programming libraries to parse the NetCDF files, compute summary statistics, and output results as GIS layers. Eight sets of summary statistics were generated as examples for the contiguous U.S. states and much of Alaska, including number of days over 90°F, number of days with a heat index over 90°F, heat waves, monthly and annual precipitation, drought, extreme precipitation, multi-model averages, and model bias. This paper will provide an overview of the project to generate the main and condensed data repositories, describe the Python tool and how to use it, present the GIS results of the computed examples, and discuss some of the ways they can be used for planning. The condensed climate data, Python tool, computed GIS results, and documentation of the work are shared on the Internet.

  4. EverVIEW: a visualization platform for hydrologic and Earth science gridded data

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Suir, Kevin J.; Conzelmann, Craig

    2015-01-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  5. Data Container Study for Handling Array-based Data Using Rasdaman, Hive, Spark, and MongoDB

    NASA Astrophysics Data System (ADS)

    Xu, M.; Hu, F.; Yu, M.; Scheele, C.; Liu, K.; Huang, Q.; Yang, C. P.; Little, M. M.

    2016-12-01

    Geoscience communities have come up with various big data storage solutions, such as Rasdaman and Hive, to address the grand challenges for massive Earth observation data management and processing. To examine the readiness of current solutions in supporting big Earth observation, we propose to investigate and compare four popular data container solutions, including Rasdaman, Hive, Spark, and MongoDB. Using different types of spatial and non-spatial queries, datasets stored in common scientific data formats (e.g., NetCDF and HDF), and two applications (i.e. dust storm simulation data mining and MERRA data analytics), we systematically compare and evaluate the feature and performance of these four data containers in terms of data discover and access. The computing resources (e.g. CPU, memory, hard drive, network) consumed while performing various queries and operations are monitored and recorded for the performance evaluation. The initial results show that 1) Rasdaman has the best performance for queries on statistical and operational functions, and supports NetCDF data format better than HDF; 2) Rasdaman clustering configuration is more complex than the others; 3) Hive performs better on single pixel extraction from multiple images; and 4) Except for the single pixel extractions, Spark performs better than Hive and its performance is close to Rasdaman. A comprehensive report will detail the experimental results, and compare their pros and cons regarding system performance, ease of use, accessibility, scalability, compatibility, and flexibility.

  6. NetCDF4/HDF5 and Linked Data in the Real World - Enriching Geoscientific Metadata without Bloat

    NASA Astrophysics Data System (ADS)

    Ip, Alex; Car, Nicholas; Druken, Kelsey; Poudjom-Djomani, Yvette; Butcher, Stirling; Evans, Ben; Wyborn, Lesley

    2017-04-01

    NetCDF4 has become the dominant generic format for many forms of geoscientific data, leveraging (and constraining) the versatile HDF5 container format, while providing metadata conventions for interoperability. However, the encapsulation of detailed metadata within each file can lead to metadata "bloat", and difficulty in maintaining consistency where metadata is replicated to multiple locations. Complex conceptual relationships are also difficult to represent in simple key-value netCDF metadata. Linked Data provides a practical mechanism to address these issues by associating the netCDF files and their internal variables with complex metadata stored in Semantic Web vocabularies and ontologies, while complying with and complementing existing metadata conventions. One of the stated objectives of the netCDF4/HDF5 formats is that they should be self-describing: containing metadata sufficient for cataloguing and using the data. However, this objective can be regarded as only partially-met where details of conventions and definitions are maintained externally to the data files. For example, one of the most widely used netCDF community standards, the Climate and Forecasting (CF) Metadata Convention, maintains standard vocabularies for a broad range of disciplines across the geosciences, but this metadata is currently neither readily discoverable nor machine-readable. We have previously implemented useful Linked Data and netCDF tooling (ncskos) that associates netCDF files, and individual variables within those files, with concepts in vocabularies formulated using the Simple Knowledge Organization System (SKOS) ontology. NetCDF files contain Uniform Resource Identifier (URI) links to terms represented as SKOS Concepts, rather than plain-text representations of those terms, so we can use simple, standardised web queries to collect and use rich metadata for the terms from any Linked Data-presented SKOS vocabulary. Geoscience Australia (GA) manages a large volume of diverse geoscientific data, much of which is being translated from proprietary formats to netCDF at NCI Australia. This data is made available through the NCI National Environmental Research Data Interoperability Platform (NERDIP) for programmatic access and interdisciplinary analysis. The netCDF files contain both scientific data variables (e.g. gravity, magnetic or radiometric values), but also domain-specific operational values (e.g. specific instrument parameters) best described fully in formal vocabularies. Our ncskos codebase provides access to multiple stores of detailed external metadata in a standardised fashion. Geophysical datasets are generated from a "survey" event, and GA maintains corporate databases of all surveys and their associated metadata. It is impractical to replicate the full source survey metadata into each netCDF dataset so, instead, we link the netCDF files to survey metadata using public Linked Data URIs. These URIs link to Survey class objects which we model as a subclass of Activity objects as defined by the PROV Ontology, and we provide URI resolution for them via a custom Linked Data API which draws current survey metadata from GA's in-house databases. We have demonstrated that Linked Data is a practical way to associate netCDF data with detailed, external metadata. This allows us to ensure that catalogued metadata is kept consistent with metadata points-of-truth, and we can infer complex conceptual relationships not possible with netCDF key-value attributes alone.

  7. Hydratools, a MATLAB® based data processing package for Sontek Hydra data

    USGS Publications Warehouse

    Martini, M.; Lightsom, F.L.; Sherwood, C.R.; Xu, Jie; Lacy, J.R.; Ramsey, A.; Horwitz, R.

    2005-01-01

    The U.S. Geological Survey (USGS) has developed a set of MATLAB tools to process and convert data collected by Sontek Hydra instruments to netCDF, which is a format used by the USGS to process and archive oceanographic time-series data. The USGS makes high-resolution current measurements within 1.5 meters of the bottom. These data are used in combination with other instrument data from sediment transport studies to develop sediment transport models. Instrument manufacturers provide software which outputs unique binary data formats. Multiple data formats are cumbersome. The USGS solution is to translate data streams into a common data format: netCDF. The Hydratools toolbox is written to create netCDF format files following EPIC conventions, complete with embedded metadata. Data are accepted from both the ADV and the PCADP. The toolbox will detect and remove bad data, substitute other sources of heading and tilt measurements if necessary, apply ambiguity corrections, calculate statistics, return information about data quality, and organize metadata. Standardized processing and archiving makes these data more easily and routinely accessible locally and over the Internet. In addition, documentation of the techniques used in the toolbox provides a baseline reference for others utilizing the data.

  8. Using GDAL to Convert NetCDF 4 CF 1.6 to GeoTIFF: Interoperability Problems and Solutions for Data Providers and Distributors

    NASA Astrophysics Data System (ADS)

    Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.

    2015-12-01

    An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability. Differencesbetween data file contents and software package expectations areexposed, affording an opportunity to improve conformance of software,data or both. The incident can also serve as a demonstration that dataproviders, distributors, and users can work together to improve dataproduct quality and interoperability.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.

  10. Serving Real-Time Point Observation Data in netCDF using Climate and Forecasting Discrete Sampling Geometry Conventions

    NASA Astrophysics Data System (ADS)

    Ward-Garrison, C.; May, R.; Davis, E.; Arms, S. C.

    2016-12-01

    NetCDF is a set of software libraries and self-describing, machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. The Climate and Forecasting (CF) metadata conventions for netCDF foster the ability to work with netCDF files in general and useful ways. These conventions include metadata attributes for physical units, standard names, and spatial coordinate systems. While these conventions have been successful in easing the use of working with netCDF-formatted output from climate and forecast models, their use for point-based observation data has been less so. Unidata has prototyped using the discrete sampling geometry (DSG) CF conventions to serve, using the THREDDS Data Server, the real-time point observation data flowing across the Internet Data Distribution (IDD). These data originate in text format reports for individual stations (e.g. METAR surface data or TEMP upper air data) and are converted and stored in netCDF files in real-time. This work discusses the experiences and challenges of using the current CF DSG conventions for storing such real-time data. We also test how parts of netCDF's extended data model can address these challenges, in order to inform decisions for a future version of CF (CF 2.0) that would take advantage of features of the netCDF enhanced data model.

  11. Recommendations resulting from the SPDS Community-Wide Workshop

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Data Systems Panel identified three critical functionalities of a Space Physics Data System (SPDS): the delivery of self-documenting data, the existence of a matrix of translators between various standard formats (IDFS, CDF, netCDF, HDF, TENNIS, UCLA flat file, and FITS), and a network-based capability for browsing and examining inventory records for the system's data holdings. The recommendations resulting from the workshop include the philosophy, funding, and objectives of a SPDS. Access to quality data is seen as the most important objective by the Policy Panel, with curation and information about the data being integral parts of any accessible data set. The Data Issues Panel concluded that the SPDS can supply encouragement, guidelines, and ultimately provide a mechanism for financial support for data archiving, restoration, and curation. The Software Panel of the SPDS focused on defining the requirements and priorities for SPDS to support common data analysis and data visualization tools and packages.

  12. Task 28: Web Accessible APIs in the Cloud Trade Study

    NASA Technical Reports Server (NTRS)

    Gallagher, James; Habermann, Ted; Jelenak, Aleksandar; Lee, Joe; Potter, Nathan; Yang, Muqun

    2017-01-01

    This study explored three candidate architectures for serving NASA Earth Science Hierarchical Data Format Version 5 (HDF5) data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the project are: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and Network Common Data Format Version 4 (netCDF4) data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3).Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.

  13. Data Publishing and Sharing Via the THREDDS Data Repository

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Caron, J.; Davis, E.; Baltzer, T.

    2007-12-01

    The terms "Team Science" and "Networked Science" have been coined to describe a virtual organization of researchers tied via some intellectual challenge, but often located in different organizations and locations. A critical component to these endeavors is publishing and sharing of content, including scientific data. Imagine pointing your web browser to a web page that interactively lets you upload data and metadata to a repository residing on a remote server, which can then be accessed by others in a secure fasion via the web. While any content can be added to this repository, it is designed particularly for storing and sharing scientific data and metadata. Server support includes uploading of data files that can subsequently be subsetted, aggregrated, and served in NetCDF or other scientific data formats. Metadata can be associated with the data and interactively edited. The THREDDS Data Repository (TDR) is a server that provides client initiated, on demand, location transparent storage for data of any type that can then be served by the THREDDS Data Server (TDS). The TDR provides functionality to: * securely store and "own" data files and associated metadata * upload files via HTTP and gridftp * upload a collection of data as single file * modify and restructure repository contents * incorporate metadata provided by the user * generate additional metadata programmatically * edit individual metadata elements The TDR can exist separately from a TDS, serving content via HTTP. Also, it can work in conjunction with the TDS, which includes functionality to provide: * access to data in a variety of formats via -- OPeNDAP -- OGC Web Coverage Service (for gridded datasets) -- bulk HTTP file transfer * a NetCDF view of datasets in NetCDF, OPeNDAP, HDF-5, GRIB, and NEXRAD formats * serving of very large volume datasets, such as NEXRAD radar * aggregation into virtual datasets * subsetting via OPeNDAP and NetCDF Subsetting services This talk will discuss TDR/TDS capabilities as well as how users can install this software to create their own repositories.

  14. IVS Working Group 4: VLBI Data Structures

    NASA Astrophysics Data System (ADS)

    Gipson, J.

    2012-12-01

    I present an overview of the "openDB format" for storing, archiving, and processing VLBI data. In this scheme, most VLBI data is stored in NetCDF files. NetCDF has the advantage that there are interfaces to most common computer languages including Fortran, Fortran-90, C, C++, Perl, etc, and the most common operating systems including Linux, Windows, and Mac. The data files for a particular session are organized by special ASCII "wrapper" files which contain pointers to the data files. This allows great flexibility in the processing and analysis of VLBI data. For example it allows you to easily change subsets of the data used in the analysis such as troposphere modeling, ionospheric calibration, editing, and ambiguity resolution. It also allows for extending the types of data used, e.g., source maps. I present a roadmap to transition to this new format. The new format can already be used by VieVS and by the global mode of solve. There are plans in work for other software packages to be able to use the new format.

  15. Unleashing Geophysics Data with Modern Formats and Services

    NASA Astrophysics Data System (ADS)

    Ip, Alex; Brodie, Ross C.; Druken, Kelsey; Bastrakova, Irina; Evans, Ben; Kemp, Carina; Richardson, Murray; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2016-04-01

    Geoscience Australia (GA) is the national steward of large volumes of geophysical data extending over the entire Australasian region and spanning many decades. The volume and variety of data which must be managed, coupled with the increasing need to support machine-to-machine data access, mean that the old "click-and-ship" model delivering data as downloadable files for local analysis is rapidly becoming unviable - a "big data" problem not unique to geophysics. The Australian Government, through the Research Data Services (RDS) Project, recently funded the Australian National Computational Infrastructure (NCI) to organize a wide range of Earth Systems data from diverse collections including geoscience, geophysics, environment, climate, weather, and water resources onto a single High Performance Data (HPD) Node. This platform, which now contains over 10 petabytes of data, is called the National Environmental Research Data Interoperability Platform (NERDIP), and is designed to facilitate broad user access, maximise reuse, and enable integration. GA has contributed several hundred terabytes of geophysical data to the NERDIP. Historically, geophysical datasets have been stored in a range of formats, with metadata of varying quality and accessibility, and without standardised vocabularies. This has made it extremely difficult to aggregate original data from multiple surveys (particularly un-gridded geophysics point/line data) into standard formats suited to High Performance Computing (HPC) environments. To address this, it was decided to use the NERDIP-preferred Hierarchical Data Format (HDF) 5, which is a proven, standard, open, self-describing and high-performance format supported by extensive software tools, libraries and data services. The Network Common Data Form (NetCDF) 4 API facilitates the use of data in HDF5, whilst the NetCDF Climate & Forecasting conventions (NetCDF-CF) further constrain NetCDF4/HDF5 data so as to provide greater inherent interoperability. The first geophysical data collection selected for transformation by GA was Airborne ElectroMagnetics (AEM) data which was held in proprietary-format files, with associated ISO 19115 metadata held in a separate relational database. Existing NetCDF-CF metadata profiles were enhanced to cover AEM and other geophysical data types, and work is underway to formalise the new geophysics vocabulary as a proposed extension to the Climate & Forecasting conventions. The richness and flexibility of HDF5's internal indexing mechanisms has allowed lossless restructuring of the AEM data for efficient storage, subsetting and access via either the NetCDF4/HDF5 APIs or Open-source Project for a Network Data Access Protocol (OPeNDAP) data services. This approach not only supports large-scale HPC processing, but also interactive access to a wide range of geophysical data in user-friendly environments such as iPython notebooks and more sophisticated cloud-enabled portals such as the Virtual Geophysics Laboratory (VGL). As multidimensional AEM datasets are relatively complex compared to other geophysical data types, the general approach employed in this project for modernizing AEM data is likely to be applicable to other geophysics data types. When combined with the use of standards-based data services and APIs, a coordinated, systematic modernisation will result in vastly improved accessibility to, and usability of, geophysical data in a wide range of computational environments both within and beyond the geophysics community.

  16. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  17. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  18. Informatic infrastructure for Climatological and Oceanographic data based on THREDDS technology in a Grid environment

    NASA Astrophysics Data System (ADS)

    Tronconi, C.; Forneris, V.; Santoleri, R.

    2009-04-01

    CNR-ISAC-GOS is responsible for the Mediterranean Sea satellite operational system in the framework of MOON Patnership. This Observing System acquires satellite data and produces Near Real Time, Delayed Time and Re-analysis of Ocean Colour and Sea Surface Temperature products covering the Mediterranean and the Black Seas and regional basins. In the framework of several projects (MERSEA, PRIMI, Adricosm Star, SeaDataNet, MyOcean, ECOOP), GOS is producing Climatological/Satellite datasets based on optimal interpolation and specific Regional algorithm for chlorophyll, updated in Near Real Time and in Delayed mode. GOS has built • an informatic infrastructure data repository and delivery based on THREDDS technology The datasets are generated in NETCDF format, compliant with both the CF convention and the international satellite-oceanographic specification, as prescribed by GHRSST (for SST). All data produced, are made available to the users through a THREDDS server catalog. • A LAS has been installed in order to exploit the potential of NETCDF data and the OPENDAP URL. It provides flexible access to geo-referenced scientific data • a Grid Environment based on Globus Technologies (GT4) connecting more than one Institute; in particular exploiting CNR and ESA clusters makes possible to reprocess 12 years of Chlorophyll data in less than one month.(estimated processing time on a single core PC: 9months). In the poster we will give an overview of: • the features of the THREDDS catalogs, pointing out the powerful characteristics of this new middleware that has replaced the "old" OPENDAP Server; • the importance of adopting a common format (as NETCDF) for data exchange; • the tools (e.g. LAS) connected with THREDDS and NETCDF format use. • the Grid infrastructure on ISAC We will present also specific basin-scale High Resolution products and Ultra High Resolution regional/coastal products available on these catalogs.

  19. Moving from HDF4 to HDF5/netCFD-4

    NASA Technical Reports Server (NTRS)

    Pourmal, Elena; Yang, Kent; Lee, Joe

    2017-01-01

    In this presentation, we will go over the major differences between two file formats and libraries, and will talk about the HDF5 features that users should consider when designing new products in HDF5netCDF4. We will also discuss the h4h5tools toolkit that can facilitate conversion of data in the existing HDF4 files to HDF5 and netCDF-4, and we will engage the participants in the discussion of how The HDF Group can help with the transition and adoption of HDF5 and netCDF-4.

  20. CMGTooL user's manual

    USGS Publications Warehouse

    Xu, Jingping; Lightsom, Fran; Noble, Marlene A.; Denham, Charles

    2002-01-01

    During the past several years, the sediment transport group in the Coastal and Marine Geology Program (CMGP) of the U. S. Geological Survey has made major revisions to its methodology of processing, analyzing, and maintaining the variety of oceanographic time-series data. First, CMGP completed the transition of the its oceanographic time-series database to a self-documenting NetCDF (Rew et al., 1997) data format. Second, CMGP’s oceanographic data variety and complexity have been greatly expanded from traditional 2-dimensional, single-point time-series measurements (e.g., Electro-magnetic current meters, transmissometers) to more advanced 3-dimensional and profiling time-series measurements due to many new acquisitions of modern instruments such as Acoustic Doppler Current Profiler (RDI, 1996), Acoustic Doppler Velocitimeter, Pulse-Coherence Acoustic Doppler Profiler (SonTek, 2001), Acoustic Bacscatter Sensor (Aquatec, 1001001001001001001). In order to accommodate the NetCDF format of data from the new instruments, a software package of processing, analyzing, and visualizing time-series oceanographic data was developed. It is named CMGTooL. The CMGTooL package contains two basic components: a user-friendly GUI for NetCDF file analysis, processing and manipulation; and a data analyzing program library. Most of the routines in the library are stand-alone programs suitable for batch processing. CMGTooL is written in MATLAB computing language (The Mathworks, 1997), therefore users must have MATLAB installed on their computer in order to use this software package. In addition, MATLAB’s Signal Processing Toolbox is also required by some CMGTooL’s routines. Like most MATLAB programs, all CMGTooL codes are compatible with different computing platforms including PC, MAC, and UNIX machines (Note: CMGTooL has been tested on different platforms that run MATLAB 5.2 (Release 10) or lower versions. Some of the commands related to MAC may not be compatible with later releases of MATLAB). The GUI and some of the library routines call low-level NetCDF file I/O, variable and attribute functions. These NetCDF exclusive functions are supported by a MATLAB toolbox named NetCDF, created by Dr. Charles Denham . This toolbox has to be installed in order to use the CMGTooL GUI. The CMGTooL GUI calls several routines that were initially developed by others. The authors would like to acknowledge the following scientists for their ideas and codes: Dr. Rich Signell (USGS), Dr. Chris Sherwood (USGS), and Dr. Bob Beardsley (WHOI). Many special terms that carry special meanings in either MATLAB or the NetCDF Toolbox are used in this manual. Users are encouraged to read the documents of MATLAB and NetCDF for references.

  1. NetCDF-U - Uncertainty conventions for netCDF datasets

    NASA Astrophysics Data System (ADS)

    Bigagli, Lorenzo; Nativi, Stefano; Domenico, Ben

    2013-04-01

    To facilitate the automated processing of uncertain data (e.g. uncertainty propagation in modeling applications), we have proposed a set of conventions for expressing uncertainty information within the netCDF data model and format: the NetCDF Uncertainty Conventions (NetCDF-U). From a theoretical perspective, it can be said that no dataset is a perfect representation of the reality it purports to represent. Inevitably, errors arise from the observation process, including the sensor system and subsequent processing, differences in scales of phenomena and the spatial support of the observation mechanism, lack of knowledge about the detailed conversion between the measured quantity and the target variable. This means that, in principle, all data should be treated as uncertain. The most natural representation of an uncertain quantity is in terms of random variables, with a probabilistic approach. However, it must be acknowledged that almost all existing data resources are not treated in this way. Most datasets come simply as a series of values, often without any uncertainty information. If uncertainty information is present, then it is typically within the metadata, as a data quality element. This is typically a global (dataset wide) representation of uncertainty, often derived through some form of validation process. Typically, it is a statistical measure of spread, for example the standard deviation of the residuals. The introduction of a mechanism by which such descriptions of uncertainty can be integrated into existing geospatial applications is considered a practical step towards a more accurate modeling of our uncertain understanding of any natural process. Given the generality and flexibility of the netCDF data model, conventions on naming, syntax, and semantics have been adopted by several communities of practice, as a means of improving data interoperability. Some of the existing conventions include provisions on uncertain elements and concepts, but, to our knowledge, no general convention on the encoding of uncertainty has been proposed, to date. In particular, the netCDF Climate and Forecast Conventions (NetCDF-CF), a de-facto standard for a large amount of data in Fluid Earth Sciences, mention the issue and provide limited support for uncertainty representation. NetCDF-U is designed to be fully compatible with NetCDF-CF, where possible adopting the same mechanisms (e.g. using the same attributes name with compatible semantics). The rationale for this is that a probabilistic description of scientific quantities is a crosscutting aspect, which may be modularized (note that a netCDF dataset may be compliant with more than one convention). The scope of NetCDF-U is to extend and qualify the netCDF classic data model (also known as netCDF3), to capture the uncertainty related to geospatial information encoded in that format. In the future, a netCDF4 approach for uncertainty encoding will be investigated. The NetCDF-U Conventions have the following rationale: • Compatibility with netCDF-CF Conventions 1.5. • Human-readability of conforming datasets structure. • Minimal difference between certain/agnostic and uncertain representations of data (e.g. with respect to dataset structure). NetCDF-U is based on a generic mechanism for annotating netCDF data variables with probability theory semantics. The Uncertainty Markup Language (UncertML) 2.0 is used as a controlled conceptual model and vocabulary for NetCDF-U annotations. The proposed mechanism anticipates a generalized support for semantic annotations in netCDF. NetCDF-U defines syntactical conventions for encoding samples, summary statistics, and distributions, along with mechanisms for expressing dependency relationships among variables. The conventions were accepted as an Open Geospatial Consortium (OGC) Discussion Paper (OGC 11-163); related discussions are conducted on a public forum hosted by the OGC. NetCDF-U may have implications for future work directed at communicating geospatial data provenance and uncertainty in contexts other than netCDF. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 248488.

  2. Damsel: A Data Model Storage Library for Exascale Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok; Liao, Wei-keng

    Computational science applications have been described as having one of seven motifs (the “seven dwarfs”), each having a particular pattern of computation and communication. From a storage and I/O perspective, these applications can also be grouped into a number of data model motifs describing the way data is organized and accessed during simulation, analysis, and visualization. Major storage data models developed in the 1990s, such as Network Common Data Format (netCDF) and Hierarchical Data Format (HDF) projects, created support for more complex data models. Development of both netCDF and HDF5 was influenced by multi-dimensional dataset storage requirements, but their accessmore » models and formats were designed with sequential storage in mind (e.g., a POSIX I/O model). Although these and other high-level I/O libraries have had a beneficial impact on large parallel applications, they do not always attain a high percentage of peak I/O performance due to fundamental design limitations, and they do not address the full range of current and future computational science data models. The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. The project consists of three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community. The product of this project, Damsel library, is openly available for download from http://cucis.ece.northwestern.edu/projects/DAMSEL. Several case studies and application programming interface reference are also available to assist new users to learn to use the library.« less

  3. Development of a gridded meteorological dataset over Java island, Indonesia 1985-2014.

    PubMed

    Yanto; Livneh, Ben; Rajagopalan, Balaji

    2017-05-23

    We describe a gridded daily meteorology dataset consisting of precipitation, minimum and maximum temperature over Java Island, Indonesia at 0.125°×0.125° (~14 km) resolution spanning 30 years from 1985-2014. Importantly, this data set represents a marked improvement from existing gridded data sets over Java with higher spatial resolution, derived exclusively from ground-based observations unlike existing satellite or reanalysis-based products. Gap-infilling and gridding were performed via the Inverse Distance Weighting (IDW) interpolation method (radius, r, of 25 km and power of influence, α, of 3 as optimal parameters) restricted to only those stations including at least 3,650 days (~10 years) of valid data. We employed MSWEP and CHIRPS rainfall products in the cross-validation. It shows that the gridded rainfall presented here produces the most reasonable performance. Visual inspection reveals an increasing performance of gridded precipitation from grid, watershed to island scale. The data set, stored in a network common data form (NetCDF), is intended to support watershed-scale and island-scale studies of short-term and long-term climate, hydrology and ecology.

  4. Atmospheric Radiation Measurement's Data Management Facility captures metadata and uses visualization tools to assist in routine data management.

    NASA Astrophysics Data System (ADS)

    Keck, N. N.; Macduff, M.; Martin, T.

    2017-12-01

    The Atmospheric Radiation Measurement's (ARM) Data Management Facility (DMF) plays a critical support role in processing and curating data generated by the Department of Energy's ARM Program. Data are collected near real time from hundreds of observational instruments spread out all over the globe. Data are then ingested hourly to provide time series data in NetCDF (network Common Data Format) and includes standardized metadata. Based on automated processes and a variety of user reviews the data may need to be reprocessed. Final data sets are then stored and accessed by users through the ARM Archive. Over the course of 20 years, a suite of data visualization tools have been developed to facilitate the operational processes to manage and maintain the more than 18,000 real time events, that move 1.3 TB of data each day through the various stages of the DMF's data system. This poster will present the resources and methodology used to capture metadata and the tools that assist in routine data management and discoverability.

  5. OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Koziol, B. W.; Rood, R. B.

    2011-12-01

    The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.

  6. netCDF Operators for Rapid Analysis of Measured and Modeled Swath-like Data

    NASA Astrophysics Data System (ADS)

    Zender, C. S.

    2015-12-01

    Swath-like data (hereafter SLD) are defined by non-rectangular and/or time-varying spatial grids in which one or more coordinates are multi-dimensional. It is often challenging and time-consuming to work with SLD, including all Level 2 satellite-retrieved data, non-rectangular subsets of Level 3 data, and model data on curvilinear grids. Researchers and data centers want user-friendly, fast, and powerful methods to specify, extract, serve, manipulate, and thus analyze, SLD. To meet these needs, large research-oriented agencies and modeling center such as NASA, DOE, and NOAA increasingly employ the netCDF Operators (NCO), an open-source scientific data analysis software package applicable to netCDF and HDF data. NCO includes extensive, fast, parallelized regridding features to facilitate analysis and intercomparison of SLD and model data. Remote sensing, weather and climate modeling and analysis communities face similar problems in handling SLD including how to easily: 1. Specify and mask irregular regions such as ocean basins and political boundaries in SLD (and rectangular) grids. 2. Bin, interpolate, average, or re-map SLD to regular grids. 3. Derive secondary data from given quality levels of SLD. These common tasks require a data extraction and analysis toolkit that is SLD-friendly and, like NCO, familiar in all these communities. With NCO users can 1. Quickly project SLD onto the most useful regular grids for intercomparison. 2. Access sophisticated statistical and regridding functions that are robust to missing data and allow easy specification of quality control metrics. These capabilities improve interoperability, software-reuse, and, because they apply to SLD, minimize transmission, storage, and handling of unwanted data. While SLD analysis still poses many challenges compared to regularly gridded, rectangular data, the custom analyses scripts SLD once required are now shorter, more powerful, and user-friendly.

  7. The Ocean Observatories Initiative: Data Acquisition Functions and Its Built-In Automated Python Modules

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Vardaro, M.; Crowley, M. F.; Glenn, S. M.; Schofield, O.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Fram, J. P.; Kerfoot, J.

    2016-02-01

    The Ocean Observatories Initiative (OOI), funded by the National Science Foundation, provides users with access to long-term datasets from a variety of oceanographic sensors. The Endurance Array in the Pacific Ocean consists of two separate lines off the coasts of Oregon and Washington. The Oregon line consists of 7 moorings, two cabled benthic experiment packages and 6 underwater gliders. The Washington line comprises 6 moorings and 6 gliders. Each mooring is outfitted with a variety of instrument packages. The raw data from these instruments are sent to shore via satellite communication and in some cases, via fiber optic cable. Raw data is then sent to the cyberinfrastructure (CI) group at Rutgers where it is aggregated, parsed into thousands of different data streams, and integrated into a software package called uFrame. The OOI CI delivers the data to the general public via a web interface that outputs data into commonly used scientific data file formats such as JSON, netCDF, and CSV. The Rutgers data management team has developed a series of command-line Python tools that streamline data acquisition in order to facilitate the QA/QC review process. The first step in the process is querying the uFrame database for a list of all available platforms. From this list, a user can choose a specific platform and automatically download all available datasets from the specified platform. The downloaded dataset is plotted using a generalized Python netcdf plotting routine that utilizes a data visualization toolbox called matplotlib. This routine loads each netCDF file separately and outputs plots by each available parameter. These Python tools have been uploaded to a Github repository that is openly available to help facilitate OOI data access and visualization.

  8. Workflow-Oriented Cyberinfrastructure for Sensor Data Analytics

    NASA Astrophysics Data System (ADS)

    Orcutt, J. A.; Rajasekar, A.; Moore, R. W.; Vernon, F.

    2015-12-01

    Sensor streams comprise an increasingly large part of Earth Science data. Analytics based on sensor data require an easy way to perform operations such as acquisition, conversion to physical units, metadata linking, sensor fusion, analysis and visualization on distributed sensor streams. Furthermore, embedding real-time sensor data into scientific workflows is of growing interest. We have implemented a scalable networked architecture that can be used to dynamically access packets of data in a stream from multiple sensors, and perform synthesis and analysis across a distributed network. Our system is based on the integrated Rule Oriented Data System (irods.org), which accesses sensor data from the Antelope Real Time Data System (brtt.com), and provides virtualized access to collections of data streams. We integrate real-time data streaming from different sources, collected for different purposes, on different time and spatial scales, and sensed by different methods. iRODS, noted for its policy-oriented data management, brings to sensor processing features and facilities such as single sign-on, third party access control lists ( ACLs), location transparency, logical resource naming, and server-side modeling capabilities while reducing the burden on sensor network operators. Rich integrated metadata support also makes it straightforward to discover data streams of interest and maintain data provenance. The workflow support in iRODS readily integrates sensor processing into any analytical pipeline. The system is developed as part of the NSF-funded Datanet Federation Consortium (datafed.org). APIs for selecting, opening, reaping and closing sensor streams are provided, along with other helper functions to associate metadata and convert sensor packets into NetCDF and JSON formats. Near real-time sensor data including seismic sensors, environmental sensors, LIDAR and video streams are available through this interface. A system for archiving sensor data and metadata in NetCDF format has been implemented and will be demonstrated at AGU.

  9. Trade Study: Storing NASA HDF5/netCDF-4 Data in the Amazon Cloud and Retrieving Data via Hyrax Server / THREDDS Data Server

    NASA Technical Reports Server (NTRS)

    Habermann, Ted; Jelenak, Aleksander; Lee, Joe; Yang, Kent; Gallagher, James; Potter, Nathan

    2017-01-01

    As part of the overall effort to understand implications of migrating ESDIS data and services to the cloud we are testing several common OPeNDAP and HDF use cases against three architectures for general performance and cost characteristics. The architectures include retrieving entire files, retrieving datasets using HTTP range gets, and retrieving elements of datasets (chunks) with HTTP range gets. We will describe these architectures and discuss our approach to estimating cost.

  10. A data delivery system for IMOS, the Australian Integrated Marine Observing System

    NASA Astrophysics Data System (ADS)

    Proctor, R.; Roberts, K.; Ward, B. J.

    2010-09-01

    The Integrated Marine Observing System (IMOS, www.imos.org.au), an AUD 150 m 7-year project (2007-2013), is a distributed set of equipment and data-information services which, among many applications, collectively contribute to meeting the needs of marine climate research in Australia. The observing system provides data in the open oceans around Australia out to a few thousand kilometres as well as the coastal oceans through 11 facilities which effectively observe and measure the 4-dimensional ocean variability, and the physical and biological response of coastal and shelf seas around Australia. Through a national science rationale IMOS is organized as five regional nodes (Western Australia - WAIMOS, South Australian - SAIMOS, Tasmania - TASIMOS, New SouthWales - NSWIMOS and Queensland - QIMOS) surrounded by an oceanic node (Blue Water and Climate). Operationally IMOS is organized as 11 facilities (Argo Australia, Ships of Opportunity, Southern Ocean Automated Time Series Observations, Australian National Facility for Ocean Gliders, Autonomous Underwater Vehicle Facility, Australian National Mooring Network, Australian Coastal Ocean Radar Network, Australian Acoustic Tagging and Monitoring System, Facility for Automated Intelligent Monitoring of Marine Systems, eMarine Information Infrastructure and Satellite Remote Sensing) delivering data. IMOS data is freely available to the public. The data, a combination of near real-time and delayed mode, are made available to researchers through the electronic Marine Information Infrastructure (eMII). eMII utilises the Australian Academic Research Network (AARNET) to support a distributed database on OPeNDAP/THREDDS servers hosted by regional computing centres. IMOS instruments are described through the OGC Specification SensorML and where-ever possible data is in CF compliant netCDF format. Metadata, conforming to standard ISO 19115, is automatically harvested from the netCDF files and the metadata records catalogued in the OGC GeoNetwork Metadata Entry and Search Tool (MEST). Data discovery, access and download occur via web services through the IMOS Ocean Portal (http://imos.aodn.org.au) and tools for the display and integration of near real-time data are in development.

  11. The Weather Radar Toolkit, National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center's support of interoperability and the Global Earth Observation System of Systems (GEOSS)

    NASA Astrophysics Data System (ADS)

    Ansari, S.; Del Greco, S.

    2006-12-01

    In February 2005, 61 countries around the World agreed on a 10 year plan to work towards building open systems for sharing geospatial data and services across different platforms worldwide. This system is known as the Global Earth Observation System of Systems (GEOSS). The objective of GEOSS focuses on easy access to environmental data and interoperability across different systems allowing participating countries to measure the "pulse" of the planet in an effort to advance society. In support of GEOSS goals, NOAA's National Climatic Data Center (NCDC) has developed radar visualization and data exporter tools in an open systems environment. The NCDC Weather Radar Toolkit (WRT) loads Weather Surveillance Radar 1988 Doppler (WSR-88D) volume scan (S-band) data, known as Level-II, and derived products, known as Level-III, into an Open Geospatial Consortium (OGC) compliant environment. The application is written entirely in Java and will run on any Java- supported platform including Windows, Macintosh and Linux/Unix. The application is launched via Java Web Start and runs on the client machine while accessing these data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT allows the data to be manipulated to create custom mosaics, composites and precipitation estimates. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. By decoding the various Radar formats into the NetCDF Common Data Model, the exported NetCDF data becomes interoperable with existing software packages including THREDDS Data Server and the Integrated Data Viewer (IDV). The NCDC recently partnered with NOAA's National Severe Storms Lab (NSSL) to decode Sigmet C-band Doppler radar data providing the NCDC Viewer/Data Exporter the functionality to read C-Band. This also supports a bilateral agreement between the United States and Canada for data sharing and to support interoperability with the US WSR-88D and Environment Canada radar networks. In addition, the NCDC partnered with the University of Oklahoma to develop decoders to read a test bed of distributed X- band radars that are funded through the Collaborative Adaptive Sensing of the Atmosphere (CASA) project. The NCDC is also archiving the National Mosaic and Next Generation QPE (Q2) products from NSSL, which provide products such as three-dimensional reflectivity, composite reflectivity and precipitation estimates at a 1 km resolution. These three sources of Radar data are also supported in the WRT.

  12. A NetCDF version of the two-dimensional energy balance model based on the full multigrid algorithm

    NASA Astrophysics Data System (ADS)

    Zhuang, Kelin; North, Gerald R.; Stevens, Mark J.

    A NetCDF version of the two-dimensional energy balance model based on the full multigrid method in Fortran is introduced for both pedagogical and research purposes. Based on the land-sea-ice distribution, orbital elements, greenhouse gases concentration, and albedo, the code calculates the global seasonal surface temperature. A step-by-step guide with examples is provided for practice.

  13. Developing a Hadoop-based Middleware for Handling Multi-dimensional NetCDF

    NASA Astrophysics Data System (ADS)

    Li, Z.; Yang, C. P.; Schnase, J. L.; Duffy, D.; Lee, T. J.

    2014-12-01

    Climate observations and model simulations are collecting and generating vast amounts of climate data, and these data are ever-increasing and being accumulated in a rapid speed. Effectively managing and analyzing these data are essential for climate change studies. Hadoop, a distributed storage and processing framework for large data sets, has attracted increasing attentions in dealing with the Big Data challenge. The maturity of Infrastructure as a Service (IaaS) of cloud computing further accelerates the adoption of Hadoop in solving Big Data problems. However, Hadoop is designed to process unstructured data such as texts, documents and web pages, and cannot effectively handle the scientific data format such as array-based NetCDF files and other binary data format. In this paper, we propose to build a Hadoop-based middleware for transparently handling big NetCDF data by 1) designing a distributed climate data storage mechanism based on POSIX-enabled parallel file system to enable parallel big data processing with MapReduce, as well as support data access by other systems; 2) modifying the Hadoop framework to transparently processing NetCDF data in parallel without sequencing or converting the data into other file formats, or loading them to HDFS; and 3) seamlessly integrating Hadoop, cloud computing and climate data in a highly scalable and fault-tolerance framework.

  14. CMEMS (Copernicus Marine Environment Monitoring Service) In Situ Thematic Assembly Centre: A service for operational Oceanography

    NASA Astrophysics Data System (ADS)

    Manzano Muñoz, Fernando; Pouliquen, Sylvie; Petit de la Villeon, Loic; Carval, Thierry; Loubrieu, Thomas; Wedhe, Henning; Sjur Ringheim, Lid; Hammarklint, Thomas; Tamm, Susanne; De Alfonso, Marta; Perivoliotis, Leonidas; Chalkiopoulos, Antonis; Marinova, Veselka; Tintore, Joaquin; Troupin, Charles

    2016-04-01

    Copernicus, previously known as GMES (Global Monitoring for Environment and Security), is the European Programme for the establishment of a European capacity for Earth Observation and Monitoring. Copernicus aims to provide a sustainable service for Ocean Monitoring and Forecasting validated and commissioned by users. From May 2015, the Copernicus Marine Environment Monitoring Service (CMEMS) is working on an operational mode through a contract with services engagement (result is regular data provision). Within CMEMS, the In Situ Thematic Assembly Centre (INSTAC) distributed service integrates in situ data from different sources for operational oceanography needs. CMEMS INSTAC is collecting and carrying out quality control in a homogeneous manner on data from providers outside Copernicus (national and international networks), to fit the needs of internal and external users. CMEMS INSTAC has been organized in 7 regional Dissemination Units (DUs) to rely on the EuroGOOS ROOSes. Each DU aggregates data and metadata provided by a series of Production Units (PUs) acting as an interface for providers. Homogeneity and standardization are key features to ensure coherent and efficient service. All DUs provide data in the OceanSITES NetCDF format 1.2 (based on NetCDF 3.6), which is CF compliant, relies on SeaDataNet vocabularies and is able to handle profile and time-series measurements. All the products, both near real-time (NRT) and multi-year (REP), are available online for every CMEMS registered user through an FTP service. On top of the FTP service, INSTAC products are available through Oceanotron, an open-source data server dedicated to marine observations dissemination. It provides services such as aggregation on spatio-temporal coordinates and observed parameters, and subsetting on observed parameters and metadata. The accuracy of the data is checked on various levels. Quality control procedures are applied for the validity of the data and correctness tests for the metadata of each NetCDF file. The quality control procedures for the data include different routines for NRT and REP products. Key Performance Indicators (KPI) for monitoring purposes are also used in Copernicus. They allow a periodic monitoring of the availability, quantity and quality of the INSTAC data integrated in the NRT products. Statistical reports are generated on quarterly and yearly basis to provide more visibility on the coverage in space and time of the INSTAC NRT and REP products, as well as information on their quality. These reports are generated using Java and Python procedures developed within the INSTAC group. One of the most critical tasks for the DUs is to generate NetCDF files compliant with the agreed format. Many tools and programming libraries have been developed for that purpose, for instance Unidata Java Library. These tools provide NetCDF data management capabilities including creation, reading and modification. Some DUs have also developed regional data portals which offer useful information for the users including data charts, platforms availability through interactive maps, KPI and statistical figures and direct access to the FTP service. The proposed presentation will detail Copernicus in situ data service and the monitoring tools that have been developed by the INSTAC group.

  15. Development of a gridded meteorological dataset over Java island, Indonesia 1985–2014

    PubMed Central

    Yanto; Livneh, Ben; Rajagopalan, Balaji

    2017-01-01

    We describe a gridded daily meteorology dataset consisting of precipitation, minimum and maximum temperature over Java Island, Indonesia at 0.125°×0.125° (~14 km) resolution spanning 30 years from 1985–2014. Importantly, this data set represents a marked improvement from existing gridded data sets over Java with higher spatial resolution, derived exclusively from ground-based observations unlike existing satellite or reanalysis-based products. Gap-infilling and gridding were performed via the Inverse Distance Weighting (IDW) interpolation method (radius, r, of 25 km and power of influence, α, of 3 as optimal parameters) restricted to only those stations including at least 3,650 days (~10 years) of valid data. We employed MSWEP and CHIRPS rainfall products in the cross-validation. It shows that the gridded rainfall presented here produces the most reasonable performance. Visual inspection reveals an increasing performance of gridded precipitation from grid, watershed to island scale. The data set, stored in a network common data form (NetCDF), is intended to support watershed-scale and island-scale studies of short-term and long-term climate, hydrology and ecology. PMID:28534871

  16. Ocean Tracking Network (OTN): Development of Oceanographic Data Integration with Animal Movement

    NASA Astrophysics Data System (ADS)

    Bajona, L.

    2016-02-01

    OTN is a $168-million ocean research and technology development platform headquartered at Dalhousie University, Canada. Using acoustic and satellite telemetry to globally document the movements and survival of aquatic animals, and their environmental correlates. The OTN Mission: to foster conservation and sustainability of valued species by generating knowledge on the movement patterns of aquatic species in their changing environment. OTN's ever-expanding global network of acoustic receivers listening for over 90 different key animal species is providing for the data needed in working in collaboration with researchers for the development of oceanographic data integration with animal movement. Presented here is Data Management's work to date, status and challenges in OTN's move towards a community standard to enable sharing between projects nationally and internationally; permitting inter-operability with other large national (e.g. CHONe, ArcticNET) and international (IOOS, IMOS) networks. This work includes co-development of Animal Acoustic Telemetry (AAT) metadata standard and implementation using an ERDDAP data server (NOAA, Environmental Research Division's Data Access Program) facilitating ingestion for modelers (eg. netcdf).

  17. Long-term oceanographic observations in Massachusetts Bay, 1989-2006

    USGS Publications Warehouse

    Butman, Bradford; Alexander, P. Soupy; Bothner, Michael H.; Borden, Jonathan; Casso, Michael A.; Gutierrez, Benjamin T.; Hastings, Mary E.; Lightsom, Frances L.; Martini, Marinna A.; Montgomery, Ellyn T.; Rendigs, Richard R.; Strahle, William S.

    2009-01-01

    This data report presents long-term oceanographic observations made in western Massachusetts Bay at long-term site A (LT-A) (42 deg 22.6' N., 70 deg 47.0' W.; nominal water depth 32 meters) from December 1989 through February 2006 and long-term site B (LT-B) (42 deg 9.8' N., 70 deg 38.4' W.; nominal water depth 22 meters) from October 1997 through February 2004 (fig. 1). The observations were collected as part of a U.S. Geological Survey (USGS) study designed to understand the transport and long-term fate of sediments and associated contaminants in Massachusetts Bay. The observations include time-series measurements of current, temperature, salinity, light transmission, pressure, oxygen, fluorescence, and sediment-trapping rate. About 160 separate mooring or tripod deployments were made on about 90 research cruises to collect these long-term observations. This report presents a description of the 16-year field program and the instrumentation used to make the measurements, an overview of the data set, more than 2,500 pages of statistics and plots that summarize the data, and the digital data in Network Common Data Form (NetCDF) format. This research was conducted by the USGS in cooperation with the Massachusetts Water Resources Authority and the U.S. Coast Guard.

  18. Data Access Services that Make Remote Sensing Data Easier to Use

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher

    2010-01-01

    This slide presentation reviews some of the processes that NASA uses to make the remote sensing data easy to use over the World Wide Web. This work involves much research into data formats, geolocation structures and quality indicators, often to be followed by coding a preprocessing program. Only then are the data usable within the analysis tool of choice. The Goddard Earth Sciences Data and Information Services Center is deploying a variety of data access services that are designed to dramatically shorten the time consumed in the data preparation step. On-the-fly conversion to the standard network Common Data Form (netCDF) format with Climate-Forecast (CF) conventions imposes a standard coordinate system framework that makes data instantly readable through several tools, such as the Integrated Data Viewer, Gridded Analysis and Display System, Panoply and Ferret. A similar benefit is achieved by serving data through the Open Source Project for a Network Data Access Protocol (OPeNDAP), which also provides subsetting. The Data Quality Screening Service goes a step further in filtering out data points based on quality control flags, based on science team recommendations or user-specified criteria. Further still is the Giovanni online analysis system which goes beyond handling formatting and quality to provide visualization and basic statistics of the data. This general approach of automating the preparation steps has the important added benefit of enabling use of the data by non-human users (i.e., computer programs), which often make sub-optimal use of the available data due to the need to hard-code data preparation on the client side.

  19. Playing the Metadata Game: Technologies and Strategies Used by Climate Diagnostics Center for Cataloging and Distributing Climate Data.

    NASA Astrophysics Data System (ADS)

    Schweitzer, R. H.

    2001-05-01

    The Climate Diagnostics Center maintains a collection of gridded climate data primarily for use by local researchers. Because this data is available on fast digital storage and because it has been converted to netCDF using a standard metadata convention (called COARDS), we recognize that this data collection is also useful to the community at large. At CDC we try to use technology and metadata standards to reduce our costs associated with making these data available to the public. The World Wide Web has been an excellent technology platform for meeting that goal. Specifically we have developed Web-based user interfaces that allow users to search, plot and download subsets from the data collection. We have also been exploring use of the Pacific Marine Environment Laboratory's Live Access Server (LAS) as an engine for this task. This would result in further savings by allowing us to concentrate on customizing the LAS where needed, rather that developing and maintaining our own system. One such customization currently under development is the use of Java Servlets and JavaServer pages in conjunction with a metadata database to produce a hierarchical user interface to LAS. In addition to these Web-based user interfaces all of our data are available via the Distributed Oceanographic Data System (DODS). This allows other sites using LAS and individuals using DODS-enabled clients to use our data as if it were a local file. All of these technology systems are driven by metadata. When we began to create netCDF files, we collaborated with several other agencies to develop a netCDF convention (COARDS) for metadata. At CDC we have extended that convention to incorporate additional metadata elements to make the netCDF files as self-describing as possible. Part of the local metadata is a set of controlled names for the variable, level in the atmosphere and ocean, statistic and data set for each netCDF file. To allow searching and easy reorganization of these metadata, we loaded the metadata from the netCDF files into a mySQL database. The combination of the mySQL database and the controlled names makes it possible to automate the construction of user interfaces and standard format metadata descriptions, like Federal Geographic Data Committee (FGDC) and Directory Interchange Format (DIF). These standard descriptions also include an association between our controlled names and standard keywords such as those developed by the Global Change Master Directory (GCMD). This talk will give an overview of each of these technology and metadata standards as it applies to work at the Climate Diagnostics Center. The talk will also discuss the pros and cons of each approach and discuss areas for future development.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less

  1. Bit Grooming: statistically accurate precision-preserving quantization with compression, evaluated in the netCDF Operators (NCO, v4.4.8+)

    NASA Astrophysics Data System (ADS)

    Zender, Charles S.

    2016-09-01

    Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits of consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25-80 and 5-65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1-5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1-2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that it can compress, Bit Grooming guarantees the specified precision throughout the full floating-point range. Data quantization by Bit Grooming is irreversible (i.e., lossy) yet transparent, meaning that no extra processing is required by data users/readers. Hence Bit Grooming can easily reduce data storage volume without sacrificing scientific precision or imposing extra burdens on users.

  2. Sharing electronic structure and crystallographic data with ETSF_IO

    NASA Astrophysics Data System (ADS)

    Caliste, D.; Pouillon, Y.; Verstraete, M. J.; Olevano, V.; Gonze, X.

    2008-11-01

    We present a library of routines whose main goal is to read and write exchangeable files (NetCDF file format) storing electronic structure and crystallographic information. It is based on the specification agreed inside the European Theoretical Spectroscopy Facility (ETSF). Accordingly, this library is nicknamed ETSF_IO. The purpose of this article is to give both an overview of the ETSF_IO library and a closer look at its usage. ETSF_IO is designed to be robust and easy to use, close to Fortran read and write routines. To facilitate its adoption, a complete documentation of the input and output arguments of the routines is available in the package, as well as six tutorials explaining in detail various possible uses of the library routines. Catalogue identifier: AEBG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Gnu Lesser General Public License No. of lines in distributed program, including test data, etc.: 63 156 No. of bytes in distributed program, including test data, etc.: 363 390 Distribution format: tar.gz Programming language: Fortran 95 Computer: All systems with a Fortran95 compiler Operating system: All systems with a Fortran95 compiler Classification: 7.3, 8 External routines: NetCDF, http://www.unidata.ucar.edu/software/netcdf Nature of problem: Store and exchange electronic structure data and crystallographic data independently of the computational platform, language and generating software Solution method: Implement a library based both on NetCDF file format and an open specification (http://etsf.eu/index.php?page=standardization)

  3. Comparing NetCDF and SciDB on managing and querying 5D hydrologic dataset

    NASA Astrophysics Data System (ADS)

    Liu, Haicheng; Xiao, Xiao

    2016-11-01

    Efficiently extracting information from high dimensional hydro-meteorological modelling datasets requires smart solutions. Traditional methods are mostly based on files, which can be edited and accessed handily. But they have problems of efficiency due to contiguous storage structure. Others propose databases as an alternative for advantages such as native functionalities for manipulating multidimensional (MD) arrays, smart caching strategy and scalability. In this research, NetCDF file based solutions and the multidimensional array database management system (DBMS) SciDB applying chunked storage structure are benchmarked to determine the best solution for storing and querying 5D large hydrologic modelling dataset. The effect of data storage configurations including chunk size, dimension order and compression on query performance is explored. Results indicate that dimension order to organize storage of 5D data has significant influence on query performance if chunk size is very large. But the effect becomes insignificant when chunk size is properly set. Compression of SciDB mostly has negative influence on query performance. Caching is an advantage but may be influenced by execution of different query processes. On the whole, NetCDF solution without compression is in general more efficient than the SciDB DBMS.

  4. US Geoscience Information Network, Web Services for Geoscience Information Discovery and Access

    NASA Astrophysics Data System (ADS)

    Richard, S.; Allison, L.; Clark, R.; Coleman, C.; Chen, G.

    2012-04-01

    The US Geoscience information network has developed metadata profiles for interoperable catalog services based on ISO19139 and the OGC CSW 2.0.2. Currently data services are being deployed for the US Dept. of Energy-funded National Geothermal Data System. These services utilize OGC Web Map Services, Web Feature Services, and THREDDS-served NetCDF for gridded datasets. Services and underlying datasets (along with a wide variety of other information and non information resources are registered in the catalog system. Metadata for registration is produced by various workflows, including harvest from OGC capabilities documents, Drupal-based web applications, transformation from tabular compilations. Catalog search is implemented using the ESRI Geoportal open-source server. We are pursuing various client applications to demonstrated discovery and utilization of the data services. Currently operational applications allow catalog search and data acquisition from map services in an ESRI ArcMap extension, a catalog browse and search application built on openlayers and Django. We are developing use cases and requirements for other applications to utilize geothermal data services for resource exploration and evaluation.

  5. SchemaOnRead: A Package for Schema-on-Read in R

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less

  6. The UGRID Reader - A ParaView Plugin for the Visualization of Unstructured Climate Model Data in NetCDF Format

    NASA Astrophysics Data System (ADS)

    Brisc, Felicia; Vater, Stefan; Behrens, Joern

    2016-04-01

    We present the UGRID Reader, a visualization software component that implements the UGRID Conventions into Paraview. It currently supports the reading and visualization of 2D unstructured triangular, quadrilateral and mixed triangle/quadrilateral meshes, while the data can be defined per cell or per vertex. The Climate and Forecast Metadata Conventions (CF Conventions) have been set for many years as the standard framework for climate data written in NetCDF format. While they allow storing unstructured data simply as data defined at a series of points, they do not currently address the topology of the underlying unstructured mesh. However, it is often necessary to have additional mesh topology information, i.e. is it a one dimensional network, a 2D triangular mesh or a flexible mixed triangle/quadrilateral mesh, a 2D mesh with vertical layers, or a fully unstructured 3D mesh. The UGRID Conventions proposed by the UGRID Interoperability group are attempting to fill in this void by extending the CF Conventions with topology specifications. As the UGRID Conventions are increasingly popular with an important subset of the CF community, they warrant the development of a customized tool for the visualization and exploration of UGRID-conforming data. The implementation of the UGRID Reader has been designed corresponding to the ParaView plugin architecture. This approach allowed us to tap into the powerful reading and rendering capabilities of ParaView, while the reader is easy to install. We aim at parallelism to be able to process large data sets. Furthermore, our current application of the reader is the visualization of higher order simulation output which demands for a special representation of the data within a cell.

  7. mizuRoute version 1: A river network routing tool for a continental domain water resources applications

    USGS Publications Warehouse

    Mizukami, Naoki; Clark, Martyn P.; Sampson, Kevin; Nijssen, Bart; Mao, Yixin; McMillan, Hilary; Viger, Roland; Markstrom, Steven; Hay, Lauren E.; Woods, Ross; Arnold, Jeffrey R.; Brekke, Levi D.

    2016-01-01

    This paper describes the first version of a stand-alone runoff routing tool, mizuRoute. The mizuRoute tool post-processes runoff outputs from any distributed hydrologic model or land surface model to produce spatially distributed streamflow at various spatial scales from headwater basins to continental-wide river systems. The tool can utilize both traditional grid-based river network and vector-based river network data. Both types of river network include river segment lines and the associated drainage basin polygons, but the vector-based river network can represent finer-scale river lines than the grid-based network. Streamflow estimates at any desired location in the river network can be easily extracted from the output of mizuRoute. The routing process is simulated as two separate steps. First, hillslope routing is performed with a gamma-distribution-based unit-hydrograph to transport runoff from a hillslope to a catchment outlet. The second step is river channel routing, which is performed with one of two routing scheme options: (1) a kinematic wave tracking (KWT) routing procedure; and (2) an impulse response function – unit-hydrograph (IRF-UH) routing procedure. The mizuRoute tool also includes scripts (python, NetCDF operators) to pre-process spatial river network data. This paper demonstrates mizuRoute's capabilities to produce spatially distributed streamflow simulations based on river networks from the United States Geological Survey (USGS) Geospatial Fabric (GF) data set in which over 54 000 river segments and their contributing areas are mapped across the contiguous United States (CONUS). A brief analysis of model parameter sensitivity is also provided. The mizuRoute tool can assist model-based water resources assessments including studies of the impacts of climate change on streamflow.

  8. Bit Grooming: Statistically accurate precision-preserving quantization with compression, evaluated in the netCDF operators (NCO, v4.4.8+)

    DOE PAGES

    Zender, Charles S.

    2016-09-19

    Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits ofmore » consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25–80 and 5–65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1–5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1–2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that it can compress, Bit Grooming guarantees the specified precision throughout the full floating-point range. Data quantization by Bit Grooming is irreversible (i.e., lossy) yet transparent, meaning that no extra processing is required by data users/readers. Hence Bit Grooming can easily reduce data storage volume without sacrificing scientific precision or imposing extra burdens on users.« less

  9. Investigating the feasibility of Visualising Complex Space Weather Data in a CAVE

    NASA Astrophysics Data System (ADS)

    Loughlin, S.; Habash Krause, L.

    2013-12-01

    The purpose of this study was to investigate the feasibility of visualising complex space weather data in a Cave Automatic Virtual Environment (CAVE). Space weather is increasingly causing disruptions on Earth, such as power outages and disrupting communication to satellites. We wanted to display this space weather data within the CAVE since the data from instruments, models and simulations are typically too complex to understand on their own, especially when they are of 7 dimensions. To accomplish this, I created a VTK to NetCDF converter. NetCDF is a science data format, which stores array oriented scientific data. The format is maintained by the University Corporation for Atmospheric Research, and is used extensively by the atmospheric and space communities.

  10. The International Satellite Cloud Climatology Project H-Series climate data record product

    NASA Astrophysics Data System (ADS)

    Young, Alisa H.; Knapp, Kenneth R.; Inamdar, Anand; Hankins, William; Rossow, William B.

    2018-03-01

    This paper describes the new global long-term International Satellite Cloud Climatology Project (ISCCP) H-series climate data record (CDR). The H-series data contain a suite of level 2 and 3 products for monitoring the distribution and variation of cloud and surface properties to better understand the effects of clouds on climate, the radiation budget, and the global hydrologic cycle. This product is currently available for public use and is derived from both geostationary and polar-orbiting satellite imaging radiometers with common visible and infrared (IR) channels. The H-series data currently span July 1983 to December 2009 with plans for continued production to extend the record to the present with regular updates. The H-series data are the longest combined geostationary and polar orbiter satellite-based CDR of cloud properties. Access to the data is provided in network common data form (netCDF) and archived by NOAA's National Centers for Environmental Information (NCEI) under the satellite Climate Data Record Program (https://doi.org/10.7289/V5QZ281S). The basic characteristics, history, and evolution of the dataset are presented herein with particular emphasis on and discussion of product changes between the H-series and the widely used predecessor D-series product which also spans from July 1983 through December 2009. Key refinements included in the ISCCP H-series CDR are based on improved quality control measures, modified ancillary inputs, higher spatial resolution input and output products, calibration refinements, and updated documentation and metadata to bring the H-series product into compliance with existing standards for climate data records.

  11. A polarimetric scattering database for non-spherical ice particles at microwave wavelengths

    NASA Astrophysics Data System (ADS)

    Lu, Yinghui; Jiang, Zhiyuan; Aydin, Kultegin; Verlinde, Johannes; Clothiaux, Eugene E.; Botta, Giovanni

    2016-10-01

    The atmospheric science community has entered a period in which electromagnetic scattering properties at microwave frequencies of realistically constructed ice particles are necessary for making progress on a number of fronts. One front includes retrieval of ice-particle properties and signatures from ground-based, airborne, and satellite-based radar and radiometer observations. Another front is evaluation of model microphysics by application of forward operators to their outputs and comparison to observations during case study periods. Yet a third front is data assimilation, where again forward operators are applied to databases of ice-particle scattering properties and the results compared to observations, with their differences leading to corrections of the model state. Over the past decade investigators have developed databases of ice-particle scattering properties at microwave frequencies and made them openly available. Motivated by and complementing these earlier efforts, a database containing polarimetric single-scattering properties of various types of ice particles at millimeter to centimeter wavelengths is presented. While the database presented here contains only single-scattering properties of ice particles in a fixed orientation, ice-particle scattering properties are computed for many different directions of the radiation incident on them. These results are useful for understanding the dependence of ice-particle scattering properties on ice-particle orientation with respect to the incident radiation. For ice particles that are small compared to the wavelength, the number of incident directions of the radiation is sufficient to compute reasonable estimates of their (randomly) orientation-averaged scattering properties. This database is complementary to earlier ones in that it contains complete (polarimetric) scattering property information for each ice particle - 44 plates, 30 columns, 405 branched planar crystals, 660 aggregates, and 640 conical graupel - and direction of incident radiation but is limited to four frequencies (X-, Ku-, Ka-, and W-bands), does not include temperature dependencies of the single-scattering properties, and does not include scattering properties averaged over randomly oriented ice particles. Rules for constructing the morphologies of ice particles from one database to the next often differ; consequently, analyses that incorporate all of the different databases will contain the most variability, while illuminating important differences between them. Publication of this database is in support of future analyses of this nature and comes with the hope that doing so helps contribute to the development of a database standard for ice-particle scattering properties, like the NetCDF (Network Common Data Form) CF (Climate and Forecast) or NetCDF CF/Radial metadata conventions.

  12. MK3TOOLS & NetCDF - storing VLBI data in a machine independent array oriented data format

    NASA Astrophysics Data System (ADS)

    Hobiger, T.; Koyama, Y.; Kondo, T.

    2007-07-01

    In the beginning of 2002 the International VLBI Service (IVS) has agreed to introduce a Platform-independent VLBI exchange format (PIVEX) which permits the exchange of observational data and stimulates the research across different analysis groups. Unfortunately PIVEX has never been implemented and many analysis software packages are still depending on prior processing (e.g. ambiguity resolution and computation of ionosphere corrections) done by CALC/SOLVE. Thus MK3TOOLS which handles MK3 databases without CALC/SOLVE being installed has been developed. It uses the NetCDF format to store the data and since interfaces exist for a variety of programming languages (FORTRAN, C/C++, JAVA, Perl, Python) it can be easily incorporated in existing and upcoming analysis software packages.

  13. Tools and strategies for instrument monitoring, data mining and data access

    NASA Astrophysics Data System (ADS)

    van Hees, R. M., ,, Dr

    2009-04-01

    The ever growing size of data sets produced by various satellite instruments creates a challenge in data management. Three main tasks were identified: instrument performance monitoring, data mining by users and data deployment. In this presentation, I will discuss the three tasks and our solution. As a practical example to illustrate the problem and make the discussion less abstract, I will use Sciamachy on-board the ESA satellite Envisat. Since the launch of Envisat, in March 2002, Sciamachy has performed nearly a billion science measurements and performed daily calibrations measurements. The total size of the data set (not including reprocessed data) is over 30 TB, distributed over 150,000 files. [Instrument Monitoring] Most instruments produce house-keeping data, which may include time, geo-location, temperature of different parts of the instrument and instrument settings and configuration. In addition, many instruments perform calibration measurements. Instrument performance monitoring requires automated analyzes of critical parameters for events, and the option to off-line inspect the behavior of various parameters in time. We choose to extract the necessary information from the SCIAMACHY data products, and store everything in one file, where we separated house-keeping data from calibration measurements. Due to the large volume and the need to have quick random-access, the Hierarchical Data Format (HDF5) was our obvious choice. The HDF5 format is self describing and designed to organize different types of data in one file. For example, one data set may contain the meta data of the calibration measurements: time, geo-location, instrument settings, quality parameters (temperature of the instrument), while a second large data set contains the actual measurements. The HDF5 high-level packet table API is ideal for tables that only grow (by appending rows), while the HDF5 table API is better suited for tables where rows need to be updated, inserted or replaced. In particular, the packet table API allows very compact storage of compound data sets and very fast read/write access. Details about this implementation and pitfalls will be given in the presentation. [Data Mining] The ability to select relevant data is a requirement that all data centers have to offer. The NL-SCIA-DC allows the users to select data using several criteria including: time, geo-location, type of observation and data quality. The result of the query are [i] location and name of relevant data products (files), or [ii] listing of meta data of the relevant measurements, or [iii] listing of the measurements (level 2 or higher). For this application, we need the power of a relational database, the SQL language, and the availability of spatial functions. PostgreSQL, extended with postGIS support turned out to be a good choice. Common queries on tables with millions of rows can be executed within seconds. [Data Deployment] The dissemination of scientific data is often cumbersome by the usage of many different formats to store the products. Therefore, time-consuming and inefficient conversions are needed to use data products from different origin. Within the Atmospheric Data Access for the Geospatial User Community (ADAGUC) project we provide selected space borne atmospheric and land data sets in the same data format and consistent internal structure, so that users can easily use and combine data. The common format for storage is HDF5, but the netCDF-4 API is used to create the data sets. The standard for metadata and dataset attributes follow the netCDF Climate and Forecast conventions, in addition metadata complies to the ISO 19115:2003 INSPIRE profile are added. The advantage of netCDF-4 is that the API is essentially equal to netCDF-3 (with a few extensions), while the data format is HDF5 (recognized by many scientific tools). The added metadata ensures product traceability. Details will be given in the presentation and several posters.

  14. Figure4

    EPA Pesticide Factsheets

    NetCDF files of PBL height (m), Shortwave Radiation, 10 m wind speed from WRF and Ozone from CMAQ. The data is the standard deviation of these variables for each hour of the 4 day simulation. Figure 4 is only one of the time periods: June 8, 2100 UTC. The NetCDF files have a time stamp (Times) that can be used to find this time in order to reproduce the Figure 4. Also included is a data dictionary that describes the domain and all other attributes of the model simulation.This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).

  15. eWaterCycle visualisation. combining the strength of NetCDF and Web Map Service: ncWMS

    NASA Astrophysics Data System (ADS)

    Hut, R.; van Meersbergen, M.; Drost, N.; Van De Giesen, N.

    2016-12-01

    As a result of the eWatercycle global hydrological forecast we have created Cesium-ncWMS, a web application based on ncWMS and Cesium. ncWMS is a server side application capable of reading any NetCDF file written using the Climate and Forecasting (CF) conventions, and making the data available as a Web Map Service(WMS). ncWMS automatically determines available variables in a file, and creates maps colored according to map data and a user selected color scale. Cesium is a Javascript 3D virtual Globe library. It uses WebGL for rendering, which makes it very fast, and it is capable of displaying a wide variety of data types such as vectors, 3D models, and 2D maps. The forecast results are automatically uploaded to our web server running ncWMS. In turn, the web application can be used to change the settings for color maps and displayed data. The server uses the settings provided by the web application, together with the data in NetCDF to provide WMS image tiles, time series data and legend graphics to the Cesium-NcWMS web application. The user can simultaneously zoom in to the very high resolution forecast results anywhere on the world, and get time series data for any point on the globe. The Cesium-ncWMS visualisation combines a global overview with local relevant information in any browser. See the visualisation live at forecast.ewatercycle.org

  16. SysSon - A Framework for Systematic Sonification Design

    NASA Astrophysics Data System (ADS)

    Vogt, Katharina; Goudarzi, Visda; Holger Rutz, Hanns

    2015-04-01

    SysSon is a research approach on introducing sonification systematically to a scientific community where it is not yet commonly used - e.g., in climate science. Thereby, both technical and socio-cultural barriers have to be met. The approach was further developed with climate scientists, who participated in contextual inquiries, usability tests and a workshop of collaborative design. Following from these extensive user tests resulted our final software framework. As frontend, a graphical user interface allows climate scientists to parametrize standard sonifications with their own data sets. Additionally, an interactive shell allows to code new sonifications for users competent in sound design. The framework is a standalone desktop application, available as open source (for details see http://sysson.kug.ac.at/) and works with data in NetCDF format.

  17. Autoplot: a Browser for Science Data on the Web

    NASA Astrophysics Data System (ADS)

    Faden, J.; Weigel, R. S.; West, E. E.; Merka, J.

    2008-12-01

    Autoplot (www.autoplot.org) is software for plotting data from many different sources and in many different file formats. Data from CDF, CEF, Fits, NetCDF, and OpenDAP can be plotted, along with many other sources such as ASCII tables and Excel spreadsheets. This is done by adapting these various data formats and APIs into a common data model that borrows from the netCDF and CDF data models. Autoplot uses a web browser metaphor to simplify use. The user specifies a parameter URL, for example a CDF file accessible via http with a parameter name appended, and the file resource is downloaded and the parameter is rendered in a scientifically meaningful way. When data span multiple files, the user can use a file name template in the URL to aggregate (combine) a set of remote files. So the problem of aggregating data across file boundaries is handled on the client side, allowing simple web servers to be used. The das2 graphics library provides rich controls for exploring the data. Scripting is supported through Python, providing not just programmatic control, but for calculating new parameters in a language that will look familiar to IDL and Matlab users. Autoplot is Java-based software, and will run on most computers without a burdensome installation process. It can also used as an applet or as a servlet that serves static images. Autoplot was developed as part of the Virtual Radiation Belt Observatory (ViRBO) project, and is also being used for the Virtual Magnetospheric Observatory (VMO). It is expected that this flexible, general-purpose plotting tool will be useful for allowing a data provider to add instant visualization capabilities to a directory of files or for general use in the Virtual Observatory environment.

  18. EARLINET: potential operationality of a research network

    NASA Astrophysics Data System (ADS)

    Sicard, M.; D'Amico, G.; Comerón, A.; Mona, L.; Alados-Arboledas, L.; Amodeo, A.; Baars, H.; Baldasano, J. M.; Belegante, L.; Binietoglou, I.; Bravo-Aranda, J. A.; Fernández, A. J.; Fréville, P.; García-Vizcaíno, D.; Giunta, A.; Granados-Muñoz, M. J.; Guerrero-Rascado, J. L.; Hadjimitsis, D.; Haefele, A.; Hervo, M.; Iarlori, M.; Kokkalis, P.; Lange, D.; Mamouri, R. E.; Mattis, I.; Molero, F.; Montoux, N.; Muñoz, A.; Muñoz Porcar, C.; Navas-Guzmán, F.; Nicolae, D.; Nisantzi, A.; Papagiannopoulos, N.; Papayannis, A.; Pereira, S.; Preißler, J.; Pujadas, M.; Rizi, V.; Rocadenbosch, F.; Sellegri, K.; Simeonov, V.; Tsaknakis, G.; Wagner, F.; Pappalardo, G.

    2015-11-01

    In the framework of ACTRIS (Aerosols, Clouds, and Trace Gases Research Infrastructure Network) summer 2012 measurement campaign (8 June-17 July 2012), EARLINET organized and performed a controlled exercise of feasibility to demonstrate its potential to perform operational, coordinated measurements and deliver products in near-real time. Eleven lidar stations participated in the exercise which started on 9 July 2012 at 06:00 UT and ended 72 h later on 12 July at 06:00 UT. For the first time, the single calculus chain (SCC) - the common calculus chain developed within EARLINET for the automatic evaluation of lidar data from raw signals up to the final products - was used. All stations sent in real-time measurements of a 1 h duration to the SCC server in a predefined netcdf file format. The pre-processing of the data was performed in real time by the SCC, while the optical processing was performed in near-real time after the exercise ended. 98 and 79 % of the files sent to SCC were successfully pre-processed and processed, respectively. Those percentages are quite large taking into account that no cloud screening was performed on the lidar data. The paper draws present and future SCC users' attention to the most critical parameters of the SCC product configuration and their possible optimal value but also to the limitations inherent to the raw data. The continuous use of SCC direct and derived products in heterogeneous conditions is used to demonstrate two potential applications of EARLINET infrastructure: the monitoring of a Saharan dust intrusion event and the evaluation of two dust transport models. The efforts made to define the measurements protocol and to configure properly the SCC pave the way for applying this protocol for specific applications such as the monitoring of special events, atmospheric modeling, climate research and calibration/validation activities of spaceborne observations.

  19. Woods Hole Image Processing System Software implementation; using NetCDF as a software interface for image processing

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    The Branch of Atlantic Marine Geology has been involved in the collection, processing and digital mosaicking of high, medium and low-resolution side-scan sonar data during the past 6 years. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. With the need to process sidescan data in the field with increased power and reduced cost of major workstations, a need to have an image processing package on a UNIX based computer system which could be utilized in the field as well as be more generally available to Branch personnel was identified. This report describes the initial development of that package referred to as the Woods Hole Image Processing System (WHIPS). The software was developed using the Unidata NetCDF software interface to allow data to be more readily portable between different computer operating systems.

  20. Scientific Data Storage for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Readey, J.

    2014-12-01

    Traditionally data storage used for geophysical software systems has centered on file-based systems and libraries such as NetCDF and HDF5. In contrast cloud based infrastructure providers such as Amazon AWS, Microsoft Azure, and the Google Cloud Platform generally provide storage technologies based on an object based storage service (for large binary objects) complemented by a database service (for small objects that can be represented as key-value pairs). These systems have been shown to be highly scalable, reliable, and cost effective. We will discuss a proposed system that leverages these cloud-based storage technologies to provide an API-compatible library for traditional NetCDF and HDF5 applications. This system will enable cloud storage suitable for geophysical applications that can scale up to petabytes of data and thousands of users. We'll also cover other advantages of this system such as enhanced metadata search.

  1. Which products are available for subsetting?

    Atmospheric Science Data Center

    2014-12-08

    ... users to create smaller files (subsets) of the original data by selecting desired parameters, parameter criterion, or latitude and ... fluxes, where the net flux is constrained to the global heat storage in netCDF format. Single Scanner Footprint TOA/Surface Fluxes ...

  2. CfRadial - CF NetCDF for Radar and Lidar Data in Polar Coordinates.

    NASA Astrophysics Data System (ADS)

    Dixon, M. J.; Lee, W. C.; Michelson, D.; Curtis, M.

    2016-12-01

    Since 1990, NCAR has supported over 20 different data formats for radar and lidar data in polar coordinates. Researchers, students and operational users spend unnecessary time handling a multitude of unique formats. CfRadial grew out of the need to simplify the use of these data and thereby to improve efficiency in research and operations. CfRadial adopts the well-known NetCDF framework, along with the Climate and Forecasting (CF) conventions such that data and metadata are accurately represented. Mobile platforms are also supported. The first major release, CfRadial version 1.1, occurred in February 2011, followed by minor updates. CfRadial has been adopted by NCAR as well as other agencies in the US and the UK. CfRadial development was boosted in 2015 through a two-year NSF EarthCube grant to improve CF in general. Version 1.4 was agreed upon in May 2016, adding explicit support for quality control fields and spectra. In Europe and Australia, EUMETNET OPERA's HDF5-based ODIM_H5 standard has been rapidly embraced as the modern standard for exchanging weather radar data for operations. ODIM_H5 exploits data groups, hierarchies, and built-in compression, characteristics that have been added to NetCDF4. A meeting of the WMO Task Team on Weather Radar Data Exchange (TT-WRDE) was held at NCAR in Boulder in July 2016, with a goal of identifying a single global standard for radar and lidar data in polar coordinates. CfRadial and ODIM_H5 were considered alongside the older and more rigid table-driven WMO BUFR and GRIB2 formats. TT-WRDE recommended that CfRadial 1.4 be merged with the sweep-oriented structure of ODIM_H5, making use of NetCDF groups, to produce a single format that will encompass the best ideas of both formats. That has led to the emergence of the CfRadial 2.0 standard. This format should meet the objectives of both the NSF EarthCube CF 2.0 initiative and the WMO TT-WRDE. It has the added benefit of improving data exchange between operational and research users, making operational data more readily available to researchers, and research algorithms more accessible to operational agencies.

  3. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    NASA Astrophysics Data System (ADS)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  4. Globally Gridded Satellite observations for climate studies

    USGS Publications Warehouse

    Knapp, K.R.; Ansari, S.; Bain, C.L.; Bourassa, M.A.; Dickinson, M.J.; Funk, Chris; Helms, C.N.; Hennon, C.C.; Holmes, C.D.; Huffman, G.J.; Kossin, J.P.; Lee, H.-T.; Loew, A.; Magnusdottir, G.

    2011-01-01

    Geostationary satellites have provided routine, high temporal resolution Earth observations since the 1970s. Despite the long period of record, use of these data in climate studies has been limited for numerous reasons, among them that no central archive of geostationary data for all international satellites exists, full temporal and spatial resolution data are voluminous, and diverse calibration and navigation formats encumber the uniform processing needed for multisatellite climate studies. The International Satellite Cloud Climatology Project (ISCCP) set the stage for overcoming these issues by archiving a subset of the full-resolution geostationary data at ~10-km resolution at 3-hourly intervals since 1983. Recent efforts at NOAA's National Climatic Data Center to provide convenient access to these data include remapping the data to a standard map projection, recalibrating the data to optimize temporal homogeneity, extending the record of observations back to 1980, and reformatting the data for broad public distribution. The Gridded Satellite (GridSat) dataset includes observations from the visible, infrared window, and infrared water vapor channels. Data are stored in Network Common Data Format (netCDF) using standards that permit a wide variety of tools and libraries to process the data quickly and easily. A novel data layering approach, together with appropriate satellite and file metadata, allows users to access GridSat data at varying levels of complexity based on their needs. The result is a climate data record already in use by the meteorological community. Examples include reanalysis of tropical cyclones, studies of global precipitation, and detection and tracking of the intertropical convergence zone.

  5. Sea ice in the Baltic Sea - revisiting BASIS ice, a historical data set covering the period 1960/1961-1978/1979

    NASA Astrophysics Data System (ADS)

    Löptien, U.; Dietze, H.

    2014-12-01

    The Baltic Sea is a seasonally ice-covered, marginal sea in central northern Europe. It is an essential waterway connecting highly industrialised countries. Because ship traffic is intermittently hindered by sea ice, the local weather services have been monitoring sea ice conditions for decades. In the present study we revisit a historical monitoring data set, covering the winters 1960/1961 to 1978/1979. This data set, dubbed Data Bank for Baltic Sea Ice and Sea Surface Temperatures (BASIS) ice, is based on hand-drawn maps that were collected and then digitised in 1981 in a joint project of the Finnish Institute of Marine Research (today the Finnish Meteorological Institute (FMI)) and the Swedish Meteorological and Hydrological Institute (SMHI). BASIS ice was designed for storage on punch cards and all ice information is encoded by five digits. This makes the data hard to access. Here we present a post-processed product based on the original five-digit code. Specifically, we convert to standard ice quantities (including information on ice types), which we distribute in the current and free Network Common Data Format (NetCDF). Our post-processed data set will help to assess numerical ice models and provide easy-to-access unique historical reference material for sea ice in the Baltic Sea. In addition we provide statistics showcasing the data quality. The website http://www.baltic-ocean.org hosts the post-processed data and the conversion code. The data are also archived at the Data Publisher for Earth & Environmental Science, PANGAEA (doi:10.1594/PANGAEA.832353).

  6. Acoustic Doppler Current Profiler Data Processing System manual [ADCP

    USGS Publications Warehouse

    Cote, Jessica M.; Hotchkiss, Frances S.; Martini, Marinna A.; Denham, Charles R.; revisions by Ramsey, Andree L.; Ruane, Stephen

    2000-01-01

    This open-file report describes the data processing software currently in use by the U.S. Geological Survey (USGS), Woods Hole Coastal and Marine Science Center (WHCMSC), to process time series of acoustic Doppler current data obtained by Teledyne RD Instruments Workhorse model ADCPs. The Sediment Transport Instrumentation Group (STG) at the WHCMSC has a long-standing commitment to providing scientists high quality oceanographic data published in a timely manner. To meet this commitment, STG has created this software to aid personnel in processing and reviewing data as well as evaluating hardware for signs of instrument malfunction. The output data format for the data is network Common Data Form (netCDF), which meets USGS publication standards. Typically, ADCP data are recorded in beam coordinates. This conforms to the USGS philosophy to post-process rather than internally process data. By preserving the original data quality indicators as well as the initial data set, data can be evaluated and reprocessed for different types of analyses. Beam coordinate data are desirable for internal and surface wave experiments, for example. All the code in this software package is intended to run using the MATLAB program available from The Mathworks, Inc. As such, it is platform independent and can be adapted by the USGS and others for specialized experiments with non-standard requirements. The software is continuously being updated and revised as improvements are required. The most recent revision may be downloaded from: http://woodshole.er.usgs.gov/operations/stg/Pubs/ADCPtools/adcp_index.htm The USGS makes this software available at the user?s discretion and responsibility.

  7. MWR3C physical retrievals of precipitable water vapor and cloud liquid water path

    DOE Data Explorer

    Cadeddu, Maria

    2016-10-12

    The data set contains physical retrievals of PWV and cloud LWP retrieved from MWR3C measurements during the MAGIC campaign. Additional data used in the retrieval process include radiosondes and ceilometer. The retrieval is based on an optimal estimation technique that starts from a first guess and iteratively repeats the forward model calculations until a predefined convergence criterion is satisfied. The first guess is a vector of [PWV,LWP] from the neural network retrieval fields in the netcdf file. When convergence is achieved the 'a posteriori' covariance is computed and its square root is expressed in the file as the retrieval 1-sigma uncertainty. The closest radiosonde profile is used for the radiative transfer calculations and ceilometer data are used to constrain the cloud base height. The RMS error between the brightness temperatures is computed at the last iterations as a consistency check and is written in the last column of the output file.

  8. Common Data Format (CDF) and Coordinated Data Analysis Web (CDAWeb)

    NASA Technical Reports Server (NTRS)

    Candey, Robert M.

    2010-01-01

    The Coordinated Data Analysis Web (CDAWeb) data browsing system provides plotting, listing and open access v ia FTP, HTTP, and web services (REST, SOAP, OPeNDAP) for data from mo st NASA Heliophysics missions and is heavily used by the community. C ombining data from many instruments and missions enables broad resear ch analysis and correlation and coordination with other experiments a nd missions. Crucial to its effectiveness is the use of a standard se lf-describing data format, in this case, the Common Data Format (CDF) , also developed at the Space Physics Data facility , and the use of metadata standa rds (easily edited with SKTeditor ). CDAweb is based on a set of IDL routines, CDAWlib . . The CDF project also maintains soft ware and services for translating between many standard formats (CDF. netCDF, HDF, FITS, XML) .

  9. Rosetta: Ensuring the Preservation and Usability of ASCII-based Data into the Future

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Arms, S. C.

    2015-12-01

    Field data obtained from dataloggers often take the form of comma separated value (CSV) ASCII text files. While ASCII based data formats have positive aspects, such as the ease of accessing the data from disk and the wide variety of tools available for data analysis, there are some drawbacks, especially when viewing the situation through the lens of data interoperability and stewardship. The Unidata data translation tool, Rosetta, is a web-based service that provides an easy, wizard-based interface for data collectors to transform their datalogger generated ASCII output into Climate and Forecast (CF) compliant netCDF files following the CF-1.6 discrete sampling geometries. These files are complete with metadata describing what data are contained in the file, the instruments used to collect the data, and other critical information that otherwise may be lost in one of many README files. The choice of the machine readable netCDF data format and data model, coupled with the CF conventions, ensures long-term preservation and interoperability, and that future users will have enough information to responsibly use the data. However, with the understanding that the observational community appreciates the ease of use of ASCII files, methods for transforming the netCDF back into a CSV or spreadsheet format are also built-in. One benefit of translating ASCII data into a machine readable format that follows open community-driven standards is that they are instantly able to take advantage of data services provided by the many open-source data server tools, such as the THREDDS Data Server (TDS). While Rosetta is currently a stand-alone service, this talk will also highlight efforts to couple Rosetta with the TDS, thus allowing self-publishing of thoroughly documented datasets by the data producers themselves.

  10. Improving the Accessibility and Use of NASA Earth Science Data

    NASA Technical Reports Server (NTRS)

    Tisdale, Matthew; Tisdale, Brian

    2015-01-01

    Many of the NASA Langley Atmospheric Science Data Center (ASDC) Distributed Active Archive Center (DAAC) multidimensional tropospheric and atmospheric chemistry data products are stored in HDF4, HDF5 or NetCDF format, which traditionally have been difficult to analyze and visualize with geospatial tools. With the rising demand from the diverse end-user communities for geospatial tools to handle multidimensional products, several applications, such as ArcGIS, have refined their software. Many geospatial applications now have new functionalities that enable the end user to: Store, serve, and perform analysis on each individual variable, its time dimension, and vertical dimension. Use NetCDF, GRIB, and HDF raster data formats across applications directly. Publish output within REST image services or WMS for time and space enabled web application development. During this webinar, participants will learn how to leverage geospatial applications such as ArcGIS, OPeNDAP and ncWMS in the production of Earth science information, and in increasing data accessibility and usability.

  11. The Comparison of Point Data Models for the Output of WRF Hydro Model in the IDV

    NASA Astrophysics Data System (ADS)

    Ho, Y.; Weber, J.

    2017-12-01

    WRF Hydro netCDF output files contain streamflow, flow depth, longitude, latitude, altitude and stream order values for each forecast point. However, the data are not CF compliant. The total number of forecast points for the US CONUS is approximately 2.7 million and it is a big challenge for any visualization and analysis tool. The IDV point cloud display shows point data as a set of points colored by parameter. This display is very efficient compared to a standard point type display for rendering a large number of points. The one problem we have is that the data I/O can be a bottleneck issue when dealing with a large collection of point input files. In this presentation, we will experiment with different point data models and their APIs to access the same WRF Hydro model output. The results will help us construct a CF compliant netCDF point data format for the community.

  12. An open source Java web application to build self-contained Web GIS sites

    NASA Astrophysics Data System (ADS)

    Zavala Romero, O.; Ahmed, A.; Chassignet, E.; Zavala-Hidalgo, J.

    2014-12-01

    This work describes OWGIS, an open source Java web application that creates Web GIS sites by automatically writing HTML and JavaScript code. OWGIS is configured by XML files that define which layers (geographic datasets) will be displayed on the websites. This project uses several Open Geospatial Consortium standards to request data from typical map servers, such as GeoServer, and is also able to request data from ncWMS servers. The latter allows for the displaying of 4D data stored using the NetCDF file format (widely used for storing environmental model datasets). Some of the features available on the sites built with OWGIS are: multiple languages, animations, vertical profiles and vertical transects, color palettes, color ranges, and the ability to download data. OWGIS main users are scientists, such as oceanographers or climate scientists, who store their data in NetCDF files and want to analyze, visualize, share, or compare their data using a website.

  13. MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.

    PubMed

    Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten

    2006-12-01

    MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant

  14. wsacrvpthrc.a1

    DOE Data Explorer

    Gaustad, Krista; Hardin, Joseph

    2015-12-14

    The wsacr PCM process executed by the sacr3 binary reads in wsacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.

  15. wsacrppivh.a1

    DOE Data Explorer

    Gaustad, Krista; Hardin, Joseph

    2015-07-22

    The wsacr PCM process executed by the sacr3 binary reads in wsacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.

  16. wsacrzrhiv.a1

    DOE Data Explorer

    Gaustad, Krista; Hardin, Joseph

    2015-07-22

    The wsacr PCM process executed by the sacr3 binary reads in wsacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.

  17. kasacrvpthrc.a1

    DOE Data Explorer

    Gaustad, Krista; Hardin, Joseph

    2015-07-22

    The kasacr PCM process executed by the sacr3 binary reads in kasacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.

  18. The IAGOS information system

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Schultz, Martin; Brötz, Björn; Rauthe-Schöch, Armin; Thouret, Valérie

    2015-04-01

    IAGOS (In-service Aircraft for a Global Observing System) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. The IAGOS database (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data centre Ether (CNES and CNRS). In the framework of the IGAS project (IAGOS for Copernicus Atmospheric Service) interoperability with international portals or other databases is implemented in order to improve IAGOS data discovery. The IGAS data network is composed of three data centres: the IAGOS database in Toulouse including IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data since January 2015; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the MACC data centre in Jülich (http://join.iek.fz-juelich.de). The MACC (Monitoring Atmospheric Composition and Climate) project is a prominent user of the IGAS data network. In June 2015 a new version of the IAGOS database will be released providing improved services such as download in NetCDF or NASA Ames formats; graphical tools (maps, scatter plots, etc.); standardized metadata (ISO 19115) and a better users management. The link with the MACC data centre, through JOIN (Jülich OWS Interface), will allow to combine model outputs with IAGOS data for intercomparison. The interoperability within the IGAS data network, implemented thanks to many web services, will improve the functionalities of the web interfaces of each data centre.

  19. National Centers for Environmental Prediction

    Science.gov Websites

    Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Chuang (POST) Fanglin Yang (VSDB) Perry Shafran (VERIFICATION) Ilya Rivin (HYCOM) David Behringer (MOM4 * Functional Equivalence test for MOM4p0 on GAEA - Dave Behringer * NCEP Gaea module - $NETCDF * Use a forum

  20. EARLINET: potential operationality of a research network

    NASA Astrophysics Data System (ADS)

    Sicard, M.; D'Amico, G.; Comerón, A.; Mona, L.; Alados-Arboledas, L.; Amodeo, A.; Baars, H.; Belegante, L.; Binietoglou, I.; Bravo-Aranda, J. A.; Fernández, A. J.; Fréville, P.; García-Vizcaíno, D.; Giunta, A.; Granados-Muñoz, M. J.; Guerrero-Rascado, J. L.; Hadjimitsis, D.; Haefele, A.; Hervo, M.; Iarlori, M.; Kokkalis, P.; Lange, D.; Mamouri, R. E.; Mattis, I.; Molero, F.; Montoux, N.; Muñoz, A.; Muñoz Porcar, C.; Navas-Guzmán, F.; Nicolae, D.; Nisantzi, A.; Papagiannopoulos, N.; Papayannis, A.; Pereira, S.; Preißler, J.; Pujadas, M.; Rizi, V.; Rocadenbosch, F.; Sellegri, K.; Simeonov, V.; Tsaknakis, G.; Wagner, F.; Pappalardo, G.

    2015-07-01

    In the framework of ACTRIS summer 2012 measurement campaign (8 June-17 July 2012), EARLINET organized and performed a controlled exercise of feasibility to demonstrate its potential to perform operational, coordinated measurements and deliver products in near-real time. Eleven lidar stations participated to the exercise which started on 9 July 2012 at 06:00 UT and ended 72 h later on 12 July at 06:00 UT. For the first time the Single-Calculus Chain (SCC), the common calculus chain developed within EARLINET for the automatic evaluation of lidar data from raw signals up to the final products, was used. All stations sent in real time measurements of 1 h of duration to the SCC server in a predefined netcdf file format. The pre-processing of the data was performed in real time by the SCC while the optical processing was performed in near-real time after the exercise ended. 98 and 84 % of the files sent to SCC were successfully pre-processed and processed, respectively. Those percentages are quite large taking into account that no cloud screening was performed on lidar data. The paper shows time series of continuous and homogeneously obtained products retrieved at different levels of the SCC: range-square corrected signals (pre-processing) and daytime backscatter and nighttime extinction coefficient profiles (optical processing), as well as combined plots of all direct and derived optical products. The derived products include backscatter- and extinction-related Ångström exponents, lidar ratios and color ratios. The combined plots reveal extremely valuable for aerosol classification. The efforts made to define the measurements protocol and to configure properly the SCC pave the way for applying this protocol for specific applications such as the monitoring of special events, atmospheric modelling, climate research and calibration/validation activities of spaceborne observations.

  1. Sea ice in the Baltic Sea - revisiting BASIS ice, a~historical data set covering the period 1960/1961-1978/1979

    NASA Astrophysics Data System (ADS)

    Löptien, U.; Dietze, H.

    2014-06-01

    The Baltic Sea is a seasonally ice-covered, marginal sea, situated in central northern Europe. It is an essential waterway connecting highly industrialised countries. Because ship traffic is intermittently hindered by sea ice, the local weather services have been monitoring sea ice conditions for decades. In the present study we revisit a historical monitoring data set, covering the winters 1960/1961. This data set, dubbed Data Bank for Baltic Sea Ice and Sea Surface Temperatures (BASIS) ice, is based on hand-drawn maps that were collected and then digitised 1981 in a joint project of the Finnish Institute of Marine Research (today Finish Meteorological Institute (FMI)) and the Swedish Meteorological and Hydrological Institute (SMHI). BASIS ice was designed for storage on punch cards and all ice information is encoded by five digits. This makes the data hard to access. Here we present a post-processed product based on the original five-digit code. Specifically, we convert to standard ice quantities (including information on ice types), which we distribute in the current and free Network Common Data Format (NetCDF). Our post-processed data set will help to assess numerical ice models and provide easy-to-access unique historical reference material for sea ice in the Baltic Sea. In addition we provide statistics showcasing the data quality. The website www.baltic-ocean.org hosts the post-prossed data and the conversion code. The data are also archived at the Data Publisher for Earth & Environmental Science PANGEA (doi:10.1594/PANGEA.832353).

  2. Data Container Study for Handling array-based data using Hive, Spark, MongoDB, SciDB and Rasdaman

    NASA Astrophysics Data System (ADS)

    Xu, M.; Hu, F.; Yang, J.; Yu, M.; Yang, C. P.

    2017-12-01

    Geoscience communities have come up with various big data storage solutions, such as Rasdaman and Hive, to address the grand challenges for massive Earth observation data management and processing. To examine the readiness of current solutions in supporting big Earth observation, we propose to investigate and compare four popular data container solutions, including Rasdaman, Hive, Spark, SciDB and MongoDB. Using different types of spatial and non-spatial queries, datasets stored in common scientific data formats (e.g., NetCDF and HDF), and two applications (i.e. dust storm simulation data mining and MERRA data analytics), we systematically compare and evaluate the feature and performance of these four data containers in terms of data discover and access. The computing resources (e.g. CPU, memory, hard drive, network) consumed while performing various queries and operations are monitored and recorded for the performance evaluation. The initial results show that 1) the popular data container clusters are able to handle large volume of data, but their performances vary in different situations. Meanwhile, there is a trade-off between data preprocessing, disk saving, query-time saving, and resource consuming. 2) ClimateSpark, MongoDB and SciDB perform the best among all the containers in all the queries tests, and Hive performs the worst. 3) These studied data containers can be applied on other array-based datasets, such as high resolution remote sensing data and model simulation data. 4) Rasdaman clustering configuration is more complex than the others. A comprehensive report will detail the experimental results, and compare their pros and cons regarding system performance, ease of use, accessibility, scalability, compatibility, and flexibility.

  3. Operable Data Management for Ocean Observing Systems

    NASA Astrophysics Data System (ADS)

    Chavez, F. P.; Graybeal, J. B.; Godin, M. A.

    2004-12-01

    As oceanographic observing systems become more numerous and complex, data management solutions must follow. Most existing oceanographic data management systems fall into one of three categories: they have been developed as dedicated solutions, with limited application to other observing systems; they expect that data will be pre-processed into well-defined formats, such as netCDF; or they are conceived as robust, generic data management solutions, with complexity (high) and maturity and adoption rates (low) to match. Each approach has strengths and weaknesses; no approach yet fully addresses, nor takes advantage of, the sophistication of ocean observing systems as they are now conceived. In this presentation we describe critical data management requirements for advanced ocean observing systems, of the type envisioned by ORION and IOOS. By defining common requirements -- functional, qualitative, and programmatic -- for all such ocean observing systems, the performance and nature of the general data management solution can be characterized. Issues such as scalability, maintaining metadata relationships, data access security, visualization, and operational flexibility suggest baseline architectural characteristics, which may in turn lead to reusable components and approaches. Interoperability with other data management systems, with standards-based solutions in metadata specification and data transport protocols, and with the data management infrastructure envisioned by IOOS and ORION, can also be used to define necessary capabilities. Finally, some requirements for the software infrastructure of ocean observing systems can be inferred. Early operational results and lessons learned, from development and operations of MBARI ocean observing systems, are used to illustrate key requirements, choices, and challenges. Reference systems include the Monterey Ocean Observing System (MOOS), its component software systems (Software Infrastructure and Applications for MOOS, and the Shore Side Data System), and the Autonomous Ocean Sampling Network (AOSN).

  4. FLASH_SSF_Aqua-FM3-MODIS_Version3C

    Atmospheric Science Data Center

    2018-04-04

    ... Tool:  CERES Order Tool  (netCDF) Subset Data:  CERES Search and Subset Tool (HDF4 & netCDF) ... Cloud Layer Area Cloud Infared Emissivity Cloud Base Pressure Surface (Radiative) Flux TOA Flux Surface Types TOT ... Radiance SW Filtered Radiance LW Flux Order Data:  Earthdata Search:  Order Data Guide Documents:  ...

  5. FLASH_SSF_Terra-FM1-MODIS_Version3C

    Atmospheric Science Data Center

    2018-04-04

    ... Tool:  CERES Order Tool  (netCDF) Subset Data:  CERES Search and Subset Tool (HDF4 & netCDF) ... Cloud Layer Area Cloud Infrared Emissivity Cloud Base Pressure Surface (Radiative) Flux TOA Flux Surface Types TOT ... Radiance SW Filtered Radiance LW Flux Order Data:  Earthdata Search:  Order Data Guide Documents:  ...

  6. Trade Study: Storing NASA HDF5/netCDF-4 Data in the Amazon Cloud and Retrieving Data Via Hyrax Server Data Server

    NASA Technical Reports Server (NTRS)

    Habermann, Ted; Gallagher, James; Jelenak, Aleksandar; Potter, Nathan; Lee, Joe; Yang, Kent

    2017-01-01

    This study explored three candidate architectures with different types of objects and access paths for serving NASA Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the study were: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and netCDF4 data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3). Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.We will describe the three architectures and the use cases along with performance results and recommendations for further work.

  7. ';Best' Practices for Aggregating Subset Results from Archived Datasets

    NASA Astrophysics Data System (ADS)

    Baskin, W. E.; Perez, J.

    2013-12-01

    In response to the exponential growth in science data analysis and visualization capabilities Data Centers have been developing new delivery mechanisms to package and deliver large volumes of aggregated subsets of archived data. New standards are evolving to help data providers and application programmers deal with growing needs of the science community. These standards evolve from the best practices gleaned from new products and capabilities. The NASA Atmospheric Sciences Data Center (ASDC) has developed and deployed production provider-specific search and subset web applications for the CALIPSO, CERES, TES, and MOPITT missions. This presentation explores several use cases that leverage aggregated subset results and examines the standards and formats ASDC developers applied to the delivered files as well as the implementation strategies for subsetting and processing the aggregated products. The following topics will be addressed: - Applications of NetCDF CF conventions to aggregated level 2 satellite subsets - Data-Provider-Specific format requirements vs. generalized standards - Organization of the file structure of aggregated NetCDF subset output - Global Attributes of individual subsetted files vs. aggregated results - Specific applications and framework used for subsetting and delivering derivative data files

  8. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.

  9. The GEON Integrated Data Viewer (IDV) and IRIS DMC Services Illustrate CyberInfrastructure Support for Seismic Data Visualization and Interpretation

    NASA Astrophysics Data System (ADS)

    Meertens, C.; Wier, S.; Ahern, T.; Casey, R.; Weertman, B.; Laughbon, C.

    2008-12-01

    UNAVCO and the IRIS DMC are data service partners for seismic visualization, particularly for hypocentral data and tomography. UNAVCO provides the GEON Integrated Data Viewer (IDV), an extension of the Unidata IDV, a free, interactive, research-level, software display and analysis tool for data in 3D (latitude, longitude, depth) and 4D (with time), located on or inside the Earth. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three- dimensional geoscience data in the context of new remote and shared data sources. The GEON IDV supports data access from data sources using HTTP and FTP servers, OPeNDAP servers, THREDDS catalogs, RSS feeds, and WMS (web map) servers. The IRIS DMC (Data Management System) has developed web services providing data for earthquake hypocentral data and seismic tomography model grids. These services can be called by the GEON IDV to access data at IRIS without copying files. The IRIS Earthquake Browser (IEB) is a web-based query tool for hypocentral data. The IEB combines the DMC's large database of more than 1,900,000 earthquakes with the Google Maps web interface. With the IEB you can quickly find earthquakes in any region of the globe and then import this information into the GEON Integrated Data Viewer where the hypocenters may be visualized. You can select earthquakes by location region, time, depth, and magnitude. The IEB gives the IDV a URL to the selected data. The IDV then shows the data as maps or 3D displays, with interactive control of vertical scale, area, map projection, with symbol size and color control by magnitude or depth. The IDV can show progressive time animation of, for example, aftershocks filling a source region. The IRIS Tomoserver converts seismic tomography model output grids to NetCDF for use in the IDV. The Tomoserver accepts a tomographic model file as input from a user and provides an equivalent NetCDF file as output. The service supports NA04, S3D, A1D and CUB input file formats, contributed by their respective creators. The NetCDF file is saved to a location that can be referenced with a URL on an IRIS server. The URL for the NetCDF file is provided to the user. The user can download the data from IRIS, or copy the URL into IDV directly for interpretation, and the IDV will access the data at IRIS. The Tomoserver conversion software was developed by Instrumental Software Technologies, Inc. Use cases with the GEON IDV and IRIS DMC data services will be shown.

  10. LASP Time Series Server (LaTiS): Overcoming Data Access Barriers via a Common Data Model in the Middle Tier (Invited)

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Wilson, A.

    2010-12-01

    The Laboratory for Atmospheric and Space Physics at the University of Colorado has developed an Open Source, OPeNDAP compliant, Java Servlet based, RESTful web service to serve time series data. In addition to handling OPeNDAP style requests and returning standard responses, existing modules for alternate output formats can be reused or customized. It is also simple to reuse or customize modules to directly read various native data sources and even to perform some processing on the server. The server is built around a common data model based on the Unidata Common Data Model (CDM) which merges the NetCDF, HDF, and OPeNDAP data models. The server framework features a modular architecture that supports pluggable Readers, Writers, and Filters via the common interface to the data, enabling a workflow that reads data from their native form, performs some processing on the server, and presents the results to the client in its preferred form. The service is currently being used operationally to serve time series data for the LASP Interactive Solar Irradiance Data Center (LISIRD, http://lasp.colorado.edu/lisird/) and as part of the Time Series Data Server (TSDS, http://tsds.net/). I will present the data model and how it enables reading, writing, and processing concerns to be separated into loosely coupled components. I will also share thoughts for evolving beyond the time series abstraction and providing a general purpose data service that can be orchestrated into larger workflows.

  11. Climate Prediction Center - NCEP Global Ocean Data Assimilation System:

    Science.gov Websites

    home page National Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Monthly in NetCDF Other formats Links NOAA Ocean Climate Observation Program (OCO) Climate Test Bed About Prediction (NCEP) are a valuable community asset for monitoring different aspects of ocean climate

  12. MyOcean Internal Information System (Dial-P)

    NASA Astrophysics Data System (ADS)

    Blanc, Frederique; Jolibois, Tony; Loubrieu, Thomas; Manzella, Giuseppe; Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    MyOcean is a three-year project (2008-2011) which goal is the development and pre-operational validation of the GMES Marine Core Service for ocean monitoring and forecasting. It's a transition project that will conduct the European "operational oceanography" community towards the operational phase of a GMES European service, which demands more European integration, more operationality, and more service. Observations, model-based data, and added-value products will be generated - and enhanced thanks to dedicated expertise - by the following production units: • Five Thematic Assembly Centers, each of them dealing with a specific set of observation data: Sea Level, Ocean colour, Sea Surface Temperature, Sea Ice & Wind, and In Situ data, • Seven Monitoring and Forecasting Centers to serve the Global Ocean, the Arctic area, the Baltic Sea, the Atlantic North-West shelves area, the Atlantic Iberian-Biscay-Ireland area, the Mediterranean Sea and the Black sea. Intermediate and final users will discover, view and get the products by means of a central web desk, a central re-active manned service desk and thematic experts distributed across Europe. The MyOcean Information System (MIS) is considering the various aspects of an interoperable - federated information system. Data models support data and computer systems by providing the definition and format of data. The possibility of including the information in the data file is depending on data model adopted. In general there is little effort in the actual project to develop a ‘generic' data model. A strong push to develop a common model is provided by the EU Directive INSPIRE. At present, there is no single de-facto data format for storing observational data. Data formats are still evolving, with their underlying data models moving towards the concept of Feature Types based on ISO/TC211 standards. For example, Unidata are developing the Common Data Model that can represent scientific data types such as point, trajectory, station, grid, etc., which will be implemented in netCDF format. SeaDataNet is recommending ODV and NetCDF formats. Another problem related to data curation and interoperability is the possibility to use common vocabularies. Common vocabularies are developed in many international initiatives, such as GEMET (promoted by INSPIRE as a multilingual thesaurus), UNIDATA, SeaDataNet, Marine Metadata Initiative (MMI). MIS is considering the SeaDataNet vocabulary as a base for interoperability. Four layers of different abstraction levels of interoperability an be defined: - Technical/basic: this layer is implemented at each TAC or MFC through internet connection and basic services for data transfer and browsing (e.g FTP, HTTP, etc). - Syntactic: allowing the interchange of metadata and protocol elements. This layer corresponds to a definition Core Metadata Set, the format of exchange/delivery for the data and associated metadata and possible software. This layer is implemented by the DIAL-P logical interface (e.g. adoption of INSPIRE compliant metadata set and common data formats). - Functional/pragmatic: based on a common set of functional primitives or on a common set of service definitions. This layer refers to the definition of services based on Web services standards. This layer is implemented by the DIAL-P logical interface (e.g. adoption of INSPIRE compliant network services). - Semantic: allowing to access similar classes of objects and services across multiple sites, with multilinguality of content as one specific aspect. This layer corresponds to MIS interface, terminology and thesaurus. Given the above requirements, the proposed solution is a federation of systems, where the individual participants are self-contained autonomous systems, but together form a consistent wider picture. A mid-tier integration layer mediates between existing systems, adapting their data and service model schema to the MIS. The developed MIS is a read-only system, i.e. does not allow updating (or inserting) data into the participant resource systems. The main advantages of the proposed approach are: • to enable information sources to join the MIS and publish their data and metadata in a secure way, without any modification to their existing resources and procedures and without any restriction to their autonomy; • to enable users to browse and query the MIS, receiving an aggregated result incorporating relevant data and metadata from across different sources; • to accommodate the growth of such a MIS, either in terms of its clients or of its information resources, as well as the evolution of the underlying data model.

  13. Collaboration tools and techniques for large model datasets

    USGS Publications Warehouse

    Signell, R.P.; Carniel, S.; Chiggiato, J.; Janekovic, I.; Pullen, J.; Sherwood, C.R.

    2008-01-01

    In MREA and many other marine applications, it is common to have multiple models running with different grids, run by different institutions. Techniques and tools are described for low-bandwidth delivery of data from large multidimensional datasets, such as those from meteorological and oceanographic models, directly into generic analysis and visualization tools. Output is stored using the NetCDF CF Metadata Conventions, and then delivered to collaborators over the web via OPeNDAP. OPeNDAP datasets served by different institutions are then organized via THREDDS catalogs. Tools and procedures are then used which enable scientists to explore data on the original model grids using tools they are familiar with. It is also low-bandwidth, enabling users to extract just the data they require, an important feature for access from ship or remote areas. The entire implementation is simple enough to be handled by modelers working with their webmasters - no advanced programming support is necessary. ?? 2007 Elsevier B.V. All rights reserved.

  14. The new IAGOS Database Portal

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Fontaine, Alain

    2016-04-01

    IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data. The IAGOS Database Portal (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). The new IAGOS Database Portal has been released in December 2015. The main improvement is the interoperability implementation with international portals or other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the CAMS data center in Jülich (http://join.iek.fz-juelich.de). The CAMS (Copernicus Atmospheric Monitoring Service) project is a prominent user of the IGAS data network. The new portal provides improved and new services such as the download in NetCDF or NASA Ames formats, plotting tools (maps, time series, vertical profiles, etc.) and user management. Added value products are available on the portal: back trajectories, origin of air masses, co-location with satellite data, etc. The link with the CAMS data center, through JOIN (Jülich OWS Interface), allows to combine model outputs with IAGOS data for inter-comparison. Finally IAGOS metadata has been standardized (ISO 19115) and now provides complete information about data traceability and quality.

  15. Software reuse example and challenges at NSIDC

    NASA Astrophysics Data System (ADS)

    Billingsley, B. W.; Brodzik, M.; Collins, J. A.

    2009-12-01

    NSIDC has created a new data discovery and access system, Searchlight, to provide users with the data they want in the format they want. NSIDC Searchlight supports discovery and access to disparate data types with on-the-fly reprojection, regridding and reformatting. Architected to both reuse open source systems and be reused itself, Searchlight reuses GDAL and Proj4 for manipulating data and format conversions, the netCDF Java library for creating netCDF output, MapServer and OpenLayers for defining spatial criteria and the JTS Topology Suite (JTS) in conjunction with Hibernate Spatial for database interaction and rich OGC-compliant spatial objects. The application reuses popular Java and Java Script libraries including Struts 2, Spring, JPA (Hibernate), Sitemesh, JFreeChart, JQuery, DOJO and a PostGIS PostgreSQL database. Future reuse of Searchlight components is supported at varying architecture levels, ranging from the database and model components to web services. We present the tools, libraries and programs that Searchlight has reused. We describe the architecture of Searchlight and explain the strategies deployed for reusing existing software and how Searchlight is built for reuse. We will discuss NSIDC reuse of the Searchlight components to support rapid development of new data delivery systems.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosler, Peter

    Stride Search provides a flexible tool for detecting storms or other extreme climate events in high-resolution climate data sets saved on uniform latitude-longitude grids in standard NetCDF format. Users provide the software a quantitative description of a meteorological event they are interested in; the software searches a data set for locations in space and time that meet the user’s description. In its first stage, Stride Search performs a spatial search of the data set at each timestep by dividing a search domain into circular sectors of constant geodesic radius. Data from a netCDF file is read into memory for eachmore » circular search sector. If the data meet or exceed a set of storm identification criteria (defined by the user), a storm is recorded to a linked list. Finally, the linked list is examined and duplicate detections of the same storm are removed and the results are written to an output file. The first stage’s output file is read by a second program that builds storm. Additional identification criteria may be applied at this stage to further classify storms. Storm tracks are the software’s ultimate output and routines are provided for formatting that output for various external software libraries for plotting and tabulating data.« less

  17. Web processing service for landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.

    2012-04-01

    Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file

  18. Using OPeNDAP's Data-Services Framework to Lift Mash-Ups above Blind Dates

    NASA Astrophysics Data System (ADS)

    Gallagher, J. H. R.; Fulker, D. W.

    2015-12-01

    OPeNDAP's data-as-service framework (Hyrax) matches diverse sources with many end-user tools and contexts. Keys to its flexibility include: A data model embracing tabular data alongside n-dim arrays and other structures useful in geoinformatics. A REST-like protocol that supports—via suffix notation—a growing set of output forms (netCDF, XML, etc.) plus a query syntax for subsetting. Subsetting applies (via constraints on column values) to tabular data or (via constraints on indices or coordinates) to array-style data . A handler-style architecture that admits a growing set of input types. Community members may contribute handlers, making Hyrax effective as middleware, where N sources are mapped to M outputs with order N+M effort (not NxM). Hyrax offers virtual aggregations of source data, enabling granularity aimed at users, not data-collectors. OPeNDAP-access libraries exist in multiple languages, including Python, Java, and C++. Recent enhancements are increasing this framework's interoperability (i.e., its mash-up) potential. Extensions implemented as servlets—running adjacent to Hyrax—are enriching the forms of aggregation and enabling new protocols: User-specified aggregations, namely, applying a query to (huge) lists of source granules, and receiving one (large) table or zipped netCDF file. OGC (Open Geospatial Consortium) protocols, WMS and WCS. A Webification (W10n) protocol that returns JavaScript Object Notation (JSON). Extensions to OPeNDAP's query language are reducing transfer volumes and enabling new forms of inspection. Advances underway include: Functions that, for triangular-mesh sources, return sub-meshes spec'd via geospatial bounding boxes. Functions that, for data from multiple, satellite-borne sensors (with differing orbits), select observations based on coincidence. Calculations of means, histograms, etc. that greatly reduce output volumes.. Paths for communities to contribute new server functions (in Python, e.g.) that data providers may incorporate into Hyrax via installation parameters. One could say Hyrax itself is a mash-up, but we suggest it as an instrument for a mash-up artist's toolbox. This instrument can support mash-ups built on netCDF files, OGC protocols, JavaScript Web pages, and/or programs written in Python, Java, C or C++.

  19. Enabling data-driven provenance in NetCDF, via OGC WPS operations. Climate Analysis services use case.

    NASA Astrophysics Data System (ADS)

    Mihajlovski, A.; Spinuso, A.; Plieger, M.; Som de Cerff, W.

    2016-12-01

    Modern Climate analysis platforms provide generic and standardized ways of accessing data and processing services. These are typically supported by a wide range of OGC formats and interfaces. However, the problem of instrumentally tracing the lineage of the transformations occurring on a dataset and its provenance remains an open challenge. It requires standard-driven and interoperable solutions to facilitate understanding, sharing of self-describing data products, fostering collaboration among peers. The CLIPC portal provided us real use case, where the need of an instrumented provenance management is fundamental. CLIPC provides a single point of access for scientific information on climate change. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses. This is made possible through the Copernicus Earth Observation Programme for Europe. With a backbone combining WPS and OPeNDAP services, CLIPC has two themes: 1. Harmonized access to climate datasets derived from models, observations and re-analyses 2. A climate impact tool kit to evaluate, rank and aggregate indicators The climate impact tool kit is realised with the orchestration of a number of WPS that ingest, normalize and combine NetCDF files. The WPS allowing this specific computation are hosted by the climate4impact portal, which is a more generic climate data-access and processing service. In this context, guaranteeing validation and reproducibility of results, is a clearly stated requirement to improve the quality of the results obtained by the combined analysis Two core contributions made, are the enabling of a provenance wrapper around WPS services and the enabling of provenance tracing within the NetCDF format, which adopts and extends the W3C's PROV model. To disseminate indicator data and create transformed data products, a standardized provenance, metadata and processing infrastructure is researched for CLIPC. These efforts will lead towards the provision of tools for further web service processing development and optimisation, opening up possibilities to scale and administer abstract users and data driven workflows.

  20. Web-based Altimeter Service

    NASA Astrophysics Data System (ADS)

    Callahan, P. S.; Wilson, B. D.; Xing, Z.; Raskin, R. G.

    2010-12-01

    We have developed a web-based system to allow updating and subsetting of TOPEX data. The Altimeter Service will be operated by PODAAC along with their other provision of oceanographic data. The Service could be easily expanded to other mission data. An Altimeter Service is crucial to the improvement and expanded use of altimeter data. A service is necessary for altimetry because the result of most interest - sea surface height anomaly (SSHA) - is composed of several components that are updated individually and irregularly by specialized experts. This makes it difficult for projects to provide the most up-to-date products. Some components are the subject of ongoing research, so the ability for investigators to make products for comparison or sharing is important. The service will allow investigators/producers to get their component models or processing into widespread use much more quickly. For coastal altimetry, the ability to subset the data to the area of interest and insert specialized models (e.g., tides) or data processing results is crucial. A key part of the Altimeter Service is having data producers provide updated or local models and data. In order for this to succeed, producers need to register their products with the Altimeter Service and to provide the product in a form consistent with the service update methods. We will describe the capabilities of the web service and the methods for providing new components. Currently the Service is providing TOPEX GDRs with Retracking (RGDRs) in netCDF format that has been coordinated with Jason data. Users can add new orbits, tide models, gridded geophysical fields such as mean sea surface, and along-track corrections as they become available and are installed by PODAAC. The updated fields are inserted into the netCDF files while the previous values are retained for comparison. The Service will also generate SSH and SSHA. In addition, the Service showcases a feature that plots any variable from files in netCDF. The research described here was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

  1. Construction of Hierarchical Models for Fluid Dynamics in Earth and Planetary Sciences : DCMODEL project

    NASA Astrophysics Data System (ADS)

    Takahashi, Y. O.; Takehiro, S.; Sugiyama, K.; Odaka, M.; Ishiwatari, M.; Sasaki, Y.; Nishizawa, S.; Ishioka, K.; Nakajima, K.; Hayashi, Y.

    2012-12-01

    Toward the understanding of fluid motions of planetary atmospheres and planetary interiors by performing multiple numerical experiments with multiple models, we are now proceeding ``dcmodel project'', where a series of hierarchical numerical models with various complexity is developed and maintained. In ``dcmodel project'', a series of the numerical models are developed taking care of the following points: 1) a common ``style'' of program codes assuring readability of the software, 2) open source codes of the models to the public, 3) scalability of the models assuring execution on various scales of computational resources, 4) stressing the importance of documentation and presenting a method for writing reference manuals. The lineup of the models and utility programs of the project is as follows: Gtool5, ISPACK/SPML, SPMODEL, Deepconv, Dcpam, and Rdoc-f95. In the followings, features of each component are briefly described. Gtool5 (Ishiwatari et al., 2012) is a Fortran90 library, which provides data input/output interfaces and various utilities commonly used in the models of dcmodel project. A self-descriptive data format netCDF is adopted as a IO format of Gtool5. The interfaces of gtool5 library can reduce the number of operation steps for the data IO in the program code of the models compared with the interfaces of the raw netCDF library. Further, by use of gtool5 library, procedures for data IO and addition of metadata for post-processing can be easily implemented in the program codes in a consolidated form independent of the size and complexity of the models. ``ISPACK'' is the spectral transformation library and ``SPML (SPMODEL library)'' (Takehiro et al., 2006) is its wrapper library. Most prominent feature of SPML is a series of array-handling functions with systematic function naming rules, and this enables us to write codes with a form which is easily deduced from the mathematical expressions of the governing equations. ``SPMODEL'' (Takehiro et al., 2006) is a collection of various sample programs using ``SPML''. These sample programs provide the basekit for simple numerical experiments of geophysical fluid dynamics. For example, SPMODEL includes 1-dimensional KdV equation model, 2-dimensional barotropic, shallow water, Boussinesq models, 3-dimensional MHD dynamo models in rotating spherical shells. These models are written in the common style in harmony with SPML functions. ``Deepconv'' (Sugiyama et al., 2010) and ``Dcpam'' are a cloud resolving model and a general circulation model for the purpose of applications to the planetary atmospheres, respectively. ``Deepconv'' includes several physical processes appropriate for simulations of Jupiter and Mars atmospheres, while ``Dcpam'' does for simulations of Earth, Mars, and Venus-like atmospheres. ``Rdoc-f95'' is a automatic generator of reference manuals of Fortran90/95 programs, which is an extension of ruby documentation tool kit ``rdoc''. It analyzes dependency of modules, functions, and subroutines in the multiple program source codes. At the same time, it can list up the namelist variables in the programs.

  2. metAlignID: a high-throughput software tool set for automated detection of trace level contaminants in comprehensive LECO two-dimensional gas chromatography time-of-flight mass spectrometry data.

    PubMed

    Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J

    2012-11-09

    A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. SeaWiFS technical report series. Volume 19: Case studies for SeaWiFS calibration and validation, part 2

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Acker, James G. (Editor); Firestone, Elaine R. (Editor); Mcclain, Charles R.; Fraser, Robert S.; Mclean, James T.; Darzi, Michael; Firestone, James K.; Patt, Frederick S.; Schieber, Brian D.

    1994-01-01

    This document provides brief reports, or case studies, on a number of investigations and data set development activities sponsored by the Calibration and Validation Team (CVT) within the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Project. Chapter 1 is a comparison with the atmospheric correction of Coastal Zone Color Scanner (CZCS) data using two independent radiative transfer formulations. Chapter 2 is a study on lunar reflectance at the SeaWiFS wavelengths which was useful in establishing the SeaWiFS lunar gain. Chapter 3 reports the results of the first ground-based solar calibration of the SeaWiFS instrument. The experiment was repeated in the fall of 1993 after the instrument was modified to reduce stray light; the results from the second experiment will be provided in the next case studies volume. Chapter 4 is a laboratory experiment using trap detectors which may be useful tools in the calibration round-robin program. Chapter 5 is the original data format evaluation study conducted in 1992 which outlines the technical criteria used in considering three candidate formats, the hierarchical data format (HDF), the common data format (CDF), and the network CDF (netCDF). Chapter 6 summarizes the meteorological data sets accumulated during the first three years of CZCS operation which are being used for initial testing of the operational SeaWiFS algorithms and systems and would be used during a second global processing of the CZCS data set. Chapter 7 describes how near-real time surface meteorological and total ozone data required for the atmospheric correction algorithm will be retrieved and processed. Finally, Chapter 8 is a comparison of surface wind products from various operational meteorological centers and field observations. Surface winds are used in the atmospheric correction scheme to estimate glint and foam radiances.

  4. Owgis 2.0: Open Source Java Application that Builds Web GIS Interfaces for Desktop Andmobile Devices

    NASA Astrophysics Data System (ADS)

    Zavala Romero, O.; Chassignet, E.; Zavala-Hidalgo, J.; Pandav, H.; Velissariou, P.; Meyer-Baese, A.

    2016-12-01

    OWGIS is an open source Java and JavaScript application that builds easily configurable Web GIS sites for desktop and mobile devices. The current version of OWGIS generates mobile interfaces based on HTML5 technology and can be used to create mobile applications. The style of the generated websites can be modified using COMPASS, a well known CSS Authoring Framework. In addition, OWGIS uses several Open Geospatial Consortium standards to request datafrom the most common map servers, such as GeoServer. It is also able to request data from ncWMS servers, allowing the websites to display 4D data from NetCDF files. This application is configured by XML files that define which layers, geographic datasets, are displayed on the Web GIS sites. Among other features, OWGIS allows for animations; streamlines from vector data; virtual globe display; vertical profiles and vertical transects; different color palettes; the ability to download data; and display text in multiple languages. OWGIS users are mainly scientists in the oceanography, meteorology and climate fields.

  5. Design of FastQuery: How to Generalize Indexing and Querying System for Scientific Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Jerry; Wu, Kesheng

    2011-04-18

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies such as FastBit are critical for facilitating interactive exploration of large datasets. These technologies rely on adding auxiliary information to existing datasets to accelerate query processing. To use these indices, we need to match the relational data model used by the indexing systems with the array data model used by most scientific data, and to provide an efficient input and output layer for reading and writing the indices. In this work, we present a flexible design that can be easily applied to most scientific datamore » formats. We demonstrate this flexibility by applying it to two of the most commonly used scientific data formats, HDF5 and NetCDF. We present two case studies using simulation data from the particle accelerator and climate simulation communities. To demonstrate the effectiveness of the new design, we also present a detailed performance study using both synthetic and real scientific workloads.« less

  6. Kelly et al. (2016): Simulating the phase partitioning of NH3, HNO3, and HCl with size-resolved particles over northern Colorado in winter

    EPA Pesticide Factsheets

    In this study, modeled gas- and aerosol phase ammonia, nitric acid, and hydrogen chloride are compared to measurements taken during a field campaign conducted in northern Colorado in February and March 2011. We compare the modeled and observed gas-particle partitioning, and assess potential reasons for discrepancies between the model and measurements. This data set contains scripts and data used for each figure in the associated manuscript. Figures are generated using the R project statistical programming language. Data files are in either comma-separated value (CSV) format or netCDF, a standard self-describing binary data format commonly used in the earth and atmospheric sciences. This dataset is associated with the following publication:Kelly , J., K. Baker , C. Nolte, S. Napelenok , W.C. Keene, and A.A.P. Pszenny. Simulating the phase partitioning of NH3, HNO3, and HCl with size-resolved particles over northern Colorado in winter. ATMOSPHERIC ENVIRONMENT. Elsevier Science Ltd, New York, NY, USA, 131: 67-77, (2016).

  7. Error mitigation for CCSD compressed imager data

    NASA Astrophysics Data System (ADS)

    Gladkova, Irina; Grossberg, Michael; Gottipati, Srikanth; Shahriar, Fazlul; Bonev, George

    2009-08-01

    To efficiently use the limited bandwidth available on the downlink from satellite to ground station, imager data is usually compressed before transmission. Transmission introduces unavoidable errors, which are only partially removed by forward error correction and packetization. In the case of the commonly used CCSD Rice-based compression, it results in a contiguous sequence of dummy values along scan lines in a band of the imager data. We have developed a method capable of using the image statistics to provide a principled estimate of the missing data. Our method outperforms interpolation yet can be performed fast enough to provide uninterrupted data flow. The estimation of the lost data provides significant value to end users who may use only part of the data, may not have statistical tools, or lack the expertise to mitigate the impact of the lost data. Since the locations of the lost data will be clearly marked as meta-data in the HDF or NetCDF header, experts who prefer to handle error mitigation themselves will be free to use or ignore our estimates as they see fit.

  8. Using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  9. Development of climate data storage and processing model

    NASA Astrophysics Data System (ADS)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  10. The Aegean Sea marine security decision support system

    NASA Astrophysics Data System (ADS)

    Perivoliotis, L.; Krokos, G.; Nittis, K.; Korres, G.

    2011-05-01

    As part of the integrated ECOOP (European Coastal Sea Operational observing and Forecasting System) project, HCMR upgraded the already existing standalone Oil Spill Forecasting System for the Aegean Sea, initially developed for the Greek Operational Oceanography System (POSEIDON), into an active element of the European Decision Support System (EuroDeSS). The system is accessible through a user friendly web interface where the case scenarios can be fed into the oil spill drift model component, while the synthetic output contains detailed information about the distribution of oil spill particles and the oil spill budget and it is provided both in text based ECOOP common output format and as a series of sequential graphics. The main development steps that were necessary for this transition were the modification of the forcing input data module in order to allow the import of other system products which are usually provided in standard formats such as NetCDF and the transformation of the model's calculation routines to allow use of current, density and diffusivities data in z instead of sigma coordinates. During the implementation of the Aegean DeSS, the system was used in operational mode in order support the Greek marine authorities in handling a real accident that took place in North Aegean area. Furthermore, the introduction of common input and output files by all the partners of EuroDeSS extended the system's interoperability thus facilitating data exchanges and comparison experiments.

  11. The Aegean sea marine security decision support system

    NASA Astrophysics Data System (ADS)

    Perivoliotis, L.; Krokos, G.; Nittis, K.; Korres, G.

    2011-10-01

    As part of the integrated ECOOP (European Coastal Sea Operational observing and Forecasting System) project, HCMR upgraded the already existing standalone Oil Spill Forecasting System for the Aegean Sea, initially developed for the Greek Operational Oceanography System (POSEIDON), into an active element of the European Decision Support System (EuroDeSS). The system is accessible through a user friendly web interface where the case scenarios can be fed into the oil spill drift model component, while the synthetic output contains detailed information about the distribution of oil spill particles and the oil spill budget and it is provided both in text based ECOOP common output format and as a series of sequential graphics. The main development steps that were necessary for this transition were the modification of the forcing input data module in order to allow the import of other system products which are usually provided in standard formats such as NetCDF and the transformation of the model's calculation routines to allow use of current, density and diffusivities data in z instead of sigma coordinates. During the implementation of the Aegean DeSS, the system was used in operational mode in order to support the Greek marine authorities in handling a real accident that took place in North Aegean area. Furthermore, the introduction of common input and output files by all the partners of EuroDeSS extended the system's interoperability thus facilitating data exchanges and comparison experiments.

  12. Wave data processing toolbox manual

    USGS Publications Warehouse

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata structure is embedded with the data to document pertinent information regarding the deployment and the parameters used to process the data. Using this format ensures that the relevant information about how the data was collected and converted to physical units is maintained with the actual data. EPIC-standard variable names have been utilized where appropriate. These standards, developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) (http://www.pmel.noaa.gov/epic/), provide a universal vernacular allowing researchers to share data without translation.

  13. The IAGOS Information System

    NASA Astrophysics Data System (ADS)

    Boulanger, D.; Thouret, V.

    2016-12-01

    IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft and do measurements of aerosols, cloud particles, greenhouse gases, ozone, water vapor and nitrogen oxides from the surface to the lower stratosphere. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core and IAGOS-CARIBIC data. The IAGOS Data Portal (http://www.iagos.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). In 2016 the new IAGOS Data Portal has been released. In addition to the data download the portal provides improved and new services such as download in NetCDF or NASA Ames formats and plotting tools (maps, time series, vertical profiles). New added value products are available through the portal: back trajectories, origin of air masses, co-location with satellite data. Web services allow to download IAGOS metadata such as flights and airports information. Administration tools have been implemented for users management and instruments monitoring. A major improvement is the interoperability with international portals and other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse, the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de) and the CAMS (Copernicus Atmosphere Monitoring Service) data center in Jülich (http://join.iek.fz-juelich.de). The link with the CAMS data center, through the JOIN interface, allows to combine model outputs with IAGOS data for inter-comparison. The CAMS project is a prominent user of the IGAS data network. Duting the next year IAGOS will improve metadata standardization and dissemination through different collaborations with the AERIS data center, GAW for which IAGOS is a contributing network and the ENVRI+ European project. Measurements traceability and quality metadata will be available and DOI will be implemented.

  14. ClimateNet: A Machine Learning dataset for Climate Science Research

    NASA Astrophysics Data System (ADS)

    Prabhat, M.; Biard, J.; Ganguly, S.; Ames, S.; Kashinath, K.; Kim, S. K.; Kahou, S.; Maharaj, T.; Beckham, C.; O'Brien, T. A.; Wehner, M. F.; Williams, D. N.; Kunkel, K.; Collins, W. D.

    2017-12-01

    Deep Learning techniques have revolutionized commercial applications in Computer vision, speech recognition and control systems. The key for all of these developments was the creation of a curated, labeled dataset ImageNet, for enabling multiple research groups around the world to develop methods, benchmark performance and compete with each other. The success of Deep Learning can be largely attributed to the broad availability of this dataset. Our empirical investigations have revealed that Deep Learning is similarly poised to benefit the task of pattern detection in climate science. Unfortunately, labeled datasets, a key pre-requisite for training, are hard to find. Individual research groups are typically interested in specialized weather patterns, making it hard to unify, and share datasets across groups and institutions. In this work, we are proposing ClimateNet: a labeled dataset that provides labeled instances of extreme weather patterns, as well as associated raw fields in model and observational output. We develop a schema in NetCDF to enumerate weather pattern classes/types, store bounding boxes, and pixel-masks. We are also working on a TensorFlow implementation to natively import such NetCDF datasets, and are providing a reference convolutional architecture for binary classification tasks. Our hope is that researchers in Climate Science, as well as ML/DL, will be able to use (and extend) ClimateNet to make rapid progress in the application of Deep Learning for Climate Science research.

  15. Common Patterns with End-to-end Interoperability for Data Access

    NASA Astrophysics Data System (ADS)

    Gallagher, J.; Potter, N.; Jones, M. B.

    2010-12-01

    At first glance, using common storage formats and open standards should be enough to ensure interoperability between data servers and client applications, but that is often not the case. In the REAP (Realtime Environment for Analytical Processing; NSF #0619060) project we integrated access to data from OPeNDAP servers into the Kepler workflow system and found that, as in previous cases, we spent the bulk of our effort addressing the twin issues of data model compatibility and integration strategies. Implementing seamless data access between a remote data source and a client application (data sink) can be broken down into two kinds of issues. First, the solution must address any differences in the data models used by the data source (OPeNDAP) and the data sink (the Kepler workflow system). If these models match completely, there is little work to be done. However, that is rarely the case. To map OPeNDAP's data model to Kepler's, we used two techniques (ignoring trivial conversions): On-the-fly type mapping and out-of-band communication. Type conversion takes place both for data and metadata because Kepler requires a priori knowledge of some aspects (e.g., syntactic metadata) of the data to build a workflow. In addition, OPeNDAP's constraint expression syntax was used to send out-of-band information to restrict the data requested from the server, facilitating changes in the returned data's type. This technique provides a way for users to exert fine-grained control over the data request, a potentially useful technique, at the cost of requiring that users understand a little about the data source's processing capabilities. The second set of issues for end-to-end data access are integration strategies. OPeNDAP provides several different tools for bringing data into an application: C++, C and Java libraries that provide functions for newly written software; The netCDF library which enables existing applications to read from servers using an older interface; and simple file transfers. These options affect seamlessness in that they represent tradeoffs in new development (required for the first option) with cumbersome extra user actions (required by the last option). While the middle option, adding new functionality to an existing library (netCDF), is very appealing because practice has shown that it can be very effective over a wide range of clients, it's very hard to build these libraries because correctly writing a new implementation of an existing API that preserves the original's exact semantics can be a daunting task. In the example discussed here, we developed a new module for Kepler using OPeNDAP's Java API. This provided a way to leverage internal optimizations for data organization in Kepler and we felt that outweighed the additional cost of new development and the need for users to learn how to use a new Kepler module. While common storage formats and open standards play an important role in data access, our work with the Kepler workflow system reinforces the experience that matching the data models of the data server (source) and user client (sink) and choosing the most appropriate integration strategy are critical to achieving interoperability.

  16. Promoting discovery and access to real time observations produced by regional coastal ocean observing systems

    NASA Astrophysics Data System (ADS)

    Anderson, D. M.; Snowden, D. P.; Bochenek, R.; Bickel, A.

    2015-12-01

    In the U.S. coastal waters, a network of eleven regional coastal ocean observing systems support real-time coastal and ocean observing. The platforms supported and variables acquired are diverse, ranging from current sensing high frequency (HF) radar to autonomous gliders. The system incorporates data produced by other networks and experimental systems, further increasing the breadth of the collection. Strategies promoted by the U.S. Integrated Ocean Observing System (IOOS) ensure these data are not lost at sea. Every data set deserves a description. ISO and FGDC compliant metadata enables catalog interoperability and record-sharing. Extensive use of netCDF with the Climate and Forecast convention (identifying both metadata and a structured format) is shown to be a powerful strategy to promote discovery, interoperability, and re-use of the data. To integrate specialized data which are often obscure, quality control protocols are being developed to homogenize the QC and make these data more integrate-able. Data Assembly Centers have been established to integrate some specialized streams including gliders, animal telemetry, and HF radar. Subsets of data that are ingested into the National Data Buoy Center are also routed to the Global Telecommunications System (GTS) of the World Meteorological Organization to assure wide international distribution. From the GTS, data are assimilated into now-cast and forecast models, fed to other observing systems, and used to support observation-based decision making such as forecasts, warnings, and alerts. For a few years apps were a popular way to deliver these real-time data streams to phones and tablets. Responsive and adaptive web sites are an emerging flexible strategy to provide access to the regional coastal ocean observations.

  17. Obtaining and processing Daymet data using Python and ArcGIS

    USGS Publications Warehouse

    Bohms, Stefanie

    2013-01-01

    This set of scripts was developed to automate the process of downloading and mosaicking daily Daymet data to a user defined extent using ArcGIS and Python programming language. The three steps are downloading the needed Daymet tiles for the study area extent, converting the netcdf file to a tif raster format, and mosaicking those rasters to one file. The set of scripts is intended for all levels of experience with Python programming language and requires no scripting by the user.

  18. Efforts to integrate CMIP metadata and standards into NOAA-GFDL's climate model workflow

    NASA Astrophysics Data System (ADS)

    Blanton, C.; Lee, M.; Mason, E. E.; Radhakrishnan, A.

    2017-12-01

    Modeling centers participating in CMIP6 run model simulations, publish requested model output (conforming to community data standards), and document models and simulations using ES-DOC. GFDL developed workflow software implementing some best practices to meet these metadata and documentation requirements. The CMIP6 Data Request defines the variables that should be archived for each experiment and specifies their spatial and temporal structure. We used the Data Request's dreqPy python library to write GFDL model configuration files as an alternative to hand-crafted tables. There was also a largely successful effort to standardize variable names within the model to reduce the additional overhead of translating "GFDL to CMOR" variables at a later stage in the pipeline. The ES-DOC ecosystem provides tools and standards to create, publish, and view various types of community-defined CIM documents, most notably model and simulation documents. Although ES-DOC will automatically create simulation documents during publishing by harvesting NetCDF global attributes, the information must be collected, stored, and placed in the NetCDF files by the workflow. We propose to develop a GUI to collect the simulation document precursors. In addition, a new MIP for CMIP6-CPMIP, a comparison of computational performance of climate models-is documented using machine and performance CIM documents. We used ES-DOC's pyesdoc python library to automatically create these machine and performance documents. We hope that these and similar efforts will become permanent features of the GFDL workflow to facilitate future participation in CMIP-like activities.

  19. Development of Extended Content Standards for Biodiversity Data

    NASA Astrophysics Data System (ADS)

    Hugo, Wim; Schmidt, Jochen; Saarenmaa, Hannu

    2013-04-01

    Interoperability in the field of Biodiversity observation has been strongly driven by the development of a number of global initiatives (GEO, GBIF, OGC, TDWG, GenBank, …) and its supporting standards (OGC-WxS, OGC-SOS, Darwin Core (DwC), NetCDF, …). To a large extent, these initiatives have focused on discoverability and standardization of syntactic and schematic interoperability. Semantic interoperability is more complex, requiring development of domain-dependent conceptual data models, and extension of these models with appropriate ontologies (typically manifested as controlled vocabularies). Biodiversity content has been standardized partly, for example through Darwin Core for occurrence data and associated taxonomy, and through Genbank for genetic data, but other contexts of biodiversity observation have lagged behind - making it difficult to achieve semantic interoperability between distributed data sources. With this in mind, WG8 of GEO BON (charged with data and systems interoperability) has started a work programme to address a number of concerns, one of which is the gap in content standards required to make Biodiversity data truly interoperable. The paper reports on the framework developed by WG8 for the classification of Biodiversity observation data into 'families' of use cases and its supporting data schema, where gaps, if any, in the availability if content standards have been identified, and how these are to be addressed by way of an abstract data model and the development of associated content standards. It is proposed that a minimum set of standards (1) will be required to address the scope of Biodiversity content, aligned with levels and dimensions of observation, and based on the 'Essential Biodiversity Variables' (2) being developed by GEO BON . The content standards are envisaged as loosely separated from the syntactic and schematic standards used for the base data exchange: typically, services would offer an existing data standard (DwC, WFS, SOS, NetCDF), with a use-case dependent 'payload' embedded into the data stream. This enables the re-use of the abstract schema, and sometimes the implementation specification (for example XML, JSON, or NetCDF conventions) across services. An explicit aim will be to make the XML implementation specification re-usable as a DwC and a GML (SOS end WFS) extension. (1) Olga Lyashevska, Keith D. Farnsworth, How many dimensions of biodiversity do we need?, Ecological Indicators, Volume 18, July 2012, Pages 485-492, ISSN 1470-160X, 10.1016/j.ecolind.2011.12.016. (2) GEO BON: Workshop on Essential Biodiversity Variables (27-29 February 2012, Frascati, Italy). (http://www.earthobservations.org/geobon_docs_20120227.shtml)

  20. JADDS - towards a tailored global atmospheric composition data service for CAMS forecasts and reanalysis

    NASA Astrophysics Data System (ADS)

    Stein, Olaf; Schultz, Martin G.; Rambadt, Michael; Saini, Rajveer; Hoffmann, Lars; Mallmann, Daniel

    2017-04-01

    Global model data of atmospheric composition produced by the Copernicus Atmospheric Monitoring Service (CAMS) is collected since 2010 at FZ Jülich and serves as boundary condition for use by Regional Air Quality (RAQ) modellers world-wide. RAQ models need time-resolved meteorological as well as chemical lateral boundary conditions for their individual model domains. While the meteorological data usually come from well-established global forecast systems, the chemical boundary conditions are not always well defined. In the past, many models used 'climatic' boundary conditions for the tracer concentrations, which can lead to significant concentration biases, particularly for tracers with longer lifetimes which can be transported over long distances (e.g. over the whole northern hemisphere) with the mean wind. The Copernicus approach utilizes extensive near-realtime data assimilation of atmospheric composition data observed from space which gives additional reliability to the global modelling data and is well received by the RAQ communities. An existing Web Coverage Service (WCS) for sharing these individually tailored model results is currently being re-engineered to make use of a modern, scalable database technology in order to improve performance, enhance flexibility, and allow the operation of catalogue services. The new Jülich Atmospheric Data Distributions Server (JADDS) adheres to the Web Coverage Service WCS2.0 standard as defined by the Open Geospatial Consortium OGC. This enables the user groups to flexibly define datasets they need by selecting a subset of chemical species or restricting geographical boundaries or the length of the time series. The data is made available in the form of different catalogues stored locally on our server. In addition, the Jülich OWS Interface (JOIN) provides interoperable web services allowing for easy download and visualization of datasets delivered from WCS servers via the internet. We will present the prototype JADDS server and address the major issues identified when relocating large four-dimensional datasets into a RASDAMAN raster array database. So far the RASDAMAN support for data available in netCDF format is limited with respect to metadata related to variables and axes. For community-wide accepted solutions, selected data coverages shall result in downloadable netCDF files including metadata complying with the netCDF CF Metadata Conventions standard (http://cfconventions.org/). This can be achieved by adding custom metadata elements for RASDAMAN bands (model levels) on data ingestion. Furthermore, an optimization strategy for ingestion of several TB of 4D model output data will be outlined.

  1. Extending netCDF and CF conventions to support enhanced Earth Observation Ontology services: the Prod-Trees project

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Valentin, Bernard; Koubarakis, Manolis; Nativi, Stefano

    2013-04-01

    Access to Earth Observation products remains not at all straightforward for end users in most domains. Semantically-enabled search engines, generally accessible through Web portals, have been developed. They allow searching for products by selecting application-specific terms and specifying basic geographical and temporal filtering criteria. Although this mostly suits the needs of the general public, the scientific communities require more advanced and controlled means to find products. Ranges of validity, traceability (e.g. origin, applied algorithms), accuracy, uncertainty, are concepts that are typically taken into account in research activities. The Prod-Trees (Enriching Earth Observation Ontology Services using Product Trees) project will enhance the CF-netCDF product format and vocabulary to allow storing metadata that better describe the products, and in particular EO products. The project will bring a standardized solution that permits annotating EO products in such a manner that official and third-party software libraries and tools will be able to search for products using advanced tags and controlled parameter names. Annotated EO products will be automatically supported by all the compatible software. Because the entire product information will come from the annotations and the standards, there will be no need for integrating extra components and data structures that have not been standardized. In the course of the project, the most important and popular open-source software libraries and tools will be extended to support the proposed extensions of CF-netCDF. The result will be provided back to the respective owners and maintainers for ensuring the best dissemination and adoption of the extended format. The project, funded by ESA, has started in December 2012 and will end in May 2014. It is coordinated by Space Applications Services, and the Consortium includes CNR-IIA and the National and Kapodistrian University of Athens. The first activities included the elicitation of user requirements in order to identify gaps in the current CF and netCDF specification for providing an extended support of the discovery of EO data. To this aim a Validation Group has been established including members from organizations actively using netCDF and CF standards. A questionnaire has been prepared and submitted to the Validation Group; it was aimed for being filled online, but also for guiding interviews. The presentation will focus on the project objectives, the first achievements with particular reference to the results of the requirements analysis and future plans.

  2. A Geospatial Database that Supports Derivation of Climatological Features of Severe Weather

    NASA Astrophysics Data System (ADS)

    Phillips, M.; Ansari, S.; Del Greco, S.

    2007-12-01

    The Severe Weather Data Inventory (SWDI) at NOAA's National Climatic Data Center (NCDC) provides user access to archives of several datasets critical to the detection and evaluation of severe weather. These datasets include archives of: · NEXRAD Level-III point features describing general storm structure, hail, mesocyclone and tornado signatures · National Weather Service Storm Events Database · National Weather Service Local Storm Reports collected from storm spotters · National Weather Service Warnings · Lightning strikes from Vaisala's National Lightning Detection Network (NLDN) SWDI archives all of these datasets in a spatial database that allows for convenient searching and subsetting. These data are accessible via the NCDC web site, Web Feature Services (WFS) or automated web services. The results of interactive web page queries may be saved in a variety of formats, including plain text, XML, Google Earth's KMZ, standards-based NetCDF and Shapefile. NCDC's Storm Risk Assessment Project (SRAP) uses data from the SWDI database to derive gridded climatology products that show the spatial distributions of the frequency of various events. SRAP also can relate SWDI events to other spatial data such as roads, population, watersheds, and other geographic, sociological, or economic data to derive products that are useful in municipal planning, emergency management, the insurance industry, and other areas where there is a need to quantify and qualify how severe weather patterns affect people and property.

  3. Oceanotron, Scalable Server for Marine Observations

    NASA Astrophysics Data System (ADS)

    Loubrieu, T.; Bregent, S.; Blower, J. D.; Griffiths, G.

    2013-12-01

    Ifremer, French marine institute, is deeply involved in data management for different ocean in-situ observation programs (ARGO, OceanSites, GOSUD, ...) or other European programs aiming at networking ocean in-situ observation data repositories (myOcean, seaDataNet, Emodnet). To capitalize the effort for implementing advance data dissemination services (visualization, download with subsetting) for these programs and generally speaking water-column observations repositories, Ifremer decided to develop the oceanotron server (2010). Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpeNDAP, ...), the server is designed to manage plugins: - StorageUnits : which enable to read specific data repository formats (netCDF/OceanSites, RDBMS schema, ODV binary format). - FrontDesks : which get external requests and send results for interoperable protocols (OGC/WMS, OGC/SOS, OpenDAP). In between a third type of plugin may be inserted: - TransformationUnits : which enable ocean business related transformation of the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). The server is released under open-source license so that partners can develop their own plugins. Within MyOcean project, University of Reading has plugged a WMS implementation as an oceanotron frontdesk. The modules are connected together by sharing the same information model for marine observations (or sampling features: vertical profiles, point series and trajectories), dataset metadata and queries. The shared information model is based on OGC/Observation & Measurement and Unidata/Common Data Model initiatives. The model is implemented in java (http://www.ifremer.fr/isi/oceanotron/javadoc/). This inner-interoperability level enables to capitalize ocean business expertise in software development without being indentured to specific data formats or protocols. Oceanotron is deployed at seven European data centres for marine in-situ observations within myOcean. While additional extensions are still being developed, to promote new collaborative initiatives, a work is now done on continuous and distributed integration (jenkins, maven), shared reference documentation (on alfresco) and code and release dissemination (sourceforge, github).

  4. Go_LIVE! - Global Near-real-time Land Ice Velocity data from Landsat 8 at NSIDC

    NASA Astrophysics Data System (ADS)

    Klinger, M. J.; Fahnestock, M. A.; Scambos, T. A.; Gardner, A. S.; Haran, T. M.; Moon, T. A.; Hulbe, C. L.; Berthier, E.

    2016-12-01

    The National Snow and Ice Data Center (NSIDC) is developing a processing and staging system under NASA funding for near-real-time global ice velocity data derived from Landsat 8 panchromatic imagery: Global Land Ice Velocity Extraction from Landsat (Go_LIVE). The system performs repeat image feature tracking using newly developed Python Correlation (PyCorr) software applied to image pairs covering all glaciers > 5km2 as well as both ice sheets. We correlate each Landsat 8 path-row image with matching path-row images acquired within the previous 400 days. Real-Time (RT) panchromatic Landsat 8 L1T images have geolocation accuracy of 5 meters and high radiometric sensitivity (12-bit), allowing for feature matching over low-contrast snow and ice surfaces. High-pass filters are applied to the imagery to enhance local surface texture and improve correlation returns. Despite the excellent geolocation accuracy of Landsat 8, the remaining error introduces an artificial offset in the velocity returns. To correct this error, we apply a shift to the x and y grids to bring the displacement field to zero over known stationary features such as bedrock. For ice sheet interiors where stationary features do not exist, we use near-zero (<10 ma-1) or slow-moving ice areas (10-25 ma-1) to refine velocities. Go_LIVE will eventually include Landsat 7, 5 and 4 imagery as well. Go_LIVE runs on the University of Colorado's supercomputer and Peta Library storage system to process 10,000 image pairs per hour. We are currently developing a web-based data access site at NSIDC. The data are provided in NetCDF (Network Common Data Format) as geolocated grids of x and y velocity components at 300 m spacing with accompanying error and quality parameters. Extensive data sets currently exist for Alaskan, Antarctic, and Greenlandic ice areas, and are available upon request to NSIDC. Go_LIVE's goal for 2017 is a system that updates global ice velocity at few-day or shorter latency.

  5. Coupling West WRF to GSSHA with GSSHApy

    NASA Astrophysics Data System (ADS)

    Snow, A. D.

    2017-12-01

    The West WRF output data is in the gridded NetCDF output format containing the required forcing data needed to run a GSSHA simulation. These data include precipitation, pressure, temperature, relative humidity, cloud cover, wind speed, and solar radiation. Tools to reproject, resample, and reformat the data for GSSHA have recently been added to the open source Python library GSSHApy (https://github.com/ci-water/gsshapy). These tools have created a connection that has made it possible to run forecasts using the West WRF forcing data with GSSHA to produce both streamflow and lake level predictions.

  6. figure1.nc

    EPA Pesticide Factsheets

    NetCDF file of the SREF standard deviation of wind speed and direction that was used to inject variability in the FDDA input.variable U_NDG_OLD contains standard deviation of wind speed (m/s)variable V_NDG_OLD contains the standard deviation of wind direction (deg)This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).

  7. The Time Series Data Server (TSDS) for Standards-Compliant, Convenient, and Efficient Access to Time Series Data

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Weigel, R. S.; Wilson, A.; Ware Dewolfe, A.

    2009-12-01

    Data analysis in the physical sciences is often plagued by the difficulty in acquiring the desired data. A great deal of work has been done in the area of metadata and data discovery, however, many such discoveries simply provide links that lead directly to a data file. Often these files are impractically large, containing more time samples or variables than desired, and are slow to access. Once these files are downloaded, format issues further complicate using the data. Some data servers have begun to address these problems by improving data virtualization and ease of use. However, these services often don't scale to large datasets. Also, the generic nature of the data models used by these servers, while providing greater flexibility, may complicate setting up such a service for data providers and limit sufficient semantics that would otherwise simplify use for clients, machine or human. The Time Series Data Server (TSDS) aims to address these problems within the limited, yet common, domain of time series data. With the simplifying assumption that all data products served are a function of time, the server can optimize for data access based on time subsets, a common use case. The server also supports requests for specific variables, which can be of type scalar, structure, or sequence. It also supports data types with higher level semantics, such as "spectrum." The TSDS is implemented using Java Servlet technology and can be dropped into any servlet container and customized for a data provider's needs. The interface is based on OPeNDAP (http://opendap.org) and conforms to the Data Acces Protocol (DAP) 2.0, a NASA standard (ESDS-RFC-004), which defines a simple HTTP request and response paradigm. Thus a TSDS server instance is a compliant OPeNDAP server that can be accessed by any OPeNDAP client or directly via RESTful web service requests. The TSDS reads the data that it serves into a common data model via the NetCDF Markup Language (NcML, http://www.unidata.ucar.edu/software/netcdf/ncml/) which enables dataset virtualization. An NcML file can expose a single file, a subset, or an aggregation of files as a single, logical dataset. With the appropriate NcML adapter, the TSDS can read data from its native format, eliminating the need for data providers to reformat their data and lowering the barrier for integration. Data can even be read via remote services which is important for enabling VxOs to be truly virtual. The TSDS provides reading, writing, and filtering capabilities through a modular framework. A collection of standard modules is available and customized modules are easy to create and integrate. This way the TSDS can read and write data in a variety of formats and apply filters to them an a manner customizable to meet the needs of both the data providers and consumers. The TSDS server is currently in use serving solar irradiance data from the LASP Interactive Solar IRradiance Datacenter (LISIRD, http://lasp.colorado.edu/lisird/), and is being introduced into the space physics virtual observatory community. The TSDS software is Open Source and available at SourceForge.

  8. On common noise-induced synchronization in complex networks with state-dependent noise diffusion processes

    NASA Astrophysics Data System (ADS)

    Russo, Giovanni; Shorten, Robert

    2018-04-01

    This paper is concerned with the study of common noise-induced synchronization phenomena in complex networks of diffusively coupled nonlinear systems. We consider the case where common noise propagation depends on the network state and, as a result, the noise diffusion process at the nodes depends on the state of the network. For such networks, we present an algebraic sufficient condition for the onset of synchronization, which depends on the network topology, the dynamics at the nodes, the coupling strength and the noise diffusion. Our result explicitly shows that certain noise diffusion processes can drive an unsynchronized network towards synchronization. In order to illustrate the effectiveness of our result, we consider two applications: collective decision processes and synchronization of chaotic systems. We explicitly show that, in the former application, a sufficiently large noise can drive a population towards a common decision, while, in the latter, we show how common noise can synchronize a network of Lorentz chaotic systems.

  9. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.

  10. ESA Atmospheric Toolbox

    NASA Astrophysics Data System (ADS)

    Niemeijer, Sander

    2017-04-01

    The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and analysis application for atmospheric data and can be used to visualize and analyze the data that you retrieve using the CODA and HARP interfaces. The application uses the Python language as the means through which you provide commands to the application. The Python interfaces for CODA and HARP are included so you can directly ingest product data from within VISAN. Powerful visualization functionality for 2D plots and geographical plots in VISAN will allow you to directly visualize the ingested data. All components from the ESA Atmospheric Toolbox are Open Source and freely available. Software packages can be downloaded from the BEAT website: http://stcorp.nl/beat/

  11. The IAGOS Information System

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Thouret, Valérie; Brissebrat, Guillaume

    2017-04-01

    IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft and do measurements of aerosols, cloud particles, greenhouse gases, ozone, water vapor and nitrogen oxides from the surface to the lower stratosphere. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data. The IAGOS Data Portal http://www.iagos.org, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). In 2016 the new IAGOS Data Portal has been released. In addition to the data download the portal provides improved and new services such as download in NetCDF or NASA Ames formats and plotting tools (maps, time series, vertical profiles, etc.). New added value products are or will be soon available through the portal: back trajectories, origin of air masses, co-location with satellite data, etc. Web services allow to download IAGOS metadata such as flights and airports information. Administration tools have been implemented for users management and instruments monitoring. A major improvement is the interoperability with international portals or other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse, the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de) and the CAMS (Copernicus Atmosphere Monitoring Service) data center in Jülich (http://join.iek.fz-juelich.de). The link with the CAMS data center, through the JOIN interface, allows to combine model outputs with IAGOS data for inter-comparison. The CAMS project is a prominent user of the IGAS data network. During the year IAGOS will improved metadata standardization and dissemination through different collaborations with the AERIS data center, GAW for which IAGOS is a contributing network and the ENVRI+ European project. Metadata about measurements traceability and quality will be available, DOI will be implemented and interoperability with other European Infrastructures will be set up through standardized web services.

  12. Ocean data management in OMP Data Service

    NASA Astrophysics Data System (ADS)

    Fleury, Laurence; André, François; Belmahfoud, Nizar; Boichard, Jean-Luc; Brissebrat, Guillaume; Ferré, Hélène; Mière, Arnaud

    2014-05-01

    The Observatoire Midi-Pyrénées Data Service (SEDOO) is a development team, dedicated to environmental data management and dissemination application set up, in the framework of intensive field campaigns and long term observation networks. SEDOO developped some applications dealing with ocean data only, but also generic databases that enable to store and distribute multidisciplinary datasets. SEDOO is in charge of the in situ data management and the data portal for international and multidisciplinary programmes as large as African Monsoon Multidisciplinary Analyses (AMMA) and Mediterranean Integrated STudies at Regional And Local Scales (MISTRALS). The AMMA and MISTRALS databases are distributed and the data portals enable to access datasets managed by other data centres (IPSL, CORIOLIS...) through interoperability protocols (OPeNDAP, xml requests...). AMMA and MISTRALS metadata (data description) are standardized and comply with international standards (ISO 19115-19139; INSPIRE European Directive; Global Change Master Directory Thesaurus). Most of the AMMA and MISTRALS in situ ocean data sets are homogenized and inserted in a relational database, in order to enable accurate data selection and download of different data sets in a shared format. Data selection criteria are location, period, physical property name, physical property range... The data extraction procedure include format output selection among CSV, NetCDF, Nasa Ames... The AMMA database - http://database.amma-international.org/ - contains field campaign observations in the Guinea Gulf (EGEE 2005-2007) and Atlantic Tropical Ocean (AEROSE-II 2006...), as well as long term monitoring data (PIRATA, ARGO...). Operational analysis (MERCATOR) and satellite products (TMI, SSMI...) are managed by IPSL data centre and can be accessed too. They have been projected over regular latitude-longitude grids and converted into the NetCDF format. The MISTRALS data portal - http://mistrals.sedoo.fr/ - enables to access ocean datasets produced by the contributing programmes: Hydrological cycle in the Mediterranean eXperiment (HyMeX), Chemistry-Aerosol Mediterranean eXperiment (ChArMEx), Marine Mediterranean eXperiment (MERMeX)... The programmes include many field campaigns from 2011 to 2015, collecting general and specific properties. Long term monitoring networks, like Mediterranean Ocean Observing System on Environment (MOOSE) or Mediterranean Eurocentre for Underwater Sciences and Technologies (MEUST-SE), contribute to the MISTRALS data portal as well. Relevant model outputs and satellite products managed by external data centres (IPSL, ENEA...) can be accessed too. SEDOO manages the SSS (Sea Surface Salinity) national observation service data: http://sss.sedoo.fr/. SSS aims at collecting, validating, archiving and distributing in situ SSS measurements derived from Voluntary Observing Ship programs. The SSS data user interface enables to built multicriteria data request and download relevant datasets. SEDOO contributes to the SOLWARA project that aims at understanding the oceanic circulation in the Coral Sea and the Solomon Sea and their role in both the climate system and the oceanic chemistry. The research programme include in situ measurements, numerical modelling and compiled analyses of past data. The website http://thredds.sedoo.fr/solwara/ enables to access, visualize and download Solwara gridded data and model simulations, using Thredds associated services (OPEnDAP, NCSS and WMS). In order to improve the application user-friendliness, SSS and SOLWARA web interfaces are JEE applications build with GWT Framework, and share many modules.

  13. Community Intercomparison Suite (CIS) v1.4.0: a tool for intercomparing models and observations

    NASA Astrophysics Data System (ADS)

    Watson-Parris, Duncan; Schutgens, Nick; Cook, Nicholas; Kipling, Zak; Kershaw, Philip; Gryspeerdt, Edward; Lawrence, Bryan; Stier, Philip

    2016-09-01

    The Community Intercomparison Suite (CIS) is an easy-to-use command-line tool which has been developed to allow the straightforward intercomparison of remote sensing, in situ and model data. While there are a number of tools available for working with climate model data, the large diversity of sources (and formats) of remote sensing and in situ measurements necessitated a novel software solution. Developed by a professional software company, CIS supports a large number of gridded and ungridded data sources "out-of-the-box", including climate model output in NetCDF or the UK Met Office pp file format, CloudSat, CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization), MODIS (MODerate resolution Imaging Spectroradiometer), Cloud and Aerosol CCI (Climate Change Initiative) level 2 satellite data and a number of in situ aircraft and ground station data sets. The open-source architecture also supports user-defined plugins to allow many other sources to be easily added. Many of the key operations required when comparing heterogenous data sets are provided by CIS, including subsetting, aggregating, collocating and plotting the data. Output data are written to CF-compliant NetCDF files to ensure interoperability with other tools and systems. The latest documentation, including a user manual and installation instructions, can be found on our website (http://cistools.net). Here, we describe the need which this tool fulfils, followed by descriptions of its main functionality (as at version 1.4.0) and plugin architecture which make it unique in the field.

  14. Predicting missing links in complex networks based on common neighbors and distance

    PubMed Central

    Yang, Jinxuan; Zhang, Xiao-Dong

    2016-01-01

    The algorithms based on common neighbors metric to predict missing links in complex networks are very popular, but most of these algorithms do not account for missing links between nodes with no common neighbors. It is not accurate enough to reconstruct networks by using these methods in some cases especially when between nodes have less common neighbors. We proposed in this paper a new algorithm based on common neighbors and distance to improve accuracy of link prediction. Our proposed algorithm makes remarkable effect in predicting the missing links between nodes with no common neighbors and performs better than most existing currently used methods for a variety of real-world networks without increasing complexity. PMID:27905526

  15. SeaView: bringing EarthCube to the Oceanographer

    NASA Astrophysics Data System (ADS)

    Stocks, K. I.; Diggs, S. C.; Arko, R. A.; Kinkade, D.; Shepherd, A.

    2016-12-01

    As new instrument types are developed, and new observational programs start, that support a growing community of "dry" oceanographers, the ability to find, access, and visualize existing data of interest becomes increasingly critical. Yet ocean data, when available, is are held in multiple data facilities, in different formats, and accessible through different pathways. This creates practical problems with integrating and working across different data sets. The SeaView project is building connections between the rich data resources in five major oceanographic data facilities - BCO-DMO, CCHDO, OBIS, OOI, and R2R* - creating a federated set of thematic data collections that are organized around common characteristics (geographic location, time, expedition, program, data type, etc.) and published online in Web Accessible Folders using standard file formats such as ODV and NetCDF. The work includes not simply reformatting data, but identifying and, where possible, addressing interoperability challenges: which common identifiers for core concepts can connect data across repositories, which terms a scientist may want to search that, if added to the data repositories, will increase discoverability; the presence of duplicate data across repositories, etc. We will present the data collections available to date, including data from the OOI Pioneer Array region, and seek scientists' input on the data types and formats they prefer, the tools they use to analyze and visualize data, and their specific recommendations for future data collections to support oceanographic science. * Biological and Chemical Oceanography Data Management Office (BCO-DMO), CLIVAR and Carbon Hydrographic Data Office (CCHDO), International Ocean Biogeographic Information System (iOBIS), Ocean Observatories Initiative (OOI), and Rolling Deck to Repository (R2R) Program.

  16. Hyperparameterization of soil moisture statistical models for North America with Ensemble Learning Models (Elm)

    NASA Astrophysics Data System (ADS)

    Steinberg, P. D.; Brener, G.; Duffy, D.; Nearing, G. S.; Pelissier, C.

    2017-12-01

    Hyperparameterization, of statistical models, i.e. automated model scoring and selection, such as evolutionary algorithms, grid searches, and randomized searches, can improve forecast model skill by reducing errors associated with model parameterization, model structure, and statistical properties of training data. Ensemble Learning Models (Elm), and the related Earthio package, provide a flexible interface for automating the selection of parameters and model structure for machine learning models common in climate science and land cover classification, offering convenient tools for loading NetCDF, HDF, Grib, or GeoTiff files, decomposition methods like PCA and manifold learning, and parallel training and prediction with unsupervised and supervised classification, clustering, and regression estimators. Continuum Analytics is using Elm to experiment with statistical soil moisture forecasting based on meteorological forcing data from NASA's North American Land Data Assimilation System (NLDAS). There Elm is using the NSGA-2 multiobjective optimization algorithm for optimizing statistical preprocessing of forcing data to improve goodness-of-fit for statistical models (i.e. feature engineering). This presentation will discuss Elm and its components, including dask (distributed task scheduling), xarray (data structures for n-dimensional arrays), and scikit-learn (statistical preprocessing, clustering, classification, regression), and it will show how NSGA-2 is being used for automate selection of soil moisture forecast statistical models for North America.

  17. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  18. FastQuery: A Parallel Indexing System for Scientific Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chou, Jerry; Wu, Kesheng; Prabhat,

    2011-07-29

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the- art index and query technologies such as FastBit can significantly improve accesses to these datasets by augmenting the user data with indexes and other secondary information. However, a challenge is that the indexes assume the relational data model but the scientific data generally follows the array data model. To match the two data models, we design a generic mapping mechanism and implement an efficient input and output interface for reading and writing the data and their corresponding indexes. To take advantage of the emerging many-core architectures, we also developmore » a parallel strategy for indexing using threading technology. This approach complements our on-going MPI-based parallelization efforts. We demonstrate the flexibility of our software by applying it to two of the most commonly used scientific data formats, HDF5 and NetCDF. We present two case studies using data from a particle accelerator model and a global climate model. We also conducted a detailed performance study using these scientific datasets. The results show that FastQuery speeds up the query time by a factor of 2.5x to 50x, and it reduces the indexing time by a factor of 16 on 24 cores.« less

  19. Cloud-Enabled Climate Analytics-as-a-Service using Reanalysis data: A case study.

    NASA Astrophysics Data System (ADS)

    Nadeau, D.; Duffy, D.; Schnase, J. L.; McInerney, M.; Tamkin, G.; Potter, G. L.; Thompson, J. H.

    2014-12-01

    The NASA Center for Climate Simulation (NCCS) maintains advanced data capabilities and facilities that allow researchers to access the enormous volume of data generated by weather and climate models. The NASA Climate Model Data Service (CDS) and the NCCS are merging their efforts to provide Climate Analytics-as-a-Service for the comparative study of the major reanalysis projects: ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, JMA JRA25, and JRA55. These reanalyses have been repackaged to netCDF4 file format following the CMIP5 Climate and Forecast (CF) metadata convention prior to be sequenced into the Hadoop Distributed File System ( HDFS ). A small set of operations that represent a common starting point in many analysis workflows was then created: min, max, sum, count, variance and average. In this example, Reanalysis data exploration was performed with the use of Hadoop MapReduce and accessibility was achieved using the Climate Data Service(CDS) application programming interface (API) created at NCCS. This API provides a uniform treatment of large amount of data. In this case study, we have limited our exploration to 2 variables, temperature and precipitation, using 3 operations, min, max and avg and using 30-year of Reanalysis data for 3 regions of the world: global, polar, subtropical.

  20. Technical note: Harmonizing met-ocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Camossi, E.

    2015-11-01

    Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.

  1. Identification of common coexpression modules based on quantitative network comparison.

    PubMed

    Jo, Yousang; Kim, Sanghyeon; Lee, Doheon

    2018-06-13

    Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.

  2. SGP and TWP (Manus) Ice Cloud Vertical Velocities

    DOE Data Explorer

    Kalesse, Heike

    2013-06-27

    Daily netcdf-files of ice-cloud dynamics observed at the ARM sites at SGP (Jan1997-Dec2010) and Manus (Jul1999-Dec2010). The files include variables at different time resolution (10s, 20min, 1hr). Profiles of radar reflectivity factor (dbz), Doppler velocity (vel) as well as retrieved vertical air motion (V_air) and reflectivity-weighted particle terminal fall velocity (V_ter) are given at 10s, 20min and 1hr resolution. Retrieved V_air and V_ter follow radar notation, so positive values indicate downward motion. Lower level clouds are removed, however a multi-layer flag is included.

  3. Figure5

    EPA Pesticide Factsheets

    This is an R statistics package script that allows the reproduction of Figure 5. The script includes the links to large NetCDF files that the figures access for O3, CO, wind speed, radiation and PBL height. It pulls the timeseries for each variable at a number of cities (lat-lon specified). This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).

  4. SACR ADVance 3-D Cartesian Cloud Cover (SACR-ADV-3D3C) product

    DOE Data Explorer

    Meng Wang, Tami Toto, Eugene Clothiaux, Katia Lamer, Mariko Oue

    2017-03-08

    SACR-ADV-3D3C remaps the outputs of SACRCORR for cross-wind range-height indicator (CW-RHI) scans to a Cartesian grid and reports reflectivity CFAD and best estimate domain averaged cloud fraction. The final output is a single NetCDF file containing all aforementioned corrected radar moments remapped on a 3-D Cartesian grid, the SACR reflectivity CFAD, a profile of best estimate cloud fraction, a profile of maximum observable x-domain size (xmax), a profile time to horizontal distance estimate and a profile of minimum observable reflectivity (dBZmin).

  5. Toward a Common Vision in Library Networking. Proceedings of the Library of Congress Network Advisory Committee Meeting (Washington, D.C., December 9-11, 1985). Network Planning Paper No. 13.

    ERIC Educational Resources Information Center

    Harriman, Sigrid G., Ed.

    The December 1985 program session of the Library of Congress Network Advisory Committee (NAC) focused on determining the effectiveness of networking, identifying a common vision or goal, and developing a strategy to accomplish that goal. The program session included remarks on the role of the regional networks in national networking by Louella V.…

  6. Gridded global surface ozone metrics for atmospheric chemistry model evaluation

    NASA Astrophysics Data System (ADS)

    Sofen, E. D.; Bowdalo, D.; Evans, M. J.; Apadula, F.; Bonasoni, P.; Cupeiro, M.; Ellul, R.; Galbally, I. E.; Girgzdiene, R.; Luppo, S.; Mimouni, M.; Nahas, A. C.; Saliba, M.; Tørseth, K.

    2016-02-01

    The concentration of ozone at the Earth's surface is measured at many locations across the globe for the purposes of air quality monitoring and atmospheric chemistry research. We have brought together all publicly available surface ozone observations from online databases from the modern era to build a consistent data set for the evaluation of chemical transport and chemistry-climate (Earth System) models for projects such as the Chemistry-Climate Model Initiative and Aer-Chem-MIP. From a total data set of approximately 6600 sites and 500 million hourly observations from 1971-2015, approximately 2200 sites and 200 million hourly observations pass screening as high-quality sites in regionally representative locations that are appropriate for use in global model evaluation. There is generally good data volume since the start of air quality monitoring networks in 1990 through 2013. Ozone observations are biased heavily toward North America and Europe with sparse coverage over the rest of the globe. This data set is made available for the purposes of model evaluation as a set of gridded metrics intended to describe the distribution of ozone concentrations on monthly and annual timescales. Metrics include the moments of the distribution, percentiles, maximum daily 8-hour average (MDA8), sum of means over 35 ppb (daily maximum 8-h; SOMO35), accumulated ozone exposure above a threshold of 40 ppbv (AOT40), and metrics related to air quality regulatory thresholds. Gridded data sets are stored as netCDF-4 files and are available to download from the British Atmospheric Data Centre (doi: 10.5285/08fbe63d-fa6d-4a7a-b952-5932e3ab0452). We provide recommendations to the ozone measurement community regarding improving metadata reporting to simplify ongoing and future efforts in working with ozone data from disparate networks in a consistent manner.

  7. Gridded global surface ozone metrics for atmospheric chemistry model evaluation

    NASA Astrophysics Data System (ADS)

    Sofen, E. D.; Bowdalo, D.; Evans, M. J.; Apadula, F.; Bonasoni, P.; Cupeiro, M.; Ellul, R.; Galbally, I. E.; Girgzdiene, R.; Luppo, S.; Mimouni, M.; Nahas, A. C.; Saliba, M.; Tørseth, K.; Wmo Gaw, Epa Aqs, Epa Castnet, Capmon, Naps, Airbase, Emep, Eanet Ozone Datasets, All Other Contributors To

    2015-07-01

    The concentration of ozone at the Earth's surface is measured at many locations across the globe for the purposes of air quality monitoring and atmospheric chemistry research. We have brought together all publicly available surface ozone observations from online databases from the modern era to build a consistent dataset for the evaluation of chemical transport and chemistry-climate (Earth System) models for projects such as the Chemistry-Climate Model Initiative and Aer-Chem-MIP. From a total dataset of approximately 6600 sites and 500 million hourly observations from 1971-2015, approximately 2200 sites and 200 million hourly observations pass screening as high-quality sites in regional background locations that are appropriate for use in global model evaluation. There is generally good data volume since the start of air quality monitoring networks in 1990 through 2013. Ozone observations are biased heavily toward North America and Europe with sparse coverage over the rest of the globe. This dataset is made available for the purposes of model evaluation as a set of gridded metrics intended to describe the distribution of ozone concentrations on monthly and annual timescales. Metrics include the moments of the distribution, percentiles, maximum daily eight-hour average (MDA8), SOMO35, AOT40, and metrics related to air quality regulatory thresholds. Gridded datasets are stored as netCDF-4 files and are available to download from the British Atmospheric Data Centre (doi:10.5285/08fbe63d-fa6d-4a7a-b952-5932e3ab0452). We provide recommendations to the ozone measurement community regarding improving metadata reporting to simplify ongoing and future efforts in working with ozone data from disparate networks in a consistent manner.

  8. A Lightweight Remote Parallel Visualization Platform for Interactive Massive Time-varying Climate Data Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, T.; Huang, Q.; Liu, Q.

    2014-12-01

    Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.

  9. Development of an Operational TS Dataset Production System for the Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Kim, Sung Dae; Park, Hyuk Min; Kim, Young Ho; Park, Kwang Soon

    2017-04-01

    An operational TS (Temperature and Salinity) dataset production system was developed to provide near real-time data to the data assimilation system periodically. It collects the latest 15 days' TS data of the north western pacific area (20°N - 55°N, 110°E - 150°E), applies QC tests to the archived data and supplies them to numerical prediction models of KIOST (Korea Institute of Ocean Science and Technology). The latest real-time TS data are collected from Argo GDAC and GTSPP data server every week. Argo data are downloaded from /latest_data directory of Argo GDAC. Because many duplicated data exist when all profile data are extracted from all Argo netCDF files, DB system is used to avoid duplication. All metadata (float ID, location, observation date and time, etc) of all Argo floats is stored into Database system and a Matlab program was developed to manipulate DB data, to check the duplication and to exclude duplicated data. GTSPP data are downloaded from /realtime directory of GTSPP data service. The latest data except ARGO data are extracted from the original data. Another Matlab program was coded to inspect all collected data using 10 QC tests and produce final dataset which can be used by the assimilation system. Three regional range tests to inspect annual, seasonal and monthly variations are included in the QC procedures. The C program was developed to provide regional ranges to data managers. It can calculate upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. The final TS dataset contains the latest 15 days' TS data in netCDF format. It is updated every week and transmitted to numerical modeler of KIOST for operational use.

  10. Surveying hospital network structure in New York State: how are they structured?

    PubMed

    Nauenberg, E; Brewer, C S

    2000-01-01

    We determine the most common network structures in New York state. The taxonomy employed uses three structural dimensions: integration, complexity, and risk-sharing between organizations. Based on a survey conducted in 1996, the most common type of network (26.4 percent) had medium levels of integration, medium or high levels of complexity, and some risk-sharing. Also common were networks with low levels of integration, low levels of complexity, and no risk-sharing (22.1 percent).

  11. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses

    PubMed Central

    Stephen, Emily P.; Lepage, Kyle Q.; Eden, Uri T.; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S.; Guenther, Frank H.; Kramer, Mark A.

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty—both in the functional network edges and the corresponding aggregate measures of network topology—are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here—appropriate for static and dynamic network inference and different statistical measures of coupling—permits the evaluation of confidence in network measures in a variety of settings common to neuroscience. PMID:24678295

  12. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.

    PubMed

    Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.

  13. Construction of multi-scale consistent brain networks: methods and applications.

    PubMed

    Ge, Bao; Tian, Yin; Hu, Xintao; Chen, Hanbo; Zhu, Dajiang; Zhang, Tuo; Han, Junwei; Guo, Lei; Liu, Tianming

    2015-01-01

    Mapping human brain networks provides a basis for studying brain function and dysfunction, and thus has gained significant interest in recent years. However, modeling human brain networks still faces several challenges including constructing networks at multiple spatial scales and finding common corresponding networks across individuals. As a consequence, many previous methods were designed for a single resolution or scale of brain network, though the brain networks are multi-scale in nature. To address this problem, this paper presents a novel approach to constructing multi-scale common structural brain networks from DTI data via an improved multi-scale spectral clustering applied on our recently developed and validated DICCCOLs (Dense Individualized and Common Connectivity-based Cortical Landmarks). Since the DICCCOL landmarks possess intrinsic structural correspondences across individuals and populations, we employed the multi-scale spectral clustering algorithm to group the DICCCOL landmarks and their connections into sub-networks, meanwhile preserving the intrinsically-established correspondences across multiple scales. Experimental results demonstrated that the proposed method can generate multi-scale consistent and common structural brain networks across subjects, and its reproducibility has been verified by multiple independent datasets. As an application, these multi-scale networks were used to guide the clustering of multi-scale fiber bundles and to compare the fiber integrity in schizophrenia and healthy controls. In general, our methods offer a novel and effective framework for brain network modeling and tract-based analysis of DTI data.

  14. Distributed data discovery, access and visualization services to Improve Data Interoperability across different data holdings

    NASA Astrophysics Data System (ADS)

    Palanisamy, G.; Krassovski, M.; Devarakonda, R.; Santhana Vannan, S.

    2012-12-01

    The current climate debate is highlighting the importance of free, open, and authoritative sources of high quality climate data that are available for peer review and for collaborative purposes. It is increasingly important to allow various organizations around the world to share climate data in an open manner, and to enable them to perform dynamic processing of climate data. This advanced access to data can be enabled via Web-based services, using common "community agreed" standards without having to change their internal structure used to describe the data. The modern scientific community has become diverse and increasingly complex in nature. To meet the demands of such diverse user community, the modern data supplier has to provide data and other related information through searchable, data and process oriented tool. This can be accomplished by setting up on-line, Web-based system with a relational database as a back end. The following common features of the web data access/search systems will be outlined in the proposed presentation: - A flexible data discovery - Data in commonly used format (e.g., CSV, NetCDF) - Preparing metadata in standard formats (FGDC, ISO19115, EML, DIF etc.) - Data subseting capabilities and ability to narrow down to individual data elements - Standards based data access protocols and mechanisms (SOAP, REST, OpenDAP, OGC etc.) - Integration of services across different data systems (discovery to access, visualizations and subseting) This presentation will also include specific examples of integration of various data systems that are developed by Oak Ridge National Laboratory's - Climate Change Science Institute, their ability to communicate between each other to enable better data interoperability and data integration. References: [1] Devarakonda, Ranjeet, and Harold Shanafield. "Drupal: Collaborative framework for science research." Collaboration Technologies and Systems (CTS), 2011 International Conference on. IEEE, 2011. [2]Devarakonda, R., Shrestha, B., Palanisamy, G., Hook, L. A., Killeffer, T. S., Boden, T. A., ... & Lazer, K. (2014). THE NEW ONLINE METADATA EDITOR FOR GENERATING STRUCTURED METADATA. Oak Ridge National Laboratory (ORNL).

  15. New Solutions for Enabling Discovery of User-Centric Virtual Data Products in NASA's Common Metadata Repository

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Gilman, J.; Baynes, K.; Shum, D.

    2015-12-01

    This talk introduces a new NASA Earth Observing System Data and Information System (EOSDIS) capability to automatically generate and maintain derived, Virtual Product information allowing DAACs and Data Providers to create tailored and more discoverable variations of their products. After this talk the audience will be aware of the new EOSDIS Virtual Product capability, applications of it, and how to take advantage of it. Much of the data made available in the EOSDIS are organized for generation and archival rather than for discovery and use. The EOSDIS Common Metadata Repository (CMR) is launching a new capability providing automated generation and maintenance of user-oriented Virtual Product information. DAACs can easily surface variations on established data products tailored to specific uses cases and users, leveraging DAAC exposed services such as custom ordering or access services like OPeNDAP for on-demand product generation and distribution. Virtual Data Products enjoy support for spatial and temporal information, keyword discovery, association with imagery, and are fully discoverable by tools such as NASA Earthdata Search, Worldview, and Reverb. Virtual Product generation has applicability across many use cases: - Describing derived products such as Surface Kinetic Temperature information (AST_08) from source products (ASTER L1A) - Providing streamlined access to data products (e.g. AIRS) containing many (>800) data variables covering an enormous variety of physical measurements - Attaching additional EOSDIS offerings such as Visual Metadata, external services, and documentation metadata - Publishing alternate formats for a product (e.g. netCDF for HDF products) with the actual conversion happening on request - Publishing granules to be modified by on-the-fly services, like GES-DISC's Data Quality Screening Service - Publishing "bundled" products where granules from one product correspond to granules from one or more other related products

  16. Extending Climate Analytics as a Service to the Earth System Grid Federation Progress Report on the Reanalysis Ensemble Service

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2016-12-01

    We are extending climate analytics-as-a-service, including: (1) A high-performance Virtual Real-Time Analytics Testbed supporting six major reanalysis data sets using advanced technologies like the Cloudera Impala-based SQL and Hadoop-based MapReduce analytics over native NetCDF files. (2) A Reanalysis Ensemble Service (RES) that offers a basic set of commonly used operations over the reanalysis collections that are accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib. (3) An Open Geospatial Consortium (OGC) WPS-compliant Web service interface to CDSLib to accommodate ESGF's Web service endpoints. This presentation will report on the overall progress of this effort, with special attention to recent enhancements that have been made to the Reanalysis Ensemble Service, including the following: - An CDSlib Python library that supports full temporal, spatial, and grid-based resolution services - A new reanalysis collections reference model to enable operator design and implementation - An enhanced library of sample queries to demonstrate and develop use case scenarios - Extended operators that enable single- and multiple reanalysis area average, vertical average, re-gridding, and trend, climatology, and anomaly computations - Full support for the MERRA-2 reanalysis and the initial integration of two additional reanalyses - A prototype Jupyter notebook-based distribution mechanism that combines CDSlib documentation with interactive use case scenarios and personalized project management - Prototyped uncertainty quantification services that combine ensemble products with comparative observational products - Convenient, one-stop shopping for commonly used data products from multiple reanalyses, including basic subsetting and arithmetic operations over the data and extractions of trends, climatologies, and anomalies - The ability to compute and visualize multiple reanalysis intercomparisons

  17. Spatiotemporal Filtering Using Principal Component Analysis and Karhunen-Loeve Expansion Approaches for Regional GPS Network Analysis

    NASA Technical Reports Server (NTRS)

    Dong, D.; Fang, P.; Bock, F.; Webb, F.; Prawirondirdjo, L.; Kedar, S.; Jamason, P.

    2006-01-01

    Spatial filtering is an effective way to improve the precision of coordinate time series for regional GPS networks by reducing so-called common mode errors, thereby providing better resolution for detecting weak or transient deformation signals. The commonly used approach to regional filtering assumes that the common mode error is spatially uniform, which is a good approximation for networks of hundreds of kilometers extent, but breaks down as the spatial extent increases. A more rigorous approach should remove the assumption of spatially uniform distribution and let the data themselves reveal the spatial distribution of the common mode error. The principal component analysis (PCA) and the Karhunen-Loeve expansion (KLE) both decompose network time series into a set of temporally varying modes and their spatial responses. Therefore they provide a mathematical framework to perform spatiotemporal filtering.We apply the combination of PCA and KLE to daily station coordinate time series of the Southern California Integrated GPS Network (SCIGN) for the period 2000 to 2004. We demonstrate that spatially and temporally correlated common mode errors are the dominant error source in daily GPS solutions. The spatial characteristics of the common mode errors are close to uniform for all east, north, and vertical components, which implies a very long wavelength source for the common mode errors, compared to the spatial extent of the GPS network in southern California. Furthermore, the common mode errors exhibit temporally nonrandom patterns.

  18. ESMPy and OpenClimateGIS: Python Interfaces for High Performance Grid Remapping and Geospatial Dataset Manipulation

    NASA Astrophysics Data System (ADS)

    O'Kuinghttons, Ryan; Koziol, Benjamin; Oehmke, Robert; DeLuca, Cecelia; Theurich, Gerhard; Li, Peggy; Jacob, Joseph

    2016-04-01

    The Earth System Modeling Framework (ESMF) Python interface (ESMPy) supports analysis and visualization in Earth system modeling codes by providing access to a variety of tools for data manipulation. ESMPy started as a Python interface to the ESMF grid remapping package, which provides mature and robust high-performance and scalable grid remapping between 2D and 3D logically rectangular and unstructured grids and sets of unconnected data. ESMPy now also interfaces with OpenClimateGIS (OCGIS), a package that performs subsetting, reformatting, and computational operations on climate datasets. ESMPy exposes a subset of ESMF grid remapping utilities. This includes bilinear, finite element patch recovery, first-order conservative, and nearest neighbor grid remapping methods. There are also options to ignore unmapped destination points, mask points on source and destination grids, and provide grid structure in the polar regions. Grid remapping on the sphere takes place in 3D Cartesian space, so the pole problem is not an issue as it can be with other grid remapping software. Remapping can be done between any combination of 2D and 3D logically rectangular and unstructured grids with overlapping domains. Grid pairs where one side of the regridding is represented by an appropriate set of unconnected data points, as is commonly found with observational data streams, is also supported. There is a developing interoperability layer between ESMPy and OpenClimateGIS (OCGIS). OCGIS is a pure Python, open source package designed for geospatial manipulation, subsetting, and computation on climate datasets stored in local NetCDF files or accessible remotely via the OPeNDAP protocol. Interfacing with OCGIS has brought GIS-like functionality to ESMPy (i.e. subsetting, coordinate transformations) as well as additional file output formats (i.e. CSV, ESRI Shapefile). ESMPy is distinguished by its strong emphasis on open source, community governance, and distributed development. The user base has grown quickly, and the package is integrating with several other software tools and frameworks. These include the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT), Iris, PyFerret, cfpython, and the Community Surface Dynamics Modeling System (CSDMS). ESMPy minimum requirements include Python 2.6, Numpy 1.6.1 and an ESMF installation. Optional dependencies include NetCDF and OCGIS-related dependencies: GDAL, Shapely, and Fiona. ESMPy is regression tested nightly, and supported on Darwin, Linux and Cray systems with the GNU compiler suite and MPI communications. OCGIS is supported on Linux, and also undergoes nightly regression testing. Both packages are installable from Anaconda channels. Upcoming development plans for ESMPy involve development of a higher order conservative grid remapping method. Future OCGIS development will focus on mesh and location stream interoperability and streamlined access to ESMPy's MPI implementation.

  19. A User's Guide to the Tsunami Datasets at NOAA's National Data Buoy Center

    NASA Astrophysics Data System (ADS)

    Bouchard, R. H.; O'Neil, K.; Grissom, K.; Garcia, M.; Bernard, L. J.; Kern, K. J.

    2013-12-01

    The National Data Buoy Center (NDBC) has maintained and operated the National Oceanic and Atmospheric Administration's (NOAA) tsunameter network since 2003. The tsunameters employ the NOAA-developed Deep-ocean Assessment and Reporting of Tsunamis (DART) technology. The technology measures the pressure and temperature every 15 seconds on the ocean floor and transforms them into equivalent water-column height observations. A complex series of subsampled observations are transmitted acoustically in real-time to a moored buoy or marine autonomous vehicle (MAV) at the ocean surface. The surface platform uses its satellite communications to relay the observations to NDBC. NDBC places the observations onto the Global Telecommunication System (GTS) for relay to NOAA's Tsunami Warning Centers (TWC) in Hawai'i and Alaska and to the international community. It takes less than three minutes to speed the observations from the ocean floor to the TWCs. NDBC can retrieve limited amounts of the 15-s measurements from the instrumentation on the ocean floor using the technology's two-way communications. NDBC recovers the full resolution 15-s measurements about every 2 years and forwards the datasets and metadata to the National Geophysical Data Center for permanent archive. Meanwhile, NDBC retains the real-time observations on its website. The type of real-time observation depends on the operating mode of the tsunameter. NDBC provides the observations in a variety of traditional and innovative methods and formats that include descriptors of the operating mode. Datasets, organized by station, are available from the NDBC website as text files and from the NDBC THREDDS server in netCDF format. The website provides alerts and lists of events that allow users to focus on the information relevant for tsunami hazard analysis. In addition, NDBC developed a basic web service to query station information and observations to support the Short-term Inundation Forecasting for Tsunamis (SIFT) model. NDBC and NOAA's Integrated Ocean Observing System have fielded the innovative Sensor Observation Service (SOS) that allows users access to observations by station, or groups of stations that have been organized into Features of Interest, such as the 2011 Honshu Tsunami. The user can elect to receive the SOS observations in several different formats, such as Sensor Web Enablement (SWE) or delimiter-separated values. Recently, NDBC's Coastal and Offshore Buoys provided meteorological observations used in analyzing possible meteotsunamis on the U.S. East Coast. However, many of these observations are some distance away from the tsunameters. In a demonstration project, NDBC has added sensors to a tsunameter's surface buoy and a MAV to support program requirements for meteorological observations. All these observations are available from NDBC's website in text files, netCDF, and SOS. To aid users in obtaining information relevant to their applications, the presentation documents, in detail, the characteristics of the different types of real-time observations and the availability and organization of the resulting datasets at NDBC .

  20. Identifying influential user communities on the social network

    NASA Astrophysics Data System (ADS)

    Hu, Weishu; Gong, Zhiguo; Hou U, Leong; Guo, Jingzhi

    2015-10-01

    Nowadays social network services have been popularly used in electronic commerce systems. Users on the social network can develop different relationships based on their common interests and activities. In order to promote the business, it is interesting to explore hidden relationships among users developed on the social network. Such knowledge can be used to locate target users for different advertisements and to provide effective product recommendations. In this paper, we define and study a novel community detection problem that is to discover the hidden community structure in large social networks based on their common interests. We observe that the users typically pay more attention to those users who share similar interests, which enable a way to partition the users into different communities according to their common interests. We propose two algorithms to detect influential communities using common interests in large social networks efficiently and effectively. We conduct our experimental evaluation using a data set from Epinions, which demonstrates that our method achieves 4-11.8% accuracy improvement over the state-of-the-art method.

  1. Technical Note: Harmonizing met-ocean model data via standard web services within small research groups

    USGS Publications Warehouse

    Signell, Richard; Camossi, E.

    2016-01-01

    Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.

  2. Technical note: Harmonising metocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, Richard P.; Camossi, Elena

    2016-05-01

    Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.

  3. Predicting new drug indications from network analysis

    NASA Astrophysics Data System (ADS)

    Mohd Ali, Yousoff Effendy; Kwa, Kiam Heong; Ratnavelu, Kurunathan

    This work adapts centrality measures commonly used in social network analysis to identify drugs with better positions in drug-side effect network and drug-indication network for the purpose of drug repositioning. Our basic hypothesis is that drugs having similar phenotypic profiles such as side effects may also share similar therapeutic properties based on related mechanism of action and vice versa. The networks were constructed from Side Effect Resource (SIDER) 4.1 which contains 1430 unique drugs with side effects and 1437 unique drugs with indications. Within the giant components of these networks, drugs were ranked based on their centrality scores whereby 18 prominent drugs from the drug-side effect network and 15 prominent drugs from the drug-indication network were identified. Indications and side effects of prominent drugs were deduced from the profiles of their neighbors in the networks and compared to existing clinical studies while an optimum threshold of similarity among drugs was sought for. The threshold can then be utilized for predicting indications and side effects of all drugs. Similarities of drugs were measured by the extent to which they share phenotypic profiles and neighbors. To improve the likelihood of accurate predictions, only profiles such as side effects of common or very common frequencies were considered. In summary, our work is an attempt to offer an alternative approach to drug repositioning using centrality measures commonly used for analyzing social networks.

  4. Common cold outbreaks: A network theory approach

    NASA Astrophysics Data System (ADS)

    Vishkaie, Faranak Rajabi; Bakouie, Fatemeh; Gharibzadeh, Shahriar

    2014-11-01

    In this study, at first we evaluated the network structure in social encounters by which respiratory diseases can spread. We considered common-cold and recorded a sample of human population and actual encounters between them. Our results show that the database structure presents a great value of clustering. In the second step, we evaluated dynamics of disease spread with SIR model by assigning a function to each node of the structural network. The rate of disease spread in networks was observed to be inversely correlated with characteristic path length. Therefore, the shortcuts have a significant role in increasing spread rate. We conclude that the dynamics of social encounters' network stands between the random and the lattice in network spectrum. Although in this study we considered the period of common-cold disease for network dynamics, it seems that similar approaches may be useful for other airborne diseases such as SARS.

  5. Network morphospace

    PubMed Central

    Avena-Koenigsberger, Andrea; Goñi, Joaquín; Solé, Ricard; Sporns, Olaf

    2015-01-01

    The structure of complex networks has attracted much attention in recent years. It has been noted that many real-world examples of networked systems share a set of common architectural features. This raises important questions about their origin, for example whether such network attributes reflect common design principles or constraints imposed by selectional forces that have shaped the evolution of network topology. Is it possible to place the many patterns and forms of complex networks into a common space that reveals their relations, and what are the main rules and driving forces that determine which positions in such a space are occupied by systems that have actually evolved? We suggest that these questions can be addressed by combining concepts from two currently relatively unconnected fields. One is theoretical morphology, which has conceptualized the relations between morphological traits defined by mathematical models of biological form. The second is network science, which provides numerous quantitative tools to measure and classify different patterns of local and global network architecture across disparate types of systems. Here, we explore a new theoretical concept that lies at the intersection between both fields, the ‘network morphospace’. Defined by axes that represent specific network traits, each point within such a space represents a location occupied by networks that share a set of common ‘morphological’ characteristics related to aspects of their connectivity. Mapping a network morphospace reveals the extent to which the space is filled by existing networks, thus allowing a distinction between actual and impossible designs and highlighting the generative potential of rules and constraints that pervade the evolution of complex systems. PMID:25540237

  6. Web mapping system for complex processing and visualization of environmental geospatial datasets

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.

  7. Integrating sea floor observatory data: the EMSO data infrastructure

    NASA Astrophysics Data System (ADS)

    Huber, Robert; Azzarone, Adriano; Carval, Thierry; Doumaz, Fawzi; Giovanetti, Gabriele; Marinaro, Giuditta; Rolin, Jean-Francois; Beranzoli, Laura; Waldmann, Christoph

    2013-04-01

    The European research infrastructure EMSO is a European network of fixed-point, deep-seafloor and water column observatories deployed in key sites of the European Continental margin and Arctic. It aims to provide the technological and scientific framework for the investigation of the environmental processes related to the interaction between the geosphere, biosphere, and hydrosphere and for a sustainable management by long-term monitoring also with real-time data transmission. Since 2006, EMSO is on the ESFRI (European Strategy Forum on Research Infrastructures) roadmap and has entered its construction phase in 2012. Within this framework, EMSO is contributing to large infrastructure integration projects such as ENVRI and COOPEUS. The EMSO infrastructure is geographically distributed in key sites of European waters, spanning from the Arctic, through the Atlantic and Mediterranean Sea to the Black Sea. It is presently consisting of thirteen sites which have been identified by the scientific community according to their importance respect to Marine Ecosystems, Climate Changes and Marine GeoHazards. The data infrastructure for EMSO is being designed as a distributed system. Presently, EMSO data collected during experiments at each EMSO site are locally stored and organized in catalogues or relational databases run by the responsible regional EMSO nodes. Three major institutions and their data centers are currently offering access to EMSO data: PANGAEA, INGV and IFREMER. In continuation of the IT activities which have been performed during EMSOs twin project ESONET, EMSO is now implementing the ESONET data architecture within an operational EMSO data infrastructure. EMSO aims to be compliant with relevant marine initiatives such as MyOceans, EUROSITES, EuroARGO, SEADATANET and EMODNET as well as to meet the requirements of international and interdisciplinary projects such as COOPEUS and ENVRI, EUDAT and iCORDI. A major focus is therefore set on standardization and interoperability of the EMSO data infrastructure. Beneath common standards for metadata exchange such as OpenSearch or OAI-PMH, EMSO has chosen to implement core standards of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) suite of standards, such as Catalogue Service for Web (CS-W), Sensor Observation Service (SOS) and Observations and Measurements (O&M). Further, strong integration efforts are currently undertaken to harmonize data formats e.g NetCDF as well as the used ontologies and terminologies. The presentation will also give information to users about the discovery and visualization procedure for the EMSO data presently available.

  8. Implementation of Web Processing Services (WPS) over IPSL Earth System Grid Federation (ESGF) node

    NASA Astrophysics Data System (ADS)

    Kadygrov, Nikolay; Denvil, Sebastien; Carenton, Nicolas; Levavasseur, Guillaume; Hempelmann, Nils; Ehbrecht, Carsten

    2016-04-01

    The Earth System Grid Federation (ESGF) is aimed to provide access to climate data for the international climate community. ESGF is a system of distributed and federated nodes that dynamically interact with each other. ESGF user may search and download climatic data, geographically distributed over the world, from one common web interface and through standardized API. With the continuous development of the climate models and the beginning of the sixth phase of the Coupled Model Intercomparison Project (CMIP6), the amount of data available from ESGF will continuously increase during the next 5 years. IPSL holds a replication of the different global and regional climate models output, observations and reanalysis data (CMIP5, CORDEX, obs4MIPs, etc) that are available on the IPSL ESGF node. In order to let scientists perform analysis of the models without downloading vast amount of data the Web Processing Services (WPS) were installed at IPSL compute node. The work is part of the CONVERGENCE project founded by French National Research Agency (ANR). PyWPS implementation of the Web processing Service standard from Open Geospatial Consortium (OGC) in the framework of birdhouse software is used. The processes could be run by user remotely through web-based WPS client or by using command-line tool. All the calculations are performed on the server side close to the data. If the models/observations are not available at IPSL it will be downloaded and cached by WPS process from ESGF network using synda tool. The outputs of the WPS processes are available for download as plots, tar-archives or as NetCDF files. We present the architecture of WPS at IPSL along with the processes for evaluation of the model performance, on-site diagnostics and post-analysis processing of the models output, e.g.: - regriding/interpolation/aggregation - ocgis (OpenClimateGIS) based polygon subsetting of the data - average seasonal cycle, multimodel mean, multimodel mean bias - calculation of the climate indices with icclim library (CERFACS) - atmospheric modes of variability In order to evaluate performance of any new model, once it became available in ESGF, we implement WPS with several model diagnostics and performance metrics calculated using ESMValTool (Eyring et al., GMDD 2015). As a further step we are developing new WPS processes and core-functions to be implemented at ISPL ESGF compute node following the scientific community needs.

  9. Common neighbour structure and similarity intensity in complex networks

    NASA Astrophysics Data System (ADS)

    Hou, Lei; Liu, Kecheng

    2017-10-01

    Complex systems as networks always exhibit strong regularities, implying underlying mechanisms governing their evolution. In addition to the degree preference, the similarity has been argued to be another driver for networks. Assuming a network is randomly organised without similarity preference, the present paper studies the expected number of common neighbours between vertices. A symmetrical similarity index is accordingly developed by removing such expected number from the observed common neighbours. The developed index can not only describe the similarities between vertices, but also the dissimilarities. We further apply the proposed index to measure of the influence of similarity on the wring patterns of networks. Fifteen empirical networks as well as artificial networks are examined in terms of similarity intensity and degree heterogeneity. Results on real networks indicate that, social networks are strongly governed by the similarity as well as the degree preference, while the biological networks and infrastructure networks show no apparent similarity governance. Particularly, classical network models, such as the Barabási-Albert model, the Erdös-Rényi model and the Ring Lattice, cannot well describe the social networks in terms of the degree heterogeneity and similarity intensity. The findings may shed some light on the modelling and link prediction of different classes of networks.

  10. Common quandaries and their practical solutions in Bayesian network modeling

    Treesearch

    Bruce G. Marcot

    2017-01-01

    Use and popularity of Bayesian network (BN) modeling has greatly expanded in recent years, but many common problems remain. Here, I summarize key problems in BN model construction and interpretation,along with suggested practical solutions. Problems in BN model construction include parameterizing probability values, variable definition, complex network structures,...

  11. Asynchronous reference frame agreement in a quantum network

    NASA Astrophysics Data System (ADS)

    Islam, Tanvirul; Wehner, Stephanie

    2016-03-01

    An efficient implementation of many multiparty protocols for quantum networks requires that all the nodes in the network share a common reference frame. Establishing such a reference frame from scratch is especially challenging in an asynchronous network where network links might have arbitrary delays and the nodes do not share synchronised clocks. In this work, we study the problem of establishing a common reference frame in an asynchronous network of n nodes of which at most t are affected by arbitrary unknown error, and the identities of the faulty nodes are not known. We present a protocol that allows all the correctly functioning nodes to agree on a common reference frame as long as the network graph is complete and not more than t\\lt n/4 nodes are faulty. As the protocol is asynchronous, it can be used with some assumptions to synchronise clocks over a network. Also, the protocol has the appealing property that it allows any existing two-node asynchronous protocol for reference frame agreement to be lifted to a robust protocol for an asynchronous quantum network.

  12. Navigable networks as Nash equilibria of navigation games.

    PubMed

    Gulyás, András; Bíró, József J; Kőrösi, Attila; Rétvári, Gábor; Krioukov, Dmitri

    2015-07-03

    Common sense suggests that networks are not random mazes of purposeless connections, but that these connections are organized so that networks can perform their functions well. One function common to many networks is targeted transport or navigation. Here, using game theory, we show that minimalistic networks designed to maximize the navigation efficiency at minimal cost share basic structural properties with real networks. These idealistic networks are Nash equilibria of a network construction game whose purpose is to find an optimal trade-off between the network cost and navigability. We show that these skeletons are present in the Internet, metabolic, English word, US airport, Hungarian road networks, and in a structural network of the human brain. The knowledge of these skeletons allows one to identify the minimal number of edges, by altering which one can efficiently improve or paralyse navigation in the network.

  13. Network Analysis Reveals a Common Host-Pathogen Interaction Pattern in Arabidopsis Immune Responses.

    PubMed

    Li, Hong; Zhou, Yuan; Zhang, Ziding

    2017-01-01

    Many plant pathogens secrete virulence effectors into host cells to target important proteins in host cellular network. However, the dynamic interactions between effectors and host cellular network have not been fully understood. Here, an integrative network analysis was conducted by combining Arabidopsis thaliana protein-protein interaction network, known targets of Pseudomonas syringae and Hyaloperonospora arabidopsidis effectors, and gene expression profiles in the immune response. In particular, we focused on the characteristic network topology of the effector targets and differentially expressed genes (DEGs). We found that effectors tended to manipulate key network positions with higher betweenness centrality. The effector targets, especially those that are common targets of an individual effector, tended to be clustered together in the network. Moreover, the distances between the effector targets and DEGs increased over time during infection. In line with this observation, pathogen-susceptible mutants tended to have more DEGs surrounding the effector targets compared with resistant mutants. Our results suggest a common plant-pathogen interaction pattern at the cellular network level, where pathogens employ potent local impact mode to interfere with key positions in the host network, and plant organizes an in-depth defense by sequentially activating genes distal to the effector targets.

  14. Customer-oriented Data Formats and Services for Global Land Data Assimilation System (GLDAS) Products at the NASA GES DISC

    NASA Astrophysics Data System (ADS)

    Fang, H.; Kato, H.; Rodell, M.; Teng, W. L.; Vollmer, B. E.

    2008-12-01

    The Global Land Data Assimilation System (GLDAS) has been generating a series of land surface state (e.g., soil moisture and surface temperature) and flux (e.g., evaporation and sensible heat flux) products, simulated by four land surface models (CLM, Mosaic, Noah and VIC). These products are now accessible at the Hydrology Data and Information Services Center (HDISC), a component of the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Current GLDAS data hosted at HDISC include a set of 1.0° data products, covering 1979 to the present, from the four models and a 0.25° data product, covering 2000 to the present, from the Noah model. In addition to the basic anonymous ftp data downloading, users can avail themselves of several advanced data search and downloading services, such as Mirador and OPeNDAP. Mirador is a Google-based search tool that provides keywords searching, on-the-fly spatial and parameter subsetting of selected data. OPeNDAP (Open-source Project for a Network Data Access Protocol) enables remote OPeNDAP clients to access OPeNDAP served data regardless of local storage format. Additional data services to be available in the near future from HDISC include (1) on-the-fly converter of GLDAS to NetCDF and binary data formats; (2) temporal aggregation of GLDAS files; and (3) Giovanni, an online visualization and analysis tool that provides a simple way to visualize, analyze, and access vast amounts of data without having to download the data.

  15. A new CM SAF Solar Surface Radiation Climate Data Set derived from Meteosat Satellite Observations

    NASA Astrophysics Data System (ADS)

    Trentmann, J.; Mueller, R. W.; Pfeifroth, U.; Träger-Chatterjee, C.; Cremer, R.

    2014-12-01

    The incoming surface solar radiation has been defined as an essential climate variable by GCOS. It is mandatory to monitor this part of the earth's energy balance, and thus gain insights on the state and variability of the climate system. In addition, data sets of the surface solar radiation have received increased attention over the recent years as an important source of information for the planning of solar energy applications. The EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF) is deriving surface solar radiation from geostationary and polar-orbiting satellite instruments. While CM SAF is focusing on the generation of high-quality long-term climate data records, also operationally data is provided in short time latency within 8 weeks. Here we present SARAH (Solar Surface Radiation Dataset - Heliosat), i.e. the new CM SAF Solar Surface Radiation data set based on Meteosat satellite observations. SARAH provides instantaneous, daily- and monthly-averaged data of the effective cloud albedo (CAL), the direct normalized solar radiation (DNI) and the solar irradiance (SIS) from 1983 to 2013 for the full view of the Meteosat satellite (i.e, Europe, Africa, parts of South America, and the Atlantic ocean). The data sets are generated with a high spatial resolution of 0.05 deg allowing for detailed regional studies, and are available in netcdf-format at no cost without restrictions at www.cmsaf.eu. We provide an overview of the data sets, including a validation against reference measurements from the BSRN and GEBA surface station networks.

  16. Navigable networks as Nash equilibria of navigation games

    PubMed Central

    Gulyás, András; Bíró, József J.; Kőrösi, Attila; Rétvári, Gábor; Krioukov, Dmitri

    2015-01-01

    Common sense suggests that networks are not random mazes of purposeless connections, but that these connections are organized so that networks can perform their functions well. One function common to many networks is targeted transport or navigation. Here, using game theory, we show that minimalistic networks designed to maximize the navigation efficiency at minimal cost share basic structural properties with real networks. These idealistic networks are Nash equilibria of a network construction game whose purpose is to find an optimal trade-off between the network cost and navigability. We show that these skeletons are present in the Internet, metabolic, English word, US airport, Hungarian road networks, and in a structural network of the human brain. The knowledge of these skeletons allows one to identify the minimal number of edges, by altering which one can efficiently improve or paralyse navigation in the network. PMID:26138277

  17. GOME/ERS-2: New Homogeneous Level 1B Data from an Old Instrument

    NASA Astrophysics Data System (ADS)

    Slijkhuis, S.; Aberle, B.; Coldewey-Egbers, M.; Loyola, D.; Dehn, A.; Fehr, T.

    2015-11-01

    In the framework of ESA's "GOME Evolution Project", a reprocessing will be made of the entire 16 year GOME Level 1 dataset. The GOME Evolution Project further includes the generation of a new GOME water vapour product, and a public outreach programme.In this paper we will describe the reprocessing of the Level 1 data, carried out with the latest version of the GOME Data Processor at DLR. The change most visible to the user will be the new product format in NetCDF, plus supporting documentation (ATBD and PUM). Full-mission reprocessed L1b data are expected to be released in the 4th quarter of 2015.

  18. Web-based CERES Clouds QC Property Viewing Tool

    NASA Astrophysics Data System (ADS)

    Smith, R. A.; Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Minnis, P.

    2014-12-01

    This presentation will display the capabilities of a web-based CERES cloud property viewer. Terra data will be chosen for examples. It will demonstrate viewing of cloud properties in gridded global maps, histograms, time series displays, latitudinal zonal images, binned data charts, data frequency graphs, and ISCCP plots. Images can be manipulated by the user to narrow boundaries of the map as well as color bars and value ranges, compare datasets, view data values, and more. Other atmospheric studies groups will be encouraged to put their data into the underlying NetCDF data format and view their data with the tool. A laptop will hopefully be available to allow conference attendees to try navigating the tool.

  19. Co-authorship network analysis in health research: method and potential use.

    PubMed

    Fonseca, Bruna de Paula Fonseca E; Sampaio, Ricardo Barros; Fonseca, Marcus Vinicius de Araújo; Zicker, Fabio

    2016-04-30

    Scientific collaboration networks are a hallmark of contemporary academic research. Researchers are no longer independent players, but members of teams that bring together complementary skills and multidisciplinary approaches around common goals. Social network analysis and co-authorship networks are increasingly used as powerful tools to assess collaboration trends and to identify leading scientists and organizations. The analysis reveals the social structure of the networks by identifying actors and their connections. This article reviews the method and potential applications of co-authorship network analysis in health. The basic steps for conducting co-authorship studies in health research are described and common network metrics are presented. The application of the method is exemplified by an overview of the global research network for Chikungunya virus vaccines.

  20. Networked Governance in Three Policy Areas with Implications for the Common Core State Standards Initiative

    ERIC Educational Resources Information Center

    Manna, Paul

    2010-01-01

    Policy makers and researchers now recognize that designing effective institutions to govern policy networks is a major challenge of the 21st Century. Presently, the Common Core State Standards Initiative resembles an emerging network of organizations united around the goal of developing clear and challenging academic expectations for students in…

  1. Common solutions for power, communication and robustness in operations of large measurement networks within Research Infrastructures

    NASA Astrophysics Data System (ADS)

    Huber, Robert; Beranzoli, Laura; Fiebig, Markus; Gilbert, Olivier; Laj, Paolo; Mazzola, Mauro; Paris, Jean-Daniel; Pedersen, Helle; Stocker, Markus; Vitale, Vito; Waldmann, Christoph

    2017-04-01

    European Environmental Research Infrastructures (RI) frequently comprise in situ observatories from large-scale networks of platforms or sites to local networks of various sensors. Network operation is usually a cumbersome aspect of these RIs facing specific technological problems related to operations in remote areas, maintenance of the network, transmission of observation values, etc.. Robust inter-connection within and across these networks is still at infancy level and the burden increases with remoteness of the station, harshness of environmental conditions, and unavailability of classic communication systems, which is a common feature here. Despite existing RIs having developed ad-hoc solutions to overcome specific problems and innovative technologies becoming available, no common approach yet exists. Within the European project ENVRIplus, a dedicated work package aims to stimulate common network operation technologies and approaches in terms of power supply and storage, robustness, and data transmission. Major objectives of this task are to review existing technologies and RI requirements, propose innovative solutions and evaluate the standardization potential prior to wider deployment across networks. Focus areas within these efforts are: improving energy production and storage units, testing robustness of RI equipment towards extreme conditions as well as methodologies for robust data transmission. We will introduce current project activities which are coordinated at various levels including the engineering as well as the data management perspective, and explain how environmental RIs can benefit from the developments.

  2. An information model for a virtual private optical network (OVPN) using virtual routers (VRs)

    NASA Astrophysics Data System (ADS)

    Vo, Viet Minh Nhat

    2002-05-01

    This paper describes a virtual private optical network architecture (Optical VPN - OVPN) based on virtual router (VR). It improves over architectures suggested for virtual private networks by using virtual routers with optical networks. The new things in this architecture are necessary changes to adapt to devices and protocols used in optical networks. This paper also presents information models for the OVPN: at the architecture level and at the service level. These are extensions to the DEN (directory enable network) and CIM (Common Information Model) for OVPNs using VRs. The goal is to propose a common management model using policies.

  3. Sharing, Privacy and Trust in Our Networked World. A Report to the OCLC Membership

    ERIC Educational Resources Information Center

    Storey, Tom, Ed.

    2007-01-01

    The practice of using a social network to establish and enhance relationships based on some common ground--shared interests, related skills, or a common geographic location--is as old as human societies, but social networking has flourished due to the ease of connecting on the Web. This OCLC membership report explores this web of social…

  4. NASA Integrated Space Communications Network

    NASA Technical Reports Server (NTRS)

    Tai, Wallace; Wright, Nate; Prior, Mike; Bhasin, Kul

    2012-01-01

    The NASA Integrated Network for Space Communications and Navigation (SCaN) has been in the definition phase since 2010. It is intended to integrate NASA s three existing network elements, i.e., the Space Network, Near Earth Network, and Deep Space Network, into a single network. In addition to the technical merits, the primary purpose of the Integrated Network is to achieve a level of operating cost efficiency significantly higher than it is today. Salient features of the Integrated Network include (a) a central system element that performs service management functions and user mission interfaces for service requests; (b) a set of common service execution equipment deployed at the all stations that provides return, forward, and radiometric data processing and delivery capabilities; (c) the network monitor and control operations for the entire integrated network are conducted remotely and centrally at a prime-shift site and rotating among three sites globally (a follow-the-sun approach); (d) the common network monitor and control software deployed at all three network elements that supports the follow-the-sun operations.

  5. Revealing how network structure affects accuracy of link prediction

    NASA Astrophysics Data System (ADS)

    Yang, Jin-Xuan; Zhang, Xiao-Dong

    2017-08-01

    Link prediction plays an important role in network reconstruction and network evolution. The network structure affects the accuracy of link prediction, which is an interesting problem. In this paper we use common neighbors and the Gini coefficient to reveal the relation between them, which can provide a good reference for the choice of a suitable link prediction algorithm according to the network structure. Moreover, the statistical analysis reveals correlation between the common neighbors index, Gini coefficient index and other indices to describe the network structure, such as Laplacian eigenvalues, clustering coefficient, degree heterogeneity, and assortativity of network. Furthermore, a new method to predict missing links is proposed. The experimental results show that the proposed algorithm yields better prediction accuracy and robustness to the network structure than existing currently used methods for a variety of real-world networks.

  6. NDEx: A Community Resource for Sharing and Publishing of Biological Networks.

    PubMed

    Pillich, Rudolf T; Chen, Jing; Rynkov, Vladimir; Welker, David; Pratt, Dexter

    2017-01-01

    Networks are a powerful and flexible paradigm that facilitate communication and computation about interactions of any type, whether social, economic, or biological. NDEx, the Network Data Exchange, is an online commons to enable new modes of collaboration and publication using biological networks. NDEx creates an access point and interface to a broad range of networks, whether they express molecular interactions, curated relationships from literature, or the outputs of systematic analysis of big data. Research organizations can use NDEx as a distribution channel for networks they generate or curate. Developers of bioinformatic applications can store and query NDEx networks via a common programmatic interface. NDEx can also facilitate the integration of networks as data in electronic publications, thus making a step toward an ecosystem in which networks bearing data, hypotheses, and findings flow seamlessly between scientists.

  7. The Earth Information Exchange: A Portal for Earth Science From the ESIP Federation

    NASA Astrophysics Data System (ADS)

    Wertz, R.; Hutchinson, C.; Hardin, D.

    2006-12-01

    The Federation of Earth Science Information Partners is a unique consortium of more than 90 organizations that collect, interpret and develop applications for remotely sensed Earth Observation Information. Included in the ESIP network are NASA, NOAA and USGS data centers, research universities, government research laboratories, supercomputer facilities, education resource providers, information technology innovators, nonprofit organizations and commercial enterprises. The consortium's work is dedicated to providing the most up-to-date, science-based information to researchers and decision-makers who are working to understand and address the environmental, economic and social challenges facing our planet. By increasing the use and usability of Earth observation data and linking it with decision-making tools, the Federation partners leverage the value of these important data resources for the betterment of society and our planet. To further the dissemination of Earth Science data, the Federation is developing the Earth Information Exchange (EIE). The EIE is a portal that will provide access to the vast information holdings of the members' organizations in one web-based location and will provides a robust marketplace in which the products and services needed to use and understand this information can be readily acquired. Since the Federation membership includes the federal government's Earth observing data centers, we believe that the impact of the EIE on Earth science research and education and environmental policy making will be profound. In the EIE, Earth observation data, products and services, are organized by the societal benefits categories defined by the international working group developing the Global Earth Observation System of Systems (GEOSS). The quality of the information is ensured in each of the Exchange's issue areas by maintaining working groups of issue area researchers and practitioners who serve as stewards for their respective communities. The current working groups are focused toward the issues of Air Quality, Coastal Management, Disaster Management, Ecological Forecasting, Public Health, and Water Management. Initially, the Exchange will be linked to USGS's Geospatial One Stop portal, NASA's Earth Science Gateway, the Global Change Master Directory (GCMD) and the Eos ClearingHOuse (ECHO). The Earth Information Exchange will be an integrated system of distributed components that work together to expedite the process of Earth science and to increase the effective application of its results to benefit the public. Specifically the EIE is designed to provide a comprehensive inventory of Earth observation metadata by GEOSS and other commonly used issue area categories. To provide researchers, educators and policy makers with ready access to metadata over the web, via URLs. To provide researchers with access to data in common scientific data formats such as netCDF and HDF-EOS and common scientific data models such as swath, point and grid. To provide policy makers and others with an e-commerce marketplace where advanced data products (analysis tools, models, simulations, decision support products) can be found and acquired. And, to provide researchers, educators and policy makers with a broad inventory of the human resources associated with the Federation and its partners.

  8. An Innovative Open Data-driven Approach for Improved Interpretation of Coverage Data at NASA JPL's PO.DAA

    NASA Astrophysics Data System (ADS)

    McGibbney, L. J.; Armstrong, E. M.

    2016-12-01

    Figuratively speaking, Scientific Datasets (SD) are shared by data producers in a multitude of shapes, sizes and flavors. Primarily however they exist as machine-independent manifestations supporting the creation, access, and sharing of array-oriented SD that can on occasion be spread across multiple files. Within the Earth Sciences, the most notable general examples include the HDF family, NetCDF, etc. with other formats such as GRIB being used pervasively within specific domains such as the Oceanographic, Atmospheric and Meteorological sciences. Such file formats contain Coverage Data e.g. a digital representation of some spatio-temporal phenomenon. A challenge for large data producers such as NASA and NOAA as well as consumers of coverage datasets (particularly surrounding visualization and interactive use within web clients) is that this is still not a straight-forward issue due to size, serialization and inherent complexity. Additionally existing data formats are either unsuitable for the Web (like netCDF files) or hard to interpret independently due to missing standard structures and metadata (e.g. the OPeNDAP protocol). Therefore alternative, Web friendly manifestations of such datasets are required.CoverageJSON is an emerging data format for publishing coverage data to the web in a web-friendly, way which fits in with the linked data publication paradigm hence lowering the barrier for interpretation by consumers via mobile devices and client applications, etc. as well as data producers who can build next generation Web friendly Web services around datasets. This work will detail how CoverageJSON is being evaluated at NASA JPL's PO.DAAC as an enabling data representation format for publishing SD as Linked Open Data embedded within SD landing pages as well as via semantic data repositories. We are currently evaluating how utilization of CoverageJSON within SD landing pages addresses the long-standing acknowledgement that SD producers are not currently addressing content-based optimization within their SD landing pages for better crawlability by commercial search engines.

  9. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.

  10. Exposing Coverage Data to the Semantic Web within the MELODIES project: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    Riechert, Maik; Blower, Jon; Griffiths, Guy

    2016-04-01

    Coverage data, typically big in data volume, assigns values to a given set of spatiotemporal positions, together with metadata on how to interpret those values. Existing storage formats like netCDF, HDF and GeoTIFF all have various restrictions that prevent them from being preferred formats for use over the web, especially the semantic web. Factors that are relevant here are the processing complexity, the semantic richness of the metadata, and the ability to request partial information, such as a subset or just the appropriate metadata. Making coverage data available within web browsers opens the door to new ways for working with such data, including new types of visualization and on-the-fly processing. As part of the European project MELODIES (http://melodiesproject.eu) we look into the challenges of exposing such coverage data in an interoperable and web-friendly way, and propose solutions using a host of emerging technologies like JSON-LD, the DCAT and GeoDCAT-AP ontologies, the CoverageJSON format, and new approaches to REST APIs for coverage data. We developed the CoverageJSON format within the MELODIES project as an additional way to expose coverage data to the web, next to having simple rendered images available using standards like OGC's WMS. CoverageJSON partially incorporates JSON-LD but does not encode individual data values as semantic resources, making use of the technology in a practical manner. The development also focused on it being a potential output format for OGC WCS. We will demonstrate how existing netCDF data can be exposed as CoverageJSON resources on the web together with a REST API that allows users to explore the data and run operations such as spatiotemporal subsetting. We will show various use cases from the MELODIES project, including reclassification of a Land Cover dataset client-side within the browser with the ability for the user to influence the reclassification result by making use of the above technologies.

  11. Rescue, Archival and Discovery of Tsunami Events on Marigrams

    NASA Astrophysics Data System (ADS)

    Eble, M. C.; Wright, L. M.; Stroker, K. J.; Sweeney, A.; Lancaster, M.

    2017-12-01

    The Big Earth Data Initiative made possible the reformatting of paper marigram records on which were recorded measurements of the 1946, 1952, 1960, and 1964 tsunamis generated in the Pacific Ocean. Data contained within each record were determined to be invaluable for tsunami researchers and operational agencies with a responsibility for issuing warnings during a tsunami event. All marigrams were carefully digitized and metadata were generated to form numerical datasets in order to provide the tsunami and other research and application-driven communities with quality data. Data were then packaged as CF-compliant netCDF datafiles and submitted to the NOAA Centers for Environmental Information for long-term stewardship, archival, and public discovery of both original scanned images and data in digital netCDF and CSC formats. The PNG plots of each time series were generated and included with data packages to provide a visual representation of the numerical data sets. ISO-compliant metadata were compiled for the collection at the event level and individual DOIs were minted for each of the four events included in this project. The procedure followed to reformat each record in this four-event subset of the larger NCEI scanned marigram inventory is presented and discussed. The practical use of these data is presented to highlight that even infrequent measurements of tsunamis hold information that may potentially help constrain earthquake rupture area, provide estimates of earthquake co-seismic slip distribution, identify subsidence or uplift, and significantly increase the holdings of situ data available for tsunami model validation. These same data may also prove valuable to the broader global tide community for validation and further development of tide models and for investigation into the stability of tidal harmonic constants. Data reformatted as part of this project are PARR compliant and meet the requirements for Data Management, Discoverability, Accessibility, Documentation, Readability, and Data Preservation and Stewardship as per the Big Earth Data Initiative.

  12. SciSpark's SRDD : A Scientific Resilient Distributed Dataset for Multidimensional Data

    NASA Astrophysics Data System (ADS)

    Palamuttam, R. S.; Wilson, B. D.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; McGibbney, L. J.; Ramirez, P.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We have developed SciSpark, a robust Big Data framework, that extends ApacheTM Spark for scaling scientific computations. Apache Spark improves the map-reduce implementation in ApacheTM Hadoop for parallel computing on a cluster, by emphasizing in-memory computation, "spilling" to disk only as needed, and relying on lazy evaluation. Central to Spark is the Resilient Distributed Dataset (RDD), an in-memory distributed data structure that extends the functional paradigm provided by the Scala programming language. However, RDDs are ideal for tabular or unstructured data, and not for highly dimensional data. The SciSpark project introduces the Scientific Resilient Distributed Dataset (sRDD), a distributed-computing array structure which supports iterative scientific algorithms for multidimensional data. SciSpark processes data stored in NetCDF and HDF files by partitioning them across time or space and distributing the partitions among a cluster of compute nodes. We show usability and extensibility of SciSpark by implementing distributed algorithms for geospatial operations on large collections of multi-dimensional grids. In particular we address the problem of scaling an automated method for finding Mesoscale Convective Complexes. SciSpark provides a tensor interface to support the pluggability of different matrix libraries. We evaluate performance of the various matrix libraries in distributed pipelines, such as Nd4jTM and BreezeTM. We detail the architecture and design of SciSpark, our efforts to integrate climate science algorithms, parallel ingest and partitioning (sharding) of A-Train satellite observations from model grids. These solutions are encompassed in SciSpark, an open-source software framework for distributed computing on scientific data.

  13. The C. elegans Connectome Consists of Homogenous Circuits with Defined Functional Roles

    PubMed Central

    Azulay, Aharon; Zaslaver, Alon

    2016-01-01

    A major goal of systems neuroscience is to decipher the structure-function relationship in neural networks. Here we study network functionality in light of the common-neighbor-rule (CNR) in which a pair of neurons is more likely to be connected the more common neighbors it shares. Focusing on the fully-mapped neural network of C. elegans worms, we establish that the CNR is an emerging property in this connectome. Moreover, sets of common neighbors form homogenous structures that appear in defined layers of the network. Simulations of signal propagation reveal their potential functional roles: signal amplification and short-term memory at the sensory/inter-neuron layer, and synchronized activity at the motoneuron layer supporting coordinated movement. A coarse-grained view of the neural network based on homogenous connected sets alone reveals a simple modular network architecture that is intuitive to understand. These findings provide a novel framework for analyzing larger, more complex, connectomes once these become available. PMID:27606684

  14. A comparative study of 11 local health department organizational networks.

    PubMed

    Merrill, Jacqueline; Keeling, Jonathan W; Carley, Kathleen M

    2010-01-01

    Although the nation's local health departments (LHDs) share a common mission, variability in administrative structures is a barrier to identifying common, optimal management strategies. There is a gap in understanding what unifying features LHDs share as organizations that could be leveraged systematically for achieving high performance. To explore sources of commonality and variability in a range of LHDs by comparing intraorganizational networks. We used organizational network analysis to document relationships between employees, tasks, knowledge, and resources within LHDs, which may exist regardless of formal administrative structure. A national sample of 11 LHDs from seven states that differed in size, geographic location, and governance. Relational network data were collected via an on-line survey of all employees in 11 LHDs. A total of 1062 out of 1239 employees responded (84% response rate). Network measurements were compared using coefficient of variation. Measurements were correlated with scores from the National Public Health Performance Assessment and with LHD demographics. Rankings of tasks, knowledge, and resources were correlated across pairs of LHDs. We found that 11 LHDs exhibited compound organizational structures in which centralized hierarchies were coupled with distributed networks at the point of service. Local health departments were distinguished from random networks by a pattern of high centralization and clustering. Network measurements were positively associated with performance for 3 of 10 essential services (r > 0.65). Patterns in the measurements suggest how LHDs adapt to the population served. Shared network patterns across LHDs suggest where common organizational management strategies are feasible. This evidence supports national efforts to promote uniform standards for service delivery to diverse populations.

  15. Systematic identification of an integrative network module during senescence from time-series gene expression.

    PubMed

    Park, Chihyun; Yun, So Jeong; Ryu, Sung Jin; Lee, Soyoung; Lee, Young-Sam; Yoon, Youngmi; Park, Sang Chul

    2017-03-15

    Cellular senescence irreversibly arrests growth of human diploid cells. In addition, recent studies have indicated that senescence is a multi-step evolving process related to important complex biological processes. Most studies analyzed only the genes and their functions representing each senescence phase without considering gene-level interactions and continuously perturbed genes. It is necessary to reveal the genotypic mechanism inferred by affected genes and their interaction underlying the senescence process. We suggested a novel computational approach to identify an integrative network which profiles an underlying genotypic signature from time-series gene expression data. The relatively perturbed genes were selected for each time point based on the proposed scoring measure denominated as perturbation scores. Then, the selected genes were integrated with protein-protein interactions to construct time point specific network. From these constructed networks, the conserved edges across time point were extracted for the common network and statistical test was performed to demonstrate that the network could explain the phenotypic alteration. As a result, it was confirmed that the difference of average perturbation scores of common networks at both two time points could explain the phenotypic alteration. We also performed functional enrichment on the common network and identified high association with phenotypic alteration. Remarkably, we observed that the identified cell cycle specific common network played an important role in replicative senescence as a key regulator. Heretofore, the network analysis from time series gene expression data has been focused on what topological structure was changed over time point. Conversely, we focused on the conserved structure but its context was changed in course of time and showed it was available to explain the phenotypic changes. We expect that the proposed method will help to elucidate the biological mechanism unrevealed by the existing approaches.

  16. Development of web-GIS system for analysis of georeferenced geophysical data

    NASA Astrophysics Data System (ADS)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.; Bogomolov, V. Y.; Genina, E.; Martynova, Y.; Shulgina, T. M.

    2012-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated web-GIS information-computational system for analysis of georeferenced climatological and meteorological data has been created. The information-computational system consists of 4 basic parts: computational kernel developed using GNU Data Language (GDL), a set of PHP-controllers run within specialized web-portal, JavaScript class libraries for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology, and an archive of geophysical datasets. Computational kernel comprises of a number of dedicated modules for querying and extraction of data, mathematical and statistical data analysis, visualization, and preparing output files in geoTIFF and netCDF format containing processing results. Specialized web-portal consists of a web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript libraries aiming at graphical user interface development are based on GeoExt library combining ExtJS Framework and OpenLayers software. The archive of geophysical data consists of a number of structured environmental datasets represented by data files in netCDF, HDF, GRIB, ESRI Shapefile formats. For processing by the system are available: two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, meteorological observational data for the territory of the former USSR for the 20th century, results of modeling by global and regional climatological models, and others. The system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The Web-GIS information-computational system for geophysical data analysis provides specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified web-interface in a common graphical web-browser. This work is partially supported by the Ministry of education and science of the Russian Federation (contract #07.514.114044), projects IV.31.1.5, IV.31.2.7, RFBR grants #10-07-00547a, #11-05-01190a, and integrated project SB RAS #131.

  17. Australia's TERN: Advancing Ecosystem Data Management in Australia

    NASA Astrophysics Data System (ADS)

    Phinn, S. R.; Christensen, R.; Guru, S.

    2013-12-01

    Globally, there is a consistent movement towards more open, collaborative and transparent science, where the publication and citation of data is considered standard practice. Australia's Terrestrial Ecosystem Research Network (TERN) is a national research infrastructure investment designed to support the ecosystem science community through all stages of the data lifecycle. TERN has developed and implemented a comprehensive network of ';hard' and ';soft' infrastructure that enables Australia's ecosystem scientists to collect, publish, store, share, discover and re-use data in ways not previously possible. The aim of this poster is to demonstrate how TERN has successfully delivered infrastructure that is enabling a significant cultural and practical shift in Australia's ecosystem science community towards consistent approaches for data collection, meta-data, data licensing, and data publishing. TERN enables multiple disciplines, within the ecosystem sciences to more effectively and efficiently collect, store and publish their data. A critical part of TERN's approach has been to build on existing data collection activities, networks and skilled people to enable further coordination and collaboration to build each data collection facility and coordinate data publishing. Data collection in TERN is through discipline based facilities, covering long term collection of: (1) systematic plot based measurements of vegetation structure, composition and faunal biodiversity; (2) instrumented towers making systematic measurements of solar, water and gas fluxes; and (3) satellite and airborne maps of biophysical properties of vegetation, soils and the atmosphere. Several other facilities collect and integrate environmental data to produce national products for fauna and vegetation surveys, soils and coastal data, as well as integrated or synthesised products for modelling applications. Data management, publishing and sharing in TERN are implemented through a tailored data licensing framework suitable for ecosystem data, national standards for metadata, a DOI-minting service, and context-appropriate data repositories and portals. The TERN Data infrastructure is based on loosely coupled 'network of networks.' Overall, the data formats used across the TERN facilities vary from NetCDF, comma-separated values and descriptive documents. Metadata standards include ISO19115, Ecological Metadata Language and rich semantic enabled contextual information. Data services vary from Web Mapping Service, Web Feature Service, OpeNDAP, file servers and KNB Metacat. These approaches enable each data collection facility to maintain their discipline based data collection and storage protocols. TERN facility meta-data are harvested regularly for the central TERN Data Discovery Portal and converted to a national standard format. This approach enables centralised discovery, access, and re-use of data simply and effectively, while maintaining disciplinary diversity. Effort is still required to support the cultural shift towards acceptance of effective data management, publication, sharing and re-use as standard practice. To this end TERN's future activities will be directed to supporting this transformation and undertaking ';education' to enable ecosystem scientists to take full advantage of TERN's infrastructure, and providing training and guidance for best practice data management.

  18. The Climate Data Analytic Services (CDAS) Framework.

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2016-12-01

    Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.

  19. Extending Climate Analytics-As to the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  20. Design Principles of Regulatory Networks: Searching for the Molecular Algorithms of the Cell

    PubMed Central

    Lim, Wendell A.; Lee, Connie M.; Tang, Chao

    2013-01-01

    A challenge in biology is to understand how complex molecular networks in the cell execute sophisticated regulatory functions. Here we explore the idea that there are common and general principles that link network structures to biological functions, principles that constrain the design solutions that evolution can converge upon for accomplishing a given cellular task. We describe approaches for classifying networks based on abstract architectures and functions, rather than on the specific molecular components of the networks. For any common regulatory task, can we define the space of all possible molecular solutions? Such inverse approaches might ultimately allow the assembly of a design table of core molecular algorithms that could serve as a guide for building synthetic networks and modulating disease networks. PMID:23352241

  1. Solid-state current transformer

    NASA Technical Reports Server (NTRS)

    Farnsworth, D. L. (Inventor)

    1976-01-01

    A signal transformation network which is uniquely characterized to exhibit a very low input impedance while maintaining a linear transfer characteristic when driven from a voltage source and when quiescently biased in the low microampere current range is described. In its simplest form, it consists of a tightly coupled two transistor network in which a common emitter input stage is interconnected directly with an emitter follower stage to provide virtually 100 percent negative feedback to the base input of the common emitter stage. Bias to the network is supplied via the common tie point of the common emitter stage collector terminal and the emitter follower base stage terminal by a regulated constant current source, and the output of the circuit is taken from the collector of the emitter follower stage.

  2. Social Networks and High Healthcare Utilization: Building Resilience Through Analysis

    DTIC Science & Technology

    2016-09-01

    of Social Network Analysis Patients Developing targeted intervention programs based on the individual’s needs may potentially help improve the...network structure is found in the patterns of interconnection that develop between nodes. It is this linking through common nodes, “the AB link shares...transitivity is responsible for the clustering of nodes that form “communities” of people based on geography, common interests, or other group

  3. Distributed Common Ground System-Navy Increment 2 (DCGS-N Inc 2)

    DTIC Science & Technology

    2016-03-01

    15 minutes Enter and be Managed in the Network: Reference SvcV-7, Consolidated Afloat Networks and Enterprise Services ( CANES ) CDD, DCGS-N Inc 2...Red, White , Gray Data and Tracks to Command and Control System. Continuous Stream from SCI Common Intelligence Picture to General Service (GENSER...AIS - Automatic Information System AOC - Air Operations Command CANES - Consolidated Afloat Networks and Enterprise Services CID - Center for

  4. Figure12

    EPA Pesticide Factsheets

    NCL script: cmaq_ensemble_isam_4panels_subdomain.nclNetcdf input file for NCL script, containing ensemble means and standard deviation of ISAM SO4 and O3 contributions from IPM: test.ncPlot (ps): maps_isam_mean_std_lasthour_ipm_so4_o3_east.psPlot (pdf): maps_isam_mean_std_lasthour_ipm_so4_o3_east.pdfPlot (ncgm): maps_isam_mean_std_lasthour_ipm_so4_o3_east.ncgmThis dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).

  5. Development of a Multilayer MODIS IST-Albedo Product of Greenland

    NASA Technical Reports Server (NTRS)

    Hall, D. K.; Comiso, J. C.; Cullather, R. I.; Digirolamo, N. E.; Nowicki, S. M.; Medley, B. C.

    2017-01-01

    A new multilayer IST-albedo Moderate Resolution Imaging Spectroradiometer (MODIS) product of Greenland was developed to meet the needs of the ice sheet modeling community. The multiple layers of the product enable the relationship between IST and albedo to be evaluated easily. Surface temperature is a fundamental input for dynamical ice sheet models because it is a component of the ice sheet radiation budget and mass balance. Albedo influences absorption of incoming solar radiation. The daily product will combine the existing standard MODIS Collection-6 ice-surface temperature, derived melt maps, snow albedo and water vapor products. The new product is available in a polar stereographic projection in NetCDF format. The product will ultimately extend from March 2000 through the end of 2017.

  6. 47 CFR 64.2003 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... RULES RELATING TO COMMON CARRIERS Customer Proprietary Network Information § 64.2003 Definitions. (a... service. (g) Customer proprietary network information (CPNI). The term “customer proprietary network...

  7. Near real-time traffic routing

    NASA Technical Reports Server (NTRS)

    Yang, Chaowei (Inventor); Xie, Jibo (Inventor); Zhou, Bin (Inventor); Cao, Ying (Inventor)

    2012-01-01

    A near real-time physical transportation network routing system comprising: a traffic simulation computing grid and a dynamic traffic routing service computing grid. The traffic simulator produces traffic network travel time predictions for a physical transportation network using a traffic simulation model and common input data. The physical transportation network is divided into a multiple sections. Each section has a primary zone and a buffer zone. The traffic simulation computing grid includes multiple of traffic simulation computing nodes. The common input data includes static network characteristics, an origin-destination data table, dynamic traffic information data and historical traffic data. The dynamic traffic routing service computing grid includes multiple dynamic traffic routing computing nodes and generates traffic route(s) using the traffic network travel time predictions.

  8. 47 CFR 11.20 - State Relay Network.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false State Relay Network. 11.20 Section 11.20... Network. This network is composed of State Relay (SR) sources, leased common carrier communications facilities or any other available communication facilities. The network distributes State EAS messages...

  9. 47 CFR 11.20 - State Relay Network.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false State Relay Network. 11.20 Section 11.20... Network. This network is composed of State Relay (SR) sources, leased common carrier communications facilities or any other available communication facilities. The network distributes State EAS messages...

  10. A Framework for Integrating Multiple Biological Networks to Predict MicroRNA-Disease Associations.

    PubMed

    Peng, Wei; Lan, Wei; Yu, Zeng; Wang, Jianxin; Pan, Yi

    2017-03-01

    MicroRNAs have close relationship with human diseases. Therefore, identifying disease related MicroRNAs plays an important role in disease diagnosis, prognosis and therapy. However, designing an effective computational method which can make good use of various biological resources and correctly predict the associations between MicroRNA and disease is still a big challenge. Previous researchers have pointed out that there are complex relationships among microRNAs, diseases and environment factors. There are inter-relationships between microRNAs, diseases or environment factors based on their functional similarity or phenotype similarity or chemical structure similarity and so on. There are also intra-relationships between microRNAs and diseases, microRNAs and environment factors, diseases and environment factors. Moreover, functionally similar microRNAs tend to associate with common diseases and common environment factors. The diseases with similar phenotypes are likely caused by common microRNAs and common environment factors. In this work, we propose a framework namely ThrRWMDE which can integrate these complex relationships to predict microRNA-disease associations. In this framework, microRNA similarity network (MFN), disease similarity network (DSN) and environmental factor similarity network (ESN) are constructed according to certain biological properties. Then, an unbalanced three random walking algorithm is implemented on the three networks so as to obtain information from neighbors in corresponding networks. This algorithm not only can flexibly infer information from different levels of neighbors with respect to the topological and structural differences of the three networks, but also in the course of working the functional information will be transferred from one network to another according to the associations between the nodes in different networks. The results of experiment show that our method achieves better prediction performance than other state-of-the-art methods.

  11. A Comparative Study of 11 Local Health Department Organizational Networks

    PubMed Central

    Merrill, Jacqueline; Keeling, Jonathan W.; Carley, Kathleen M.

    2013-01-01

    Context Although the nation’s local health departments (LHDs) share a common mission, variability in administrative structures is a barrier to identifying common, optimal management strategies. There is a gap in understanding what unifying features LHDs share as organizations that could be leveraged systematically for achieving high performance. Objective To explore sources of commonality and variability in a range of LHDs by comparing intraorganizational networks. Intervention We used organizational network analysis to document relationships between employees, tasks, knowledge, and resources within LHDs, which may exist regardless of formal administrative structure. Setting A national sample of 11 LHDs from seven states that differed in size, geographic location, and governance. Participants Relational network data were collected via an on-line survey of all employees in 11 LHDs. A total of 1 062 out of 1 239 employees responded (84% response rate). Outcome Measures Network measurements were compared using coefficient of variation. Measurements were correlated with scores from the National Public Health Performance Assessment and with LHD demographics. Rankings of tasks, knowledge, and resources were correlated across pairs of LHDs. Results We found that 11 LHDs exhibited compound organizational structures in which centralized hierarchies were coupled with distributed networks at the point of service. Local health departments were distinguished from random networks by a pattern of high centralization and clustering. Network measurements were positively associated with performance for 3 of 10 essential services (r > 0.65). Patterns in the measurements suggest how LHDs adapt to the population served. Conclusions Shared network patterns across LHDs suggest where common organizational management strategies are feasible. This evidence supports national efforts to promote uniform standards for service delivery to diverse populations. PMID:20445462

  12. 47 CFR 54.519 - State telecommunications networks.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 3 2014-10-01 2014-10-01 false State telecommunications networks. 54.519 Section 54.519 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES... telecommunications networks. (a) Telecommunications services. State telecommunications networks may secure discounts...

  13. 47 CFR 54.519 - State telecommunications networks.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 3 2012-10-01 2012-10-01 false State telecommunications networks. 54.519 Section 54.519 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES... telecommunications networks. (a) Telecommunications services. State telecommunications networks may secure discounts...

  14. 47 CFR 54.519 - State telecommunications networks.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 3 2013-10-01 2013-10-01 false State telecommunications networks. 54.519 Section 54.519 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES... telecommunications networks. (a) Telecommunications services. State telecommunications networks may secure discounts...

  15. Quantum secured gigabit optical access networks

    PubMed Central

    Fröhlich, Bernd; Dynes, James F.; Lucamarini, Marco; Sharpe, Andrew W.; Tam, Simon W.-B.; Yuan, Zhiliang; Shields, Andrew J.

    2015-01-01

    Optical access networks connect multiple endpoints to a common network node via shared fibre infrastructure. They will play a vital role to scale up the number of users in quantum key distribution (QKD) networks. However, the presence of power splitters in the commonly used passive network architecture makes successful transmission of weak quantum signals challenging. This is especially true if QKD and data signals are multiplexed in the passive network. The splitter introduces an imbalance between quantum signal and Raman noise, which can prevent the recovery of the quantum signal completely. Here we introduce a method to overcome this limitation and demonstrate coexistence of multi-user QKD and full power data traffic from a gigabit passive optical network (GPON) for the first time. The dual feeder implementation is compatible with standard GPON architectures and can support up to 128 users, highlighting that quantum protected GPON networks could be commonplace in the future. PMID:26656307

  16. Integrating Genetic and Functional Genomic Data to Elucidate Common Disease Tra

    NASA Astrophysics Data System (ADS)

    Schadt, Eric

    2005-03-01

    The reconstruction of genetic networks in mammalian systems is one of the primary goals in biological research, especially as such reconstructions relate to elucidating not only common, polygenic human diseases, but living systems more generally. Here I present a statistical procedure for inferring causal relationships between gene expression traits and more classic clinical traits, including complex disease traits. This procedure has been generalized to the gene network reconstruction problem, where naturally occurring genetic variations in segregating mouse populations are used as a source of perturbations to elucidate tissue-specific gene networks. Differences in the extent of genetic control between genders and among four different tissues are highlighted. I also demonstrate that the networks derived from expression data in segregating mouse populations using the novel network reconstruction algorithm are able to capture causal associations between genes that result in increased predictive power, compared to more classically reconstructed networks derived from the same data. This approach to causal inference in large segregating mouse populations over multiple tissues not only elucidates fundamental aspects of transcriptional control, it also allows for the objective identification of key drivers of common human diseases.

  17. Genome-Wide Networks of Amino Acid Covariances Are Common among Viruses

    PubMed Central

    Donlin, Maureen J.; Szeto, Brandon; Gohara, David W.; Aurora, Rajeev

    2012-01-01

    Coordinated variation among positions in amino acid sequence alignments can reveal genetic dependencies at noncontiguous positions, but methods to assess these interactions are incompletely developed. Previously, we found genome-wide networks of covarying residue positions in the hepatitis C virus genome (R. Aurora, M. J. Donlin, N. A. Cannon, and J. E. Tavis, J. Clin. Invest. 119:225–236, 2009). Here, we asked whether such networks are present in a diverse set of viruses and, if so, what they may imply about viral biology. Viral sequences were obtained for 16 viruses in 13 species from 9 families. The entire viral coding potential for each virus was aligned, all possible amino acid covariances were identified using the observed-minus-expected-squared algorithm at a false-discovery rate of ≤1%, and networks of covariances were assessed using standard methods. Covariances that spanned the viral coding potential were common in all viruses. In all cases, the covariances formed a single network that contained essentially all of the covariances. The hepatitis C virus networks had hub-and-spoke topologies, but all other networks had random topologies with an unusually large number of highly connected nodes. These results indicate that genome-wide networks of genetic associations and the coordinated evolution they imply are very common in viral genomes, that the networks rarely have the hub-and-spoke topology that dominates other biological networks, and that network topologies can vary substantially even within a given viral group. Five examples with hepatitis B virus and poliovirus are presented to illustrate how covariance network analysis can lead to inferences about viral biology. PMID:22238298

  18. Interpretation of medical imaging data with a mobile application: a mobile digital imaging processing environment.

    PubMed

    Lin, Meng Kuan; Nicolini, Oliver; Waxenegger, Harald; Galloway, Graham J; Ullmann, Jeremy F P; Janke, Andrew L

    2013-01-01

    Digital Imaging Processing (DIP) requires data extraction and output from a visualization tool to be consistent. Data handling and transmission between the server and a user is a systematic process in service interpretation. The use of integrated medical services for management and viewing of imaging data in combination with a mobile visualization tool can be greatly facilitated by data analysis and interpretation. This paper presents an integrated mobile application and DIP service, called M-DIP. The objective of the system is to (1) automate the direct data tiling, conversion, pre-tiling of brain images from Medical Imaging NetCDF (MINC), Neuroimaging Informatics Technology Initiative (NIFTI) to RAW formats; (2) speed up querying of imaging measurement; and (3) display high-level of images with three dimensions in real world coordinates. In addition, M-DIP provides the ability to work on a mobile or tablet device without any software installation using web-based protocols. M-DIP implements three levels of architecture with a relational middle-layer database, a stand-alone DIP server, and a mobile application logic middle level realizing user interpretation for direct querying and communication. This imaging software has the ability to display biological imaging data at multiple zoom levels and to increase its quality to meet users' expectations. Interpretation of bioimaging data is facilitated by an interface analogous to online mapping services using real world coordinate browsing. This allows mobile devices to display multiple datasets simultaneously from a remote site. M-DIP can be used as a measurement repository that can be accessed by any network environment, such as a portable mobile or tablet device. In addition, this system and combination with mobile applications are establishing a virtualization tool in the neuroinformatics field to speed interpretation services.

  19. Interpretation of Medical Imaging Data with a Mobile Application: A Mobile Digital Imaging Processing Environment

    PubMed Central

    Lin, Meng Kuan; Nicolini, Oliver; Waxenegger, Harald; Galloway, Graham J.; Ullmann, Jeremy F. P.; Janke, Andrew L.

    2013-01-01

    Digital Imaging Processing (DIP) requires data extraction and output from a visualization tool to be consistent. Data handling and transmission between the server and a user is a systematic process in service interpretation. The use of integrated medical services for management and viewing of imaging data in combination with a mobile visualization tool can be greatly facilitated by data analysis and interpretation. This paper presents an integrated mobile application and DIP service, called M-DIP. The objective of the system is to (1) automate the direct data tiling, conversion, pre-tiling of brain images from Medical Imaging NetCDF (MINC), Neuroimaging Informatics Technology Initiative (NIFTI) to RAW formats; (2) speed up querying of imaging measurement; and (3) display high-level of images with three dimensions in real world coordinates. In addition, M-DIP provides the ability to work on a mobile or tablet device without any software installation using web-based protocols. M-DIP implements three levels of architecture with a relational middle-layer database, a stand-alone DIP server, and a mobile application logic middle level realizing user interpretation for direct querying and communication. This imaging software has the ability to display biological imaging data at multiple zoom levels and to increase its quality to meet users’ expectations. Interpretation of bioimaging data is facilitated by an interface analogous to online mapping services using real world coordinate browsing. This allows mobile devices to display multiple datasets simultaneously from a remote site. M-DIP can be used as a measurement repository that can be accessed by any network environment, such as a portable mobile or tablet device. In addition, this system and combination with mobile applications are establishing a virtualization tool in the neuroinformatics field to speed interpretation services. PMID:23847587

  20. The multiscale backbone of the human phenotype network based on biological pathways.

    PubMed

    Darabos, Christian; White, Marquitta J; Graham, Britney E; Leung, Derek N; Williams, Scott M; Moore, Jason H

    2014-01-25

    Networks are commonly used to represent and analyze large and complex systems of interacting elements. In systems biology, human disease networks show interactions between disorders sharing common genetic background. We built pathway-based human phenotype network (PHPN) of over 800 physical attributes, diseases, and behavioral traits; based on about 2,300 genes and 1,200 biological pathways. Using GWAS phenotype-to-genes associations, and pathway data from Reactome, we connect human traits based on the common patterns of human biological pathways, detecting more pleiotropic effects, and expanding previous studies from a gene-centric approach to that of shared cell-processes. The resulting network has a heavily right-skewed degree distribution, placing it in the scale-free region of the network topologies spectrum. We extract the multi-scale information backbone of the PHPN based on the local densities of the network and discarding weak connection. Using a standard community detection algorithm, we construct phenotype modules of similar traits without applying expert biological knowledge. These modules can be assimilated to the disease classes. However, we are able to classify phenotypes according to shared biology, and not arbitrary disease classes. We present examples of expected clinical connections identified by PHPN as proof of principle. We unveil a previously uncharacterized connection between phenotype modules and discuss potential mechanistic connections that are obvious only in retrospect. The PHPN shows tremendous potential to become a useful tool both in the unveiling of the diseases' common biology, and in the elaboration of diagnosis and treatments.

  1. 47 CFR 32.6532 - Network administration expense.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Network administration expense. 32.6532 Section 32.6532 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES... Network administration expense. This account shall include costs incurred in network administration. This...

  2. 47 CFR 32.6532 - Network administration expense.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Network administration expense. 32.6532 Section 32.6532 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES... Network administration expense. This account shall include costs incurred in network administration. This...

  3. Network of listed companies based on common shareholders and the prediction of market volatility

    NASA Astrophysics Data System (ADS)

    Li, Jie; Ren, Da; Feng, Xu; Zhang, Yongjie

    2016-11-01

    In this paper, we build a network of listed companies in the Chinese stock market based on common shareholding data from 2003 to 2013. We analyze the evolution of topological characteristics of the network (e.g., average degree, diameter, average path length and clustering coefficient) with respect to the time sequence. Additionally, we consider the economic implications of topological characteristic changes on market volatility and use them to make future predictions. Our study finds that the network diameter significantly predicts volatility. After adding control variables used in traditional financial studies (volume, turnover and previous volatility), network topology still significantly influences volatility and improves the predictive ability of the model.

  4. Integrated Ocean Profile Data Delivery for Operations and Climate Research

    NASA Astrophysics Data System (ADS)

    Sun, C. L.; Soreide, N. N.

    2006-12-01

    An end-to-end data and information system for delivering integrated real-time and historical datasets is presented in this paper. The purposes of this paper are: (1) to illustrate the procedures of quality control and loading ocean profile data into the U.S. National Oceanographic Data Center (NODC) ocean database and (2) to facilitate the development and provision of a wide variety of useful data, analyses, and information products for operations and climate research. The NODC currently focuses on acquiring, processing, and distributing ocean profile data collected by two operational global ocean observing systems: Argo Profiling Network and Global Temperature-Salinity Profile Program (GTSPP). The two data streams contain upper ocean temperature and salinity data mainly from profiling floats, expendable bathythermographs (XBTs) but also from conductivity-temperature-depths (CTDs) and bottles. Argo has used resources from 23 or so countries to make unprecedented in-situ observations of the global ocean. All Argo data are publicly available in near real-time via the Global Telecommunications System (GTS) and in scientifically quality-controlled form with a few months delay. The NODC operates the Global Argo Data Repository for long-term archiving Argo data and serves the data in the NODC version of Argo netCDF and tab- delimited spreadsheet text formats to the public through the NODC Web site at http://www.nodc.noaa.gov/argo/. The GTSPP is a cooperative international program. It maintains a global ocean T-S resource with data that are both up-to-date and of the highest quality possible. Both real-time data transmitted over the GTS, and delayed- mode data received by contribution countries are acquired and quality controlled by the Marine Environmental Data Service, Canada and is eventually incorporated into a continuously managed database maintained by the NODC. Information and data are made publicly available at http://www.nodc.noaa.gov/GTSPP/ . Web-based tools are developed for allowing users on the Web to query and subset the data by parameter, location, time, and other attributes such as instrument types and quality flags. Desktop applications with capabilities of exploring data from real-time data streams and integrating the data streams with archives across the Internet are available for users who have a high bandwidth Internet connection. Alternatively, users without high-speed network access can order CD/DVD-ROMs from the NODC that contain the integrated dataset and then use software over potentially low-bandwidth network connection to periodically update the CD/DVD-ROM-based archive with new data

  5. Implementing a Data Quality Strategy to Simplify Access to Data

    NASA Astrophysics Data System (ADS)

    Druken, K. A.; Trenham, C. E.; Evans, B. J. K.; Richards, C. J.; Wang, J.; Wyborn, L. A.

    2016-12-01

    To ensure seamless programmatic access for data analysis (including machine learning), standardization of both data and services is vital. At the Australian National Computational Infrastructure (NCI) we have developed a Data Quality Strategy (DQS) that currently provides processes for: (1) the consistency of data structures in the underlying High Performance Data (HPD) platform; (2) quality control through compliance with recognized community standards; and (3) data quality assurance through demonstrated functionality across common platforms, tools and services. NCI hosts one of Australia's largest repositories (10+ PBytes) of research data collections spanning datasets from climate, coasts, oceans and geophysics through to astronomy, bioinformatics and the social sciences. A key challenge is the application of community-agreed data standards to the broad set of Earth systems and environmental data that are being used. Within these disciplines, data span a wide range of gridded, ungridded (i.e., line surveys, point clouds), and raster image types, as well as diverse coordinate reference projections and resolutions. By implementing our DQS we have seen progressive improvement in the quality of the datasets across the different subject domains, and through this, the ease by which the users can programmatically access the data, either in situ or via web services. As part of its quality control procedures, NCI has developed a compliance checker based upon existing domain standards. The DQS also includes extensive Functionality Testing which include readability by commonly used libraries (e.g., netCDF, HDF, GDAL, etc.); accessibility by data servers (e.g., THREDDS, Hyrax, GeoServer), validation against scientific analysis and programming platforms (e.g., Python, Matlab, QGIS); and visualization tools (e.g., ParaView, NASA Web World Wind). These tests ensure smooth interoperability between products and services as well as exposing unforeseen requirements and dependencies. The results provide an important component of quality control within the DQS as well as clarifying the requirement for any extensions to the relevant standards that help support the uptake of data by broader international communities.

  6. NASA Integrated Network Monitor and Control Software Architecture

    NASA Technical Reports Server (NTRS)

    Shames, Peter; Anderson, Michael; Kowal, Steve; Levesque, Michael; Sindiy, Oleg; Donahue, Kenneth; Barnes, Patrick

    2012-01-01

    The National Aeronautics and Space Administration (NASA) Space Communications and Navigation office (SCaN) has commissioned a series of trade studies to define a new architecture intended to integrate the three existing networks that it operates, the Deep Space Network (DSN), Space Network (SN), and Near Earth Network (NEN), into one integrated network that offers users a set of common, standardized, services and interfaces. The integrated monitor and control architecture utilizes common software and common operator interfaces that can be deployed at all three network elements. This software uses state-of-the-art concepts such as a pool of re-programmable equipment that acts like a configurable software radio, distributed hierarchical control, and centralized management of the whole SCaN integrated network. For this trade space study a model-based approach using SysML was adopted to describe and analyze several possible options for the integrated network monitor and control architecture. This model was used to refine the design and to drive the costing of the four different software options. This trade study modeled the three existing self standing network elements at point of departure, and then described how to integrate them using variations of new and existing monitor and control system components for the different proposed deployments under consideration. This paper will describe the trade space explored, the selected system architecture, the modeling and trade study methods, and some observations on useful approaches to implementing such model based trade space representation and analysis.

  7. Applied Graph-Mining Algorithms to Study Biomolecular Interaction Networks

    PubMed Central

    2014-01-01

    Protein-protein interaction (PPI) networks carry vital information on the organization of molecular interactions in cellular systems. The identification of functionally relevant modules in PPI networks is one of the most important applications of biological network analysis. Computational analysis is becoming an indispensable tool to understand large-scale biomolecular interaction networks. Several types of computational methods have been developed and employed for the analysis of PPI networks. Of these computational methods, graph comparison and module detection are the two most commonly used strategies. This review summarizes current literature on graph kernel and graph alignment methods for graph comparison strategies, as well as module detection approaches including seed-and-extend, hierarchical clustering, optimization-based, probabilistic, and frequent subgraph methods. Herein, we provide a comprehensive review of the major algorithms employed under each theme, including our recently published frequent subgraph method, for detecting functional modules commonly shared across multiple cancer PPI networks. PMID:24800226

  8. 47 CFR 68.201 - Connection to the public switched telephone network.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... network. 68.201 Section 68.201 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) CONNECTION OF TERMINAL EQUIPMENT TO THE TELEPHONE NETWORK Terminal Equipment Approval Procedures § 68.201 Connection to the public switched telephone network. Terminal equipment may...

  9. 47 CFR 68.201 - Connection to the public switched telephone network.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... network. 68.201 Section 68.201 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) CONNECTION OF TERMINAL EQUIPMENT TO THE TELEPHONE NETWORK Terminal Equipment Approval Procedures § 68.201 Connection to the public switched telephone network. Terminal equipment may...

  10. Networking between community health programs: a case study outlining the effectiveness, barriers and enablers

    PubMed Central

    2012-01-01

    Background In India, since the 1990s, there has been a burgeoning of NGOs involved in providing primary health care. This has resulted in a complex NGO-Government interface which is difficult for lone NGOs to navigate. The Uttarakhand Cluster, India, links such small community health programs together to build NGO capacity, increase visibility and better link to the government schemes and the formal healthcare system. This research, undertaken between 1998 and 2011, aims to examine barriers and facilitators to such linking, or clustering, and the effectiveness of this clustering approach. Methods Interviews, indicator surveys and participant observation were used to document the process and explore the enablers, the barriers and the effectiveness of networks improving community health. Results The analysis revealed that when activating, framing, mobilising and synthesizing the Uttarakhand Cluster, key brokers and network players were important in bridging between organisations. The ties (or relationships) that held the cluster together included homophily around common faith, common friendships and geographical location and common mission. Self interest whereby members sought funds, visibility, credibility, increased capacity and access to trainings was also a commonly identified motivating factor for networking. Barriers to network synthesizing included lack of funding, poor communication, limited time and lack of human resources. Risk aversion and mistrust remained significant barriers to overcome for such a network. Conclusions In conclusion, specific enabling factors allowed the clustering approach to be effective at increasing access to resources, creating collaborative opportunities and increasing visibility, credibility and confidence of the cluster members. These findings add to knowledge regarding social network formation and collaboration, and such knowledge will assist in the conceptualisation, formation and success of potential health networks in India and other developing world countries. PMID:22812627

  11. Networking between community health programs: a case study outlining the effectiveness, barriers and enablers.

    PubMed

    Grills, Nathan J; Robinson, Priscilla; Phillip, Maneesh

    2012-07-19

    In India, since the 1990s, there has been a burgeoning of NGOs involved in providing primary health care. This has resulted in a complex NGO-Government interface which is difficult for lone NGOs to navigate. The Uttarakhand Cluster, India, links such small community health programs together to build NGO capacity, increase visibility and better link to the government schemes and the formal healthcare system. This research, undertaken between 1998 and 2011, aims to examine barriers and facilitators to such linking, or clustering, and the effectiveness of this clustering approach. Interviews, indicator surveys and participant observation were used to document the process and explore the enablers, the barriers and the effectiveness of networks improving community health. The analysis revealed that when activating, framing, mobilising and synthesizing the Uttarakhand Cluster, key brokers and network players were important in bridging between organisations. The ties (or relationships) that held the cluster together included homophily around common faith, common friendships and geographical location and common mission. Self interest whereby members sought funds, visibility, credibility, increased capacity and access to trainings was also a commonly identified motivating factor for networking. Barriers to network synthesizing included lack of funding, poor communication, limited time and lack of human resources. Risk aversion and mistrust remained significant barriers to overcome for such a network. In conclusion, specific enabling factors allowed the clustering approach to be effective at increasing access to resources, creating collaborative opportunities and increasing visibility, credibility and confidence of the cluster members. These findings add to knowledge regarding social network formation and collaboration, and such knowledge will assist in the conceptualisation, formation and success of potential health networks in India and other developing world countries.

  12. Event-based simulation of networks with pulse delayed coupling

    NASA Astrophysics Data System (ADS)

    Klinshov, Vladimir; Nekorkin, Vladimir

    2017-10-01

    Pulse-mediated interactions are common in networks of different nature. Here we develop a general framework for simulation of networks with pulse delayed coupling. We introduce the discrete map governing the dynamics of such networks and describe the computation algorithm for its numerical simulation.

  13. Visual NNet: An Educational ANN's Simulation Environment Reusing Matlab Neural Networks Toolbox

    ERIC Educational Resources Information Center

    Garcia-Roselló, Emilio; González-Dacosta, Jacinto; Lado, Maria J.; Méndez, Arturo J.; Garcia Pérez-Schofield, Baltasar; Ferrer, Fátima

    2011-01-01

    Artificial Neural Networks (ANN's) are nowadays a common subject in different curricula of graduate and postgraduate studies. Due to the complex algorithms involved and the dynamic nature of ANN's, simulation software has been commonly used to teach this subject. This software has usually been developed specifically for learning purposes, because…

  14. Laptop Use in University Common Spaces

    ERIC Educational Resources Information Center

    Wolff, Bill

    2006-01-01

    Anecdotal evidence existed about the many students who use their laptops and the wireless network in university common spaces, but little was known about how, where, and why students use laptops on campus, and less was known about students' awareness of university wireless network policies and security. This article discusses the results of a…

  15. Ramses-GPU: Second order MUSCL-Handcock finite volume fluid solver

    NASA Astrophysics Data System (ADS)

    Kestener, Pierre

    2017-10-01

    RamsesGPU is a reimplementation of RAMSES (ascl:1011.007) which drops the adaptive mesh refinement (AMR) features to optimize 3D uniform grid algorithms for modern graphics processor units (GPU) to provide an efficient software package for astrophysics applications that do not need AMR features but do require a very large number of integration time steps. RamsesGPU provides an very efficient C++/CUDA/MPI software implementation of a second order MUSCL-Handcock finite volume fluid solver for compressible hydrodynamics as a magnetohydrodynamics solver based on the constraint transport technique. Other useful modules includes static gravity, dissipative terms (viscosity, resistivity), and forcing source term for turbulence studies, and special care was taken to enhance parallel input/output performance by using state-of-the-art libraries such as HDF5 and parallel-netcdf.

  16. Efficiently Serving HDF5 Products via OPeNDAP

    NASA Technical Reports Server (NTRS)

    Yang, Kent

    2017-01-01

    Hyrax OPeNDAP services are widely used by the Earth Science data centers in NASA, NOAA and other organizations to serve end users. In this talk, we will present some key features added in the HDF5 Hyrax OPeNDAP handler that can help data centers to better serve the HDF5netCDF-4 data products. Among these new features, we will focus on the following:1.The DAP4 support 2.The memory cache and the disk cache support that can reduce the service access time 3.The enhancement that makes the swath-like HDF5 products visualized by CF-client tools. We will also discuss the role of the HDF5 handler in-depth in the recent study of the Hyrax service in the cloud environment.

  17. Information networks in the stock market based on the distance of the multi-attribute dimensions between listed companies

    NASA Astrophysics Data System (ADS)

    Liu, Qian; Li, Huajiao; Liu, Xueyong; Jiang, Meihui

    2018-04-01

    In the stock market, there are widespread information connections between economic agents. Listed companies can obtain mutual information about investment decisions from common shareholders, and the extent of sharing information often determines the relationships between listed companies. Because different shareholder compositions and investment shares lead to different formations of the company's governance mechanisms, we map the investment relationships between shareholders to the multi-attribute dimensional spaces of the listed companies (each shareholder investment in a company is a company dimension). Then, we construct the listed company's information network based on co-shareholder relationships. The weights for the edges in the information network are measured with the Euclidean distance between the listed companies in the multi-attribute dimension space. We define two indices to analyze the information network's features. We conduct an empirical study that analyzes Chinese listed companies' information networks. The results from the analysis show that with the diversification and decentralization of shareholder investments, almost all Chinese listed companies exchanged information through common shareholder relationships, and there is a gradual reduction in information sharing capacity between listed companies that have common shareholders. This network analysis has benefits for risk management and portfolio investments.

  18. GENESIS SciFlo: Choreographing Interoperable Web Services on the Grid using a Semantically-Enabled Dataflow Execution Environment

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.

    2007-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* & Globus Alliance toolkits), and enables scientists to do multi- instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.

  19. Why common carrier and network neutrality principles apply to the Nationwide Health Information Network (NWHIN).

    PubMed

    Gaynor, Mark; Lenert, Leslie; Wilson, Kristin D; Bradner, Scott

    2014-01-01

    The Office of the National Coordinator will be defining the architecture of the Nationwide Health Information Network (NWHIN) together with the proposed HealtheWay public/private partnership as a development and funding strategy. There are a number of open questions--for example, what is the best way to realize the benefits of health information exchange? How valuable are regional health information organizations in comparison with a more direct approach? What is the role of the carriers in delivering this service? The NWHIN is to exist for the public good, and thus shares many traits of the common law notion of 'common carriage' or 'public calling,' the modern term for which is network neutrality. Recent policy debates in Congress and resulting potential regulation have implications for key stakeholders within healthcare that use or provide services, and for those who exchange information. To date, there has been little policy debate or discussion about the implications of a neutral NWHIN. This paper frames the discussion for future policy debate in healthcare by providing a brief education and summary of the modern version of common carriage, of the key stakeholder positions in healthcare, and of the potential implications of the network neutrality debate within healthcare.

  20. Why common carrier and network neutrality principles apply to the Nationwide Health Information Network (NWHIN)

    PubMed Central

    Gaynor, Mark; Lenert, Leslie; Wilson, Kristin D; Bradner, Scott

    2014-01-01

    The Office of the National Coordinator will be defining the architecture of the Nationwide Health Information Network (NWHIN) together with the proposed HealtheWay public/private partnership as a development and funding strategy. There are a number of open questions—for example, what is the best way to realize the benefits of health information exchange? How valuable are regional health information organizations in comparison with a more direct approach? What is the role of the carriers in delivering this service? The NWHIN is to exist for the public good, and thus shares many traits of the common law notion of ‘common carriage’ or ‘public calling,’ the modern term for which is network neutrality. Recent policy debates in Congress and resulting potential regulation have implications for key stakeholders within healthcare that use or provide services, and for those who exchange information. To date, there has been little policy debate or discussion about the implications of a neutral NWHIN. This paper frames the discussion for future policy debate in healthcare by providing a brief education and summary of the modern version of common carriage, of the key stakeholder positions in healthcare, and of the potential implications of the network neutrality debate within healthcare. PMID:23837992

  1. ANALYSIS OF CLINICAL AND DERMOSCOPIC FEATURES FOR BASAL CELL CARCINOMA NEURAL NETWORK CLASSIFICATION

    PubMed Central

    Cheng, Beibei; Stanley, R. Joe; Stoecker, William V; Stricklin, Sherea M.; Hinton, Kristen A.; Nguyen, Thanh K.; Rader, Ryan K.; Rabinovitz, Harold S.; Oliviero, Margaret; Moss, Randy H.

    2012-01-01

    Background Basal cell carcinoma (BCC) is the most commonly diagnosed cancer in the United States. In this research, we examine four different feature categories used for diagnostic decisions, including patient personal profile (patient age, gender, etc.), general exam (lesion size and location), common dermoscopic (blue-gray ovoids, leaf-structure dirt trails, etc.), and specific dermoscopic lesion (white/pink areas, semitranslucency, etc.). Specific dermoscopic features are more restricted versions of the common dermoscopic features. Methods Combinations of the four feature categories are analyzed over a data set of 700 lesions, with 350 BCCs and 350 benign lesions, for lesion discrimination using neural network-based techniques, including Evolving Artificial Neural Networks and Evolving Artificial Neural Network Ensembles. Results Experiment results based on ten-fold cross validation for training and testing the different neural network-based techniques yielded an area under the receiver operating characteristic curve as high as 0.981 when all features were combined. The common dermoscopic lesion features generally yielded higher discrimination results than other individual feature categories. Conclusions Experimental results show that combining clinical and image information provides enhanced lesion discrimination capability over either information source separately. This research highlights the potential of data fusion as a model for the diagnostic process. PMID:22724561

  2. Network geometry inference using common neighbors

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Fragkiskos; Aldecoa, Rodrigo; Krioukov, Dmitri

    2015-08-01

    We introduce and explore a method for inferring hidden geometric coordinates of nodes in complex networks based on the number of common neighbors between the nodes. We compare this approach to the HyperMap method, which is based only on the connections (and disconnections) between the nodes, i.e., on the links that the nodes have (or do not have). We find that for high degree nodes, the common-neighbors approach yields a more accurate inference than the link-based method, unless heuristic periodic adjustments (or "correction steps") are used in the latter. The common-neighbors approach is computationally intensive, requiring O (t4) running time to map a network of t nodes, versus O (t3) in the link-based method. But we also develop a hybrid method with O (t3) running time, which combines the common-neighbors and link-based approaches, and we explore a heuristic that reduces its running time further to O (t2) , without significant reduction in the mapping accuracy. We apply this method to the autonomous systems (ASs) Internet, and we reveal how soft communities of ASs evolve over time in the similarity space. We further demonstrate the method's predictive power by forecasting future links between ASs. Taken altogether, our results advance our understanding of how to efficiently and accurately map real networks to their latent geometric spaces, which is an important necessary step toward understanding the laws that govern the dynamics of nodes in these spaces, and the fine-grained dynamics of network connections.

  3. Link prediction with node clustering coefficient

    NASA Astrophysics Data System (ADS)

    Wu, Zhihao; Lin, Youfang; Wang, Jing; Gregory, Steve

    2016-06-01

    Predicting missing links in incomplete complex networks efficiently and accurately is still a challenging problem. The recently proposed Cannistrai-Alanis-Ravai (CAR) index shows the power of local link/triangle information in improving link-prediction accuracy. Inspired by the idea of employing local link/triangle information, we propose a new similarity index with more local structure information. In our method, local link/triangle structure information can be conveyed by clustering coefficient of common-neighbors directly. The reason why clustering coefficient has good effectiveness in estimating the contribution of a common-neighbor is that it employs links existing between neighbors of a common-neighbor and these links have the same structural position with the candidate link to this common-neighbor. In our experiments, three estimators: precision, AUP and AUC are used to evaluate the accuracy of link prediction algorithms. Experimental results on ten tested networks drawn from various fields show that our new index is more effective in predicting missing links than CAR index, especially for networks with low correlation between number of common-neighbors and number of links between common-neighbors.

  4. The Deep Space Network in the Common Platform Era: A Prototype Implementation at DSS-13

    NASA Technical Reports Server (NTRS)

    Davarian, F.

    2013-01-01

    To enhance NASA's Deep Space Network (DSN), an effort is underway to improve network performance and simplify its operation and maintenance. This endeavor, known as the "Common Platform," has both short- and long-term objectives. The long-term work has not begun yet; however, the activity to realize the short-term goals has started. There are three goals for the long-term objective: 1. Convert the DSN into a digital network where signals are digitized at the output of the down converters at the antennas and are distributed via a digital IF switch to the processing platforms. 2. Employ a set of common hardware for signal processing applications, e.g., telemetry, tracking, radio science and Very Long Baseline Interferometry (VLBI). 3. Minimize in-house developments in favor of purchasing commercial off-the-shelf (COTS) equipment. The short-term goal is to develop a prototype of the above at NASA's experimental station known as DSS-13. This station consists of a 34m beam waveguide antenna with cryogenically cooled amplifiers capable of handling deep space research frequencies at S-, X-, and Ka-bands. Without the effort at DSS-13, the implementation of the long-term goal can potentially be risky because embarking on the modification of an operational network without prior preparations can, among other things, result in unwanted service interruptions. Not only are there technical challenges to address, full network implementation of the Common Platform concept includes significant cost uncertainties. Therefore, a limited implementation at DSS-13 will contribute to risk reduction. The benefits of employing common platforms for the DSN are lower cost and improved operations resulting from ease of maintenance and reduced number of spare parts. Increased flexibility for the user is another potential benefit. This paper will present the plans for DSS-13 implementation. It will discuss key issues such as the Common Platform architecture, choice of COTS equipment, and the standard for radio frequency (RF) to digital interface.

  5. 47 CFR 51.311 - Nondiscriminatory access to unbundled network elements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... elements. 51.311 Section 51.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... § 51.311 Nondiscriminatory access to unbundled network elements. (a) The quality of an unbundled network element, as well as the quality of the access to the unbundled network element, that an incumbent...

  6. 47 CFR 51.311 - Nondiscriminatory access to unbundled network elements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... elements. 51.311 Section 51.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... § 51.311 Nondiscriminatory access to unbundled network elements. (a) The quality of an unbundled network element, as well as the quality of the access to the unbundled network element, that an incumbent...

  7. 47 CFR 51.311 - Nondiscriminatory access to unbundled network elements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... elements. 51.311 Section 51.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... § 51.311 Nondiscriminatory access to unbundled network elements. (a) The quality of an unbundled network element, as well as the quality of the access to the unbundled network element, that an incumbent...

  8. Gray matter alterations in chronic pain: A network-oriented meta-analytic approach

    PubMed Central

    Cauda, Franco; Palermo, Sara; Costa, Tommaso; Torta, Riccardo; Duca, Sergio; Vercelli, Ugo; Geminiani, Giuliano; Torta, Diana M.E.

    2014-01-01

    Several studies have attempted to characterize morphological brain changes due to chronic pain. Although it has repeatedly been suggested that longstanding pain induces gray matter modifications, there is still some controversy surrounding the direction of the change (increase or decrease in gray matter) and the role of psychological and psychiatric comorbidities. In this study, we propose a novel, network-oriented, meta-analytic approach to characterize morphological changes in chronic pain. We used network decomposition to investigate whether different kinds of chronic pain are associated with a common or specific set of altered networks. Representational similarity techniques, network decomposition and model-based clustering were employed: i) to verify the presence of a core set of brain areas commonly modified by chronic pain; ii) to investigate the involvement of these areas in a large-scale network perspective; iii) to study the relationship between altered networks and; iv) to find out whether chronic pain targets clusters of areas. Our results showed that chronic pain causes both core and pathology-specific gray matter alterations in large-scale networks. Common alterations were observed in the prefrontal regions, in the anterior insula, cingulate cortex, basal ganglia, thalamus, periaqueductal gray, post- and pre-central gyri and inferior parietal lobule. We observed that the salience and attentional networks were targeted in a very similar way by different chronic pain pathologies. Conversely, alterations in the sensorimotor and attention circuits were differentially targeted by chronic pain pathologies. Moreover, model-based clustering revealed that chronic pain, in line with some neurodegenerative diseases, selectively targets some large-scale brain networks. Altogether these findings indicate that chronic pain can be better conceived and studied in a network perspective. PMID:24936419

  9. Optimization of neural network architecture using genetic programming improves detection and modeling of gene-gene interactions in studies of human diseases

    PubMed Central

    Ritchie, Marylyn D; White, Bill C; Parker, Joel S; Hahn, Lance W; Moore, Jason H

    2003-01-01

    Background Appropriate definition of neural network architecture prior to data analysis is crucial for successful data mining. This can be challenging when the underlying model of the data is unknown. The goal of this study was to determine whether optimizing neural network architecture using genetic programming as a machine learning strategy would improve the ability of neural networks to model and detect nonlinear interactions among genes in studies of common human diseases. Results Using simulated data, we show that a genetic programming optimized neural network approach is able to model gene-gene interactions as well as a traditional back propagation neural network. Furthermore, the genetic programming optimized neural network is better than the traditional back propagation neural network approach in terms of predictive ability and power to detect gene-gene interactions when non-functional polymorphisms are present. Conclusion This study suggests that a machine learning strategy for optimizing neural network architecture may be preferable to traditional trial-and-error approaches for the identification and characterization of gene-gene interactions in common, complex human diseases. PMID:12846935

  10. Balancing the popularity bias of object similarities for personalised recommendation

    NASA Astrophysics Data System (ADS)

    Hou, Lei; Pan, Xue; Liu, Kecheng

    2018-03-01

    Network-based similarity measures have found wide applications in recommendation algorithms and made significant contributions for uncovering users' potential interests. However, existing measures are generally biased in terms of popularity, that the popular objects tend to have more common neighbours with others and thus are considered more similar to others. Such popularity bias of similarity quantification will result in the biased recommendations, with either poor accuracy or poor diversity. Based on the bipartite network modelling of the user-object interactions, this paper firstly calculates the expected number of common neighbours of two objects with given popularities in random networks. A Balanced Common Neighbour similarity index is accordingly developed by removing the random-driven common neighbours, estimated as the expected number, from the total number. Recommendation experiments in three data sets show that balancing the popularity bias in a certain degree can significantly improve the recommendations' accuracy and diversity simultaneously.

  11. Guiding District Implementation of Common Core State Standards: Innovation Configuration Maps

    ERIC Educational Resources Information Center

    Roy, Patricia; Killion, Joellen

    2011-01-01

    Leadership Networks are regional and content-specific networks focused on the preparation of college- and career-ready students. Each network includes teacher leaders, school administrators, central office staff, regional cooperatives, and institutes of higher education. Network members work collaboratively to focus their efforts on regional needs…

  12. Security Aspects of an Enterprise-Wide Network Architecture.

    ERIC Educational Resources Information Center

    Loew, Robert; Stengel, Ingo; Bleimann, Udo; McDonald, Aidan

    1999-01-01

    Presents an overview of two projects that concern local area networks and the common point between networks as they relate to network security. Discusses security architectures based on firewall components, packet filters, application gateways, security-management components, an intranet solution, user registration by Web form, and requests for…

  13. 47 CFR 51.327 - Notice of network changes: Content of notice.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 3 2011-10-01 2011-10-01 false Notice of network changes: Content of notice. 51.327 Section 51.327 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER... Notice of network changes: Content of notice. (a) Public notice of planned network changes must, at a...

  14. 47 CFR 68.110 - Compatibility of the public switched telephone network and terminal equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... network and terminal equipment. 68.110 Section 68.110 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) CONNECTION OF TERMINAL EQUIPMENT TO THE TELEPHONE NETWORK Conditions on Use of Terminal Equipment § 68.110 Compatibility of the public switched telephone network and...

  15. 47 CFR 51.327 - Notice of network changes: Content of notice.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Notice of network changes: Content of notice. 51.327 Section 51.327 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER... Notice of network changes: Content of notice. (a) Public notice of planned network changes must, at a...

  16. 47 CFR 68.110 - Compatibility of the public switched telephone network and terminal equipment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... network and terminal equipment. 68.110 Section 68.110 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) CONNECTION OF TERMINAL EQUIPMENT TO THE TELEPHONE NETWORK Conditions on Use of Terminal Equipment § 68.110 Compatibility of the public switched telephone network and...

  17. Fungal nutrient allocation in common mycorrhizal networks is regulated by the carbon source strength of individual host plants

    USDA-ARS?s Scientific Manuscript database

    • The common mycorrhizal networks (CMN) of arbuscular mycorrhizal (AM) fungi in the soil provide multiple host plants with nutrients, but the mechanisms by which the nutrient transport to individual host plants within one CMN is controlled, are currently unknown. • We followed by radioactive and st...

  18. Rapid decay in the relative efficiency of quarantine to halt epidemics in networks

    NASA Astrophysics Data System (ADS)

    Strona, Giovanni; Castellano, Claudio

    2018-02-01

    Several recent studies have tackled the issue of optimal network immunization by providing efficient criteria to identify key nodes to be removed in order to break apart a network, thus preventing the occurrence of extensive epidemic outbreaks. Yet, although the efficiency of those criteria has been demonstrated also in empirical networks, preventive immunization is rarely applied to real-world scenarios, where the usual approach is the a posteriori attempt to contain epidemic outbreaks using quarantine measures. Here we compare the efficiency of prevention with that of quarantine in terms of the tradeoff between the number of removed and saved nodes on both synthetic and empirical topologies. We show how, consistent with common sense, but contrary to common practice, in many cases preventing is better than curing: depending on network structure, rescuing an infected network by quarantine could become inefficient soon after the first infection.

  19. Social network analysis for program implementation.

    PubMed

    Valente, Thomas W; Palinkas, Lawrence A; Czaja, Sara; Chu, Kar-Hai; Brown, C Hendricks

    2015-01-01

    This paper introduces the use of social network analysis theory and tools for implementation research. The social network perspective is useful for understanding, monitoring, influencing, or evaluating the implementation process when programs, policies, practices, or principles are designed and scaled up or adapted to different settings. We briefly describe common barriers to implementation success and relate them to the social networks of implementation stakeholders. We introduce a few simple measures commonly used in social network analysis and discuss how these measures can be used in program implementation. Using the four stage model of program implementation (exploration, adoption, implementation, and sustainment) proposed by Aarons and colleagues [1] and our experience in developing multi-sector partnerships involving community leaders, organizations, practitioners, and researchers, we show how network measures can be used at each stage to monitor, intervene, and improve the implementation process. Examples are provided to illustrate these concepts. We conclude with expected benefits and challenges associated with this approach.

  20. Social Network Analysis for Program Implementation

    PubMed Central

    Valente, Thomas W.; Palinkas, Lawrence A.; Czaja, Sara; Chu, Kar-Hai; Brown, C. Hendricks

    2015-01-01

    This paper introduces the use of social network analysis theory and tools for implementation research. The social network perspective is useful for understanding, monitoring, influencing, or evaluating the implementation process when programs, policies, practices, or principles are designed and scaled up or adapted to different settings. We briefly describe common barriers to implementation success and relate them to the social networks of implementation stakeholders. We introduce a few simple measures commonly used in social network analysis and discuss how these measures can be used in program implementation. Using the four stage model of program implementation (exploration, adoption, implementation, and sustainment) proposed by Aarons and colleagues [1] and our experience in developing multi-sector partnerships involving community leaders, organizations, practitioners, and researchers, we show how network measures can be used at each stage to monitor, intervene, and improve the implementation process. Examples are provided to illustrate these concepts. We conclude with expected benefits and challenges associated with this approach. PMID:26110842

  1. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  2. Psyplot: Visualizing rectangular and triangular Climate Model Data with Python

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp

    2016-04-01

    The development and use of climate models often requires the visualization of geo-referenced data. Creating visualizations should be fast, attractive, flexible, easily applicable and easily reproducible. There is a wide range of software tools available for visualizing raster data, but they often are inaccessible to many users (e.g. because they are difficult to use in a script or have low flexibility). In order to facilitate easy visualization of geo-referenced data, we developed a new framework called "psyplot," which can aid earth system scientists with their daily work. It is purely written in the programming language Python and primarily built upon the python packages matplotlib, cartopy and xray. The package can visualize data stored on the hard disk (e.g. NetCDF, GeoTIFF, any other file format supported by the xray package), or directly from the memory or Climate Data Operators (CDOs). Furthermore, data can be visualized on a rectangular grid (following or not following the CF Conventions) and on a triangular grid (following the CF or UGRID Conventions). Psyplot visualizes 2D scalar and vector fields, enabling the user to easily manage and format multiple plots at the same time, and to export the plots into all common picture formats and movies covered by the matplotlib package. The package can currently be used in an interactive python session or in python scripts, and will soon be developed for use with a graphical user interface (GUI). Finally, the psyplot framework enables flexible configuration, allows easy integration into other scripts that uses matplotlib, and provides a flexible foundation for further development.

  3. WegenerNet climate station network region Feldbach/Austria: From local measurements to weather and climate data products at 1 km-scale resolution

    NASA Astrophysics Data System (ADS)

    Kabas, T.; Leuprecht, A.; Bichler, C.; Kirchengast, G.

    2010-12-01

    South-eastern Austria is characteristic for experiencing a rich variety of weather and climate patterns. For this reason, the county of Feldbach was selected by the Wegener Center as a focus area for a pioneering observation experiment at very high resolution: The WegenerNet climate station network (in brief WegenerNet) comprises 151 meteorological stations within an area of about 20 km × 15 km (~ 1.4 km × 1.4 km station grid). All stations measure the main parameters temperature, humidity and precipitation with 5 minute sampling. Selected further stations include measurements of wind speed and direction completed by soil parameters as well as air pressure and net radiation. The collected data is integrated in an automatic processing system including data transfer, quality control, product generation, and visualization. Each station is equipped with an internet-attached data logger and the measurements are transferred as binary files via GPRS to the WegenerNet server in 1 hour intervals. The incoming raw data files of measured parameters as well as several operating values of the data logger are stored in a relational database (PostgreSQL). Next, the raw data pass the Quality Control System (QCS) in which the data are checked for its technical and physical plausibility (e.g., sensor specifications, temporal and spatial variability). In consideration of the data quality (quality flag), the Data Product Generator (DPG) results in weather and climate data products on various temporal scales (from 5 min to annual) for single stations and regular grids. Gridded data are derived by vertical scaling and squared inverse distance interpolation (1 km × 1 km and 0.01° × 0.01° grids). Both subsystems (QCS and DPG) are realized by the programming language Python. For application purposes the resulting data products are available via the bi-lingual (dt, en) WegenerNet data portal (www.wegenernet.org). At this time, the main interface is still online in a system in which MapServer is used to import spatial data by its database interface and to generate images of static geographic formats. However, a Java applet is additionally needed to display these images on the users local host. Furthermore, station data are visualized as time series by the scripting language PHP. Since February 2010, the visualization of gridded data products is a first step to a new data portal based on OpenLayers. In this GIS framework, all geographic information (e.g., OpenStreetMap) is displayed with MapServer. Furthermore, the visualization of all meteorological parameters are generated on the fly by a Python CGI script and transparently overlayed on the maps. Hence, station data and gridded data are visualized and further prepared for download in common data formats (csv, NetCDF). In conclusion, measured data and generated data products are provided with a data latency less than 1-2 hours in standard operation (near real time). Following an introduction of the processing system along the lines above, resulting data products are presented online at the WegenerNet data portal.

  4. EMODNet Hydrography - Seabed Mapping - Developing a higher resolution digital bathymetry for the European seas

    NASA Astrophysics Data System (ADS)

    Schaap, Dick M. A.; Moussat, Eric

    2013-04-01

    In December 2007 the European Parliament and Council adopted the Marine Strategy Framework Directive (MSFD) which aims to achieve environmentally healthy marine waters by 2020. This Directive includes an initiative for an overarching European Marine Observation and Data Network (EMODNet). The EMODNet Hydrography - Seabed Mapping projects made good progress in developing the EMODNet Hydrography portal to provide overview and access to available bathymetric survey datasets and to generate an harmonised digital bathymetry for Europe's sea basins. Up till end 2012 more than 8400 bathymetric survey datasets, managed by 14 data centres from 9 countries and originated from 118 institutes, have been gathered and populated in the EMODNet Hydrography Data Discovery and Access service, adopting SeaDataNet standards. These datasets have been used as input for analysing and generating the EMODNet digital terrain model (DTM), so far for the following sea basins: • the Greater North Sea, including the Kattegat • the English Channel and Celtic Seas • Western and Central Mediterranean Sea and Ionian Sea • Bay of Biscay, Iberian coast and North-East Atlantic • Adriatic Sea • Aegean - Levantine Sea (Eastern Mediterranean). • Azores - Madeira EEZ The Hydrography Viewing service gives users wide functionality for viewing and downloading the EMODNet digital bathymetry: • water depth in gridded form on a DTM grid of a quarter a minute of longitude and latitude • option to view QC parameters of individual DTM cells and references to source data • option to download DTM tiles in different formats: ESRI ASCII, XYZ, CSV, NetCDF (CF), GeoTiff and SD for Fledermaus 3 D viewer software • option for users to create their Personal Layer and to upload multibeam survey ASCII datasets for automatic processing into personal DTMs following the EMODNet standards The NetCDF (CF) DTM files are fit for use in a special 3D Viewer software package which is based on the existing open source NASA World Wind JSK application. It has been developed in the frame of the EU Geo-Seas project (another sibling of SeaDataNet for marine geological and geophysical data) and is freely available. The 3D viewer also supports the ingestion of WMS overlay maps. The EMODNet consortium is actively seeking cooperation with Hydrographic Offices, research institutes, authorities and private organisations for additional data sets (single and multibeam surveys, sounding tracks, composite products) to contribute to an even better geographical coverage. These datasets will be used for upgrading and extending the EMODNet regional Digital Terrain Models (DTM). The datasets themselves are not distributed but described in the metadata service, giving clear information about the background survey data used for the DTM, their access restrictions, originators and distributors and facilitating requests by users to originators. This way the portal provides originators of bathymetric data sets an attractive shop window for promoting their data sets to potential users, without losing control. The EMODNet Hydrography Consortium consists of MARIS (NL), ATLIS (NL), IFREMER (FR), SHOM (FR), IEO (ES), GSI (IE), NERC-NOCS (UK), OGS (IT), HCMR (GR), and UNEP/GRID-Arendal (NO) with associate partners CNR-ISMAR (IT), OGS-RIMA (IT), IHPT (PT), and LNEG (PT). Website: http://www.emodnet-hydrography.eu

  5. A local structure model for network analysis

    DOE PAGES

    Casleton, Emily; Nordman, Daniel; Kaiser, Mark

    2017-04-01

    The statistical analysis of networks is a popular research topic with ever widening applications. Exponential random graph models (ERGMs), which specify a model through interpretable, global network features, are common for this purpose. In this study we introduce a new class of models for network analysis, called local structure graph models (LSGMs). In contrast to an ERGM, a LSGM specifies a network model through local features and allows for an interpretable and controllable local dependence structure. In particular, LSGMs are formulated by a set of full conditional distributions for each network edge, e.g., the probability of edge presence/absence, depending onmore » neighborhoods of other edges. Additional model features are introduced to aid in specification and to help alleviate a common issue (occurring also with ERGMs) of model degeneracy. Finally, the proposed models are demonstrated on a network of tornadoes in Arkansas where a LSGM is shown to perform significantly better than a model without local dependence.« less

  6. A local structure model for network analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casleton, Emily; Nordman, Daniel; Kaiser, Mark

    The statistical analysis of networks is a popular research topic with ever widening applications. Exponential random graph models (ERGMs), which specify a model through interpretable, global network features, are common for this purpose. In this study we introduce a new class of models for network analysis, called local structure graph models (LSGMs). In contrast to an ERGM, a LSGM specifies a network model through local features and allows for an interpretable and controllable local dependence structure. In particular, LSGMs are formulated by a set of full conditional distributions for each network edge, e.g., the probability of edge presence/absence, depending onmore » neighborhoods of other edges. Additional model features are introduced to aid in specification and to help alleviate a common issue (occurring also with ERGMs) of model degeneracy. Finally, the proposed models are demonstrated on a network of tornadoes in Arkansas where a LSGM is shown to perform significantly better than a model without local dependence.« less

  7. Common and distinct brain networks underlying verbal and visual creativity.

    PubMed

    Zhu, Wenfeng; Chen, Qunlin; Xia, Lingxiang; Beaty, Roger E; Yang, Wenjing; Tian, Fang; Sun, Jiangzhou; Cao, Guikang; Zhang, Qinglin; Chen, Xu; Qiu, Jiang

    2017-04-01

    Creativity is imperative to the progression of human civilization, prosperity, and well-being. Past creative researches tends to emphasize the default mode network (DMN) or the frontoparietal network (FPN) somewhat exclusively. However, little is known about how these networks interact to contribute to creativity and whether common or distinct brain networks are responsible for visual and verbal creativity. Here, we use functional connectivity analysis of resting-state functional magnetic resonance imaging data to investigate visual and verbal creativity-related regions and networks in 282 healthy subjects. We found that functional connectivity within the bilateral superior parietal cortex of the FPN was negatively associated with visual and verbal creativity. The strength of connectivity between the DMN and FPN was positively related to both creative domains. Visual creativity was negatively correlated with functional connectivity within the precuneus of the pDMN and right middle frontal gyrus of the FPN, and verbal creativity was negatively correlated with functional connectivity within the medial prefrontal cortex of the aDMN. Critically, the FPN mediated the relationship between the aDMN and verbal creativity, and it also mediated the relationship between the pDMN and visual creativity. Taken together, decreased within-network connectivity of the FPN and DMN may allow for flexible between-network coupling in the highly creative brain. These findings provide indirect evidence for the cooperative role of the default and executive control networks in creativity, extending past research by revealing common and distinct brain systems underlying verbal and visual creative cognition. Hum Brain Mapp 38:2094-2111, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  8. SAR Altimetry Processing on Demand Service for CryoSat-2 and Sentinel-3 at ESA G-POD

    NASA Astrophysics Data System (ADS)

    Dinardo, Salvatore; Lucas, Bruno; Benveniste, Jerome

    2015-12-01

    The scope of this work is to feature the new ESA service (SARvatore) for the exploitation of the CryoSat-2 data, designed and developed entirely by the Altimetry Team at ESA-ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The G-POD Service, SARvatore (SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation) for CryoSat-2, is a web platform that provides the capability to process on-line and on-demand CryoSat-2 SAR/SARIN data, from L1a (FBR) data products until SAR/SARIN Level-2 geophysical data products.. The Processor will make use of the G-POD (Grid-Processing On Demand) distributed computing platform to deliver timely the output data products. These output data products are generated in standard NetCDF format (using CF Convention), and they are compatible with BRAT (Basic Radar Altimetry Toolbox) and other NetCDF tool. Using the G-POD graphic interface, it is easy to select the geographical area of interest along with the time-frame of interest, based on the Cryosat-2 SAR/SARIN FBR data products availability in the service's catalogue. After the task submission, the users can follow, in real time, the status of the processing task. The processor prototype is versatile in the sense that the users can customize and adapt the processing, according their specific requirements, setting a list of configurable options. The processing service is meant to be used for research & development experiments, to support the development contracts awarded confronting the deliverables to ESA, on site demonstrations/training in training courses and workshops, cross-comparison against third party products (CLS/CNES CPP Products for instance), preparation for the Sentinel-3 Topographic mission, producing data and graphics for publications, etc. So far, the processing has been designed and optimized for open ocean studies and is fully functional only over this kind of surface but there are plans to augment this processing capacity over coastal zone, inland water and over land in view of maximizing the exploitation of the upcoming Sentinel-3 Topographic mission over all surfaces. The service is open and free of charge.

  9. The PEcAn Project: Accessible Tools for On-demand Ecosystem Modeling

    NASA Astrophysics Data System (ADS)

    Cowdery, E.; Kooper, R.; LeBauer, D.; Desai, A. R.; Mantooth, J.; Dietze, M.

    2014-12-01

    Ecosystem models play a critical role in understanding the terrestrial biosphere and forecasting changes in the carbon cycle, however current forecasts have considerable uncertainty. The amount of data being collected and produced is increasing on daily basis as we enter the "big data" era, but only a fraction of this data is being used to constrain models. Until we can improve the problems of model accessibility and model-data communication, none of these resources can be used to their full potential. The Predictive Ecosystem Analyzer (PEcAn) is an ecoinformatics toolbox and a set of workflows that wrap around an ecosystem model and manage the flow of information in and out of regional-scale TBMs. Here we present new modules developed in PEcAn to manage the processing of meteorological data, one of the primary driver dependencies for ecosystem models. The module downloads, reads, extracts, and converts meteorological observations to Unidata Climate Forecast (CF) NetCDF community standard, a convention used for most climate forecast and weather models. The module also automates the conversion from NetCDF to model specific formats, including basic merging, gap-filling, and downscaling procedures. PEcAn currently supports tower-based micrometeorological observations at Ameriflux and FluxNET sites, site-level CSV-formatted data, and regional and global reanalysis products such as the North American Regional Reanalysis and CRU-NCEP. The workflow is easily extensible to additional products and processing algorithms.These meteorological workflows have been coupled with the PEcAn web interface and now allow anyone to run multiple ecosystem models for any location on the Earth by simply clicking on an intuitive Google-map based interface. This will allow users to more readily compare models to observations at those sites, leading to better calibration and validation. Current work is extending these workflows to also process field, remotely-sensed, and historical observations of vegetation composition and structure. The processing of heterogeneous met and veg data within PEcAn is made possible using the Brown Dog cyberinfrastructure tools for unstructured data.

  10. Usability and Interoperability Improvements for an EASE-Grid 2.0 Passive Microwave Data Product Using CF Conventions

    NASA Astrophysics Data System (ADS)

    Hardman, M.; Brodzik, M. J.; Long, D. G.

    2017-12-01

    Beginning in 1978, the satellite passive microwave data record has been a mainstay of remote sensing of the cryosphere, providing twice-daily, near-global spatial coverage for monitoring changes in hydrologic and cryospheric parameters that include precipitation, soil moisture, surface water, vegetation, snow water equivalent, sea ice concentration and sea ice motion. Historical versions of the gridded passive microwave data sets were produced as flat binary files described in human-readable documentation. This format is error-prone and makes it difficult to reliably include all processing and provenance. Funded by NASA MEaSUREs, we have completely reprocessed the gridded data record that includes SMMR, SSM/I-SSMIS and AMSR-E. The new Calibrated Enhanced-Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR) files are self-describing. Our approach to the new data set was to create netCDF4 files that use standard metadata conventions and best practices to incorporate file-level, machine- and human-readable contents, geolocation, processing and provenance metadata. We followed the flexible and adaptable Climate and Forecast (CF-1.6) Conventions with respect to their coordinate conventions and map projection parameters. Additionally, we made use of Attribute Conventions for Dataset Discovery (ACDD-1.3) that provided file-level conventions with spatio-temporal bounds that enable indexing software to search for coverage. Our CETB files also include temporal coverage and spatial resolution in the file-level metadata for human-readability. We made use of the JPL CF/ACDD Compliance Checker to guide this work. We tested our file format with real software, for example, netCDF Command-line Operators (NCO) power tools for unlimited control on spatio-temporal subsetting and concatenation of files. The GDAL tools understand the CF metadata and produce fully-compliant geotiff files from our data. ArcMap can then reproject the geotiff files on-the-fly and work with other geolocated data such as coastlines, with no special work required. We expect this combination of standards and well-tested interoperability to significantly improve the usability of this important ESDR for the Earth Science community.

  11. High performance geospatial and climate data visualization using GeoJS

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.

  12. Improving Metadata Compliance for Earth Science Data Records

    NASA Astrophysics Data System (ADS)

    Armstrong, E. M.; Chang, O.; Foster, D.

    2014-12-01

    One of the recurring challenges of creating earth science data records is to ensure a consistent level of metadata compliance at the granule level where important details of contents, provenance, producer, and data references are necessary to obtain a sufficient level of understanding. These details are important not just for individual data consumers but also for autonomous software systems. Two of the most popular metadata standards at the granule level are the Climate and Forecast (CF) Metadata Conventions and the Attribute Conventions for Dataset Discovery (ACDD). Many data producers have implemented one or both of these models including the Group for High Resolution Sea Surface Temperature (GHRSST) for their global SST products and the Ocean Biology Processing Group for NASA ocean color and SST products. While both the CF and ACDD models contain various level of metadata richness, the actual "required" attributes are quite small in number. Metadata at the granule level becomes much more useful when recommended or optional attributes are implemented that document spatial and temporal ranges, lineage and provenance, sources, keywords, and references etc. In this presentation we report on a new open source tool to check the compliance of netCDF and HDF5 granules to the CF and ACCD metadata models. The tool, written in Python, was originally implemented to support metadata compliance for netCDF records as part of the NOAA's Integrated Ocean Observing System. It outputs standardized scoring for metadata compliance for both CF and ACDD, produces an objective summary weight, and can be implemented for remote records via OPeNDAP calls. Originally a command-line tool, we have extended it to provide a user-friendly web interface. Reports on metadata testing are grouped in hierarchies that make it easier to track flaws and inconsistencies in the record. We have also extended it to support explicit metadata structures and semantic syntax for the GHRSST project that can be easily adapted to other satellite missions as well. Overall, we hope this tool will provide the community with a useful mechanism to improve metadata quality and consistency at the granule level by providing objective scoring and assessment, as well as encourage data producers to improve metadata quality and quantity.

  13. SAR Altimetry Processing on Demand Service for CryoSat-2 and Sentinel-3 at ESA G-POD

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Dinardo, S.; Lucas, B.

    2014-12-01

    The scope of this work is to show the new ESA service (SARvatore) for the exploitation of the CryoSat-2 data and upcoming Sentinel-3 data, designed and developed entirely by the Altimetry Team at ESRIN EOP-SER. The G-POD (Grid-Processing On Demand) Service, SARvatore (SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation) for CryoSat-2, is a web platform that provides the capability to process on-line and on demand CryoSat-2 SAR data, starting from L1a (FBR) data up to SAR Level-2 geophysical data products.The service is based on SARvatore Processor Prototype and it The output data products are generated in standard NetCDF format (using CF Convention), and they are compatible with BRAT (Basic Radar Altimety Toolbox) and its successor, the up-coming Sentinel-3 Altimetry Toolbox and other NetCDF tools.Using the G-POD graphic interface, it is possible to easily select the geographical area of interest along with the time of interest. As of August 2014 the service allows the user to select data for most of 2013 and part of 2014, no geographical restriction on this data. It is expected that before Fall 2014 all the mission (when available) will be at the disposal of the users.The processor prototype is versatile in the sense that the users can customize and adapt the processing, according their specific requirements, setting a list of configurable options..The processing service is meant to be used for research & development scopes, supporting the development contracts, on site demonstrations/training to selected users, cross-comparison against third part products, preparation to Sentinel-3 mission, publications, etc.So far, the processing has been designed and optimized for open ocean studies and is fully functional only over this kind of surface but there are plans to augment this processing capacity over coastal zones, inland waters and over land in sight of maximizing the exploitation of the upcoming Sentinel-3 Topographic mission over all surfaces.

  14. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Mattmann, C. A.; Waliser, D. E.; Kim, J.; Loikith, P.; Lee, H.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark. Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk, and makes iterative algorithms feasible. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 100 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning (ML) based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. The goals of SciSpark are to: (1) Decrease the time to compute comparison statistics and plots from minutes to seconds; (2) Allow for interactive exploration of time-series properties over seasons and years; (3) Decrease the time for satellite data ingestion into RCMES to hours; (4) Allow for Level-2 comparisons with higher-order statistics or PDF's in minutes to hours; and (5) Move RCMES into a near real time decision-making platform. We will report on: the architecture and design of SciSpark, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning (sharding) of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.

  15. A web portal for accessing, viewing and comparing in situ observations, EO products and model output data

    NASA Astrophysics Data System (ADS)

    Vines, Aleksander; Hamre, Torill; Lygre, Kjetil

    2014-05-01

    The GreenSeas project (Development of global plankton data base and model system for eco-climate early warning) aims to advance the knowledge and predictive capacities of how marine ecosystems will respond to global change. A main task has been to set up a data delivery and monitoring core service following the open and free data access policy implemented in the Global Monitoring for the Environment and Security (GMES) programme. A key feature of the system is its ability to compare data from different datasets, including an option to upload one's own netCDF files. The user can for example search in an in situ database for different variables (like temperature, salinity, different elements, light, specific plankton types or rate measurements) with different criteria (bounding box, date/time, depth, Longhurst region, cruise/transect) and compare the data with model data. The user can choose model data or Earth observation data from a list, or upload his/her own netCDF files to use in the comparison. The data can be visualized on a map, as graphs and plots (e.g. time series and property-property plots), or downloaded in various formats. The aim is to ensure open and free access to historical plankton data, new data (EO products and in situ measurements), model data (including estimates of simulation error) and biological, environmental and climatic indicators to a range of stakeholders, such as scientists, policy makers and environmental managers. We have implemented a web-based GIS(Geographical Information Systems) system and want to demonstrate the use of this. The tool is designed for a wide range of users: Novice users, who want a simple way to be able to get basic information about the current state of the marine planktonic ecosystem by utilizing predefined queries and comparisons with models. Intermediate level users who want to explore the database on their own and customize the prefedined setups. Advanced users who want to perform complex queries and inventory searching and compare the data in their own way or with their own models.

  16. Formation of Common Investment Networks by Project Establishment between Agents

    NASA Astrophysics Data System (ADS)

    Navarro-Barrientos, Jesús Emeterio

    We present an investment model integrated with trust and reputation mechanisms where agents interact with each other to establish investment projects. We investigate the establishment of investment projects, the influence of the interaction between agents in the evolution of the distribution of wealth as well as the formation of common investment networks and some of their properties. Simulation results show that the wealth distribution presents a power law in its tail. Also, it is shown that the trust and reputation mechanism proposed leads to the establishment of networks among agents, presenting some of the typical characteristics of real-life networks like a high clustering coefficient and short average path length.

  17. Modular thought in the circuit analysis

    NASA Astrophysics Data System (ADS)

    Wang, Feng

    2018-04-01

    Applied to solve the problem of modular thought, provides a whole for simplification's method, the complex problems have become of, and the study of circuit is similar to the above problems: the complex connection between components, make the whole circuit topic solution seems to be more complex, and actually components the connection between the have rules to follow, this article mainly tells the story of study on the application of the circuit modular thought. First of all, this paper introduces the definition of two-terminal network and the concept of two-terminal network equivalent conversion, then summarizes the common source resistance hybrid network modular approach, containing controlled source network modular processing method, lists the common module, typical examples analysis.

  18. Changing knowledge perspective in a changing world: The Adriatic multidisciplinary TDS approach

    NASA Astrophysics Data System (ADS)

    Bergamasco, Andrea; Carniel, Sandro; Nativi, Stefano; Signell, Richard P.; Benetazzo, Alvise; Falcieri, Francesco M.; Bonaldo, Davide; Minuzzo, Tiziano; Sclavo, Mauro

    2013-04-01

    The use and exploitation of the marine environment in recent years has been increasingly high, therefore calling for the need of a better description, monitoring and understanding of its behavior. However, marine scientists and managers often spend too much time in accessing and reformatting data instead of focusing on discovering new knowledge from the processes observed and data acquired. There is therefore the need to make more efficient our approach to data mining, especially in a world where rapid climate change imposes rapid and quick choices. In this context, it is mandatory to explore ways and possibilities to make large amounts of distributed data usable in an efficient and easy way, an effort that requires standardized data protocols, web services and standards-based tools. Following the US-IOOS approach, which has been adopted in many oceanographic and meteorological sectors, we present a CNR experience in the direction of setting up a national Italian IOOS framework (at the moment confined at the Adriatic Sea environment), using the THREDDS (THematic Real-time Environmental Distributed Data Services) Data Server (TDS). A TDS is a middleware designed to fill the gap between data providers and data users, and provides services allowing data users to find the data sets pertaining to their scientific needs, to access, visualize and use them in an easy way, without the need of downloading files to the local workspace. In order to achieve this results, it is necessary that the data providers make their data available in a standard form that the TDS understands, and with sufficient metadata so that the data can be read and searched for in a standard way. The TDS core is a NetCDF- Java Library implementing a Common Data Model (CDM), as developed by Unidata (http://www.unidata.ucar.edu), allowing the access to "array-based" scientific data. Climate and Forecast (CF) compliant NetCDF files can be read directly with no modification, while non-compliant files can be modified to meet appropriate metadata requirements. Once standardized in the CDM, the TDS makes datasets available through a series of web services such as OPeNDAP or Open Geospatial Consortium Web Coverage Service (WCS), allowing the data users to easily obtain small subsets from large datasets, and to quickly visualize their content by using tools such as GODIVA2 or Integrated Data Viewer (IDV). In addition, an ISO metadata service is available through the TDS that can be harvested by catalogue broker services (e.g. GI-cat) to enable distributed search across federated data servers. Example of TDS datasets from oceanographic evolutions (currents, waves, sediments...) will be described and discussed, while some examples can be accessed directly to the Venice site http://tds.ve.ismar.cnr.it:8080/thredds/catalog.html (Bergamasco et al., 2012) also within the framework of RITMARE Project. References Bergamasco A., Benetazzo A., Carniel S., Falcieri F., Minuzzo T., Signell R.P. and M. Sclavo, 2012. From interoperability to knowledge discovery using large model datasets in the marine environment: the THREDDS Data Server example. Advances in Oceanography and Limnology, 3(1), 41-50. DOI:10.1080/19475721.2012.669637

  19. Sharing from Scratch: How To Network CD-ROM.

    ERIC Educational Resources Information Center

    Doering, David

    1998-01-01

    Examines common CD-ROM networking architectures: via existing operating systems (OS), thin server towers, and dedicated servers. Discusses digital video disc (DVD) and non-CD/DVD optical storage solutions and presents case studies of networks that work. (PEN)

  20. Assessing and comparing relationships between urban environmental stewardship networks and land cover in Baltimore and Seattle

    Treesearch

    Michele Romolini; J. Morgan Grove; Dexter H. Locke

    2013-01-01

    Implementation of urban sustainability policies often requires collaborations between organizations across sectors. Indeed, it is commonly agreed that governance by environmental networks is preferred to individual organizations acting alone. Yet research shows that network structures vary widely, and that these variations can impact network effectiveness. However,...

  1. Predicting links based on knowledge dissemination in complex network

    NASA Astrophysics Data System (ADS)

    Zhou, Wen; Jia, Yifan

    2017-04-01

    Link prediction is the task of mining the missing links in networks or predicting the next vertex pair to be connected by a link. A lot of link prediction methods were inspired by evolutionary processes of networks. In this paper, a new mechanism for the formation of complex networks called knowledge dissemination (KD) is proposed with the assumption of knowledge disseminating through the paths of a network. Accordingly, a new link prediction method-knowledge dissemination based link prediction (KDLP)-is proposed to test KD. KDLP characterizes vertex similarity based on knowledge quantity (KQ) which measures the importance of a vertex through H-index. Extensive numerical simulations on six real-world networks demonstrate that KDLP is a strong link prediction method which performs at a higher prediction accuracy than four well-known similarity measures including common neighbors, local path index, average commute time and matrix forest index. Furthermore, based on the common conclusion that an excellent link prediction method reveals a good evolving mechanism, the experiment results suggest that KD is a considerable network evolving mechanism for the formation of complex networks.

  2. The most common friend first immunization

    NASA Astrophysics Data System (ADS)

    Nian, Fu-Zhong; Hu, Cha-Sheng

    2016-12-01

    In this paper, a standard susceptible-infected-recovered-susceptible(SIRS) epidemic model based on the Watts-Strogatz (WS) small-world network model and the Barabsi-Albert (BA) scale-free network model is established, and a new immunization scheme — “the most common friend first immunization” is proposed, in which the most common friend’s node is described as being the first immune on the second layer protection of complex networks. The propagation situations of three different immunization schemes — random immunization, high-risk immunization, and the most common friend first immunization are studied. At the same time, the dynamic behaviors are also studied on the WS small-world and the BA scale-free network. Moreover, the analytic and simulated results indicate that the immune effect of the most common friend first immunization is better than random immunization, but slightly worse than high-risk immunization. However, high-risk immunization still has some limitations. For example, it is difficult to accurately define who a direct neighbor in the life is. Compared with the traditional immunization strategies having some shortcomings, the most common friend first immunization is effective, and it is nicely consistent with the actual situation. Project supported by the National Natural Science Foundation of China (Grant No. 61263019), the Program for International Science and Technology Cooperation Projects of Gansu Province, China (Grant No. 144WCGA166), and the Program for Longyuan Young Innovation Talents and the Doctoral Foundation of Lanzhou University of Technology, China.

  3. 47 CFR 32.6110 - Network support expenses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Network support expenses. 32.6110 Section 32.6110 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6110 Network...

  4. 47 CFR 32.6530 - Network operations expense.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Network operations expense. 32.6530 Section 32.6530 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6530 Network...

  5. 47 CFR 32.6110 - Network support expenses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Network support expenses. 32.6110 Section 32.6110 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6110 Network...

  6. 47 CFR 32.6530 - Network operations expense.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Network operations expense. 32.6530 Section 32.6530 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Expense Accounts § 32.6530 Network...

  7. NDEx 2.0: A Clearinghouse for Research on Cancer Pathways.

    PubMed

    Pratt, Dexter; Chen, Jing; Pillich, Rudolf; Rynkov, Vladimir; Gary, Aaron; Demchak, Barry; Ideker, Trey

    2017-11-01

    We present NDEx 2.0, the latest release of the Network Data Exchange (NDEx) online data commons (www.ndexbio.org) and the ways in which it can be used to (i) improve the quality and abundance of biological networks relevant to the cancer research community; (ii) provide a medium for collaboration involving networks; and (iii) facilitate the review and dissemination of networks. We describe innovations addressing the challenges of an online data commons: scalability, data integration, data standardization, control of content and format by authors, and decentralized mechanisms for review. The practical use of NDEx is presented in the context of a novel strategy to foster network-oriented communities of interest in cancer research by adapting methods from academic publishing and social media. Cancer Res; 77(21); e58-61. ©2017 AACR . ©2017 American Association for Cancer Research.

  8. Efficient quantum transmission in multiple-source networks.

    PubMed

    Luo, Ming-Xing; Xu, Gang; Chen, Xiu-Bo; Yang, Yi-Xian; Wang, Xiaojun

    2014-04-02

    A difficult problem in quantum network communications is how to efficiently transmit quantum information over large-scale networks with common channels. We propose a solution by developing a quantum encoding approach. Different quantum states are encoded into a coherent superposition state using quantum linear optics. The transmission congestion in the common channel may be avoided by transmitting the superposition state. For further decoding and continued transmission, special phase transformations are applied to incoming quantum states using phase shifters such that decoders can distinguish outgoing quantum states. These phase shifters may be precisely controlled using classical chaos synchronization via additional classical channels. Based on this design and the reduction of multiple-source network under the assumption of restricted maximum-flow, the optimal scheme is proposed for specially quantized multiple-source network. In comparison with previous schemes, our scheme can greatly increase the transmission efficiency.

  9. Dedifferentiation Does Not Account for Hyperconnectivity after Traumatic Brain Injury.

    PubMed

    Bernier, Rachel Anne; Roy, Arnab; Venkatesan, Umesh Meyyappan; Grossner, Emily C; Brenner, Einat K; Hillary, Frank Gerard

    2017-01-01

    Changes in functional network connectivity following traumatic brain injury (TBI) have received increasing attention in recent neuroimaging literature. This study sought to understand how disrupted systems adapt to injury during resting and goal-directed brain states. Hyperconnectivity has been a common finding, and dedifferentiation (or loss of segregation of networks) is one possible explanation for this finding. We hypothesized that individuals with TBI would show dedifferentiation of networks (as noted in other clinical populations) and these effects would be associated with cognitive dysfunction. Graph theory was implemented to examine functional connectivity during periods of task and rest in 19 individuals with moderate/severe TBI and 14 healthy controls (HCs). Using a functional brain atlas derived from 83 functional imaging studies, graph theory was used to examine network dynamics and determine whether dedifferentiation accounts for changes in connectivity. Regions of interest were assigned to one of three groups: task-positive, default mode, or other networks. Relationships between these metrics were then compared with performance on neuropsychological tests. Hyperconnectivity in TBI was most commonly observed as increased within-network connectivity. Network strengths within networks that showed differences between TBI and HCs were correlated with performance on five neuropsychological tests typically sensitive to deficits commonly reported in TBI. Hyperconnectivity within the default mode network (DMN) during task was associated with better performance on Digit Span Backward, a measure of working memory [ R 2 (18) = 0.28, p  = 0.02]. In other words, increased differentiation of networks during task was associated with better working memory. Hyperconnectivity within the task-positive network during rest was not associated with behavior. Negative correlation weights were not associated with behavior. The primary hypothesis that hyperconnectivity occurs through increased segregation of networks, rather than dedifferentiation, was not supported. Instead, enhanced connectivity post injury was observed within network. Results suggest that the relationship between increased connectivity and cognitive functioning may be both state (rest or task) and network dependent. High-cost network hubs were identical for both rest and task, and cost was negatively associated with performance on measures of psychomotor speed and set-shifting.

  10. Naming Game on Networks: Let Everyone be Both Speaker and Hearer

    PubMed Central

    Gao, Yuan; Chen, Guanrong; Chan, Rosa H. M.

    2014-01-01

    To investigate how consensus is reached on a large self-organized peer-to-peer network, we extended the naming game model commonly used in language and communication to Naming Game in Groups (NGG). Differing from other existing naming game models, in NGG everyone in the population (network) can be both speaker and hearer simultaneously, which resembles in a closer manner to real-life scenarios. Moreover, NGG allows the transmission (communication) of multiple words (opinions) for multiple intra-group consensuses. The communications among indirectly-connected nodes are also enabled in NGG. We simulated and analyzed the consensus process in some typical network topologies, including random-graph networks, small-world networks and scale-free networks, to better understand how global convergence (consensus) could be reached on one common word. The results are interpreted on group negotiation of a peer-to-peer network, which shows that global consensus in the population can be reached more rapidly when more opinions are permitted within each group or when the negotiating groups in the population are larger in size. The novel features and properties introduced by our model have demonstrated its applicability in better investigating general consensus problems on peer-to-peer networks. PMID:25143140

  11. Putting age-related task activation into large-scale brain networks: A meta-analysis of 114 fMRI studies on healthy aging.

    PubMed

    Li, Hui-Jie; Hou, Xiao-Hui; Liu, Han-Hui; Yue, Chun-Lin; Lu, Guang-Ming; Zuo, Xi-Nian

    2015-10-01

    Normal aging is associated with cognitive decline and underlying brain dysfunction. Previous studies concentrated less on brain network changes at a systems level. Our goal was to examine these age-related changes of fMRI-derived activation with a common network parcellation of the human brain function, offering a systems-neuroscience perspective of healthy aging. We conducted a series of meta-analyses on a total of 114 studies that included 2035 older adults and 1845 young adults. Voxels showing significant age-related changes in activation were then overlaid onto seven commonly referenced neuronal networks. Older adults present moderate cognitive decline in behavioral performance during fMRI scanning, and hypo-activate the visual network and hyper-activate both the frontoparietal control and default mode networks. The degree of increased activation in frontoparietal network was associated with behavioral performance in older adults. Age-related changes in activation present different network patterns across cognitive domains. The systems neuroscience approach used here may be useful for elucidating the underlying network mechanisms of various brain plasticity processes during healthy aging. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. On the Wiener Polarity Index of Lattice Networks.

    PubMed

    Chen, Lin; Li, Tao; Liu, Jinfeng; Shi, Yongtang; Wang, Hua

    2016-01-01

    Network structures are everywhere, including but not limited to applications in biological, physical and social sciences, information technology, and optimization. Network robustness is of crucial importance in all such applications. Research on this topic relies on finding a suitable measure and use this measure to quantify network robustness. A number of distance-based graph invariants, also known as topological indices, have recently been incorporated as descriptors of complex networks. Among them the Wiener type indices are the most well known and commonly used such descriptors. As one of the fundamental variants of the original Wiener index, the Wiener polarity index has been introduced for a long time and known to be related to the cluster coefficient of networks. In this paper, we consider the value of the Wiener polarity index of lattice networks, a common network structure known for its simplicity and symmetric structure. We first present a simple general formula for computing the Wiener polarity index of any graph. Using this formula, together with the symmetric and recursive topology of lattice networks, we provide explicit formulas of the Wiener polarity index of the square lattices, the hexagonal lattices, the triangular lattices, and the 33 ⋅ 42 lattices. We also comment on potential future research topics.

  13. Naming Game on Networks: Let Everyone be Both Speaker and Hearer

    NASA Astrophysics Data System (ADS)

    Gao, Yuan; Chen, Guanrong; Chan, Rosa H. M.

    2014-08-01

    To investigate how consensus is reached on a large self-organized peer-to-peer network, we extended the naming game model commonly used in language and communication to Naming Game in Groups (NGG). Differing from other existing naming game models, in NGG everyone in the population (network) can be both speaker and hearer simultaneously, which resembles in a closer manner to real-life scenarios. Moreover, NGG allows the transmission (communication) of multiple words (opinions) for multiple intra-group consensuses. The communications among indirectly-connected nodes are also enabled in NGG. We simulated and analyzed the consensus process in some typical network topologies, including random-graph networks, small-world networks and scale-free networks, to better understand how global convergence (consensus) could be reached on one common word. The results are interpreted on group negotiation of a peer-to-peer network, which shows that global consensus in the population can be reached more rapidly when more opinions are permitted within each group or when the negotiating groups in the population are larger in size. The novel features and properties introduced by our model have demonstrated its applicability in better investigating general consensus problems on peer-to-peer networks.

  14. Multiscale Hy3S: hybrid stochastic simulation for supercomputers.

    PubMed

    Salis, Howard; Sotiropoulos, Vassilios; Kaznessis, Yiannis N

    2006-02-24

    Stochastic simulation has become a useful tool to both study natural biological systems and design new synthetic ones. By capturing the intrinsic molecular fluctuations of "small" systems, these simulations produce a more accurate picture of single cell dynamics, including interesting phenomena missed by deterministic methods, such as noise-induced oscillations and transitions between stable states. However, the computational cost of the original stochastic simulation algorithm can be high, motivating the use of hybrid stochastic methods. Hybrid stochastic methods partition the system into multiple subsets and describe each subset as a different representation, such as a jump Markov, Poisson, continuous Markov, or deterministic process. By applying valid approximations and self-consistently merging disparate descriptions, a method can be considerably faster, while retaining accuracy. In this paper, we describe Hy3S, a collection of multiscale simulation programs. Building on our previous work on developing novel hybrid stochastic algorithms, we have created the Hy3S software package to enable scientists and engineers to both study and design extremely large well-mixed biological systems with many thousands of reactions and chemical species. We have added adaptive stochastic numerical integrators to permit the robust simulation of dynamically stiff biological systems. In addition, Hy3S has many useful features, including embarrassingly parallelized simulations with MPI; special discrete events, such as transcriptional and translation elongation and cell division; mid-simulation perturbations in both the number of molecules of species and reaction kinetic parameters; combinatorial variation of both initial conditions and kinetic parameters to enable sensitivity analysis; use of NetCDF optimized binary format to quickly read and write large datasets; and a simple graphical user interface, written in Matlab, to help users create biological systems and analyze data. We demonstrate the accuracy and efficiency of Hy3S with examples, including a large-scale system benchmark and a complex bistable biochemical network with positive feedback. The software itself is open-sourced under the GPL license and is modular, allowing users to modify it for their own purposes. Hy3S is a powerful suite of simulation programs for simulating the stochastic dynamics of networks of biochemical reactions. Its first public version enables computational biologists to more efficiently investigate the dynamics of realistic biological systems.

  15. Data Access Tools And Services At The Goddard Distributed Active Archive Center (GDAAC)

    NASA Technical Reports Server (NTRS)

    Pham, Long; Eng, Eunice; Sweatman, Paul

    2003-01-01

    As one of the largest providers of Earth Science data from the Earth Observing System, GDAAC provides the latest data from the Moderate Resolution Imaging Spectroradiometer (MODIS), Atmospheric Infrared Sounder (AIRS), Solar Radiation and Climate Experiment (SORCE) data products via GDAAC's data pool (50TB of disk cache). In order to make this huge volume of data more accessible to the public and science communities, the GDAAC offers multiple data access tools and services: Open Source Project for Network Data Access Protocol (OPeNDAP), Grid Analysis and Display System (GrADS/DODS) (GDS), Live Access Server (LAS), OpenGlS Web Map Server (WMS) and Near Archive Data Mining (NADM). The objective is to assist users in retrieving electronically a smaller, usable portion of data for further analysis. The OPeNDAP server, formerly known as the Distributed Oceanographic Data System (DODS), allows the user to retrieve data without worrying about the data format. OPeNDAP is capable of server-side subsetting of HDF, HDF-EOS, netCDF, JGOFS, ASCII, DSP, FITS and binary data formats. The GrADS/DODS server is capable of serving the same data formats as OPeNDAP. GDS has an additional feature of server-side analysis. Users can analyze the data on the server there by decreasing the computational load on their client's system. The LAS is a flexible server that allows user to graphically visualize data on the fly, to request different file formats and to compare variables from distributed locations. Users of LAS have options to use other available graphics viewers such as IDL, Matlab or GrADS. WMS is based on the OPeNDAP for serving geospatial information. WMS supports OpenGlS protocol to provide data in GIs-friendly formats for analysis and visualization. NADM is another access to the GDAAC's data pool. NADM gives users the capability to use a browser to upload their C, FORTRAN or IDL algorithms, test the algorithms, and mine data in the data pool. With NADM, the GDAAC provides an environment physically close to the data source. NADM will benefit users with mining or offer data reduction algorithms by reducing large volumes of data before transmission over the network to the user.

  16. The MMI Device Ontology: Enabling Sensor Integration

    NASA Astrophysics Data System (ADS)

    Rueda, C.; Galbraith, N.; Morris, R. A.; Bermudez, L. E.; Graybeal, J.; Arko, R. A.; Mmi Device Ontology Working Group

    2010-12-01

    The Marine Metadata Interoperability (MMI) project has developed an ontology for devices to describe sensors and sensor networks. This ontology is implemented in the W3C Web Ontology Language (OWL) and provides an extensible conceptual model and controlled vocabularies for describing heterogeneous instrument types, with different data characteristics, and their attributes. It can help users populate metadata records for sensors; associate devices with their platforms, deployments, measurement capabilities and restrictions; aid in discovery of sensor data, both historic and real-time; and improve the interoperability of observational oceanographic data sets. We developed the MMI Device Ontology following a community-based approach. By building on and integrating other models and ontologies from related disciplines, we sought to facilitate semantic interoperability while avoiding duplication. Key concepts and insights from various communities, including the Open Geospatial Consortium (eg., SensorML and Observations and Measurements specifications), Semantic Web for Earth and Environmental Terminology (SWEET), and W3C Semantic Sensor Network Incubator Group, have significantly enriched the development of the ontology. Individuals ranging from instrument designers, science data producers and consumers to ontology specialists and other technologists contributed to the work. Applications of the MMI Device Ontology are underway for several community use cases. These include vessel-mounted multibeam mapping sonars for the Rolling Deck to Repository (R2R) program and description of diverse instruments on deepwater Ocean Reference Stations for the OceanSITES program. These trials involve creation of records completely describing instruments, either by individual instances or by manufacturer and model. Individual terms in the MMI Device Ontology can be referenced with their corresponding Uniform Resource Identifiers (URIs) in sensor-related metadata specifications (e.g., SensorML, NetCDF). These identifiers can be resolved through a web browser, or other client applications via HTTP against the MMI Ontology Registry and Repository (ORR), where the ontology is maintained. SPARQL-based query capabilities, which are enhanced with reasoning, along with several supported output formats, allow the effective interaction of diverse client applications with the semantic information associated with the device ontology. In this presentation we describe the process for the development of the MMI Device Ontology and illustrate extensions and applications that demonstrate the benefits of adopting this semantic approach, including example queries involving inference. We also highlight the issues encountered and future work.

  17. 47 CFR 36.213 - Network access services revenues.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Network access services revenues. 36.213 Section 36.213 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES... Income Accounts Operating Revenues § 36.213 Network access services revenues. (a) End User Revenue...

  18. 47 CFR 36.213 - Network access services revenues.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Network access services revenues. 36.213 Section 36.213 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES... Income Accounts Operating Revenues § 36.213 Network access services revenues. (a) End User Revenue...

  19. Platonic Relationships among Resistors

    ERIC Educational Resources Information Center

    Allen, Bradley; Liu, Tongtian

    2015-01-01

    Calculating the effective resistance of an electrical network is a common problem in introductory physics courses. Such calculations are typically restricted to two-dimensional networks, though even such networks can become increasingly complex, leading to several studies on their properties. Furthermore, several authors have used advanced…

  20. 47 CFR 27.1307 - Spectrum use in the network.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....1307 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES MISCELLANEOUS WIRELESS COMMUNICATIONS SERVICES 700 MHz Public/Private Partnership § 27.1307 Spectrum use in the network. (a) Spectrum use. The shared wireless broadband network developed by the 700 MHz Public/Private...

  1. Web-based CERES Clouds QC Property Viewing Tool

    NASA Astrophysics Data System (ADS)

    Smith, R. A.

    2015-12-01

    Churngwei Chu1, Rita Smith1, Sunny Sun-Mack1, Yan Chen1, Elizabeth Heckert1, Patrick Minnis21 Science Systems and Applications, Inc., Hampton, Virginia2 NASA Langley Research Center, Hampton, Virginia This presentation will display the capabilities of a web-based CERES cloud property viewer. Aqua/Terra/NPP data will be chosen for examples. It will demonstrate viewing of cloud properties in gridded global maps, histograms, time series displays, latitudinal zonal images, binned data charts, data frequency graphs, and ISCCP plots. Images can be manipulated by the user to narrow boundaries of the map as well as color bars and value ranges, compare datasets, view data values, and more. Other atmospheric studies groups will be encouraged to put their data into the underlying NetCDF data format and view their data with the tool.

  2. Atmospheric data access for the geospatial user community

    NASA Astrophysics Data System (ADS)

    van de Vegte, John; Som de Cerff, Wim-Jan; van den Oord, Gijsbertus H. J.; Sluiter, Raymond; van der Neut, Ian A.; Plieger, Maarten; van Hees, Richard M.; de Jeu, Richard A. M.; Schaepman, Michael E.; Hoogerwerf, Marc R.; Groot, Nikée E.; Domenico, Ben; Nativi, Stefano; Wilhelmi, Olga V.

    2007-10-01

    Historically the atmospheric and meteorological communities are separate worlds with their own data formats and tools for data handling making sharing of data difficult and cumbersome. On the other hand, these information sources are becoming increasingly of interest outside these communities because of the continuously improving spatial and temporal resolution of e.g. model and satellite data and the interest in historical datasets. New user communities that use geographically based datasets in a cross-domain manner are emerging. This development is supported by the progress made in Geographical Information System (GIS) software. The current GIS software is not yet ready for the wealth of atmospheric data, although the faint outlines of new generation software are already visible: support of HDF, NetCDF and an increasing understanding of temporal issues are only a few of the hints.

  3. 76 FR 51271 - Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the 700 MHz Band

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-18

    ... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... interoperable public safety broadband network. The establishment of a common air interface for 700 MHz public safety broadband networks will create a foundation for interoperability and provide a clear path for the...

  4. DDN New User Guide. Revision.

    DTIC Science & Technology

    1992-10-01

    5 2.1 Network Overview ................................................ 5 2.2 Network Access Methods...to TAC and ?4ini-TAC users, such as common error messages, TAC commands, and instructions for performing file tranders. Section 5 , Network Use...originally known as Interface Message Processors, or IMPs. 5 THE DEFENSE DATA NETWORK DRAFt NIC 60001, October 1992 message do not necessarily take the same

  5. Emergence of Soft Communities from Geometric Preferential Attachment

    PubMed Central

    Zuev, Konstantin; Boguñá, Marián; Bianconi, Ginestra; Krioukov, Dmitri

    2015-01-01

    All real networks are different, but many have some structural properties in common. There seems to be no consensus on what the most common properties are, but scale-free degree distributions, strong clustering, and community structure are frequently mentioned without question. Surprisingly, there exists no simple generative mechanism explaining all the three properties at once in growing networks. Here we show how latent network geometry coupled with preferential attachment of nodes to this geometry fills this gap. We call this mechanism geometric preferential attachment (GPA), and validate it against the Internet. GPA gives rise to soft communities that provide a different perspective on the community structure in networks. The connections between GPA and cosmological models, including inflation, are also discussed. PMID:25923110

  6. Toward the automated generation of genome-scale metabolic networks in the SEED.

    PubMed

    DeJongh, Matthew; Formsma, Kevin; Boillot, Paul; Gould, John; Rycenga, Matthew; Best, Aaron

    2007-04-26

    Current methods for the automated generation of genome-scale metabolic networks focus on genome annotation and preliminary biochemical reaction network assembly, but do not adequately address the process of identifying and filling gaps in the reaction network, and verifying that the network is suitable for systems level analysis. Thus, current methods are only sufficient for generating draft-quality networks, and refinement of the reaction network is still largely a manual, labor-intensive process. We have developed a method for generating genome-scale metabolic networks that produces substantially complete reaction networks, suitable for systems level analysis. Our method partitions the reaction space of central and intermediary metabolism into discrete, interconnected components that can be assembled and verified in isolation from each other, and then integrated and verified at the level of their interconnectivity. We have developed a database of components that are common across organisms, and have created tools for automatically assembling appropriate components for a particular organism based on the metabolic pathways encoded in the organism's genome. This focuses manual efforts on that portion of an organism's metabolism that is not yet represented in the database. We have demonstrated the efficacy of our method by reverse-engineering and automatically regenerating the reaction network from a published genome-scale metabolic model for Staphylococcus aureus. Additionally, we have verified that our method capitalizes on the database of common reaction network components created for S. aureus, by using these components to generate substantially complete reconstructions of the reaction networks from three other published metabolic models (Escherichia coli, Helicobacter pylori, and Lactococcus lactis). We have implemented our tools and database within the SEED, an open-source software environment for comparative genome annotation and analysis. Our method sets the stage for the automated generation of substantially complete metabolic networks for over 400 complete genome sequences currently in the SEED. With each genome that is processed using our tools, the database of common components grows to cover more of the diversity of metabolic pathways. This increases the likelihood that components of reaction networks for subsequently processed genomes can be retrieved from the database, rather than assembled and verified manually.

  7. Characterizing mutation-expression network relationships in multiple cancers.

    PubMed

    Ghazanfar, Shila; Yang, Jean Yee Hwa

    2016-08-01

    Data made available through large cancer consortia like The Cancer Genome Atlas make for a rich source of information to be studied across and between cancers. In recent years, network approaches have been applied to such data in uncovering the complex interrelationships between mutational and expression profiles, but lack direct testing for expression changes via mutation. In this pan-cancer study we analyze mutation and gene expression information in an integrative manner by considering the networks generated by testing for differences in expression in direct association with specific mutations. We relate our findings among the 19 cancers examined to identify commonalities and differences as well as their characteristics. Using somatic mutation and gene expression information across 19 cancers, we generated mutation-expression networks per cancer. On evaluation we found that our generated networks were significantly enriched for known cancer-related genes, such as skin cutaneous melanoma (p<0.01 using Network of Cancer Genes 4.0). Our framework identified that while different cancers contained commonly mutated genes, there was little concordance between associated gene expression changes among cancers. Comparison between cancers showed a greater overlap of network nodes for cancers with higher overall non-silent mutation load, compared to those with a lower overall non-silent mutation load. This study offers a framework that explores network information through co-analysis of somatic mutations and gene expression profiles. Our pan-cancer application of this approach suggests that while mutations are frequently common among cancer types, the impact they have on the surrounding networks via gene expression changes varies. Despite this finding, there are some cancers for which mutation-associated network behaviour appears to be similar: suggesting a potential framework for uncovering related cancers for which similar therapeutic strategies may be applicable. Our framework for understanding relationships among cancers has been integrated into an interactive R Shiny application, PAn Cancer Mutation Expression Networks (PACMEN), containing dynamic and static network visualization of the mutation-expression networks. PACMEN also features tools for further examination of network topology characteristics among cancers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Functional connectivity of hippocampal and prefrontal networks during episodic and spatial memory based on real-world environments.

    PubMed

    Robin, Jessica; Hirshhorn, Marnie; Rosenbaum, R Shayna; Winocur, Gordon; Moscovitch, Morris; Grady, Cheryl L

    2015-01-01

    Several recent studies have compared episodic and spatial memory in neuroimaging paradigms in order to understand better the contribution of the hippocampus to each of these tasks. In the present study, we build on previous findings showing common neural activation in default network areas during episodic and spatial memory tasks based on familiar, real-world environments (Hirshhorn et al. (2012) Neuropsychologia 50:3094-3106). Following previous demonstrations of the presence of functionally connected sub-networks within the default network, we performed seed-based functional connectivity analyses to determine how, depending on the task, the hippocampus and prefrontal cortex differentially couple with one another and with distinct whole-brain networks. We found evidence for a medial prefrontal-parietal network and a medial temporal lobe network, which were functionally connected to the prefrontal and hippocampal seeds, respectively, regardless of the nature of the memory task. However, these two networks were functionally connected with one another during the episodic memory task, but not during spatial memory tasks. Replicating previous reports of fractionation of the default network into stable sub-networks, this study also shows how these sub-networks may flexibly couple and uncouple with one another based on task demands. These findings support the hypothesis that episodic memory and spatial memory share a common medial temporal lobe-based neural substrate, with episodic memory recruiting additional prefrontal sub-networks. © 2014 Wiley Periodicals, Inc.

  9. Ada Run Time Support Environments and a common APSE Interface Set. [Ada Programming Support Environment

    NASA Technical Reports Server (NTRS)

    Mckay, C. W.; Bown, R. L.

    1985-01-01

    The paper discusses the importance of linking Ada Run Time Support Environments to the Common Ada Programming Support Environment (APSE) Interface Set (CAIS). A non-stop network operating systems scenario is presented to serve as a forum for identifying the important issues. The network operating system exemplifies the issues involved in the NASA Space Station data management system.

  10. The Climate Change Education Partnership Alliance: Building a Network for Effective Collaboration and Impact (Invited)

    NASA Astrophysics Data System (ADS)

    Scowcroft, G.

    2013-12-01

    The mission of the Climate Change Education Partnership Alliance (The Alliance), funded by the National Science Foundation (NSF), is to advance exemplary climate change education through research and innovative partnerships. Through six unique regional projects, The Alliance is reaching wide and diverse audiences across the U.S., while linking groups and institutions that might not otherwise be connected by a common focus on climate change education. The goals for The Alliance include building collaborations between projects and institutions, sharing effective practices, and leveraging resources to create a community in which the whole is greater than the sum of its parts. To foster these goals, NSF has funded a central hub, the Alliance Office. Currently, the Alliance Office is building the infrastructure necessary to support activities and communication between the projects. Successful networks need objectives for their interactions and a common vision held by the partners. In the first national meeting of The Alliance members, held in June 2013, the foundation was laid to begin this work. The Alliance now has a common mission and vision to guide the next four years of activities. An initial 'mapping' of the network has identified the scope and diversity of the network, how members are connected, current boundaries of the network, network strengths and weaknesses, and network needs. This information will serve as a baseline as the network develops. The Alliance has also identified the need for key 'working groups' which provide an opportunity for members to work across the projects on common goals. As The Alliance evolves, building blocks identified by the field of network science will be used to forge a strong and successful collaborative enterprise. Infrastructure is being established to support widespread engagement; social ties are being fostered through face-to-face meetings and monthly teleconferences; time is provided to build and share knowledge; the sharing of new and diverse perspectives is encouraged; and resources will be leveraged across and beyond the projects. This presentation will provide an overview of The Alliance activities, lessons learned thus far, and plans for the future.

  11. Developing an Effective Plan for Smart Sanctions: A Network Analysis Approach

    DTIC Science & Technology

    2012-10-31

    data and a network model that realistically simulates the Iranian nuclear development program. We then utilize several network analysis techniques...the Iran Watch (iranwatch.org) watchdog website. Using this data, which at first glance seems obtuse and unwieldy, we constructed network models in... model is created, nodes were evaluated using several measures of centrality. The team then analyzed this network utilizing four of the most common

  12. A common brain network among state, trait, and pathological anxiety from whole-brain functional connectivity.

    PubMed

    Takagi, Yu; Sakai, Yuki; Abe, Yoshinari; Nishida, Seiji; Harrison, Ben J; Martínez-Zalacaín, Ignacio; Soriano-Mas, Carles; Narumoto, Jin; Tanaka, Saori C

    2018-05-15

    Anxiety is one of the most common mental states of humans. Although it drives us to avoid frightening situations and to achieve our goals, it may also impose significant suffering and burden if it becomes extreme. Because we experience anxiety in a variety of forms, previous studies investigated neural substrates of anxiety in a variety of ways. These studies revealed that individuals with high state, trait, or pathological anxiety showed altered neural substrates. However, no studies have directly investigated whether the different dimensions of anxiety share a common neural substrate, despite its theoretical and practical importance. Here, we investigated a brain network of anxiety shared by different dimensions of anxiety in a unified analytical framework using functional magnetic resonance imaging (fMRI). We analyzed different datasets in a single scale, which was defined by an anxiety-related brain network derived from whole brain. We first conducted the anxiety provocation task with healthy participants who tended to feel anxiety related to obsessive-compulsive disorder (OCD) in their daily life. We found a common state anxiety brain network across participants (1585 trials obtained from 10 participants). Then, using the resting-state fMRI in combination with the participants' behavioral trait anxiety scale scores (879 participants from the Human Connectome Project), we demonstrated that trait anxiety shared the same brain network as state anxiety. Furthermore, the brain network between common to state and trait anxiety could detect patients with OCD, which is characterized by pathological anxiety-driven behaviors (174 participants from multi-site datasets). Our findings provide direct evidence that different dimensions of anxiety have a substantial biological inter-relationship. Our results also provide a biologically defined dimension of anxiety, which may promote further investigation of various human characteristics, including psychiatric disorders, from the perspective of anxiety. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Policies for implementing network firewalls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, C.D.

    1994-05-01

    Corporate networks are frequently protected by {open_quotes}firewalls{close_quotes} or gateway systems that control access to/from other networks, e.g., the Internet, in order to reduce the network`s vulnerability to hackers and other unauthorized access. Firewalls typically limit access to particular network nodes and application protocols, and they often perform special authentication and authorization functions. One of the difficult issues associated with network firewalls is determining which applications should be permitted through the firewall. For example, many networks permit the exchange of electronic mail with the outside but do not permit file access to be initiated by outside users, as this might allowmore » outside users to access sensitive data or to surreptitiously modify data or programs (e.g., to intall Trojan Horse software). However, if access through firewalls is severely restricted, legitimate network users may find it difficult or impossible to collaborate with outside users and to share data. Some of the most serious issues regarding firewalls involve setting policies for firewalls with the goal of achieving an acceptable balance between the need for greater functionality and the associated risks. Two common firewall implementation techniques, screening routers and application gateways, are discussed below, followed by some common policies implemented by network firewalls.« less

  14. Local Area Networks: Part I.

    ERIC Educational Resources Information Center

    Dessy, Raymond E.

    1982-01-01

    Local area networks are common communication conduits allowing various terminals, computers, discs, printers, and other electronic devices to intercommunicate over short distances. Discusses the vocabulary of such networks including RS-232C point-to-point and IEEE-488 multidrop protocols; error detection; message packets; multiplexing; star, ring,…

  15. Using Inspiration from Synaptic Plasticity Rules to Optimize Traffic Flow in Distributed Engineered Networks.

    PubMed

    Suen, Jonathan Y; Navlakha, Saket

    2017-05-01

    Controlling the flow and routing of data is a fundamental problem in many distributed networks, including transportation systems, integrated circuits, and the Internet. In the brain, synaptic plasticity rules have been discovered that regulate network activity in response to environmental inputs, which enable circuits to be stable yet flexible. Here, we develop a new neuro-inspired model for network flow control that depends only on modifying edge weights in an activity-dependent manner. We show how two fundamental plasticity rules, long-term potentiation and long-term depression, can be cast as a distributed gradient descent algorithm for regulating traffic flow in engineered networks. We then characterize, both by simulation and analytically, how different forms of edge-weight-update rules affect network routing efficiency and robustness. We find a close correspondence between certain classes of synaptic weight update rules derived experimentally in the brain and rules commonly used in engineering, suggesting common principles to both.

  16. Transcriptome profiling analysis reveals biomarkers in colon cancer samples of various differentiation

    PubMed Central

    Yu, Tonghu; Zhang, Huaping; Qi, Hong

    2018-01-01

    The aim of the present study was to investigate more colon cancer-related genes in different stages. Gene expression profile E-GEOD-62932 was extracted for differentially expressed gene (DEG) screening. Series test of cluster analysis was used to obtain significant trending models. Based on the Gene Ontology and Kyoto Encyclopedia of Genes and Genomes databases, functional and pathway enrichment analysis were processed and a pathway relation network was constructed. Gene co-expression network and gene signal network were constructed for common DEGs. The DEGs with the same trend were clustered and in total, 16 clusters with statistical significance were obtained. The screened DEGs were enriched into small molecule metabolic process and metabolic pathways. The pathway relation network was constructed with 57 nodes. A total of 328 common DEGs were obtained. Gene signal network was constructed with 71 nodes. Gene co-expression network was constructed with 161 nodes and 211 edges. ABCD3, CPT2, AGL and JAM2 are potential biomarkers for the diagnosis of colon cancer. PMID:29928385

  17. Forecasting PM10 in metropolitan areas: Efficacy of neural networks.

    PubMed

    Fernando, H J S; Mammarella, M C; Grandoni, G; Fedele, P; Di Marco, R; Dimitrova, R; Hyde, P

    2012-04-01

    Deterministic photochemical air quality models are commonly used for regulatory management and planning of urban airsheds. These models are complex, computer intensive, and hence are prohibitively expensive for routine air quality predictions. Stochastic methods are becoming increasingly popular as an alternative, which relegate decision making to artificial intelligence based on Neural Networks that are made of artificial neurons or 'nodes' capable of 'learning through training' via historic data. A Neural Network was used to predict particulate matter concentration at a regulatory monitoring site in Phoenix, Arizona; its development, efficacy as a predictive tool and performance vis-à-vis a commonly used regulatory photochemical model are described in this paper. It is concluded that Neural Networks are much easier, quicker and economical to implement without compromising the accuracy of predictions. Neural Networks can be used to develop rapid air quality warning systems based on a network of automated monitoring stations. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. A meta-analysis of public microarray data identifies biological regulatory networks in Parkinson's disease.

    PubMed

    Su, Lining; Wang, Chunjie; Zheng, Chenqing; Wei, Huiping; Song, Xiaoqing

    2018-04-13

    Parkinson's disease (PD) is a long-term degenerative disease that is caused by environmental and genetic factors. The networks of genes and their regulators that control the progression and development of PD require further elucidation. We examine common differentially expressed genes (DEGs) from several PD blood and substantia nigra (SN) microarray datasets by meta-analysis. Further we screen the PD-specific genes from common DEGs using GCBI. Next, we used a series of bioinformatics software to analyze the miRNAs, lncRNAs and SNPs associated with the common PD-specific genes, and then identify the mTF-miRNA-gene-gTF network. Our results identified 36 common DEGs in PD blood studies and 17 common DEGs in PD SN studies, and five of the genes were previously known to be associated with PD. Further study of the regulatory miRNAs associated with the common PD-specific genes revealed 14 PD-specific miRNAs in our study. Analysis of the mTF-miRNA-gene-gTF network about PD-specific genes revealed two feed-forward loops: one involving the SPRK2 gene, hsa-miR-19a-3p and SPI1, and the second involving the SPRK2 gene, hsa-miR-17-3p and SPI. The long non-coding RNA (lncRNA)-mediated regulatory network identified lncRNAs associated with PD-specific genes and PD-specific miRNAs. Moreover, single nucleotide polymorphism (SNP) analysis of the PD-specific genes identified two significant SNPs, and SNP analysis of the neurodegenerative disease-specific genes identified seven significant SNPs. Most of these SNPs are present in the 3'-untranslated region of genes and are controlled by several miRNAs. Our study identified a total of 53 common DEGs in PD patients compared with healthy controls in blood and brain datasets and five of these genes were previously linked with PD. Regulatory network analysis identified PD-specific miRNAs, associated long non-coding RNA and feed-forward loops, which contribute to our understanding of the mechanisms underlying PD. The SNPs identified in our study can determine whether a genetic variant is associated with PD. Overall, these findings will help guide our study of the complex molecular mechanism of PD.

  19. Phylogeny of metabolic networks: a spectral graph theoretical approach.

    PubMed

    Deyasi, Krishanu; Banerjee, Anirban; Deb, Bony

    2015-10-01

    Many methods have been developed for finding the commonalities between different organisms in order to study their phylogeny. The structure of metabolic networks also reveals valuable insights into metabolic capacity of species as well as into the habitats where they have evolved. We constructed metabolic networks of 79 fully sequenced organisms and compared their architectures. We used spectral density of normalized Laplacian matrix for comparing the structure of networks. The eigenvalues of this matrix reflect not only the global architecture of a network but also the local topologies that are produced by different graph evolutionary processes like motif duplication or joining. A divergence measure on spectral densities is used to quantify the distances between various metabolic networks, and a split network is constructed to analyse the phylogeny from these distances. In our analysis, we focused on the species that belong to different classes, but appear more related to each other in the phylogeny. We tried to explore whether they have evolved under similar environmental conditions or have similar life histories. With this focus, we have obtained interesting insights into the phylogenetic commonality between different organisms.

  20. Efficient Quantum Transmission in Multiple-Source Networks

    PubMed Central

    Luo, Ming-Xing; Xu, Gang; Chen, Xiu-Bo; Yang, Yi-Xian; Wang, Xiaojun

    2014-01-01

    A difficult problem in quantum network communications is how to efficiently transmit quantum information over large-scale networks with common channels. We propose a solution by developing a quantum encoding approach. Different quantum states are encoded into a coherent superposition state using quantum linear optics. The transmission congestion in the common channel may be avoided by transmitting the superposition state. For further decoding and continued transmission, special phase transformations are applied to incoming quantum states using phase shifters such that decoders can distinguish outgoing quantum states. These phase shifters may be precisely controlled using classical chaos synchronization via additional classical channels. Based on this design and the reduction of multiple-source network under the assumption of restricted maximum-flow, the optimal scheme is proposed for specially quantized multiple-source network. In comparison with previous schemes, our scheme can greatly increase the transmission efficiency. PMID:24691590

  1. Gossip algorithms in quantum networks

    NASA Astrophysics Data System (ADS)

    Siomau, Michael

    2017-01-01

    Gossip algorithms is a common term to describe protocols for unreliable information dissemination in natural networks, which are not optimally designed for efficient communication between network entities. We consider application of gossip algorithms to quantum networks and show that any quantum network can be updated to optimal configuration with local operations and classical communication. This allows to speed-up - in the best case exponentially - the quantum information dissemination. Irrespective of the initial configuration of the quantum network, the update requiters at most polynomial number of local operations and classical communication.

  2. Common modulation of limbic network activation underlies musical emotions as they unfold.

    PubMed

    Singer, Neomi; Jacoby, Nori; Lin, Tamar; Raz, Gal; Shpigelman, Lavi; Gilam, Gadi; Granot, Roni Y; Hendler, Talma

    2016-11-01

    Music is a powerful means for communicating emotions among individuals. Here we reveal that this continuous stream of affective information is commonly represented in the brains of different listeners and that particular musical attributes mediate this link. We examined participants' brain responses to two naturalistic musical pieces using functional Magnetic Resonance imaging (fMRI). Following scanning, as participants listened to the musical pieces for a second time, they continuously indicated their emotional experience on scales of valence and arousal. These continuous reports were used along with a detailed annotation of the musical features, to predict a novel index of Dynamic Common Activation (DCA) derived from ten large-scale data-driven functional networks. We found an association between the unfolding music-induced emotionality and the DCA modulation within a vast network of limbic regions. The limbic-DCA modulation further corresponded with continuous changes in two temporal musical features: beat-strength and tempo. Remarkably, this "collective limbic sensitivity" to temporal features was found to mediate the link between limbic-DCA and the reported emotionality. An additional association with the emotional experience was found in a left fronto-parietal network, but only among a sub-group of participants with a high level of musical experience (>5years). These findings may indicate two processing-levels underlying the unfolding of common music emotionality; (1) a widely shared core-affective process that is confined to a limbic network and mediated by temporal regularities in music and (2) an experience based process that is rooted in a left fronto-parietal network that may involve functioning of the 'mirror-neuron system'. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Arbuscular-mycorrhizal networks inhibit Eucalyptus tetrodonta seedlings in rain forest soil microcosms.

    PubMed

    Janos, David P; Scott, John; Aristizábal, Catalina; Bowman, David M J S

    2013-01-01

    Eucalyptus tetrodonta, a co-dominant tree species of tropical, northern Australian savannas, does not invade adjacent monsoon rain forest unless the forest is burnt intensely. Such facilitation by fire of seedling establishment is known as the "ashbed effect." Because the ashbed effect might involve disruption of common mycorrhizal networks, we hypothesized that in the absence of fire, intact rain forest arbuscular mycorrhizal (AM) networks inhibit E. tetrodonta seedlings. Although arbuscular mycorrhizas predominate in the rain forest, common tree species of the northern Australian savannas (including adult E. tetrodonta) host ectomycorrhizas. To test our hypothesis, we grew E. tetrodonta and Ceiba pentandra (an AM-responsive species used to confirm treatments) separately in microcosms of ambient or methyl-bromide fumigated rain forest soil with or without severing potential mycorrhizal fungus connections to an AM nurse plant, Litsea glutinosa. As expected, C. pentandra formed mycorrhizas in all treatments but had the most root colonization and grew fastest in ambient soil. E. tetrodonta seedlings also formed AM in all treatments, but severing hyphae in fumigated soil produced the least colonization and the best growth. Three of ten E. tetrodonta seedlings in ambient soil with intact network hyphae died. Because foliar chlorosis was symptomatic of iron deficiency, after 130 days we began to fertilize half the E. tetrodonta seedlings in ambient soil with an iron solution. Iron fertilization completely remedied chlorosis and stimulated leaf growth. Our microcosm results suggest that in intact rain forest, common AM networks mediate belowground competition and AM fungi may exacerbate iron deficiency, thereby enhancing resistance to E. tetrodonta invasion. Common AM networks-previously unrecognized as contributors to the ashbed effect-probably help to maintain the rain forest-savanna boundary.

  4. NASA Integrated Network COOP

    NASA Technical Reports Server (NTRS)

    Anderson, Michael L.; Wright, Nathaniel; Tai, Wallace

    2012-01-01

    Natural disasters, terrorist attacks, civil unrest, and other events have the potential of disrupting mission-essential operations in any space communications network. NASA's Space Communications and Navigation office (SCaN) is in the process of studying options for integrating the three existing NASA network elements, the Deep Space Network, the Near Earth Network, and the Space Network, into a single integrated network with common services and interfaces. The need to maintain Continuity of Operations (COOP) after a disastrous event has a direct impact on the future network design and operations concepts. The SCaN Integrated Network will provide support to a variety of user missions. The missions have diverse requirements and include anything from earth based platforms to planetary missions and rovers. It is presumed that an integrated network, with common interfaces and processes, provides an inherent advantage to COOP in that multiple elements and networks can provide cross-support in a seamless manner. The results of trade studies support this assumption but also show that centralization as a means of achieving integration can result in single points of failure that must be mitigated. The cost to provide this mitigation can be substantial. In support of this effort, the team evaluated the current approaches to COOP, developed multiple potential approaches to COOP in a future integrated network, evaluated the interdependencies of the various approaches to the various network control and operations options, and did a best value assessment of the options. The paper will describe the trade space, the study methods, and results of the study.

  5. Use of Network Inference to Elucidate Common and Chemical-specific Effects on Steoidogenesis

    EPA Science Inventory

    Microarray data is a key source for modeling gene regulatory interactions. Regulatory network models based on multiple datasets are potentially more robust and can provide greater confidence. In this study, we used network modeling on microarray data generated by exposing the fat...

  6. Cohort Differences in Received Social Support in Later Life: The Role of Network Type.

    PubMed

    Suanet, Bianca; Antonucci, Toni C

    2017-07-01

    The objective is to assess cohort differences in received emotional and instrumental support in relation to network types. The main guiding hypothesis is that due to increased salience of non-kin with recent social change, those in friend-focused and diverse network types receive more support in later birth cohorts than earlier birth cohorts. Data from the Longitudinal Aging Study Amsterdam are employed. We investigate cohort differences in total received emotional and instrumental support in a series of linear regression models comparing birth cohorts aged 55-64, 65-74, 75-84, and 85-94 across three time periods (1992, 2002, and 2012). Four network types (friend, family, restricted, and diverse) are identified. Friend-focused networks are more common in later birth cohorts, restrictive networks less common. Those in friend-focused networks in later cohorts report receiving more emotional and instrumental support. No differences in received support are evident upon diverse networks. The increased salience of non-kin is reflected in an increase in received emotional and instrumental support in friend-focused networks in later birth cohorts. The preponderance of non-kin in networks should not be perceived as a deficit model for social relationships as restrictive networks are declining across birth cohorts. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. From Many to Many More: Instant Interoperability Through the Integrated Ocean Observing System Data Assembly Center

    NASA Astrophysics Data System (ADS)

    Burnett, W.; Bouchard, R.; Hervey, R.; Crout, R.; Luke, R.

    2008-12-01

    As the Integrated Ocean Observing System (IOOS) Data Assembly Center (DAC), NOAA's National Data Buoy Center (NDBC) collects data from many ocean observing systems, quality controls the data, and distributes them nationally and internationally. The DAC capabilities provide instant interoperability of any ocean observatory with the national and international agencies responsible for critical forecasts and warnings and with the national media. This interoperability is an important milestone in an observing system's designation as an operational system. Data collection begins with NDBC's own observing systems - Meteorological and Oceanographic Buoys and Coastal Stations, the Tropical Atmosphere Ocean Array, and the NOAA tsunameter network. Leveraging the data management functions that support NDBC systems, the DAC can support data partners including ocean observations from IOOS Regional Observing Systems, the meteorological observations from the National Water Level Observing Network, meteorological and oceanographic observations from the National Estuarine Research Reserve System, Integrated Coral Observing Network, merchant ship observations from the Voluntary Observing Ship program, and ocean current measurements from oil and gas platforms in the Gulf of Mexico and from Coastal HF Radars. The DAC monitors and quality controls IOOS Partner data alerting the data provider to outages and quality discrepancies. After performing automated and manual quality control procedures, the DAC prepares the observations for distribution. The primary means of data distribution is in standard World Meteorological Organization alphanumeric coded messages distributed via the Global Telecommunications System, NOAAPort, and Family of Services. Observing systems provide their data via ftp to an NDBC server using a simple XML. The DAC also posts data in real-time to the NDBC webpages in columnar text format and data plots that maritime interests (e.g., surfing, fishing, boating) widely use. The webpage text feeds the Dial-A-Buoy capability that reads the latest data from webpages and the latest NWS forecast for the station to a user via telephone. The DAC also operates a DODS/OPenDAP server to provide data in netCDF. Recently the DAC implemented the NOAA IOOS Data Integration Framework, which facilitates the exchange of data between IOOS Regional Observing Systems by standardizing data exchange formats and incorporating needed metadata for the correct application of the data. The DAC has become an OceanSITES Global Data Assembly Center - part of the Initial Global Observing System for Climate. Supported by the NOAA IOOS Program, the DAC provides round-the-clock monitoring, quality control, and data distribution to ensure that its IOOS Partners can conduct operations that meet the NOAA definition of: Sustained, systematic, reliable, and robust mission activities with an institutional commitment to deliver appropriate, cost-effective products and services.

  8. NDEx - the Network Data Exchange, A Network Commons for Biologists | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Network models of biology, whether curated or derived from large-scale data analysis, are critical tools in the understanding of cancer mechanisms and in the design and personalization of therapies. The NDEx Project (Network Data Exchange) will create, deploy, and maintain an open-source, web-based software platform and public website to enable scientists, organizations, and software applications to share, store, manipulate, and publish biological networks.

  9. Network survivability performance (computer diskette)

    NASA Astrophysics Data System (ADS)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  10. Ground states of partially connected binary neural networks

    NASA Technical Reports Server (NTRS)

    Baram, Yoram

    1990-01-01

    Neural networks defined by outer products of vectors over (-1, 0, 1) are considered. Patterns over (-1, 0, 1) define by their outer products partially connected neural networks consisting of internally strongly connected, externally weakly connected subnetworks. Subpatterns over (-1, 1) define subnetworks, and their combinations that agree in the common bits define permissible words. It is shown that the permissible words are locally stable states of the network, provided that each of the subnetworks stores mutually orthogonal subwords, or, at most, two subwords. It is also shown that when each of the subnetworks stores two mutually orthogonal binary subwords at most, the permissible words, defined as the combinations of the subwords (one corresponding to each subnetwork), that agree in their common bits are the unique ground states of the associated energy function.

  11. On the Wiener Polarity Index of Lattice Networks

    PubMed Central

    Chen, Lin; Li, Tao; Liu, Jinfeng; Shi, Yongtang; Wang, Hua

    2016-01-01

    Network structures are everywhere, including but not limited to applications in biological, physical and social sciences, information technology, and optimization. Network robustness is of crucial importance in all such applications. Research on this topic relies on finding a suitable measure and use this measure to quantify network robustness. A number of distance-based graph invariants, also known as topological indices, have recently been incorporated as descriptors of complex networks. Among them the Wiener type indices are the most well known and commonly used such descriptors. As one of the fundamental variants of the original Wiener index, the Wiener polarity index has been introduced for a long time and known to be related to the cluster coefficient of networks. In this paper, we consider the value of the Wiener polarity index of lattice networks, a common network structure known for its simplicity and symmetric structure. We first present a simple general formula for computing the Wiener polarity index of any graph. Using this formula, together with the symmetric and recursive topology of lattice networks, we provide explicit formulas of the Wiener polarity index of the square lattices, the hexagonal lattices, the triangular lattices, and the 33 ⋅ 42 lattices. We also comment on potential future research topics. PMID:27930705

  12. Common arbuscular mycorrhizal networks amplify competition for phosphorus between seedlings and established plants.

    PubMed

    Merrild, Marie P; Ambus, Per; Rosendahl, Søren; Jakobsen, Iver

    2013-10-01

    Common mycorrhizal networks (CMNs) influence competition between plants, but reports regarding their precise effect are conflicting. We studied CMN effects on phosphorus (P) uptake and growth of seedlings as influenced by various disruptions of network components. Tomato (Solanum lycopersicon) seedlings grew into established networks of Rhizophagus irregularis and cucumber (Cucumis sativus) in two experiments. One experiment studied seedling uptake of (32)P in the network in response to cutting of cucumber shoots; the other analysed seedling uptake of P and nitrogen (N) in the presence of intact or severed arbuscular mycorrhizal fungus networks and at two soil P concentrations. Pre-established and intact networks suppressed growth of tomato seedlings. Cutting of cucumber shoots mitigated P deficiency symptoms of seedlings, which obtained access to P in the extraradical mycelium and thereby showed improved growth. Solitary seedlings growing in a network patch that had been severed from the CMN also grew much better than seedlings of the corresponding CMN. Interspecific and size-asymmetric competition between plants may be amplified rather than relaxed by CMNs that transfer P to large plants providing most carbon and render small plants P deficient. It is likely that grazing or senescence of the large plants will alleviate the network-induced suppression of seedling growth. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.

  13. Brief Internet and NREN Glossary: Part II (M-Z).

    ERIC Educational Resources Information Center

    Machovec, George S.

    1993-01-01

    Presents the second and final part of a selected glossary of terms commonly used in discussions relating to the Internet and the National Research and Education Network (NREN). Highlights include various network names; organizations; acronyms; user interfaces; network research testbeds; various protocols; remote login; and Wide Area Information…

  14. Wireless Local Area Networks: The Next Evolutionary Step.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    2001-01-01

    The Institute of Electrical and Electronics Engineers recently approved a high-speed wireless standard that enables devices from different manufacturers to communicate through a common backbone, making wireless local area networks more feasible in schools. Schools can now use wireless access points and network cards to provide flexible…

  15. Community-Based Research Networks: Development and Lessons Learned in an Emerging Field.

    ERIC Educational Resources Information Center

    Stoecker, Randy; Ambler, Susan H.; Cutforth, Nick; Donohue, Patrick; Dougherty, Dan; Marullo, Sam; Nelson, Kris S.; Stutts, Nancy B.

    2003-01-01

    Compares seven multi-institutional community-based research networks in Appalachia; Colorado; District of Columbia; Minneapolis-St. Paul; Philadelphia; Richmond, Virginia; and Trenton, New Jersey. After reviewing the histories of the networks, conducts a comparative SWOT analysis, showing their common and unique strengths, weaknesses,…

  16. Viable Global Networked Learning. JSRI Occasional Paper No. 23. Latino Studies Series.

    ERIC Educational Resources Information Center

    Arias, Armando A., Jr.

    This paper discusses an innovative paradigm for looking at computer mediated/networked teaching, learning, and research known as BESTNET (Binational English and Spanish Telecommunications Network). BESTNET is functionally defined as an international community of universities and institutions linked by common educational goals and processes,…

  17. Perspectives on the developing common experiment across the 18 sites within the long term agroecosystem research network

    Treesearch

    Kris Havstad

    2016-01-01

    The USDA Agricultural Research Service (ARS) established a Long Term Agroecosystem Research Network (LTAR) across 10 of its research locations, including some of its large watershed facilities, in 2012 and expanded that network to 18 locations in 2014.

  18. The Role of the Australian Open Learning Information Network.

    ERIC Educational Resources Information Center

    Bishop, Robin; And Others

    Three documents are presented which describe the Australian Open Learning Information Network (AOLIN)--a national, independent, and self-supporting network of educational researchers with a common interest in the use of information technology for open and distance education--and discuss two evaluative studies undertaken by the organization. The…

  19. Analysis of electrical tomography sensitive field based on multi-terminal network and electric field

    NASA Astrophysics Data System (ADS)

    He, Yongbo; Su, Xingguo; Xu, Meng; Wang, Huaxiang

    2010-08-01

    Electrical tomography (ET) aims at the study of the conductivity/permittivity distribution of the interested field non-intrusively via the boundary voltage/current. The sensor is usually regarded as an electric field, and finite element method (FEM) is commonly used to calculate the sensitivity matrix and to optimize the sensor architecture. However, only the lumped circuit parameters can be measured by the data acquisition electronics, it's very meaningful to treat the sensor as a multi terminal network. Two types of multi terminal network with common node and common loop topologies are introduced. Getting more independent measurements and making more uniform current distribution are the two main ways to minimize the inherent ill-posed effect. By exploring the relationships of network matrixes, a general formula is proposed for the first time to calculate the number of the independent measurements. Additionally, the sensitivity distribution is analyzed with FEM. As a result, quasi opposite mode, an optimal single source excitation mode, that has the advantages of more uniform sensitivity distribution and more independent measurements, is proposed.

  20. Relative impacts of environmental variation and evolutionary history on the nestedness and modularity of tree–herbivore networks

    PubMed Central

    Robinson, Kathryn M; Hauzy, Céline; Loeuille, Nicolas; Albrectsen, Benedicte R

    2015-01-01

    Nestedness and modularity are measures of ecological networks whose causative effects are little understood. We analyzed antagonistic plant–herbivore bipartite networks using common gardens in two contrasting environments comprised of aspen trees with differing evolutionary histories of defence against herbivores. These networks were tightly connected owing to a high level of specialization of arthropod herbivores that spend a large proportion of the life cycle on aspen. The gardens were separated by ten degrees of latitude with resultant differences in abiotic conditions. We evaluated network metrics and reported similar connectance between gardens but greater numbers of links per species in the northern common garden. Interaction matrices revealed clear nestedness, indicating subsetting of the bipartite interactions into specialist divisions, in both the environmental and evolutionary aspen groups, although nestedness values were only significant in the northern garden. Variation in plant vulnerability, measured as the frequency of herbivore specialization in the aspen population, was significantly partitioned by environment (common garden) but not by evolutionary origin of the aspens. Significant values of modularity were observed in all network matrices. Trait-matching indicated that growth traits, leaf morphology, and phenolic metabolites affected modular structure in both the garden and evolutionary groups, whereas extra-floral nectaries had little influence. Further examination of module configuration revealed that plant vulnerability explained considerable variance in web structure. The contrasting conditions between the two gardens resulted in bottom-up effects of the environment, which most strongly influenced the overall network architecture, however, the aspen groups with dissimilar evolutionary history also showed contrasting degrees of nestedness and modularity. Our research therefore shows that, while evolution does affect the structure of aspen–herbivore bipartite networks, the role of environmental variations is a dominant constraint. PMID:26306175

  1. Imagining the future: The core episodic simulation network dissociates as a function of timecourse and the amount of simulated information

    PubMed Central

    Thakral, Preston P.; Benoit, Roland G.; Schacter, Daniel L.

    2017-01-01

    Neuroimaging data indicate that episodic memory (i.e., remembering specific past experiences) and episodic simulation (i.e., imagining specific future experiences) are associated with enhanced activity in a common set of neural regions, often referred to as the core network. This network comprises the hippocampus, parahippocampal cortex, lateral and medial parietal cortex, lateral temporal cortex, and medial prefrontal cortex. Evidence for a core network has been taken as support for the idea that episodic memory and episodic simulation are supported by common processes. Much remains to be learned about how specific core network regions contribute to specific aspects of episodic simulation. Prior neuroimaging studies of episodic memory indicate that certain regions within the core network are differentially sensitive to the amount of information recollected (e.g., the left lateral parietal cortex). In addition, certain core network regions dissociate as a function of their timecourse of engagement during episodic memory (e.g., transient activity in the posterior hippocampus and sustained activity in the left lateral parietal cortex). In the current study, we assessed whether similar dissociations could be observed during episodic simulation. We found that the left lateral parietal cortex modulates as a function of the amount of simulated details. Of particular interest, while the hippocampus was insensitive to the amount of simulated details, we observed a temporal dissociation within the hippocampus: transient activity occurred in relatively posterior portions of the hippocampus and sustained activity occurred in anterior portions. Because the posterior hippocampal and lateral parietal findings parallel those observed previously during episodic memory, the present results add to the evidence that episodic memory and episodic simulation are supported by common processes. Critically, the present study also provides evidence that regions within the core network support dissociable processes. PMID:28324695

  2. Latent geometry of bipartite networks

    NASA Astrophysics Data System (ADS)

    Kitsak, Maksim; Papadopoulos, Fragkiskos; Krioukov, Dmitri

    2017-03-01

    Despite the abundance of bipartite networked systems, their organizing principles are less studied compared to unipartite networks. Bipartite networks are often analyzed after projecting them onto one of the two sets of nodes. As a result of the projection, nodes of the same set are linked together if they have at least one neighbor in common in the bipartite network. Even though these projections allow one to study bipartite networks using tools developed for unipartite networks, one-mode projections lead to significant loss of information and artificial inflation of the projected network with fully connected subgraphs. Here we pursue a different approach for analyzing bipartite systems that is based on the observation that such systems have a latent metric structure: network nodes are points in a latent metric space, while connections are more likely to form between nodes separated by shorter distances. This approach has been developed for unipartite networks, and relatively little is known about its applicability to bipartite systems. Here, we fully analyze a simple latent-geometric model of bipartite networks and show that this model explains the peculiar structural properties of many real bipartite systems, including the distributions of common neighbors and bipartite clustering. We also analyze the geometric information loss in one-mode projections in this model and propose an efficient method to infer the latent pairwise distances between nodes. Uncovering the latent geometry underlying real bipartite networks can find applications in diverse domains, ranging from constructing efficient recommender systems to understanding cell metabolism.

  3. Developing a common framework for evaluating the implementation of genomic medicine interventions in clinical care: the IGNITE Network's Common Measures Working Group.

    PubMed

    Orlando, Lori A; Sperber, Nina R; Voils, Corrine; Nichols, Marshall; Myers, Rachel A; Wu, R Ryanne; Rakhra-Burris, Tejinder; Levy, Kenneth D; Levy, Mia; Pollin, Toni I; Guan, Yue; Horowitz, Carol R; Ramos, Michelle; Kimmel, Stephen E; McDonough, Caitrin W; Madden, Ebony B; Damschroder, Laura J

    2018-06-01

    PurposeImplementation research provides a structure for evaluating the clinical integration of genomic medicine interventions. This paper describes the Implementing Genomics in Practice (IGNITE) Network's efforts to promote (i) a broader understanding of genomic medicine implementation research and (ii) the sharing of knowledge generated in the network.MethodsTo facilitate this goal, the IGNITE Network Common Measures Working Group (CMG) members adopted the Consolidated Framework for Implementation Research (CFIR) to guide its approach to identifying constructs and measures relevant to evaluating genomic medicine as a whole, standardizing data collection across projects, and combining data in a centralized resource for cross-network analyses.ResultsCMG identified 10 high-priority CFIR constructs as important for genomic medicine. Of those, eight did not have standardized measurement instruments. Therefore, we developed four survey tools to address this gap. In addition, we identified seven high-priority constructs related to patients, families, and communities that did not map to CFIR constructs. Both sets of constructs were combined to create a draft genomic medicine implementation model.ConclusionWe developed processes to identify constructs deemed valuable for genomic medicine implementation and codified them in a model. These resources are freely available to facilitate knowledge generation and sharing across the field.

  4. LOGIC OF CONTROLLED THRESHOLD DEVICES.

    DTIC Science & Technology

    The synthesis of threshold logic circuits from several points of view is presented. The first approach is applicable to resistor-transistor networks...in which the outputs are tied to a common collector resistor. In general, fewer threshold logic gates than NOR gates connected to a common collector...network to realize a specified function such that the failure of any but the output gate can be compensated for by a change in the threshold level (and

  5. Dual-phase evolution in complex adaptive systems

    PubMed Central

    Paperin, Greg; Green, David G.; Sadedin, Suzanne

    2011-01-01

    Understanding the origins of complexity is a key challenge in many sciences. Although networks are known to underlie most systems, showing how they contribute to well-known phenomena remains an issue. Here, we show that recurrent phase transitions in network connectivity underlie emergent phenomena in many systems. We identify properties that are typical of systems in different connectivity phases, as well as characteristics commonly associated with the phase transitions. We synthesize these common features into a common framework, which we term dual-phase evolution (DPE). Using this framework, we review the literature from several disciplines to show that recurrent connectivity phase transitions underlie the complex properties of many biological, physical and human systems. We argue that the DPE framework helps to explain many complex phenomena, including perpetual novelty, modularity, scale-free networks and criticality. Our review concludes with a discussion of the way DPE relates to other frameworks, in particular, self-organized criticality and the adaptive cycle. PMID:21247947

  6. Dual-phase evolution in complex adaptive systems.

    PubMed

    Paperin, Greg; Green, David G; Sadedin, Suzanne

    2011-05-06

    Understanding the origins of complexity is a key challenge in many sciences. Although networks are known to underlie most systems, showing how they contribute to well-known phenomena remains an issue. Here, we show that recurrent phase transitions in network connectivity underlie emergent phenomena in many systems. We identify properties that are typical of systems in different connectivity phases, as well as characteristics commonly associated with the phase transitions. We synthesize these common features into a common framework, which we term dual-phase evolution (DPE). Using this framework, we review the literature from several disciplines to show that recurrent connectivity phase transitions underlie the complex properties of many biological, physical and human systems. We argue that the DPE framework helps to explain many complex phenomena, including perpetual novelty, modularity, scale-free networks and criticality. Our review concludes with a discussion of the way DPE relates to other frameworks, in particular, self-organized criticality and the adaptive cycle.

  7. Social structure of Facebook networks

    NASA Astrophysics Data System (ADS)

    Traud, Amanda L.; Mucha, Peter J.; Porter, Mason A.

    2012-08-01

    We study the social structure of Facebook “friendship” networks at one hundred American colleges and universities at a single point in time, and we examine the roles of user attributes-gender, class year, major, high school, and residence-at these institutions. We investigate the influence of common attributes at the dyad level in terms of assortativity coefficients and regression models. We then examine larger-scale groupings by detecting communities algorithmically and comparing them to network partitions based on user characteristics. We thereby examine the relative importance of different characteristics at different institutions, finding for example that common high school is more important to the social organization of large institutions and that the importance of common major varies significantly between institutions. Our calculations illustrate how microscopic and macroscopic perspectives give complementary insights on the social organization at universities and suggest future studies to investigate such phenomena further.

  8. Controlled Vocabulary Service Application for Environmental Data Store

    NASA Astrophysics Data System (ADS)

    Ji, P.; Piasecki, M.; Lovell, R.

    2013-12-01

    In this paper we present a controlled vocabulary service application for Environmental Data Store (EDS). The purpose for such application is to help researchers and investigators to archive, manage, share, search, and retrieve data efficiently in EDS. The Simple Knowledge Organization System (SKOS) is used in the application for the representation of the controlled vocabularies coming from EDS. The controlled vocabularies of EDS are created by collecting, comparing, choosing and merging controlled vocabularies, taxonomies and ontologies widely used and recognized in geoscience/environmental informatics community, such as Environment ontology (EnvO), Semantic Web for Earth and Environmental Terminology (SWEET) ontology, CUAHSI Hydrologic Ontology and ODM Controlled Vocabulary, National Environmental Methods Index (NEMI), National Water Information System (NWIS) codes, EPSG Geodetic Parameter Data Set, WQX domain value etc. TemaTres, an open-source, web -based thesaurus management package is employed and extended to create and manage controlled vocabularies of EDS in the application. TemaTresView and VisualVocabulary that work well with TemaTres, are also integrated in the application to provide tree view and graphical view of the structure of vocabularies. The Open Source Edition of Virtuoso Universal Server is set up to provide a Web interface to make SPARQL queries against controlled vocabularies hosted on the Environmental Data Store. The replicas of some of the key vocabularies commonly used in the community, are also maintained as part of the application, such as General Multilingual Environmental Thesaurus (GEMET), NetCDF Climate and Forecast (CF) Standard Names, etc.. The application has now been deployed as an elementary and experimental prototype that provides management, search and download controlled vocabularies of EDS under SKOS framework.

  9. The Earth Data Analytic Services (EDAS) Framework

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  10. Increasing the value of geospatial informatics with open approaches for Big Data

    NASA Astrophysics Data System (ADS)

    Percivall, G.; Bermudez, L. E.

    2017-12-01

    Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."

  11. Tackling the 2nd V: Big Data, Variety and the Need for Representation Consistency

    NASA Astrophysics Data System (ADS)

    Clune, T.; Kuo, K. S.

    2016-12-01

    While Big Data technologies are transforming our ability to analyze ever larger volumes of Earth science data, practical constraints continue to limit our ability to compare data across datasets from different sources in an efficient and robust manner. Within a single data collection, invariants such as file format, grid type, and spatial resolution greatly simplify many types of analysis (often implicitly). However, when analysis combines data across multiple data collections, researchers are generally required to implement data transformations (i.e., "data preparation") to provide appropriate invariants. These transformation include changing of file formats, ingesting into a database, and/or regridding to a common spatial representation, and they can either be performed once, statically, or each time the data is accessed. At the very least, this process is inefficient from the perspective of the community as each team selects its own representation and privately implements the appropriate transformations. No doubt there are disadvantages to any "universal" representation, but we posit that major benefits would be obtained if a suitably flexible spatial representation could be standardized along with tools for transforming to/from that representation. We regard this as part of the historic trend in data publishing. Early datasets used ad hoc formats and lacked metadata. As better tools evolved, published data began to use standardized formats (e.g., HDF and netCDF) with attached metadata. We propose that the modern need to perform analysis across data sets should drive a new generation of tools that support a standardized spatial representation. More specifically, we propose the hierarchical triangular mesh (HTM) as a suitable "generic" resolution that permits standard transformations to/from native representations in use today, as well as tools to convert/regrid existing datasets onto that representation.

  12. AWS-Glacier As A Storage Foundation For AWS-EC2 Hosted Scientific Data Services

    NASA Astrophysics Data System (ADS)

    Gallagher, J. H. R.; Potter, N.

    2016-12-01

    Using AWS Glacier as a base level data store for a scientific data service presents new challenges for the web accessible data services, along with their software clients and human operators. All meaningful Glacier transactions take at least 4 hours to complete. This is in contrast to the various web APIs for data such as WMS, WFS, WCS, DAP2, and Netcdf tools which were all written based on the premise that the response will be (nearly) immediate. Only DAP4 and WPS contain an explicit asynchronous component to their respective protocols which allows for "return later" behaviors. We were able to put Hyrax (a DAP4 server) in front of Glacier-held resources, but there were significant issues. Any kind of probing of the datasets happens at the cost of the Glacier retrieval period, 4 hours. A couple of crucial things fall out of this: The first is that the service must cache metadata, including coordinate map arrays, so that a client can have enough information available in the "immediate" time frame to make a decisions about what to ask for from the dataset. This type of request planning is important because a data access request will take 4 hours to complete unless the data resource has been cached. The second thing is that the clients need to change their behavior when accessing datasets in an asynchronous system, even if the metadata is cached. Commonly, client applications will request a number of data components from a DAP2 service in the course of "discovering" the dataset. This may not be a well-supported model of interaction with Glacier or any other high latency data store.

  13. Environmental Data Store (EDS): A multi-node Data Storage Facility for diverse sets of Geoscience Data

    NASA Astrophysics Data System (ADS)

    Piasecki, M.; Ji, P.

    2014-12-01

    Geoscience data comes in many flavors that are determined by type of data such as continous on a grid or mesh or discrete colelcted at point either as one time samples or a stream of data coming of sensors, but coudl also encompass digital files of any time type such text files, WORD or EXCEL documents, or audio and video files. We present a storage facility that is comprsed of 6 nodes each of speciaized to host a certain data type: grid based data (netCDF on a THREDDS server), GIS data (shapefiles using GeoServer), point time series data (CUAHSI ODM), sample data (EDBS), and any digital data (RAMADAA) plus a server fro Remote sensing data and its products. While there is overlap in data type storage capabilities (rasters can go into several of these nodes) we prefer to use dedicated storage facilities that are a) freeware, and b) have a good degree of maturity, and c) have shown their utility for stroing a cetain type. In addition it allows to place these commonly used software stacks and storage solutiosn side-by-side to develop interoprability strategies. We have used a DRUPAL based system to handle user regoistration and authentication, and also use the system for data submission and data search. In support for tis system we developed an extensive controlled vocabulary system that is an amalgamation of various CVs used in the geosciecne community in order to achieve as high a degree of recognition, such the CF conventions, CUAHSI Cvs, , NASA (GCMD), EPA and USGS taxonomies, GEMET, in addition to ontological representations such as SWEET.

  14. Characterizing the evolution of climate networks

    NASA Astrophysics Data System (ADS)

    Tupikina, L.; Rehfeld, K.; Molkenthin, N.; Stolbova, V.; Marwan, N.; Kurths, J.

    2014-06-01

    Complex network theory has been successfully applied to understand the structural and functional topology of many dynamical systems from nature, society and technology. Many properties of these systems change over time, and, consequently, networks reconstructed from them will, too. However, although static and temporally changing networks have been studied extensively, methods to quantify their robustness as they evolve in time are lacking. In this paper we develop a theory to investigate how networks are changing within time based on the quantitative analysis of dissimilarities in the network structure. Our main result is the common component evolution function (CCEF) which characterizes network development over time. To test our approach we apply it to several model systems, Erdős-Rényi networks, analytically derived flow-based networks, and transient simulations from the START model for which we control the change of single parameters over time. Then we construct annual climate networks from NCEP/NCAR reanalysis data for the Asian monsoon domain for the time period of 1970-2011 CE and use the CCEF to characterize the temporal evolution in this region. While this real-world CCEF displays a high degree of network persistence over large time lags, there are distinct time periods when common links break down. This phasing of these events coincides with years of strong El Niño/Southern Oscillation phenomena, confirming previous studies. The proposed method can be applied for any type of evolving network where the link but not the node set is changing, and may be particularly useful to characterize nonstationary evolving systems using complex networks.

  15. Interworking evolution of mobile satellite and terrestrial networks

    NASA Technical Reports Server (NTRS)

    Matyas, R.; Kelleher, P.; Moller, P.; Jones, T.

    1993-01-01

    There is considerable interest among mobile satellite service providers in interworking with terrestrial networks to provide a universal global network. With such interworking, subscribers may be provided a common set of services such as those planned for the Public Switched Telephone Network (PSTN), the Integrated Services Digital Network (ISDN), and future Intelligent Networks (IN's). This paper first reviews issues in satellite interworking. Next the status and interworking plans of terrestrial mobile communications service providers are examined with early examples of mobile satellite interworking including a discussion of the anticipated evolution towards full interworking between mobile satellite and both fixed and mobile terrestrial networks.

  16. Proceedings of a Conference on Telecommunication Technologies, Networkings and Libraries

    NASA Astrophysics Data System (ADS)

    Knight, N. K.

    1981-12-01

    Current and developing technologies for digital transmission of image data likely to have an impact on the operations of libraries and information centers or provide support for information networking are reviewed. Technologies reviewed include slow scan television, teleconferencing, and videodisc technology and standards development for computer network interconnection through hardware and software, particularly packet switched networks computer network protocols for library and information service applications, the structure of a national bibliographic telecommunications network; and the major policy issues involved in the regulation or deregulation of the common communications carriers industry.

  17. Satellite Level 3 & 4 Data Subsetting at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Huwe, Paul; Su, Jian; Loeser, Carlee; Ostrenga, Dana; Rui, Hualan; Vollmer, Bruce

    2017-01-01

    Earth Science data are available in many file formats (NetCDF, HDF, GRB, etc.) and in a wide range of sizes, from kilobytes to gigabytes. These properties have become a challenge to users if they are not familiar with these formats or only want a small region of interest (ROI) from a specific dataset. At NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), we have developed and implemented a multipurpose subset service to ease user access to Earth Science data. Our Level 3 & 4 Regridder is capable of subsetting across multiple parameters (spatially, temporally, by level, and by variable) as well as having additional beneficial features (temporal means, regridding to target grids, and file conversion to other data formats). In this presentation, we will demonstrate how users can use this service to better access only the data they need in the form they require.

  18. HYDRA Hyperspectral Data Research Application Tom Rink and Tom Whittaker

    NASA Astrophysics Data System (ADS)

    Rink, T.; Whittaker, T.

    2005-12-01

    HYDRA is a freely available, easy to install tool for visualization and analysis of large local or remote hyper/multi-spectral datasets. HYDRA is implemented on top of the open source VisAD Java library via Jython - the Java implementation of the user friendly Python programming language. VisAD provides data integration, through its generalized data model, user-display interaction and display rendering. Jython has an easy to read, concise, scripting-like, syntax which eases software development. HYDRA allows data sharing of large datasets through its support of the OpenDAP and OpenADDE server-client protocols. The users can explore and interrogate data, and subset in physical and/or spectral space to isolate key areas of interest for further analysis without having to download an entire dataset. It also has an extensible data input architecture to recognize new instruments and understand different local file formats, currently NetCDF and HDF4 are supported.

  19. Satellite Level 3 & 4 Data Subsetting at NASA GES DISC

    NASA Astrophysics Data System (ADS)

    Huwe, P.; Su, J.; Loeser, C. F.; Ostrenga, D.; Rui, H.; Vollmer, B.

    2017-12-01

    Earth Science data are available in many file formats (NetCDF, HDF, GRB, etc.) and in a wide range of sizes, from kilobytes to gigabytes. These properties have become a challenge to users if they are not familiar with these formats or only want a small region of interest (ROI) from a specific dataset. At NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), we have developed and implemented a multipurpose subset service to ease user access to Earth Science data. Our Level 3 & 4 Regridder is capable of subsetting across multiple parameters (spatially, temporally, by level, and by variable) as well as having additional beneficial features (temporal means, regridding to target grids, and file conversion to other data formats). In this presentation, we will demonstrate how users can use this service to better access only the data they need in the form they require.

  20. Using Browser Notebooks to Analyse Big Atmospheric Data-sets in the Cloud

    NASA Astrophysics Data System (ADS)

    Robinson, N.; Tomlinson, J.; Arribas, A.; Prudden, R.

    2016-12-01

    We are presenting an account of our experience building an ecosystem for the analysis of big atmospheric data-sets. By using modern technologies we have developed a prototype platform which is scaleable and capable of analysing very large atmospheric datasets. We tested different big-data ecosystems such as Hadoop MapReduce, Spark and Dask, in order to find the one which was best suited for analysis of multidimensional binary data such as NetCDF. We make extensive use of infrastructure-as-code and containerisation to provide a platform which is reusable, and which can scale to accommodate changes in demand. We make this platform readily accessible using browser based notebooks. As a result, analysts with minimal technology experience can, in tens of lines of Python, make interactive data-visualisation web pages, which can analyse very large amounts of data using cutting edge big-data technology

  1. Large-scale integrative network-based analysis identifies common pathways disrupted by copy number alterations across cancers

    PubMed Central

    2013-01-01

    Background Many large-scale studies analyzed high-throughput genomic data to identify altered pathways essential to the development and progression of specific types of cancer. However, no previous study has been extended to provide a comprehensive analysis of pathways disrupted by copy number alterations across different human cancers. Towards this goal, we propose a network-based method to integrate copy number alteration data with human protein-protein interaction networks and pathway databases to identify pathways that are commonly disrupted in many different types of cancer. Results We applied our approach to a data set of 2,172 cancer patients across 16 different types of cancers, and discovered a set of commonly disrupted pathways, which are likely essential for tumor formation in majority of the cancers. We also identified pathways that are only disrupted in specific cancer types, providing molecular markers for different human cancers. Analysis with independent microarray gene expression datasets confirms that the commonly disrupted pathways can be used to identify patient subgroups with significantly different survival outcomes. We also provide a network view of disrupted pathways to explain how copy number alterations affect pathways that regulate cell growth, cycle, and differentiation for tumorigenesis. Conclusions In this work, we demonstrated that the network-based integrative analysis can help to identify pathways disrupted by copy number alterations across 16 types of human cancers, which are not readily identifiable by conventional overrepresentation-based and other pathway-based methods. All the results and source code are available at http://compbio.cs.umn.edu/NetPathID/. PMID:23822816

  2. Social Software: Participants' Experience Using Social Networking for Learning

    ERIC Educational Resources Information Center

    Batchelder, Cecil W.

    2010-01-01

    Social networking tools used in learning provides instructional design with tools for transformative change in education. This study focused on defining the meanings and essences of social networking through the lived common experiences of 7 college students. The problem of the study was a lack of learner voice in understanding the value of social…

  3. An International Knowledge Building Network for Sustainable Curriculum and Pedagogical Innovation

    ERIC Educational Resources Information Center

    Laferrière, Thérèse; Law, Nancy; Montané, Mireia

    2012-01-01

    This paper presents the results of the first phase (2007-2009) of a design experiment, the Knowledge Building International Project (KBIP), in which K-12 teachers from several countries collaborate as a loosely coupled network of networks with a common goal--to implement technology-supported knowledge building jointly across their classrooms.…

  4. Advantages of Social Network Analysis in Educational Research

    ERIC Educational Resources Information Center

    Ushakov, K. M.; Kukso, K. N.

    2015-01-01

    Currently one of the main tools for the large scale studies of schools is statistical analysis. Although it is the most common method and it offers greatest opportunities for analysis, there are other quantitative methods for studying schools, such as network analysis. We discuss the potential advantages that network analysis has for educational…

  5. Competitive seeds-selection in complex networks

    NASA Astrophysics Data System (ADS)

    Zhao, Jiuhua; Liu, Qipeng; Wang, Lin; Wang, Xiaofan

    2017-02-01

    This paper investigates a competitive diffusion model where two competitors simultaneously select a set of nodes (seeds) in the network to influence. We focus on the problem of how to select these seeds such that, when the diffusion process terminates, a competitor can obtain more supports than its opponent. Instead of studying this problem in the game-theoretic framework as in the existing work, in this paper we design several heuristic seed-selection strategies inspired by commonly used centrality measures-Betweenness Centrality (BC), Closeness Centrality (CC), Degree Centrality (DC), Eigenvector Centrality (EC), and K-shell Centrality (KS). We mainly compare three centrality-based strategies, which have better performances in competing with the random selection strategy, through simulations on both real and artificial networks. Even though network structure varies across different networks, we find certain common trend appearing in all of these networks. Roughly speaking, BC-based strategy and DC-based strategy are better than CC-based strategy. Moreover, if a competitor adopts CC-based strategy, then BC-based strategy is a better strategy than DC-based strategy for his opponent, and the superiority of BC-based strategy decreases as the heterogeneity of the network decreases.

  6. Synchronization invariance under network structural transformations

    NASA Astrophysics Data System (ADS)

    Arola-Fernández, Lluís; Díaz-Guilera, Albert; Arenas, Alex

    2018-06-01

    Synchronization processes are ubiquitous despite the many connectivity patterns that complex systems can show. Usually, the emergence of synchrony is a macroscopic observable; however, the microscopic details of the system, as, e.g., the underlying network of interactions, is many times partially or totally unknown. We already know that different interaction structures can give rise to a common functionality, understood as a common macroscopic observable. Building upon this fact, here we propose network transformations that keep the collective behavior of a large system of Kuramoto oscillators invariant. We derive a method based on information theory principles, that allows us to adjust the weights of the structural interactions to map random homogeneous in-degree networks into random heterogeneous networks and vice versa, keeping synchronization values invariant. The results of the proposed transformations reveal an interesting principle; heterogeneous networks can be mapped to homogeneous ones with local information, but the reverse process needs to exploit higher-order information. The formalism provides analytical insight to tackle real complex scenarios when dealing with uncertainty in the measurements of the underlying connectivity structure.

  7. Injured Brains and Adaptive Networks: The Benefits and Costs of Hyperconnectivity.

    PubMed

    Hillary, Frank G; Grafman, Jordan H

    2017-05-01

    A common finding in human functional brain-imaging studies is that damage to neural systems paradoxically results in enhanced functional connectivity between network regions, a phenomenon commonly referred to as 'hyperconnectivity'. Here, we describe the various ways that hyperconnectivity operates to benefit a neural network following injury while simultaneously negotiating the trade-off between metabolic cost and communication efficiency. Hyperconnectivity may be optimally expressed by increasing connections through the most central and metabolically efficient regions (i.e., hubs). While adaptive in the short term, we propose that chronic hyperconnectivity may leave network hubs vulnerable to secondary pathological processes over the life span due to chronically elevated metabolic stress. We conclude by offering novel, testable hypotheses for advancing our understanding of the role of hyperconnectivity in systems-level brain plasticity in neurological disorders. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Stochastic Dynamical Model of a Growing Citation Network Based on a Self-Exciting Point Process

    NASA Astrophysics Data System (ADS)

    Golosovsky, Michael; Solomon, Sorin

    2012-08-01

    We put under experimental scrutiny the preferential attachment model that is commonly accepted as a generating mechanism of the scale-free complex networks. To this end we chose a citation network of physics papers and traced the citation history of 40 195 papers published in one year. Contrary to common belief, we find that the citation dynamics of the individual papers follows the superlinear preferential attachment, with the exponent α=1.25-1.3. Moreover, we show that the citation process cannot be described as a memoryless Markov chain since there is a substantial correlation between the present and recent citation rates of a paper. Based on our findings we construct a stochastic growth model of the citation network, perform numerical simulations based on this model and achieve an excellent agreement with the measured citation distributions.

  9. A novel tracing method for the segmentation of cell wall networks.

    PubMed

    De Vylder, Jonas; Rooms, Filip; Dhondt, Stijn; Inze, Dirk; Philips, Wilfried

    2013-01-01

    Cell wall networks are a common subject of research in biology, which are important for plant growth analysis, organ studies, etc. In order to automate the detection of individual cells in such cell wall networks, we propose a new segmentation algorithm. The proposed method is a network tracing algorithm, exploiting the prior knowledge of the network structure. The method is applicable on multiple microscopy modalities such as fluorescence, but also for images captured using non invasive microscopes such as differential interference contrast (DIC) microscopes.

  10. Technologies for unattended network operations

    NASA Technical Reports Server (NTRS)

    Jaworski, Allan; Odubiyi, Jide; Holdridge, Mark; Zuzek, John

    1991-01-01

    The necessary network management functions for a telecommunications, navigation and information management (TNIM) system in the framework of an extension of the ISO model for communications network management are described. Various technologies that could substantially reduce the need for TNIM network management, automate manpower intensive functions, and deal with synchronization and control at interplanetary distances are presented. Specific technologies addressed include the use of the ISO Common Management Interface Protocol, distributed artificial intelligence for network synchronization and fault management, and fault-tolerant systems engineering.

  11. Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Todd Vollmer; Milos Manic

    The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, thismore » paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.« less

  12. Emergence of communities and diversity in social networks

    PubMed Central

    Han, Xiao; Cao, Shinan; Shen, Zhesi; Zhang, Boyu; Wang, Wen-Xu; Cressman, Ross

    2017-01-01

    Communities are common in complex networks and play a significant role in the functioning of social, biological, economic, and technological systems. Despite widespread interest in detecting community structures in complex networks and exploring the effect of communities on collective dynamics, a deep understanding of the emergence and prevalence of communities in social networks is still lacking. Addressing this fundamental problem is of paramount importance in understanding, predicting, and controlling a variety of collective behaviors in society. An elusive question is how communities with common internal properties arise in social networks with great individual diversity. Here, we answer this question using the ultimatum game, which has been a paradigm for characterizing altruism and fairness. We experimentally show that stable local communities with different internal agreements emerge spontaneously and induce social diversity into networks, which is in sharp contrast to populations with random interactions. Diverse communities and social norms come from the interaction between responders with inherent heterogeneous demands and rational proposers via local connections, where the former eventually become the community leaders. This result indicates that networks are significant in the emergence and stabilization of communities and social diversity. Our experimental results also provide valuable information about strategies for developing network models and theories of evolutionary games and social dynamics. PMID:28235785

  13. Emergence of communities and diversity in social networks.

    PubMed

    Han, Xiao; Cao, Shinan; Shen, Zhesi; Zhang, Boyu; Wang, Wen-Xu; Cressman, Ross; Stanley, H Eugene

    2017-03-14

    Communities are common in complex networks and play a significant role in the functioning of social, biological, economic, and technological systems. Despite widespread interest in detecting community structures in complex networks and exploring the effect of communities on collective dynamics, a deep understanding of the emergence and prevalence of communities in social networks is still lacking. Addressing this fundamental problem is of paramount importance in understanding, predicting, and controlling a variety of collective behaviors in society. An elusive question is how communities with common internal properties arise in social networks with great individual diversity. Here, we answer this question using the ultimatum game, which has been a paradigm for characterizing altruism and fairness. We experimentally show that stable local communities with different internal agreements emerge spontaneously and induce social diversity into networks, which is in sharp contrast to populations with random interactions. Diverse communities and social norms come from the interaction between responders with inherent heterogeneous demands and rational proposers via local connections, where the former eventually become the community leaders. This result indicates that networks are significant in the emergence and stabilization of communities and social diversity. Our experimental results also provide valuable information about strategies for developing network models and theories of evolutionary games and social dynamics.

  14. Applying NGS Data to Find Evolutionary Network Biomarkers from the Early and Late Stages of Hepatocellular Carcinoma

    PubMed Central

    Wu, Chia-Chou; Lin, Chih-Lung; Chen, Ting-Shou

    2015-01-01

    Hepatocellular carcinoma (HCC) is a major liver tumor (~80%), besides hepatoblastomas, angiosarcomas, and cholangiocarcinomas. In this study, we used a systems biology approach to construct protein-protein interaction networks (PPINs) for early-stage and late-stage liver cancer. By comparing the networks of these two stages, we found that the two networks showed some common mechanisms and some significantly different mechanisms. To obtain differential network structures between cancer and noncancer PPINs, we constructed cancer PPIN and noncancer PPIN network structures for the two stages of liver cancer by systems biology method using NGS data from cancer cells and adjacent noncancer cells. Using carcinogenesis relevance values (CRVs), we identified 43 and 80 significant proteins and their PPINs (network markers) for early-stage and late-stage liver cancer. To investigate the evolution of network biomarkers in the carcinogenesis process, a primary pathway analysis showed that common pathways of the early and late stages were those related to ordinary cancer mechanisms. A pathway specific to the early stage was the mismatch repair pathway, while pathways specific to the late stage were the spliceosome pathway, lysine degradation pathway, and progesterone-mediated oocyte maturation pathway. This study provides a new direction for cancer-targeted therapies at different stages. PMID:26366411

  15. The EuroSITES network: Integrating and enhancing fixed-point open ocean observatories around Europe

    NASA Astrophysics Data System (ADS)

    Lampitt, Richard S.; Larkin, Kate E.; EuroSITES Consortium

    2010-05-01

    EuroSITES is a 3 year (2008-2011) EU collaborative project (3.5MEuro) with the objective to integrate and enhance the nine existing open ocean fixed point observatories around Europe (www.eurosites.info). These observatories are primarily composed of full depth moorings and make multidisciplinary in situ observations within the water column as the European contribution to the global array OceanSITES (www.oceansites.org). In the first 18 months, all 9 observatories have been active and integration has been significant through the maintenance and enhancement of observatory hardware. Highlights include the enhancement of observatories with sensors to measure O2, pCO2, chlorophyll, and nitrate in near real-time from the upper 1000 m. In addition, some seafloor missions are also actively supported. These include seafloor platforms currently deployed in the Mediterranean, one for tsunami detection and one to monitor fluid flow related to seismic activity and slope stability. Upcoming seafloor science missions in 2010 include monitoring benthic biological communities and associated biogeochemistry as indicators of climate change in both the Northeast Atlantic and Mediterranean. EuroSITES also promotes the development of innovative sensors and samplers in order to progress capability to measure climate-relevant properties of the ocean. These include further developing current technologies for autonomous long-term monitoring of oxygen consumption in the mesopelagic, pH and mesozooplankton abundance. Many of these science missions are directly related to complementary activities in other European projects such as EPOCA, HYPOX and ESONET. In 2010 a direct collaboration including in situ field work will take place between ESONET and EuroSITES. The demonstration mission MODOO (funded by ESONET) will be implemented in 2010 at the EuroSITES PAP observatory. Field work will include deployment of a seafloor lander system with various sensors which will send data to shore in real time via the EuroSITES water column infrastructure. EuroSITES Data management is led by NOCS, UK with CORIOLIS, France as one of the Global Data assembly centre (GDAC) for both EuroSITES and OceanSITES. EuroSITES maintains the OceanSITES and GEO philosophy of open access to data in near real-time. With a common data policy and standardised data formats (OceanSITES NetCDF) EuroSITES is increasing the potential users of in situ ocean datasets and the societal benefit of these data. For instance, CORIOLIS is central to the ever increasing contribution of EuroSITES as an upstream data provider to the GMES project MyOcean (both real-time and delayed-mode data). Outreach and knowledge transfer of EuroSITES activities and results are also a key component to the project with a dedicated outreach website, Fact Sheet, cruise diaries and educational tools being developed in the first 18 months. In 2010 a film will be released to represent the network and this will be distributed to a wide audience through the European network of aquaria and at other outreach events. In addition, the EuroSITES project and it's relevance to global ocean observation initiatives continues to be actively promoted at both scientific and non-specialist meetings and events. By the end of EuroSITES in April 2011, the 9 core ocean observatories will be well integrated. Each observatory will have enhanced infrastructure to include both physical and biogeoechemical sensors. Science missions in the ocean interior and seafloor/subseafloor will have progressed European ocean observational capability significantly. Collaborations will have taken place or will be at an advanced stage of planning with related European and international projects including ESONET FP6 NoE and the NSF funded Ocean Observatories Initiative (OOI) (400M over 5 years). EuroSITES will continue to develop it's contribution to the ocean component of the Group on Earth Observations (GEO) through task AR-09-03c 'Global Ocean Observing Systems' and related societal benefit areas.

  16. Network survivability performance

    NASA Astrophysics Data System (ADS)

    1993-11-01

    This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunications networks to user expectations for network survivability and a foundation for continuing industry activities in the subject area. This report focuses on the survivability of both public and private networks and covers a wide range of users. Two frameworks are established for quantifying and categorizing service outages, and for classifying network survivability techniques and measures. The performance of the network survivability techniques is considered; however, recommended objectives are not established for network survivability performance.

  17. Scientific networking in disciplines

    NASA Astrophysics Data System (ADS)

    Chang, Ching-Ray; Marks, Ann; Dawson, Silvina Ponce

    2013-03-01

    Scientific networking occurs at various levels. There are regional and worldwide professional organizations that link together national physical societies (IUPAP, EPS, AAPPS, FeLaSoFi), providing a platform to exchange ideas and advance common agendas. National and international agencies have special lines of funding for scientific collaboration between groups of various countries. Some of these lines are targeted at improving science education at all levels. There are then personal networks that link people with common interests or who know each other for any reason. The International Conferences on Women in Physics have provided a unique opportunity for female physicists from all over the world to start a network of interactions that can involve all sorts of collaborative efforts. In the three-session workshop organized at ICWIP11, we discussed these various issues that the worldwide scientific community faces. In this paper we summarize the main ideas that surged during the meeting and provide the list of recommendations that were to start and keep an active network of female physicists and to foster scientific collaboration regionally and internationally.

  18. Scientific collaboration and team science: a social network analysis of the centers for population health and health disparities.

    PubMed

    Okamoto, Janet

    2015-03-01

    The past decade has seen dramatic shifts in the way that scientific research is conducted as networks, consortia, and large research centers are funded as transdisciplinary, team-based enterprises to tackle complex scientific questions. Key investigators (N = 167) involved in ten health disparities research centers completed a baseline social network and collaboration readiness survey. Collaborative ties existed primarily between investigators from the same center, with just 7 % of ties occurring across different centers. Grants and work groups were the most common types of ties between investigators, with shared presentations the most common tie across different centers. Transdisciplinary research orientation was associated with network position and reciprocity. Center directors/leaders were significantly more likely to form ties with investigators in other roles, such as statisticians and trainees. Understanding research collaboration networks can help to more effectively design and manage future team-based research, as well as pinpoint potential issues and continuous evaluation of existing efforts.

  19. Common mycorrhizal networks and their effect on the bargaining power of the fungal partner in the arbuscular mycorrhizal symbiosis.

    PubMed

    Bücking, Heike; Mensah, Jerry A; Fellbaum, Carl R

    2016-01-01

    Arbuscular mycorrhizal (AM) fungi form mutualistic interactions with the majority of land plants, including some of the most important crop species. The fungus takes up nutrients from the soil, and transfers these nutrients to the mycorrhizal interface in the root, where these nutrients are exchanged against carbon from the host. AM fungi form extensive hyphal networks in the soil and connect with their network multiple host plants. These common mycorrhizal networks (CMNs) play a critical role in the long-distance transport of nutrients through soil ecosystems and allow the exchange of signals between the interconnected plants. CMNs affect the survival, fitness, and competitiveness of the fungal and plant species that interact via these networks, but how the resource transport within these CMNs is controlled is largely unknown. We discuss the significance of CMNs for plant communities and for the bargaining power of the fungal partner in the AM symbiosis.

  20. Inference of hierarchical regulatory network of estrogen-dependent breast cancer through ChIP-based data

    PubMed Central

    2010-01-01

    Background Global profiling of in vivo protein-DNA interactions using ChIP-based technologies has evolved rapidly in recent years. Although many genome-wide studies have identified thousands of ERα binding sites and have revealed the associated transcription factor (TF) partners, such as AP1, FOXA1 and CEBP, little is known about ERα associated hierarchical transcriptional regulatory networks. Results In this study, we applied computational approaches to analyze three public available ChIP-based datasets: ChIP-seq, ChIP-PET and ChIP-chip, and to investigate the hierarchical regulatory network for ERα and ERα partner TFs regulation in estrogen-dependent breast cancer MCF7 cells. 16 common TFs and two common new TF partners (RORA and PITX2) were found among ChIP-seq, ChIP-chip and ChIP-PET datasets. The regulatory networks were constructed by scanning the ChIP-peak region with TF specific position weight matrix (PWM). A permutation test was performed to test the reliability of each connection of the network. We then used DREM software to perform gene ontology function analysis on the common genes. We found that FOS, PITX2, RORA and FOXA1 were involved in the up-regulated genes. We also conducted the ERα and Pol-II ChIP-seq experiments in tamoxifen resistance MCF7 cells (denoted as MCF7-T in this study) and compared the difference between MCF7 and MCF7-T cells. The result showed very little overlap between these two cells in terms of targeted genes (21.2% of common genes) and targeted TFs (25% of common TFs). The significant dissimilarity may indicate totally different transcriptional regulatory mechanisms between these two cancer cells. Conclusions Our study uncovers new estrogen-mediated regulatory networks by mining three ChIP-based data in MCF7 cells and ChIP-seq data in MCF7-T cells. We compared the different ChIP-based technologies as well as different breast cancer cells. Our computational analytical approach may guide biologists to further study the underlying mechanisms in breast cancer cells or other human diseases. PMID:21167036

  1. A novel multi-network approach reveals tissue-specific cellular modulators of fibrosis in systemic sclerosis.

    PubMed

    Taroni, Jaclyn N; Greene, Casey S; Martyanov, Viktor; Wood, Tammara A; Christmann, Romy B; Farber, Harrison W; Lafyatis, Robert A; Denton, Christopher P; Hinchcliff, Monique E; Pioli, Patricia A; Mahoney, J Matthew; Whitfield, Michael L

    2017-03-23

    Systemic sclerosis (SSc) is a multi-organ autoimmune disease characterized by skin fibrosis. Internal organ involvement is heterogeneous. It is unknown whether disease mechanisms are common across all involved affected tissues or if each manifestation has a distinct underlying pathology. We used consensus clustering to compare gene expression profiles of biopsies from four SSc-affected tissues (skin, lung, esophagus, and peripheral blood) from patients with SSc, and the related conditions pulmonary fibrosis (PF) and pulmonary arterial hypertension, and derived a consensus disease-associate signature across all tissues. We used this signature to query tissue-specific functional genomic networks. We performed novel network analyses to contrast the skin and lung microenvironments and to assess the functional role of the inflammatory and fibrotic genes in each organ. Lastly, we tested the expression of macrophage activation state-associated gene sets for enrichment in skin and lung using a Wilcoxon rank sum test. We identified a common pathogenic gene expression signature-an immune-fibrotic axis-indicative of pro-fibrotic macrophages (MØs) in multiple tissues (skin, lung, esophagus, and peripheral blood mononuclear cells) affected by SSc. While the co-expression of these genes is common to all tissues, the functional consequences of this upregulation differ by organ. We used this disease-associated signature to query tissue-specific functional genomic networks to identify common and tissue-specific pathologies of SSc and related conditions. In contrast to skin, in the lung-specific functional network we identify a distinct lung-resident MØ signature associated with lipid stimulation and alternative activation. In keeping with our network results, we find distinct MØ alternative activation transcriptional programs in SSc-associated PF lung and in the skin of patients with an "inflammatory" SSc gene expression signature. Our results suggest that the innate immune system is central to SSc disease processes but that subtle distinctions exist between tissues. Our approach provides a framework for examining molecular signatures of disease in fibrosis and autoimmune diseases and for leveraging publicly available data to understand common and tissue-specific disease processes in complex human diseases.

  2. Network cosmology.

    PubMed

    Krioukov, Dmitri; Kitsak, Maksim; Sinkovits, Robert S; Rideout, David; Meyer, David; Boguñá, Marián

    2012-01-01

    Prediction and control of the dynamics of complex networks is a central problem in network science. Structural and dynamical similarities of different real networks suggest that some universal laws might accurately describe the dynamics of these networks, albeit the nature and common origin of such laws remain elusive. Here we show that the causal network representing the large-scale structure of spacetime in our accelerating universe is a power-law graph with strong clustering, similar to many complex networks such as the Internet, social, or biological networks. We prove that this structural similarity is a consequence of the asymptotic equivalence between the large-scale growth dynamics of complex networks and causal networks. This equivalence suggests that unexpectedly similar laws govern the dynamics of complex networks and spacetime in the universe, with implications to network science and cosmology.

  3. Network Cosmology

    PubMed Central

    Krioukov, Dmitri; Kitsak, Maksim; Sinkovits, Robert S.; Rideout, David; Meyer, David; Boguñá, Marián

    2012-01-01

    Prediction and control of the dynamics of complex networks is a central problem in network science. Structural and dynamical similarities of different real networks suggest that some universal laws might accurately describe the dynamics of these networks, albeit the nature and common origin of such laws remain elusive. Here we show that the causal network representing the large-scale structure of spacetime in our accelerating universe is a power-law graph with strong clustering, similar to many complex networks such as the Internet, social, or biological networks. We prove that this structural similarity is a consequence of the asymptotic equivalence between the large-scale growth dynamics of complex networks and causal networks. This equivalence suggests that unexpectedly similar laws govern the dynamics of complex networks and spacetime in the universe, with implications to network science and cosmology. PMID:23162688

  4. Defining Tolerance: Impacts of Delay and Disruption when Managing Challenged Networks

    NASA Technical Reports Server (NTRS)

    Birrane, Edward J. III; Burleigh, Scott C.; Cerf, Vint

    2011-01-01

    Challenged networks exhibit irregularities in their communication performance stemming from node mobility, power constraints, and impacts from the operating environment. These irregularities manifest as high signal propagation delay and frequent link disruption. Understanding those limits of link disruption and propagation delay beyond which core networking features fail is an ongoing area of research. Various wireless networking communities propose tools and techniques that address these phenomena. Emerging standardization activities within the Internet Research Task Force (IRTF) and the Consultative Committee for Space Data Systems (CCSDS) look to build upon both this experience and scalability analysis. Successful research in this area is predicated upon identifying enablers for common communication functions (notably node discovery, duplex communication, state caching, and link negotiation) and how increased disruptions and delays affect their feasibility within the network. Networks that make fewer assumptions relating to these enablers provide more universal service. Specifically, reliance on node discovery and link negotiation results in network-specific operational concepts rather than scalable technical solutions. Fundamental to this debate are the definitions, assumptions, operational concepts, and anticipated scaling of these networks. This paper presents the commonalities and differences between delay and disruption tolerance, including support protocols and critical enablers. We present where and how these tolerances differ. We propose a set of use cases that must be accommodated by any standardized delay-tolerant network and discuss the implication of these on existing tool development.

  5. Development of an Integrated Hydrologic Modeling System for Rainfall-Runoff Simulation

    NASA Astrophysics Data System (ADS)

    Lu, B.; Piasecki, M.

    2008-12-01

    This paper aims to present the development of an integrated hydrological model which involves functionalities of digital watershed processing, online data retrieval, hydrologic simulation and post-event analysis. The proposed system is intended to work as a back end to the CUAHSI HIS cyberinfrastructure developments. As a first step into developing this system, a physics-based distributed hydrologic model PIHM (Penn State Integrated Hydrologic Model) is wrapped into OpenMI(Open Modeling Interface and Environment ) environment so as to seamlessly interact with OpenMI compliant meteorological models. The graphical user interface is being developed from the openGIS application called MapWindows which permits functionality expansion through the addition of plug-ins. . Modules required to set up through the GUI workboard include those for retrieving meteorological data from existing database or meteorological prediction models, obtaining geospatial data from the output of digital watershed processing, and importing initial condition and boundary condition. They are connected to the OpenMI compliant PIHM to simulate rainfall-runoff processes and includes a module for automatically displaying output after the simulation. Online databases are accessed through the WaterOneFlow web services, and the retrieved data are either stored in an observation database(OD) following the schema of Observation Data Model(ODM) in case for time series support, or a grid based storage facility which may be a format like netCDF or a grid-based-data database schema . Specific development steps include the creation of a bridge to overcome interoperability issue between PIHM and the ODM, as well as the embedding of TauDEM (Terrain Analysis Using Digital Elevation Models) into the model. This module is responsible for developing watershed and stream network using digital elevation models. Visualizing and editing geospatial data is achieved by the usage of MapWinGIS, an ActiveX control developed by MapWindow team. After applying to the practical watershed, the performance of the model can be tested by the post-event analysis module.

  6. WMT: The CSDMS Web Modeling Tool

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.

  7. NASA Tech Briefs, December 2008

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Topics covered include: Crew Activity Analyzer; Distributing Data to Hand-Held Devices in a Wireless Network; Reducing Surface Clutter in Cloud Profiling Radar Data; MODIS Atmospheric Data Handler; Multibeam Altimeter Navigation Update Using Faceted Shape Model; Spaceborne Hybrid-FPGA System for Processing FTIR Data; FPGA Coprocessor for Accelerated Classification of Images; SiC JFET Transistor Circuit Model for Extreme Temperature Range; TDR Using Autocorrelation and Varying-Duration Pulses; Update on Development of SiC Multi-Chip Power Modules; Radio Ranging System for Guidance of Approaching Spacecraft; Electromagnetically Clean Solar Arrays; Improved Short-Circuit Protection for Power Cells in Series; Electromagnetically Clean Solar Arrays; Logic Gates Made of N-Channel JFETs and Epitaxial Resistors; Improved Short-Circuit Protection for Power Cells in Series; Communication Limits Due to Photon-Detector Jitter; System for Removing Pollutants from Incinerator Exhaust; Sealing and External Sterilization of a Sample Container; Converting EOS Data from HDF-EOS to netCDF; HDF-EOS 2 and HDF-EOS 5 Compatibility Library; HDF-EOS Web Server; HDF-EOS 5 Validator; XML DTD and Schemas for HDF-EOS; Converting from XML to HDF-EOS; Simulating Attitudes and Trajectories of Multiple Spacecraft; Specialized Color Function for Display of Signed Data; Delivering Alert Messages to Members of a Work Force; Delivering Images for Mars Rover Science Planning; Oxide Fiber Cathode Materials for Rechargeable Lithium Cells; Electrocatalytic Reduction of Carbon Dioxide to Methane; Heterogeneous Superconducting Low-Noise Sensing Coils; Progress toward Making Epoxy/Carbon-Nanotube Composites; Predicting Properties of Unidirectional-Nanofiber Composites; Deployable Crew Quarters; Nonventing, Regenerable, Lightweight Heat Absorber; Miniature High-Force, Long-Stroke SMA Linear Actuators; "Bootstrap" Configuration for Multistage Pulse-Tube Coolers; Reducing Liquid Loss during Ullage Venting in Microgravity; Ka-Band Transponder for Deep-Space Radio Science; Replication of Space-Shuttle Computers in FPGAs and ASICs; Demisable Reaction-Wheel Assembly; Spatial and Temporal Low-Dimensional Models for Fluid Flow; Advanced Land Imager Assessment System; Range Imaging without Moving Parts.

  8. Building the Petascale National Environmental Research Interoperability Data Platform (NERDIP): Minimizing the 'Trough of Disillusionment' and Accelerating Pathways to the 'Plateau of Productivity'

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Evans, B. J. K.

    2015-12-01

    The National Computational Infrastructure (NCI) at the Australian National University (ANU) has evolved to become Australia's peak computing centre for national computational and Data-intensive Earth system science. More recently NCI collocated 10 Petabytes of 34 major national and international environmental, climate, earth system, geophysics and astronomy data collections to create the National Environmental Research Interoperability Data Platform (NERDIP). Spatial scales of the collections range from global to local ultra-high resolution, whilst sizes range from 3PB down to a few GB. The data is highly connected to both NCI HPC and cloud resources via low latency internal networks with massive bandwidth. Now that the collections are collocated on a single data platform, the 'Hype' and expectations around potential use cases for the NERDIP are high. Not unexpected issues are emerging such as access, licensing issues, ownership, and incompatible data standards. Many communities are standardised within their domain, but achieving true interdisciplinary science will require all communities to move towards open interoperable data formats such as NetCDF4/HDF5. This transition will impact on software using proprietary or non-open standards. But before we reach the 'Plateau of Productivity', there needs to be greater 'Enlightenment' of users to encourage them to realise that this unprecedented Earth system science platform provides a rich mine of opportunities for discovery and innovation for a diverse range of both domain-specific and interdisciplinary investigations including climate and weather research, impact analysis, environment, remote sensing and geophysics and develop new and innovative interdisciplinary use cases that will guide those architecting the system and help minimise the amplitude of the 'Trough of Disillusionment' and ensure greater productivity and uptake of the collections that make NERDIP unique in the next generation of Data-intensive Science.

  9. The Basic Radar Altimetry Toolbox for Sentinel 3 Users

    NASA Astrophysics Data System (ADS)

    Lucas, Bruno; Rosmorduc, Vinca; Niemeijer, Sander; Bronner, Emilie; Dinardo, Salvatore; Benveniste, Jérôme

    2013-04-01

    The Basic Radar Altimetry Toolbox (BRAT) is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2006 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales). The latest version of the software, 3.1, was released on March 2012. The tools enable users to interact with the most common altimetry data formats, being the most used way, the Graphical User Interface (BratGui). This GUI is a front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. The BratDisplay (graphic visualizer) can be launched from BratGui, or used as a stand-alone tool to visualize netCDF files - it is distributed with another ESA toolbox (GUT) as the visualizer. The most frequent uses of BRAT are teaching remote sensing, altimetry data reading (all missions from ERS-1 to Saral and soon Sentinel-3), quick data visualization/export and simple computation on the data fields. BRAT can be used for importing data and having a quick look at his contents, with several different types of plotting available. One can also use it to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BratGui involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas (MSS, -SSH, MSLA, editing of spurious data, etc.). The documentation collection includes the standard user manual explaining all the ways to interact with the set of software tools but the most important item is the Radar Altimeter Tutorial, that contains a strong introduction to altimetry, showing its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "data use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The upcoming release that is on the forge will focus on Sentinel 3 Surface Topography Mission that is build on the successful heritage of ERS, Envisat and Cryosat. The first of the two sentinel is expected to be launched in 2014. It will have on-board a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter and will provide measurements at a resolution of ~300m in SAR mode along track. Sentinel 3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The future version will provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and made them aware of the great potential of SAR altimetery for coastal and inland applications. The BRAT software is distributed under the GNU GPL open-source license and can be obtained, along with all the documentation (including the tutorial), on the webstite: http://earth.esa.int/brat

  10. How risky are social networking sites? A comparison of places online where youth sexual solicitation and harassment occurs.

    PubMed

    Ybarra, Michele L; Mitchell, Kimberly J

    2008-02-01

    Recently, public attention has focused on the possibility that social networking sites such as MySpace and Facebook are being widely used to sexually solicit underage youth, consequently increasing their vulnerability to sexual victimization. Beyond anecdotal accounts, however, whether victimization is more commonly reported in social networking sites is unknown. The Growing up With Media Survey is a national cross-sectional online survey of 1588 youth. Participants were 10- to 15-year-old youth who have used the Internet at least once in the last 6 months. The main outcome measures were unwanted sexual solicitation on the Internet, defined as unwanted requests to talk about sex, provide personal sexual information, and do something sexual, and Internet harassment, defined as rude or mean comments, or spreading of rumors. Fifteen percent of all of the youth reported an unwanted sexual solicitation online in the last year; 4% reported an incident on a social networking site specifically. Thirty-three percent reported an online harassment in the last year; 9% reported an incident on a social networking site specifically. Among targeted youth, solicitations were more commonly reported via instant messaging (43%) and in chat rooms (32%), and harassment was more commonly reported in instant messaging (55%) than through social networking sites (27% and 28%, respectively). Broad claims of victimization risk, at least defined as unwanted sexual solicitation or harassment, associated with social networking sites do not seem justified. Prevention efforts may have a greater impact if they focus on the psychosocial problems of youth instead of a specific Internet application, including funding for online youth outreach programs, school antibullying programs, and online mental health services.

  11. Effective Utilization of Resources and Infrastructure for a Spaceport Network Architecture

    NASA Technical Reports Server (NTRS)

    Gill, Tracy; Larson, Wiley; Mueller, Robert; Roberson, Luke

    2012-01-01

    Providing routine, affordable access to a variety of orbital and deep space destinations requires an intricate network of ground, planetary surface, and space-based spaceports like those on Earth (land and sea), in various Earth orbits, and on other extraterrestrial surfaces. Advancements in technology and international collaboration are critical to establish a spaceport network that satisfies the requirements for private and government research, exploration, and commercial objectives. Technologies, interfaces, assembly techniques, and protocols must be adapted to enable mission critical capabilities and interoperability throughout the spaceport network. The conceptual space mission architecture must address the full range of required spaceport services, from managing propellants for a variety of spacecraft to governance structure. In order to accomplish affordability and sustainability goals, the network architecture must consider deriving propellants from in situ planetary resources to the maximum extent possible. Water on the Moon and Mars, Mars' atmospheric CO2, and O2 extracted from lunar regolith are examples of in situ resources that could be used to generate propellants for various spacecraft, orbital stages and trajectories, and the commodities to support habitation and human operations at these destinations. The ability to use in-space fuel depots containing in situ derived propellants would drastically reduce the mass required to launch long-duration or deep space missions from Earth's gravity well. Advances in transformative technologies and common capabilities, interfaces, umbilicals, commodities, protocols, and agreements will facilitate a cost-effective, safe, reliable infrastructure for a versatile network of Earth- and extraterrestrial spaceports. Defining a common infrastructure on Earth, planetary surfaces, and in space, as well as deriving propellants from in situ planetary resources to construct in-space propellant depots to serve the spaceport network, will reduce exploration costs due to standardization of infrastructure commonality and reduction in number and types of interfaces and commodities.

  12. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  13. Prediction of missing common genes for disease pairs using network based module separation on incomplete human interactome.

    PubMed

    Akram, Pakeeza; Liao, Li

    2017-12-06

    Identification of common genes associated with comorbid diseases can be critical in understanding their pathobiological mechanism. This work presents a novel method to predict missing common genes associated with a disease pair. Searching for missing common genes is formulated as an optimization problem to minimize network based module separation from two subgraphs produced by mapping genes associated with disease onto the interactome. Using cross validation on more than 600 disease pairs, our method achieves significantly higher average receiver operating characteristic ROC Score of 0.95 compared to a baseline ROC score 0.60 using randomized data. Missing common genes prediction is aimed to complete gene set associated with comorbid disease for better understanding of biological intervention. It will also be useful for gene targeted therapeutics related to comorbid diseases. This method can be further considered for prediction of missing edges to complete the subgraph associated with disease pair.

  14. Network for minimizing current imbalances in a faradaic battery

    DOEpatents

    Wozniak, Walter; Haskins, Harold J.

    1994-01-01

    A circuit for connecting a faradaic battery with circuitry for monitoring the condition of the battery includes a plurality of voltage divider networks providing battery voltage monitoring nodes and includes compensating resistors connected with the networks to maintain uniform discharge currents through the cells of the battery. The circuit also provides a reduced common mode voltage requirement for the monitoring circuitry by referencing the divider networks to one-half the battery voltage.

  15. Connecting Land-Based Networks to Ships

    DTIC Science & Technology

    2012-09-01

    LAN Local Area Network LEO Low Earth Orbit LOS Line Of Sight MAC Media Access Control MANET Mobile Ad Hoc Networking ME Mobile...orbit – LEO ). Iridium satellite communication system is an example of LEO , while Inmarsat uses MEO and GEO. GEO satellites are the most common type...August 2012, http://www.cosmote.gr/cosmoportal/page/T25/section/Cover/ loc /en_U.S. [41] WIND, “Network Coverage map,” August 2012, http

  16. Towards Trust-based Cognitive Networks: A Survey of Trust Management for Mobile Ad Hoc Networks

    DTIC Science & Technology

    2009-06-01

    of trust. First, social trust refers to properties derived from social relationships . Examples of social networks are strong social ... relationships such as colleagues or relatives or loose social relationships such as school alumni or friends with common interests [44]. Social trust may...also use social relationships in evaluating the trust metric among group members by employing the concept of social networks. Yu et al. [44] define

  17. 'tomo_display' and 'vol_tools': IDL VM Packages for Tomography Data Reconstruction, Processing, and Visualization

    NASA Astrophysics Data System (ADS)

    Rivers, M. L.; Gualda, G. A.

    2009-05-01

    One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk, and (3) generate MPEG movies for inclusion in presentations, publications, websites, etc. Both are freely available as run-time ('.sav') versions that can be run using the free IDL Virtual Machine TM, available from ITT Visual Information Solutions: http://www.ittvis.com/ProductServices/IDL/VirtualMachine.aspx The run-time versions of 'tomo_display' and 'vol_tools' can be downloaded from: http://cars.uchicago.edu/software/idl/tomography.html http://sites.google.com/site/voltools/

  18. GIS Services, Visualization Products, and Interoperability at the National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center (NCDC)

    NASA Astrophysics Data System (ADS)

    Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.

    2007-12-01

    The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.

  19. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    NASA Astrophysics Data System (ADS)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.

  20. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Palamuttam, R. S.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; Verma, R.; Waliser, D. E.; Lee, H.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark under a NASA AIST grant (PI Mattmann). Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 10 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. We have implemented a parallel data ingest capability in which the user specifies desired variables (arrays) as several time-sorted lists of URL's (i.e. using OPeNDAP model.nc?varname, or local files). The specified variables are partitioned by time/space and then each Spark node pulls its bundle of arrays into memory to begin a computation pipeline. We also investigated the performance of several N-dim. array libraries (scala breeze, java jblas & netlib-java, and ND4J). We are currently developing science codes using ND4J and studying memory behavior on the JVM. On the pyspark side, many of our science codes already use the numpy and SciPy ecosystems. The talk will cover: the architecture of SciSpark, the design of the scientific RDD (sRDD) data structure, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.

  1. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research

    PubMed Central

    Yeung, Ka Yee

    2016-01-01

    Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface. PMID:27045593

  2. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research.

    PubMed

    Hung, Ling-Hong; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2016-01-01

    Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.

  3. Physical Heterogeneity and Aquatic Community Function in River Networks

    EPA Science Inventory

    The geomorphological character of a river network provides the template upon which evolution acts to create unique biological communities. Deciphering commonly observed patterns and processes within riverine landscapes resulting from the interplay between physical and biological...

  4. A physical sciences network characterization of non-tumorigenic and metastatic cells

    PubMed Central

    Agus, David B.; Alexander, Jenolyn F.; Arap, Wadih; Ashili, Shashanka; Aslan, Joseph E.; Austin, Robert H.; Backman, Vadim; Bethel, Kelly J.; Bonneau, Richard; Chen, Wei-Chiang; Chen-Tanyolac, Chira; Choi, Nathan C.; Curley, Steven A.; Dallas, Matthew; Damania, Dhwanil; Davies, Paul C. W.; Decuzzi, Paolo; Dickinson, Laura; Estevez-Salmeron, Luis; Estrella, Veronica; Ferrari, Mauro; Fischbach, Claudia; Foo, Jasmine; Fraley, Stephanie I.; Frantz, Christian; Fuhrmann, Alexander; Gascard, Philippe; Gatenby, Robert A.; Geng, Yue; Gerecht, Sharon; Gillies, Robert J.; Godin, Biana; Grady, William M.; Greenfield, Alex; Hemphill, Courtney; Hempstead, Barbara L.; Hielscher, Abigail; Hillis, W. Daniel; Holland, Eric C.; Ibrahim-Hashim, Arig; Jacks, Tyler; Johnson, Roger H.; Joo, Ahyoung; Katz, Jonathan E.; Kelbauskas, Laimonas; Kesselman, Carl; King, Michael R.; Konstantopoulos, Konstantinos; Kraning-Rush, Casey M.; Kuhn, Peter; Kung, Kevin; Kwee, Brian; Lakins, Johnathon N.; Lambert, Guillaume; Liao, David; Licht, Jonathan D.; Liphardt, Jan T.; Liu, Liyu; Lloyd, Mark C.; Lyubimova, Anna; Mallick, Parag; Marko, John; McCarty, Owen J. T.; Meldrum, Deirdre R.; Michor, Franziska; Mumenthaler, Shannon M.; Nandakumar, Vivek; O’Halloran, Thomas V.; Oh, Steve; Pasqualini, Renata; Paszek, Matthew J.; Philips, Kevin G.; Poultney, Christopher S.; Rana, Kuldeepsinh; Reinhart-King, Cynthia A.; Ros, Robert; Semenza, Gregg L.; Senechal, Patti; Shuler, Michael L.; Srinivasan, Srimeenakshi; Staunton, Jack R.; Stypula, Yolanda; Subramanian, Hariharan; Tlsty, Thea D.; Tormoen, Garth W.; Tseng, Yiider; van Oudenaarden, Alexander; Verbridge, Scott S.; Wan, Jenny C.; Weaver, Valerie M.; Widom, Jonathan; Will, Christine; Wirtz, Denis; Wojtkowiak, Jonathan; Wu, Pei-Hsun

    2013-01-01

    To investigate the transition from non-cancerous to metastatic from a physical sciences perspective, the Physical Sciences–Oncology Centers (PS-OC) Network performed molecular and biophysical comparative studies of the non-tumorigenic MCF-10A and metastatic MDA-MB-231 breast epithelial cell lines, commonly used as models of cancer metastasis. Experiments were performed in 20 laboratories from 12 PS-OCs. Each laboratory was supplied with identical aliquots and common reagents and culture protocols. Analyses of these measurements revealed dramatic differences in their mechanics, migration, adhesion, oxygen response, and proteomic profiles. Model-based multi-omics approaches identified key differences between these cells' regulatory networks involved in morphology and survival. These results provide a multifaceted description of cellular parameters of two widely used cell lines and demonstrate the value of the PS-OC Network approach for integration of diverse experimental observations to elucidate the phenotypes associated with cancer metastasis. PMID:23618955

  5. A physical sciences network characterization of non-tumorigenic and metastatic cells.

    PubMed

    Agus, David B; Alexander, Jenolyn F; Arap, Wadih; Ashili, Shashanka; Aslan, Joseph E; Austin, Robert H; Backman, Vadim; Bethel, Kelly J; Bonneau, Richard; Chen, Wei-Chiang; Chen-Tanyolac, Chira; Choi, Nathan C; Curley, Steven A; Dallas, Matthew; Damania, Dhwanil; Davies, Paul C W; Decuzzi, Paolo; Dickinson, Laura; Estevez-Salmeron, Luis; Estrella, Veronica; Ferrari, Mauro; Fischbach, Claudia; Foo, Jasmine; Fraley, Stephanie I; Frantz, Christian; Fuhrmann, Alexander; Gascard, Philippe; Gatenby, Robert A; Geng, Yue; Gerecht, Sharon; Gillies, Robert J; Godin, Biana; Grady, William M; Greenfield, Alex; Hemphill, Courtney; Hempstead, Barbara L; Hielscher, Abigail; Hillis, W Daniel; Holland, Eric C; Ibrahim-Hashim, Arig; Jacks, Tyler; Johnson, Roger H; Joo, Ahyoung; Katz, Jonathan E; Kelbauskas, Laimonas; Kesselman, Carl; King, Michael R; Konstantopoulos, Konstantinos; Kraning-Rush, Casey M; Kuhn, Peter; Kung, Kevin; Kwee, Brian; Lakins, Johnathon N; Lambert, Guillaume; Liao, David; Licht, Jonathan D; Liphardt, Jan T; Liu, Liyu; Lloyd, Mark C; Lyubimova, Anna; Mallick, Parag; Marko, John; McCarty, Owen J T; Meldrum, Deirdre R; Michor, Franziska; Mumenthaler, Shannon M; Nandakumar, Vivek; O'Halloran, Thomas V; Oh, Steve; Pasqualini, Renata; Paszek, Matthew J; Philips, Kevin G; Poultney, Christopher S; Rana, Kuldeepsinh; Reinhart-King, Cynthia A; Ros, Robert; Semenza, Gregg L; Senechal, Patti; Shuler, Michael L; Srinivasan, Srimeenakshi; Staunton, Jack R; Stypula, Yolanda; Subramanian, Hariharan; Tlsty, Thea D; Tormoen, Garth W; Tseng, Yiider; van Oudenaarden, Alexander; Verbridge, Scott S; Wan, Jenny C; Weaver, Valerie M; Widom, Jonathan; Will, Christine; Wirtz, Denis; Wojtkowiak, Jonathan; Wu, Pei-Hsun

    2013-01-01

    To investigate the transition from non-cancerous to metastatic from a physical sciences perspective, the Physical Sciences-Oncology Centers (PS-OC) Network performed molecular and biophysical comparative studies of the non-tumorigenic MCF-10A and metastatic MDA-MB-231 breast epithelial cell lines, commonly used as models of cancer metastasis. Experiments were performed in 20 laboratories from 12 PS-OCs. Each laboratory was supplied with identical aliquots and common reagents and culture protocols. Analyses of these measurements revealed dramatic differences in their mechanics, migration, adhesion, oxygen response, and proteomic profiles. Model-based multi-omics approaches identified key differences between these cells' regulatory networks involved in morphology and survival. These results provide a multifaceted description of cellular parameters of two widely used cell lines and demonstrate the value of the PS-OC Network approach for integration of diverse experimental observations to elucidate the phenotypes associated with cancer metastasis.

  6. Enhanced Detectability of Community Structure in Multilayer Networks through Layer Aggregation.

    PubMed

    Taylor, Dane; Shai, Saray; Stanley, Natalie; Mucha, Peter J

    2016-06-03

    Many systems are naturally represented by a multilayer network in which edges exist in multiple layers that encode different, but potentially related, types of interactions, and it is important to understand limitations on the detectability of community structure in these networks. Using random matrix theory, we analyze detectability limitations for multilayer (specifically, multiplex) stochastic block models (SBMs) in which L layers are derived from a common SBM. We study the effect of layer aggregation on detectability for several aggregation methods, including summation of the layers' adjacency matrices for which we show the detectability limit vanishes as O(L^{-1/2}) with increasing number of layers, L. Importantly, we find a similar scaling behavior when the summation is thresholded at an optimal value, providing insight into the common-but not well understood-practice of thresholding pairwise-interaction data to obtain sparse network representations.

  7. The effect of social networks and social support on common mental disorders following specific life events.

    PubMed

    Maulik, P K; Eaton, W W; Bradshaw, C P

    2010-08-01

    This study examined the association between life events and common mental disorders while accounting for social networks and social supports. Participants included 1920 adults in the Baltimore Epidemiologic Catchment Area Cohort who were interviewed in 1993-1996, of whom 1071 were re-interviewed in 2004-2005. Generalized estimating equations were used to analyze the data. Social support from friends, spouse or relatives was associated with significantly reduced odds of panic disorder and psychological distress, after experiencing specific life events. Social networks or social support had no significant stress-buffering effect. Social networks and social support had almost no direct or buffering effect on major depressive disorder, and no effect on generalized anxiety disorder and alcohol abuse or dependence disorder. The significant association between social support and psychological distress, rather than diagnosable mental disorders, highlights the importance of social support, especially when the severity of a mental health related problem is low.

  8. Distributed task coding throughout the multiple demand network of the human frontal-insular cortex.

    PubMed

    Stiers, Peter; Mennes, Maarten; Sunaert, Stefan

    2010-08-01

    The large variety of tasks that humans can perform is governed by a small number of key frontal-insular regions that are commonly active during task performance. Little is known about how this network distinguishes different tasks. We report on fMRI data in twelve participants while they performed four cognitive tasks. Of 20 commonly active frontal-insular regions in each hemisphere, five showed a BOLD response increase with increased task demands, regardless of the task. Although active in all tasks, each task invoked a unique response pattern across the voxels in each area that proved reliable in split-half multi-voxel correlation analysis. Consequently, voxels differed in their preference for one or more of the tasks. Voxel-based functional connectivity analyses revealed that same preference voxels distributed across all areas of the network constituted functional sub-networks that characterized the task being executed. Copyright 2010 Elsevier Inc. All rights reserved.

  9. Digital photocontrol of the network of live excitable cells

    NASA Astrophysics Data System (ADS)

    Erofeev, I. S.; Magome, N.; Agladze, K. I.

    2011-11-01

    Recent development of tissue engineering techniques allows creating and maintaining almost indefinitely networks of excitable cells with desired architecture. We coupled the network of live excitable cardiac cells with a common computer by sensitizing them to light, projecting a light pattern on the layer of cells, and monitoring excitation with the aid of fluorescent probes (optical mapping). As a sensitizing substance we used azobenzene trimethylammonium bromide (AzoTAB). This substance undergoes cis-trans-photoisomerization and trans-isomer of AzoTAB inhibits excitation in the cardiac cells, while cis-isomer does not. AzoTAB-mediated sensitization allows, thus, reversible and dynamic control of the excitation waves through the entire cardiomyocyte network either uniformly, or in a preferred spatial pattern. Technically, it was achieved by coupling a common digital projector with a macroview microscope and using computer graphic software for creating the projected pattern of conducting pathways. This approach allows real time interactive photocontrol of the heart tissue.

  10. A physical sciences network characterization of non-tumorigenic and metastatic cells

    NASA Astrophysics Data System (ADS)

    Physical Sciences-Oncology Centers Network; Agus, David B.; Alexander, Jenolyn F.; Arap, Wadih; Ashili, Shashanka; Aslan, Joseph E.; Austin, Robert H.; Backman, Vadim; Bethel, Kelly J.; Bonneau, Richard; Chen, Wei-Chiang; Chen-Tanyolac, Chira; Choi, Nathan C.; Curley, Steven A.; Dallas, Matthew; Damania, Dhwanil; Davies, Paul C. W.; Decuzzi, Paolo; Dickinson, Laura; Estevez-Salmeron, Luis; Estrella, Veronica; Ferrari, Mauro; Fischbach, Claudia; Foo, Jasmine; Fraley, Stephanie I.; Frantz, Christian; Fuhrmann, Alexander; Gascard, Philippe; Gatenby, Robert A.; Geng, Yue; Gerecht, Sharon; Gillies, Robert J.; Godin, Biana; Grady, William M.; Greenfield, Alex; Hemphill, Courtney; Hempstead, Barbara L.; Hielscher, Abigail; Hillis, W. Daniel; Holland, Eric C.; Ibrahim-Hashim, Arig; Jacks, Tyler; Johnson, Roger H.; Joo, Ahyoung; Katz, Jonathan E.; Kelbauskas, Laimonas; Kesselman, Carl; King, Michael R.; Konstantopoulos, Konstantinos; Kraning-Rush, Casey M.; Kuhn, Peter; Kung, Kevin; Kwee, Brian; Lakins, Johnathon N.; Lambert, Guillaume; Liao, David; Licht, Jonathan D.; Liphardt, Jan T.; Liu, Liyu; Lloyd, Mark C.; Lyubimova, Anna; Mallick, Parag; Marko, John; McCarty, Owen J. T.; Meldrum, Deirdre R.; Michor, Franziska; Mumenthaler, Shannon M.; Nandakumar, Vivek; O'Halloran, Thomas V.; Oh, Steve; Pasqualini, Renata; Paszek, Matthew J.; Philips, Kevin G.; Poultney, Christopher S.; Rana, Kuldeepsinh; Reinhart-King, Cynthia A.; Ros, Robert; Semenza, Gregg L.; Senechal, Patti; Shuler, Michael L.; Srinivasan, Srimeenakshi; Staunton, Jack R.; Stypula, Yolanda; Subramanian, Hariharan; Tlsty, Thea D.; Tormoen, Garth W.; Tseng, Yiider; van Oudenaarden, Alexander; Verbridge, Scott S.; Wan, Jenny C.; Weaver, Valerie M.; Widom, Jonathan; Will, Christine; Wirtz, Denis; Wojtkowiak, Jonathan; Wu, Pei-Hsun

    2013-04-01

    To investigate the transition from non-cancerous to metastatic from a physical sciences perspective, the Physical Sciences-Oncology Centers (PS-OC) Network performed molecular and biophysical comparative studies of the non-tumorigenic MCF-10A and metastatic MDA-MB-231 breast epithelial cell lines, commonly used as models of cancer metastasis. Experiments were performed in 20 laboratories from 12 PS-OCs. Each laboratory was supplied with identical aliquots and common reagents and culture protocols. Analyses of these measurements revealed dramatic differences in their mechanics, migration, adhesion, oxygen response, and proteomic profiles. Model-based multi-omics approaches identified key differences between these cells' regulatory networks involved in morphology and survival. These results provide a multifaceted description of cellular parameters of two widely used cell lines and demonstrate the value of the PS-OC Network approach for integration of diverse experimental observations to elucidate the phenotypes associated with cancer metastasis.

  11. A strategic outlook for coordination of ground-based measurement networks of atmospheric state variables and atmospheric composition

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Thorne, P.; Braathen, G.; De Maziere, M.; Thompson, A. M.; Kurylo, M. J., III

    2016-12-01

    There are a number of ground-based global observing networks that collectively aim to make key measurements of atmospheric state variables and atmospheric chemical composition. These networks include, but are not limited to:NDACC: Network for the Detection of Atmospheric Composition Change GUAN: GCOS Upper Air Network GRUAN: GCOS Reference Upper Air Network EARLINET: the European Aerosol Research Lidar Network GAW: Global Atmosphere Watch SHADOZ: Southern Hemisphere ADditional OZonesondes TCCON: Total Carbon Column Observing Network BSRN: Baseline Surface Radiation Network While each network brings unique capabilities to the global observing system, there are many instances where the activities and capabilities of the networks overlap. These commonalities across multiple networks can confound funding agencies when allocating scarce financial resources. Overlaps between networks may also result in some duplication of effort and a resultant sub-optimal use of funding resource for the global observing system. While some degree of overlap is useful for quality assurance, it is essential to identify the degree to which one network can take on a specific responsibility on behalf of all other networks to avoid unnecessary duplication, to identify where expertise in any one network may serve other networks, and to develop a long-term strategy for the evolution of these networks that clarifies to funding agencies where new investment is required. This presentation will briefly summarise the key characteristics of each network listed above, adopt a matrix approach to identify commonalities and, in particular, where there may be a danger of duplication of effort, and where gaps between the networks may be compromising the services that these networks are expected to collectively deliver to the global atmospheric and climate science research communities. The presentation will also examine where sharing of data and tools between networks may result in a more efficient delivery of records of essential climate variables to the global research community. There are aspects of underpinning research that are needed across all of these networks, such as laboratory spectroscopy, that often do not receive the attention they deserve. The presentation will also seek to identify where that underpinning research is lacking.

  12. Revisiting the variation of clustering coefficient of biological networks suggests new modular structure.

    PubMed

    Hao, Dapeng; Ren, Cong; Li, Chuanxing

    2012-05-01

    A central idea in biology is the hierarchical organization of cellular processes. A commonly used method to identify the hierarchical modular organization of network relies on detecting a global signature known as variation of clustering coefficient (so-called modularity scaling). Although several studies have suggested other possible origins of this signature, it is still widely used nowadays to identify hierarchical modularity, especially in the analysis of biological networks. Therefore, a further and systematical investigation of this signature for different types of biological networks is necessary. We analyzed a variety of biological networks and found that the commonly used signature of hierarchical modularity is actually the reflection of spoke-like topology, suggesting a different view of network architecture. We proved that the existence of super-hubs is the origin that the clustering coefficient of a node follows a particular scaling law with degree k in metabolic networks. To study the modularity of biological networks, we systematically investigated the relationship between repulsion of hubs and variation of clustering coefficient. We provided direct evidences for repulsion between hubs being the underlying origin of the variation of clustering coefficient, and found that for biological networks having no anti-correlation between hubs, such as gene co-expression network, the clustering coefficient doesn't show dependence of degree. Here we have shown that the variation of clustering coefficient is neither sufficient nor exclusive for a network to be hierarchical. Our results suggest the existence of spoke-like modules as opposed to "deterministic model" of hierarchical modularity, and suggest the need to reconsider the organizational principle of biological hierarchy.

  13. NMESys: An expert system for network fault detection

    NASA Technical Reports Server (NTRS)

    Nelson, Peter C.; Warpinski, Janet

    1991-01-01

    The problem of network management is becoming an increasingly difficult and challenging task. It is very common today to find heterogeneous networks consisting of many different types of computers, operating systems, and protocols. The complexity of implementing a network with this many components is difficult enough, while the maintenance of such a network is an even larger problem. A prototype network management expert system, NMESys, implemented in the C Language Integrated Production System (CLIPS). NMESys concentrates on solving some of the critical problems encountered in managing a large network. The major goal of NMESys is to provide a network operator with an expert system tool to quickly and accurately detect hard failures, potential failures, and to minimize or eliminate user down time in a large network.

  14. Arbuscular-Mycorrhizal Networks Inhibit Eucalyptus tetrodonta Seedlings in Rain Forest Soil Microcosms

    PubMed Central

    Janos, David P.; Scott, John; Aristizábal, Catalina; Bowman, David M. J. S.

    2013-01-01

    Eucalyptus tetrodonta, a co-dominant tree species of tropical, northern Australian savannas, does not invade adjacent monsoon rain forest unless the forest is burnt intensely. Such facilitation by fire of seedling establishment is known as the "ashbed effect." Because the ashbed effect might involve disruption of common mycorrhizal networks, we hypothesized that in the absence of fire, intact rain forest arbuscular mycorrhizal (AM) networks inhibit E. tetrodonta seedlings. Although arbuscular mycorrhizas predominate in the rain forest, common tree species of the northern Australian savannas (including adult E. tetrodonta) host ectomycorrhizas. To test our hypothesis, we grew E. tetrodonta and Ceiba pentandra (an AM-responsive species used to confirm treatments) separately in microcosms of ambient or methyl-bromide fumigated rain forest soil with or without severing potential mycorrhizal fungus connections to an AM nurse plant, Litsea glutinosa. As expected, C. pentandra formed mycorrhizas in all treatments but had the most root colonization and grew fastest in ambient soil. E. tetrodonta seedlings also formed AM in all treatments, but severing hyphae in fumigated soil produced the least colonization and the best growth. Three of ten E. tetrodonta seedlings in ambient soil with intact network hyphae died. Because foliar chlorosis was symptomatic of iron deficiency, after 130 days we began to fertilize half the E. tetrodonta seedlings in ambient soil with an iron solution. Iron fertilization completely remedied chlorosis and stimulated leaf growth. Our microcosm results suggest that in intact rain forest, common AM networks mediate belowground competition and AM fungi may exacerbate iron deficiency, thereby enhancing resistance to E. tetrodonta invasion. Common AM networks–previously unrecognized as contributors to the ashbed effect–probably help to maintain the rain forest–savanna boundary. PMID:23460899

  15. Studying emotion theories through connectivity analysis: Evidence from generalized psychophysiological interactions and graph theory.

    PubMed

    Huang, Yun-An; Jastorff, Jan; Van den Stock, Jan; Van de Vliet, Laura; Dupont, Patrick; Vandenbulcke, Mathieu

    2018-05-15

    Psychological construction models of emotion state that emotions are variable concepts constructed by fundamental psychological processes, whereas according to basic emotion theory, emotions cannot be divided into more fundamental units and each basic emotion is represented by a unique and innate neural circuitry. In a previous study, we found evidence for the psychological construction account by showing that several brain regions were commonly activated when perceiving different emotions (i.e. a general emotion network). Moreover, this set of brain regions included areas associated with core affect, conceptualization and executive control, as predicted by psychological construction models. Here we investigate directed functional brain connectivity in the same dataset to address two questions: 1) is there a common pathway within the general emotion network for the perception of different emotions and 2) if so, does this common pathway contain information to distinguish between different emotions? We used generalized psychophysiological interactions and information flow indices to examine the connectivity within the general emotion network. The results revealed a general emotion pathway that connects neural nodes involved in core affect, conceptualization, language and executive control. Perception of different emotions could not be accurately classified based on the connectivity patterns from the nodes of the general emotion pathway. Successful classification was achieved when connections outside the general emotion pathway were included. We propose that the general emotion pathway functions as a common pathway within the general emotion network and is involved in shared basic psychological processes across emotions. However, additional connections within the general emotion network are required to classify different emotions, consistent with a constructionist account. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. The Genomic Data Commons Launches

    Cancer.gov

    The NCI Genomic Data Commons is a next generation knowledge network that enables the access, analysis, and submission of cancer genomic data. The GDC facilitates data sharing and promotes precision medicine in oncology.

  17. Competing edge networks

    NASA Astrophysics Data System (ADS)

    Parsons, Mark; Grindrod, Peter

    2012-06-01

    We introduce a model for a pair of nonlinear evolving networks, defined over a common set of vertices, subject to edgewise competition. Each network may grow new edges spontaneously or through triad closure. Both networks inhibit the other's growth and encourage the other's demise. These nonlinear stochastic competition equations yield to a mean field analysis resulting in a nonlinear deterministic system. There may be multiple equilibria; and bifurcations of different types are shown to occur within a reduced parameter space. This situation models competitive communication networks such as BlackBerry Messenger displacing SMS; or instant messaging displacing emails.

  18. A security architecture for health information networks.

    PubMed

    Kailar, Rajashekar; Muralidhar, Vinod

    2007-10-11

    Health information network security needs to balance exacting security controls with practicality, and ease of implementation in today's healthcare enterprise. Recent work on 'nationwide health information network' architectures has sought to share highly confidential data over insecure networks such as the Internet. Using basic patterns of health network data flow and trust models to support secure communication between network nodes, we abstract network security requirements to a core set to enable secure inter-network data sharing. We propose a minimum set of security controls that can be implemented without needing major new technologies, but yet realize network security and privacy goals of confidentiality, integrity and availability. This framework combines a set of technology mechanisms with environmental controls, and is shown to be sufficient to counter commonly encountered network security threats adequately.

  19. The Private Lives of Minerals: Social Network Analysis Applied to Mineralogy and Petrology

    NASA Astrophysics Data System (ADS)

    Hazen, R. M.; Morrison, S. M.; Fox, P. A.; Golden, J. J.; Downs, R. T.; Eleish, A.; Prabhu, A.; Li, C.; Liu, C.

    2016-12-01

    Comprehensive databases of mineral species (rruff.info/ima) and their geographic localities and co-existing mineral assemblages (mindat.org) reveal patterns of mineral association and distribution that mimic social networks, as commonly applied to such varied topics as social media interactions, the spread of disease, terrorism networks, and research collaborations. Applying social network analysis (SNA) to common assemblages of rock-forming igneous and regional metamorphic mineral species, we find patterns of cohesion, segregation, density, and cliques that are similar to those of human social networks. These patterns highlight classic trends in lithologic evolution and are illustrated with sociograms, in which mineral species are the "nodes" and co-existing species form "links." Filters based on chemistry, age, structural group, and other parameters highlight visually both familiar and new aspects of mineralogy and petrology. We quantify sociograms with SNA metrics, including connectivity (based on the frequency of co-occurrence of mineral pairs), homophily (the extent to which co-existing mineral species share compositional and other characteristics), network closure (based on the degree of network interconnectivity), and segmentation (as revealed by isolated "cliques" of mineral species). Exploitation of large and growing mineral data resources with SNA offers promising avenues for discovering previously hidden trends in mineral diversity-distribution systematics, as well as providing new pedagogical approaches to teaching mineralogy and petrology.

  20. Network-Based Comparative Analysis of Arabidopsis Immune Responses to Golovinomyces orontii and Botrytis cinerea Infections.

    PubMed

    Jiang, Zhenhong; Dong, Xiaobao; Zhang, Ziding

    2016-01-11

    A comprehensive exploration of common and specific plant responses to biotrophs and necrotrophs is necessary for a better understanding of plant immunity. Here, we compared the Arabidopsis defense responses evoked by the biotrophic fungus Golovinomyces orontii and the necrotrophic fungus Botrytis cinerea through integrative network analysis. Two time-course transcriptional datasets were integrated with an Arabidopsis protein-protein interaction (PPI) network to construct a G. orontii conditional PPI sub-network (gCPIN) and a B. cinerea conditional PPI sub-network (bCPIN). We found that hubs in gCPIN and bCPIN played important roles in disease resistance. Hubs in bCPIN evolved faster than hubs in gCPIN, indicating the different selection pressures imposed on plants by different pathogens. By analyzing the common network from gCPIN and bCPIN, we identified two network components in which the genes were heavily involved in defense and development, respectively. The co-expression relationships between interacting proteins connecting the two components were different under G. orontii and B. cinerea infection conditions. Closer inspection revealed that auxin-related genes were overrepresented in the interactions connecting these two components, suggesting a critical role of auxin signaling in regulating the different co-expression relationships. Our work may provide new insights into plant defense responses against pathogens with different lifestyles.

  1. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    NASA Astrophysics Data System (ADS)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  2. Regional Educational Laboratory Electronic Network Phase 2 System

    NASA Technical Reports Server (NTRS)

    Cradler, John

    1995-01-01

    The Far West Laboratory in collaboration with the other regional educational laboratories is establishing a regionally coordinated telecommunication network to electronically interconnect each of the ten regional laboratories with educators and education stakeholders from the school to the state level. For the national distributed information database, each lab is working with mid-level networks to establish a common interface for networking throughout the country and include topics of importance to education reform as assessment and technology planning.

  3. Using Reputation Systems and Non-Deterministic Routing to Secure Wireless Sensor Networks

    PubMed Central

    Moya, José M.; Vallejo, Juan Carlos; Fraga, David; Araujo, Álvaro; Villanueva, Daniel; de Goyeneche, Juan-Mariano

    2009-01-01

    Security in wireless sensor networks is difficult to achieve because of the resource limitations of the sensor nodes. We propose a trust-based decision framework for wireless sensor networks coupled with a non-deterministic routing protocol. Both provide a mechanism to effectively detect and confine common attacks, and, unlike previous approaches, allow bad reputation feedback to the network. This approach has been extensively simulated, obtaining good results, even for unrealistically complex attack scenarios. PMID:22412345

  4. Design Issues for Traffic Management for the ATM UBR + Service for TCP Over Satellite Networks

    NASA Technical Reports Server (NTRS)

    Jain, Raj

    1999-01-01

    This project was a comprehensive research program for developing techniques for improving the performance of Internet protocols over Asynchronous Transfer Mode (ATM) based satellite networks. Among the service categories provided by ATM networks, the most commonly used category for data traffic is the unspecified bit rate (UBR) service. UBR allows sources to send data into the network without any feedback control. The project resulted in the numerous ATM Forum contributions and papers.

  5. Spatio-Temporal Neural Networks for Vision, Reasoning and Rapid Decision Making

    DTIC Science & Technology

    1994-08-31

    something that is obviously not pattern for long-term knowledge base (LTKB) facts. As a matter possiblc in common neural networks (as units in a...Conferences on Neural Davis, P. (19W0) Application of op~tical chaos to temporal pattern search in a Networks . Piscataway, NJ. [SC] nonlinear optical...Science Institute PROJECT TITLE: Spatio-temporal Neural Networks for Vision, Reasoning and Rapid Decision Making (N00014-93-1-1149) Number of ONR

  6. Predicting Positive and Negative Relationships in Large Social Networks.

    PubMed

    Wang, Guan-Nan; Gao, Hui; Chen, Lian; Mensah, Dennis N A; Fu, Yan

    2015-01-01

    In a social network, users hold and express positive and negative attitudes (e.g. support/opposition) towards other users. Those attitudes exhibit some kind of binary relationships among the users, which play an important role in social network analysis. However, some of those binary relationships are likely to be latent as the scale of social network increases. The essence of predicting latent binary relationships have recently began to draw researchers' attention. In this paper, we propose a machine learning algorithm for predicting positive and negative relationships in social networks inspired by structural balance theory and social status theory. More specifically, we show that when two users in the network have fewer common neighbors, the prediction accuracy of the relationship between them deteriorates. Accordingly, in the training phase, we propose a segment-based training framework to divide the training data into two subsets according to the number of common neighbors between users, and build a prediction model for each subset based on support vector machine (SVM). Moreover, to deal with large-scale social network data, we employ a sampling strategy that selects small amount of training data while maintaining high accuracy of prediction. We compare our algorithm with traditional algorithms and adaptive boosting of them. Experimental results of typical data sets show that our algorithm can deal with large social networks and consistently outperforms other methods.

  7. GMPLS-based control plane for optical networks: early implementation experience

    NASA Astrophysics Data System (ADS)

    Liu, Hang; Pendarakis, Dimitrios; Komaee, Nooshin; Saha, Debanjan

    2002-07-01

    Generalized Multi-Protocol Label Switching (GMPLS) extends MPLS signaling and Internet routing protocols to provide a scalable, interoperable, distributed control plane, which is applicable to multiple network technologies such as optical cross connects (OXCs), photonic switches, IP routers, ATM switches, SONET and DWDM systems. It is intended to facilitate automatic service provisioning and dynamic neighbor and topology discovery across multi-vendor intelligent transport networks, as well as their clients. Efforts to standardize such a distributed common control plane have reached various stages in several bodies such as the IETF, ITU and OIF. This paper describes the design considerations and architecture of a GMPLS-based control plane that we have prototyped for core optical networks. Functional components of GMPLS signaling and routing are integrated in this architecture with an application layer controller module. Various requirements including bandwidth, network protection and survivability, traffic engineering, optimal utilization of network resources, and etc. are taken into consideration during path computation and provisioning. Initial experiments with our prototype demonstrate the feasibility and main benefits of GMPLS as a distributed control plane for core optical networks. In addition to such feasibility results, actual adoption and deployment of GMPLS as a common control plane for intelligent transport networks will depend on the successful completion of relevant standardization activities, extensive interoperability testing as well as the strengthening of appropriate business drivers.

  8. Multilevel regularized regression for simultaneous taxa selection and network construction with metagenomic count data.

    PubMed

    Liu, Zhenqiu; Sun, Fengzhu; Braun, Jonathan; McGovern, Dermot P B; Piantadosi, Steven

    2015-04-01

    Identifying disease associated taxa and constructing networks for bacteria interactions are two important tasks usually studied separately. In reality, differentiation of disease associated taxa and correlation among taxa may affect each other. One genus can be differentiated because it is highly correlated with another highly differentiated one. In addition, network structures may vary under different clinical conditions. Permutation tests are commonly used to detect differences between networks in distinct phenotypes, and they are time-consuming. In this manuscript, we propose a multilevel regularized regression method to simultaneously identify taxa and construct networks. We also extend the framework to allow construction of a common network and differentiated network together. An efficient algorithm with dual formulation is developed to deal with the large-scale n ≪ m problem with a large number of taxa (m) and a small number of samples (n) efficiently. The proposed method is regularized with a general Lp (p ∈ [0, 2]) penalty and models the effects of taxa abundance differentiation and correlation jointly. We demonstrate that it can identify both true and biologically significant genera and network structures. Software MLRR in MATLAB is available at http://biostatistics.csmc.edu/mlrr/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Comparison of Point Matching Techniques for Road Network Matching

    NASA Astrophysics Data System (ADS)

    Hackeloeer, A.; Klasing, K.; Krisp, J. M.; Meng, L.

    2013-05-01

    Map conflation investigates the unique identification of geographical entities across different maps depicting the same geographic region. It involves a matching process which aims to find commonalities between geographic features. A specific subdomain of conflation called Road Network Matching establishes correspondences between road networks of different maps on multiple layers of abstraction, ranging from elementary point locations to high-level structures such as road segments or even subgraphs derived from the induced graph of a road network. The process of identifying points located on different maps by means of geometrical, topological and semantical information is called point matching. This paper provides an overview of various techniques for point matching, which is a fundamental requirement for subsequent matching steps focusing on complex high-level entities in geospatial networks. Common point matching approaches as well as certain combinations of these are described, classified and evaluated. Furthermore, a novel similarity metric called the Exact Angular Index is introduced, which considers both topological and geometrical aspects. The results offer a basis for further research on a bottom-up matching process for complex map features, which must rely upon findings derived from suitable point matching algorithms. In the context of Road Network Matching, reliable point matches provide an immediate starting point for finding matches between line segments describing the geometry and topology of road networks, which may in turn be used for performing a structural high-level matching on the network level.

  10. Insights into the Ecology and Evolution of Polyploid Plants through Network Analysis.

    PubMed

    Gallagher, Joseph P; Grover, Corrinne E; Hu, Guanjing; Wendel, Jonathan F

    2016-06-01

    Polyploidy is a widespread phenomenon throughout eukaryotes, with important ecological and evolutionary consequences. Although genes operate as components of complex pathways and networks, polyploid changes in genes and gene expression have typically been evaluated as either individual genes or as a part of broad-scale analyses. Network analysis has been fruitful in associating genomic and other 'omic'-based changes with phenotype for many systems. In polyploid species, network analysis has the potential not only to facilitate a better understanding of the complex 'omic' underpinnings of phenotypic and ecological traits common to polyploidy, but also to provide novel insight into the interaction among duplicated genes and genomes. This adds perspective to the global patterns of expression (and other 'omic') change that accompany polyploidy and to the patterns of recruitment and/or loss of genes following polyploidization. While network analysis in polyploid species faces challenges common to other analyses of duplicated genomes, present technologies combined with thoughtful experimental design provide a powerful system to explore polyploid evolution. Here, we demonstrate the utility and potential of network analysis to questions pertaining to polyploidy with an example involving evolution of the transgressively superior cotton fibres found in polyploid Gossypium hirsutum. By combining network analysis with prior knowledge, we provide further insights into the role of profilins in fibre domestication and exemplify the potential for network analysis in polyploid species. © 2016 John Wiley & Sons Ltd.

  11. Validation of Networks Derived from Snowball Sampling of Municipal Science Education Actors

    ERIC Educational Resources Information Center

    von der Fehr, Ane; Sølberg, Jan; Bruun, Jesper

    2018-01-01

    Social network analysis (SNA) has been used in many educational studies in the past decade, but what these studies have in common is that the populations in question in most cases are defined and known to the researchers studying the networks. Snowball sampling is an SNA methodology most often used to study hidden populations, for example, groups…

  12. Enhancing the Delivery of Supplemental Nutrition Assistance Program Education through Geographic Information Systems

    ERIC Educational Resources Information Center

    Stone, Matthew

    2011-01-01

    The Network for a Healthy California (Network) employs a Geographic Information System (GIS) to identify the target audience and plan program activities because GIS is a powerful tool for assisting in data integration and planning. This paper describes common uses of GIS by Network contractors as well as demonstrating the possibilities of GIS as a…

  13. Graph theory findings in the pathophysiology of temporal lobe epilepsy

    PubMed Central

    Chiang, Sharon; Haneef, Zulfi

    2014-01-01

    Temporal lobe epilepsy (TLE) is the most common form of adult epilepsy. Accumulating evidence has shown that TLE is a disorder of abnormal epileptogenic networks, rather than focal sources. Graph theory allows for a network-based representation of TLE brain networks, and has potential to illuminate characteristics of brain topology conducive to TLE pathophysiology, including seizure initiation and spread. We review basic concepts which we believe will prove helpful in interpreting results rapidly emerging from graph theory research in TLE. In addition, we summarize the current state of graph theory findings in TLE as they pertain its pathophysiology. Several common findings have emerged from the many modalities which have been used to study TLE using graph theory, including structural MRI, diffusion tensor imaging, surface EEG, intracranial EEG, magnetoencephalography, functional MRI, cell cultures, simulated models, and mouse models, involving increased regularity of the interictal network configuration, altered local segregation and global integration of the TLE network, and network reorganization of temporal lobe and limbic structures. As different modalities provide different views of the same phenomenon, future studies integrating data from multiple modalities are needed to clarify findings and contribute to the formation of a coherent theory on the pathophysiology of TLE. PMID:24831083

  14. Common neighbours and the local-community-paradigm for topological link prediction in bipartite networks

    NASA Astrophysics Data System (ADS)

    Daminelli, Simone; Thomas, Josephine Maria; Durán, Claudio; Vittorio Cannistraci, Carlo

    2015-11-01

    Bipartite networks are powerful descriptions of complex systems characterized by two different classes of nodes and connections allowed only across but not within the two classes. Unveiling physical principles, building theories and suggesting physical models to predict bipartite links such as product-consumer connections in recommendation systems or drug-target interactions in molecular networks can provide priceless information to improve e-commerce or to accelerate pharmaceutical research. The prediction of nonobserved connections starting from those already present in the topology of a network is known as the link-prediction problem. It represents an important subject both in many-body interaction theory in physics and in new algorithms for applied tools in computer science. The rationale is that the existing connectivity structure of a network can suggest where new connections can appear with higher likelihood in an evolving network, or where nonobserved connections are missing in a partially known network. Surprisingly, current complex network theory presents a theoretical bottle-neck: a general framework for local-based link prediction directly in the bipartite domain is missing. Here, we overcome this theoretical obstacle and present a formal definition of common neighbour index and local-community-paradigm (LCP) for bipartite networks. As a consequence, we are able to introduce the first node-neighbourhood-based and LCP-based models for topological link prediction that utilize the bipartite domain. We performed link prediction evaluations in several networks of different size and of disparate origin, including technological, social and biological systems. Our models significantly improve topological prediction in many bipartite networks because they exploit local physical driving-forces that participate in the formation and organization of many real-world bipartite networks. Furthermore, we present a local-based formalism that allows to intuitively implement neighbourhood-based link prediction entirely in the bipartite domain.

  15. State Support: A Prerequisite for Global Health Network Effectiveness Comment on "Four Challenges that Global Health Networks Face".

    PubMed

    Marten, Robert; Smith, Richard D

    2017-07-24

    Shiffman recently summarized lessons for network effectiveness from an impressive collection of case-studies. However, in common with most global health governance analysis in recent years, Shiffman underplays the important role of states in these global networks. As the body which decides and signs international agreements, often provides the resourcing, and is responsible for implementing initiatives all contributing to the prioritization of certain issues over others, state recognition and support is a prerequisite to enabling and determining global health networks' success. The role of states deserves greater attention, analysis and consideration. We reflect upon the underappreciated role of the state within the current discourse on global health. We present the tobacco case study to illustrate the decisive role of states in determining progress for global health networks, and highlight how states use a legitimacy loop to gain legitimacy from and provide legitimacy to global health networks. Moving forward in assessing global health networks' effectiveness, further investigating state support as a determinant of success will be critical. Understanding how global health networks and states interact and evolve to shape and support their respective interests should be a focus for future research. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  16. Common developmental genome deprogramming in schizophrenia - Role of Integrative Nuclear FGFR1 Signaling (INFS).

    PubMed

    Narla, S T; Lee, Y-W; Benson, C A; Sarder, P; Brennand, K J; Stachowiak, E K; Stachowiak, M K

    2017-07-01

    The watershed-hypothesis of schizophrenia asserts that over 200 different mutations dysregulate distinct pathways that converge on an unspecified common mechanism(s) that controls disease ontogeny. Consistent with this hypothesis, our RNA-sequencing of neuron committed cells (NCCs) differentiated from established iPSCs of 4 schizophrenia patients and 4 control subjects uncovered a dysregulated transcriptome of 1349 mRNAs common to all patients. Data reveals a global dysregulation of developmental genome, deconstruction of coordinated mRNA networks, and the formation of aberrant, new coordinated mRNA networks indicating a concerted action of the responsible factor(s). Sequencing of miRNA transcriptomes demonstrated an overexpression of 16 miRNAs and deconstruction of interactive miRNA-mRNA networks in schizophrenia NCCs. ChiPseq revealed that the nuclear (n) form of FGFR1, a pan-ontogenic regulator, is overexpressed in schizophrenia NCCs and overtargets dysregulated mRNA and miRNA genes. The nFGFR1 targeted 54% of all human gene promoters and 84.4% of schizophrenia dysregulated genes. The upregulated genes reside within major developmental pathways that control neurogenesis and neuron formation, whereas downregulated genes are involved in oligodendrogenesis. Our results indicate (i) an early (preneuronal) genomic etiology of schizophrenia, (ii) dysregulated genes and new coordinated gene networks are common to unrelated cases of schizophrenia, (iii) gene dysregulations are accompanied by increased nFGFR1-genome interactions, and (iv) modeling of increased nFGFR1 by an overexpression of a nFGFR1 lead to up or downregulation of selected genes as observed in schizophrenia NCCs. Together our results designate nFGFR1 signaling as a potential common dysregulated mechanism in investigated patients and potential therapeutic target in schizophrenia. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Scaling and correlations in three bus-transport networks of China

    NASA Astrophysics Data System (ADS)

    Xu, Xinping; Hu, Junhui; Liu, Feng; Liu, Lianshou

    2007-01-01

    We report the statistical properties of three bus-transport networks (BTN) in three different cities of China. These networks are composed of a set of bus lines and stations serviced by these. Network properties, including the degree distribution, clustering and average path length are studied in different definitions of network topology. We explore scaling laws and correlations that may govern intrinsic features of such networks. Besides, we create a weighted network representation for BTN with lines mapped to nodes and number of common stations to weights between lines. In such a representation, the distributions of degree, strength and weight are investigated. A linear behavior between strength and degree s(k)∼k is also observed.

  18. Network Expands Links on DNA Variants and Cancer Risk

    Cancer.gov

    Researchers with the NCI-supported GAME-ON initiative and OncoArray Network are publishing studies identifying dozens of new genetic variants associated with the risk for developing some of the most common cancers, as this Cancer Currents blog post reports.

  19. The Semi-Planned LAN: Prototyping a Local Area Network.

    ERIC Educational Resources Information Center

    True, John F.; Rosenwald, Judah

    1986-01-01

    Five administrative user departments at San Francisco State University discovered that they had common requirements for office automation and data manipulation that could be addressed with microcomputers. The results of a local area network project are presented. (Author/MLW)

  20. Landscape contexts and commonalities: building the LTAR network

    USDA-ARS?s Scientific Manuscript database

    The United States Department of Agriculture has established a Long-Term Agroecosystem Research Network to provide a coordinated framework to, “Enable understanding and forecasting of regional landscape capacities to provide agricultural commodities and ecosystem services under changing conditions.” ...

  1. Multidimensional Homophily in Friendship Networks1

    PubMed Central

    Block, Per; Grund, Thomas

    2014-01-01

    Homophily – the tendency for individuals to associate with similar others – is one of the most persistent findings in social network analysis. Its importance is established along the lines of a multitude of sociologically relevant dimensions, e.g. sex, ethnicity and social class. Existing research, however, mostly focuses on one dimension at a time. But people are inherently multidimensional, have many attributes and are members of multiple groups. In this article, we explore such multidimensionality further in the context of network dynamics. Are friendship ties increasingly likely to emerge and persist when individuals have an increasing number of attributes in common? We analyze eleven friendship networks of adolescents, draw on stochastic actor-oriented network models and focus on the interaction of established homophily effects. Our results indicate that main effects for homophily on various dimensions are positive. At the same time, the interaction of these homophily effects is negative. There seems to be a diminishing effect for having more than one attribute in common. We conclude that studies of homophily and friendship formation need to address such multidimensionality further. PMID:25525503

  2. Allocation of spectral and spatial modes in multidimensional metro-access optical networks

    NASA Astrophysics Data System (ADS)

    Gao, Wenbo; Cvijetic, Milorad

    2018-04-01

    Introduction of spatial division multiplexing (SDM) has added a new dimension in an effort to increase optical fiber channel capacity. At the same time, it can also be explored as an advanced optical networking tool. In this paper, we have investigated the resource allocation to end-users in multidimensional networking structure with plurality of spectral and spatial modes actively deployed in different networking segments. This presents a more comprehensive method as compared to the common practice where the segments of optical network are analyzed independently since the interaction between network hierarchies is included into consideration. We explored the possible transparency from the metro/core network to the optical access network, analyzed the potential bottlenecks from the network architecture perspective, and identified an optimized network structure. In our considerations, the viability of optical grooming through the entire hierarchical all-optical network is investigated by evaluating the effective utilization and spectral efficiency of the network architecture.

  3. A Security Architecture for Health Information Networks

    PubMed Central

    Kailar, Rajashekar

    2007-01-01

    Health information network security needs to balance exacting security controls with practicality, and ease of implementation in today’s healthcare enterprise. Recent work on ‘nationwide health information network’ architectures has sought to share highly confidential data over insecure networks such as the Internet. Using basic patterns of health network data flow and trust models to support secure communication between network nodes, we abstract network security requirements to a core set to enable secure inter-network data sharing. We propose a minimum set of security controls that can be implemented without needing major new technologies, but yet realize network security and privacy goals of confidentiality, integrity and availability. This framework combines a set of technology mechanisms with environmental controls, and is shown to be sufficient to counter commonly encountered network security threats adequately. PMID:18693862

  4. Global Mapping of the Yeast Genetic Interaction Network

    NASA Astrophysics Data System (ADS)

    Tong, Amy Hin Yan; Lesage, Guillaume; Bader, Gary D.; Ding, Huiming; Xu, Hong; Xin, Xiaofeng; Young, James; Berriz, Gabriel F.; Brost, Renee L.; Chang, Michael; Chen, YiQun; Cheng, Xin; Chua, Gordon; Friesen, Helena; Goldberg, Debra S.; Haynes, Jennifer; Humphries, Christine; He, Grace; Hussein, Shamiza; Ke, Lizhu; Krogan, Nevan; Li, Zhijian; Levinson, Joshua N.; Lu, Hong; Ménard, Patrice; Munyana, Christella; Parsons, Ainslie B.; Ryan, Owen; Tonikian, Raffi; Roberts, Tania; Sdicu, Anne-Marie; Shapiro, Jesse; Sheikh, Bilal; Suter, Bernhard; Wong, Sharyl L.; Zhang, Lan V.; Zhu, Hongwei; Burd, Christopher G.; Munro, Sean; Sander, Chris; Rine, Jasper; Greenblatt, Jack; Peter, Matthias; Bretscher, Anthony; Bell, Graham; Roth, Frederick P.; Brown, Grant W.; Andrews, Brenda; Bussey, Howard; Boone, Charles

    2004-02-01

    A genetic interaction network containing ~1000 genes and ~4000 interactions was mapped by crossing mutations in 132 different query genes into a set of ~4700 viable gene yeast deletion mutants and scoring the double mutant progeny for fitness defects. Network connectivity was predictive of function because interactions often occurred among functionally related genes, and similar patterns of interactions tended to identify components of the same pathway. The genetic network exhibited dense local neighborhoods; therefore, the position of a gene on a partially mapped network is predictive of other genetic interactions. Because digenic interactions are common in yeast, similar networks may underlie the complex genetics associated with inherited phenotypes in other organisms.

  5. Specialized Common Carriers: Long Distance Alternatives for Military Installations.

    DTIC Science & Technology

    1984-03-01

    military installation managers with a basic knowledge of how Specialized Common Carriers enter-f ed the telecommuunications market , what services...how Specialized Common Carriers .ntered the teleccumunicaticns market , what services Specialized Common Carriers offer, and how to obtain these...26 A. ATSTOS SEFIICES--THE MARKET STANDARD .... 26 E. SCC SVITCEID VOICE NETWORK SERVICES . . . 28 1. Dial Access Services . . . .. 28 2

  6. From brain to earth and climate systems: small-world interaction networks or not?

    PubMed

    Bialonski, Stephan; Horstmann, Marie-Therese; Lehnertz, Klaus

    2010-03-01

    We consider recent reports on small-world topologies of interaction networks derived from the dynamics of spatially extended systems that are investigated in diverse scientific fields such as neurosciences, geophysics, or meteorology. With numerical simulations that mimic typical experimental situations, we have identified an important constraint when characterizing such networks: indications of a small-world topology can be expected solely due to the spatial sampling of the system along with the commonly used time series analysis based approaches to network characterization.

  7. Resource Management In Peer-To-Peer Networks: A Nadse Approach

    NASA Astrophysics Data System (ADS)

    Patel, R. B.; Garg, Vishal

    2011-12-01

    This article presents a common solution to Peer-to-Peer (P2P) network problems and distributed computing with the help of "Neighbor Assisted Distributed and Scalable Environment" (NADSE). NADSE supports both device and code mobility. In this article mainly we focus on the NADSE based resource management technique. How information dissemination and searching is speedup when using the NADSE service provider node in large network. Results show that performance of the NADSE network is better in comparison to Gnutella, and Freenet.

  8. The Role of State Library Agencies in the Evolving National Information Network. Proceedings of the Joint Meeting of the Library of Congress Network Advisory Committee and the Chief Officers of State Library Agencies (Washington, D.C., April 27-29, 1992). Network Planning Paper No. 23.

    ERIC Educational Resources Information Center

    Library of Congress, Washington, DC. Network Development and MARC Standards Office.

    The papers in this proceedings describe similarities and differences in state libraries and examine the state library role in local, regional, and national network development and in the dissemination of information to various client segments. The papers are: (1) "The Commonalities of State Library Agencies" (Barrat Wilkins); (2)…

  9. Design and Construction of a High-speed Network Connecting All the Protein Crystallography Beamlines at the Photon Factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsugaki, Naohiro; Yamada, Yusuke; Igarashi, Noriyuki

    2007-01-19

    A private network, physically separated from the facility network, was designed and constructed which covered all the four protein crystallography beamlines at the Photon Factory (PF) and Structural Biology Research Center (SBRC). Connecting all the beamlines in the same network allows for simple authentication and a common working environment for a user who uses multiple beamlines. Giga-bit Ethernet wire-speed was achieved for the communication among the beamlines and SBRC buildings.

  10. Developer Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-21

    NREL's Developer Network, developer.nrel.gov, provides data that users can access to provide data to their own analyses, mobile and web applications. Developers can retrieve the data through a Web services API (application programming interface). The Developer Network handles overhead of serving up web services such as key management, authentication, analytics, reporting, documentation standards, and throttling in a common architecture, while allowing web services and APIs to be maintained and managed independently.

  11. Fuzz Testing of Industrial Network Protocols in Programmable Logic Controllers

    DTIC Science & Technology

    2017-12-01

    PLCs) are vital components in these cyber-physical systems. The industrial network protocols used to communicate between nodes in a control network...AB/RA) MicroLogix 1100 PLC through its implementation of EtherNet/IP, Common Industrial Protocol (CIP), and Programmable Controller Communication ...Commands (PCCC) communication protocols. This research also examines whether cross-generational vulnerabilities exist in the more advanced AB/RA

  12. New Uses for the VLBI Network

    NASA Technical Reports Server (NTRS)

    Clark, Tom

    2000-01-01

    This paper suggests some potential new uses for the existing VLBI network. It seems that every VLBI group in the world faces some common problems: We do not have enough money to operate! We do not have enough money to make improvements!! In this contribution I discuss several possibilities for new business that might help to support the network stations without causing serious impacts on the primary VLBI programs.

  13. WiFi in Schools, Electromagnetic Fields and Cell Phones: Alberta Health Fact Sheet

    ERIC Educational Resources Information Center

    Alberta Education, 2012

    2012-01-01

    Wireless devices and the networks that support them are becoming more common in Alberta schools. WiFi is a wireless networking technology that allows computers and other devices to communicate over a wireless signal. Typically the signal is carried by radio waves over an area of up to 100 meters. Through the implementation of a WiFi network,…

  14. Supervised Learning in CINets

    DTIC Science & Technology

    2011-07-01

    supervised learning process is compared to that of Artificial Neural Network ( ANNs ), fuzzy logic rule set, and Bayesian network approaches...of both fuzzy logic systems and Artificial Neural Networks ( ANNs ). Like fuzzy logic systems, the CINet technique allows the use of human- intuitive...fuzzy rule systems [3] CINets also maintain features common to both fuzzy systems and ANNs . The technique can be be shown to possess the property

  15. The Relative Ineffectiveness of Criminal Network Disruption

    PubMed Central

    Duijn, Paul A. C.; Kashirin, Victor; Sloot, Peter M. A.

    2014-01-01

    Researchers, policymakers and law enforcement agencies across the globe struggle to find effective strategies to control criminal networks. The effectiveness of disruption strategies is known to depend on both network topology and network resilience. However, as these criminal networks operate in secrecy, data-driven knowledge concerning the effectiveness of different criminal network disruption strategies is very limited. By combining computational modeling and social network analysis with unique criminal network intelligence data from the Dutch Police, we discovered, in contrast to common belief, that criminal networks might even become ‘stronger’, after targeted attacks. On the other hand increased efficiency within criminal networks decreases its internal security, thus offering opportunities for law enforcement agencies to target these networks more deliberately. Our results emphasize the importance of criminal network interventions at an early stage, before the network gets a chance to (re-)organize to maximum resilience. In the end disruption strategies force criminal networks to become more exposed, which causes successful network disruption to become a long-term effort. PMID:24577374

  16. The relative ineffectiveness of criminal network disruption.

    PubMed

    Duijn, Paul A C; Kashirin, Victor; Sloot, Peter M A

    2014-02-28

    Researchers, policymakers and law enforcement agencies across the globe struggle to find effective strategies to control criminal networks. The effectiveness of disruption strategies is known to depend on both network topology and network resilience. However, as these criminal networks operate in secrecy, data-driven knowledge concerning the effectiveness of different criminal network disruption strategies is very limited. By combining computational modeling and social network analysis with unique criminal network intelligence data from the Dutch Police, we discovered, in contrast to common belief, that criminal networks might even become 'stronger', after targeted attacks. On the other hand increased efficiency within criminal networks decreases its internal security, thus offering opportunities for law enforcement agencies to target these networks more deliberately. Our results emphasize the importance of criminal network interventions at an early stage, before the network gets a chance to (re-)organize to maximum resilience. In the end disruption strategies force criminal networks to become more exposed, which causes successful network disruption to become a long-term effort.

  17. SeaDataNet - Pan-European infrastructure for marine and ocean data management: Unified access to distributed data sets (www.seadatanet.org)

    NASA Astrophysics Data System (ADS)

    Schaap, Dick M. A.; Maudire, Gilbert

    2010-05-01

    SeaDataNet is a leading infrastructure in Europe for marine & ocean data management. It is actively operating and further developing a Pan-European infrastructure for managing, indexing and providing access to ocean and marine data sets and data products, acquired via research cruises and other observational activities, in situ and remote sensing. The basis of SeaDataNet is interconnecting 40 National Oceanographic Data Centres and Marine Data Centers from 35 countries around European seas into a distributed network of data resources with common standards for metadata, vocabularies, data transport formats, quality control methods and flags, and access. Thereby most of the NODC's operate and/or are developing national networks to other institutes in their countries to ensure national coverage and long-term stewardship of available data sets. The majority of data managed by SeaDataNet partners concerns physical oceanography, marine chemistry, hydrography, and a substantial volume of marine biology and geology and geophysics. These are partly owned by the partner institutes themselves and for a major part also owned by other organizations from their countries. The SeaDataNet infrastructure is implemented with support of the EU via the EU FP6 SeaDataNet project to provide the Pan-European data management system adapted both to the fragmented observation system and the users need for an integrated access to data, meta-data, products and services. The SeaDataNet project has a duration of 5 years and started in 2006, but builds upon earlier data management infrastructure projects, undertaken over a period of 20 years by an expanding network of oceanographic data centres from the countries around all European seas. Its predecessor project Sea-Search had a strict focus on metadata. SeaDataNet maintains significant interest in the further development of the metadata infrastructure, extending its services with the provision of easy data access and generic data products. Version 1 of its infrastructure upgrade was launched in April 2008 and is now well underway to include all 40 data centres at V1 level. It comprises the network of 40 interconnected data centres (NODCs) and a central SeaDataNet portal. V1 provides users a unified and transparent overview of the metadata and controlled access to the large collections of data sets, that are managed at these data centres. The SeaDataNet V1 infrastructure comprises the following middleware services: • Discovery services = Metadata directories and User interfaces • Vocabulary services = Common vocabularies and Governance • Security services = Authentication, Authorization & Accounting • Delivery services = Requesting and Downloading of data sets • Viewing services = Mapping of metadata • Monitoring services = Statistics on system usage and performance and Registration of data requests and transactions • Maintenance services = Entry and updating of metadata by data centres Also good progress is being made with extending the SeaDataNet infrastructure with V2 services: • Viewing services = Quick views and Visualisation of data and data products • Product services = Generic and standard products • Exchange services = transformation of SeaDataNet portal CDI output to INSPIRE compliance As a basis for the V1 services, common standards have been defined for metadata and data formats, common vocabularies, quality flags, and quality control methods, based on international standards, such as ISO 19115, OGC, NetCDF (CF), ODV, best practices from IOC and ICES, and following INSPIRE developments. An important objective of the SeaDataNet V1 infrastructure is to provide transparent access to the distributed data sets via a unique user interface and download service. In the SeaDataNet V1 architecture the Common Data Index (CDI) V1 metadata service provides the link between discovery and delivery of data sets. The CDI user interface enables users to have a detailed insight of the availability and geographical distribution of marine data, archived at the connected data centres. It provides sufficient information to allow the user to assess the data relevance. Moreover the CDI user interface provides the means for downloading data sets in common formats via a transaction mechanism. The SeaDataNet portal provides registered users access to these distributed data sets via the CDI V1 Directory and a shopping basket mechanism. This allows registered users to locate data of interest and submit their data requests. The requests are forwarded automatically from the portal to the relevant SeaDataNet data centres. This process is controlled via the Request Status Manager (RSM) Web Service at the portal and a Download Manager (DM) java software module, implemented at each of the data centres. The RSM also enables registered users to check regularly the status of their requests and download data sets, after access has been granted. Data centres can follow all transactions for their data sets online and can handle requests which require their consent. The actual delivery of data sets is done between the user and the selected data centre. Very good progress is being made with connecting all SeaDataNet data centres and their data sets to the CDI V1 system. At present the CDI V1 system provides users functionality to discover and download more than 500.000 data sets, a number which is steadily increasing. The SeaDataNet architecture provides a coherent system of the various V1 services and inclusion of the V2 services. For the implementation, a range of technical components have been defined and developed. These make use of recent web technologies, and also comprise Java components, to provide multi-platform support and syntactic interoperability. To facilitate sharing of resources and interoperability, SeaDataNet has adopted the technology of SOAP Web services for various communication tasks. The SeaDataNet architecture has been designed as a multi-disciplinary system from the beginning. It is able to support a wide variety of data types and to serve several sector communities. SeaDataNet is willing to share its technologies and expertise, to spread and expand its approach, and to build bridges to other well established infrastructures in the marine domain. Therefore SeaDataNet has developed a strategy of seeking active cooperation on a national scale with other data holding organisations via its NODC networks and on an international scale with other European and international data management initiatives and networks. This is done with the objective to achieve a wider coverage of data sources and an overall interoperability between data infrastructures in the marine and ocean domains. Recent examples are e.g. the EU FP7 projects Geo-Seas for geology and geophysical data sets, UpgradeBlackSeaScene for a Black Sea data management infrastructure, CaspInfo for a Caspian Sea data management infrastructure, the EU EMODNET pilot projects, for hydrographic, chemical, and biological data sets. All projects are adopting the SeaDataNet standards and extending its services. Also active cooperation takes place with EuroGOOS and MyOcean in the domain of real-time and delayed mode metocean monitoring data. SeaDataNet Partners: IFREMER (France), MARIS (Netherlands), HCMR/HNODC (Greece), ULg (Belgium), OGS (Italy), NERC/BODC (UK), BSH/DOD (Germany), SMHI (Sweden), IEO (Spain), RIHMI/WDC (Russia), IOC (International), ENEA (Italy), INGV (Italy), METU (Turkey), CLS (France), AWI (Germany), IMR (Norway), NERI (Denmark), ICES (International), EC-DG JRC (International), MI (Ireland), IHPT (Portugal), RIKZ (Netherlands), RBINS/MUMM (Belgium), VLIZ (Belgium), MRI (Iceland), FIMR (Finland ), IMGW (Poland), MSI (Estonia), IAE/UL (Latvia), CMR (Lithuania), SIO/RAS (Russia), MHI/DMIST (Ukraine), IO/BAS (Bulgaria), NIMRD (Romania), TSU (Georgia), INRH (Morocco), IOF (Croatia), PUT (Albania), NIB (Slovenia), UoM (Malta), OC/UCY (Cyprus), IOLR (Israel), NCSR/NCMS (Lebanon), CNR-ISAC (Italy), ISMAL (Algeria), INSTM (Tunisia)

  18. 47 CFR 32.2341 - Large private branch exchanges.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 32.2341 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES..., Intrabuilding Network Cable. (c) The cost of outside plant, whether or not on private property, used with intrabuilding, network cable shall be charged to the appropriate outside plant accounts. (d)-(e) [Reserved] (f...

  19. Visuospatial Processing in Children with Neurofibromatosis Type 1

    ERIC Educational Resources Information Center

    Clements-Stephens, Amy M.; Rimrodt, Sheryl L.; Gaur, Pooja; Cutting, Laurie E.

    2008-01-01

    Neuroimaging studies investigating the neural network of visuospatial processing have revealed a right hemisphere network of activation including inferior parietal lobe, dorsolateral prefrontal cortex, and extrastriate regions. Impaired visuospatial processing, indicated by the Judgment of Line Orientation (JLO), is commonly seen in individuals…

  20. TOWARDS AN AUTOMATED TOOL FOR CHANNEL-NETWORK CHARACTERIZATIONS, MODELING, AND ASSESSMENT

    EPA Science Inventory

    Detailed characterization of channel networks for hydrologic and geomorphic models has traditionally been a difficult and expensive proposition, and lack of information has thus been a common limitation of modeling efforts. With the advent of datasets derived from high-resolutio...

  1. International Access to Bibliographic Data: MARC and MARC-Related Activities.

    ERIC Educational Resources Information Center

    Hopkinson, Alan

    1984-01-01

    This review of international information exchange formats focuses on the international standard ISO 2709 and MARC formats, "UNISIST Reference Manual," UNIMARC (Universal MARC format), the Common Communications Format, centralized network formats (International Serials Data System, MINISIS, regional), International MARC network study…

  2. Efficient computation of parameter sensitivities of discrete stochastic chemical reaction networks.

    PubMed

    Rathinam, Muruhan; Sheppard, Patrick W; Khammash, Mustafa

    2010-01-21

    Parametric sensitivity of biochemical networks is an indispensable tool for studying system robustness properties, estimating network parameters, and identifying targets for drug therapy. For discrete stochastic representations of biochemical networks where Monte Carlo methods are commonly used, sensitivity analysis can be particularly challenging, as accurate finite difference computations of sensitivity require a large number of simulations for both nominal and perturbed values of the parameters. In this paper we introduce the common random number (CRN) method in conjunction with Gillespie's stochastic simulation algorithm, which exploits positive correlations obtained by using CRNs for nominal and perturbed parameters. We also propose a new method called the common reaction path (CRP) method, which uses CRNs together with the random time change representation of discrete state Markov processes due to Kurtz to estimate the sensitivity via a finite difference approximation applied to coupled reaction paths that emerge naturally in this representation. While both methods reduce the variance of the estimator significantly compared to independent random number finite difference implementations, numerical evidence suggests that the CRP method achieves a greater variance reduction. We also provide some theoretical basis for the superior performance of CRP. The improved accuracy of these methods allows for much more efficient sensitivity estimation. In two example systems reported in this work, speedup factors greater than 300 and 10,000 are demonstrated.

  3. Identification of Linkages between EDCs in Personal Care Products and Breast Cancer through Data Integration Combined with Gene Network Analysis.

    PubMed

    Jeong, Hyeri; Kim, Jongwoon; Kim, Youngjun

    2017-09-30

    Approximately 1000 chemicals have been reported to possibly have endocrine disrupting effects, some of which are used in consumer products, such as personal care products (PCPs) and cosmetics. We conducted data integration combined with gene network analysis to: (i) identify causal molecular mechanisms between endocrine disrupting chemicals (EDCs) used in PCPs and breast cancer; and (ii) screen candidate EDCs associated with breast cancer. Among EDCs used in PCPs, four EDCs having correlation with breast cancer were selected, and we curated 27 common interacting genes between those EDCs and breast cancer to perform the gene network analysis. Based on the gene network analysis, ESR1, TP53, NCOA1, AKT1, and BCL6 were found to be key genes to demonstrate the molecular mechanisms of EDCs in the development of breast cancer. Using GeneMANIA, we additionally predicted 20 genes which could interact with the 27 common genes. In total, 47 genes combining the common and predicted genes were functionally grouped with the gene ontology and KEGG pathway terms. With those genes, we finally screened candidate EDCs for their potential to increase breast cancer risk. This study highlights that our approach can provide insights to understand mechanisms of breast cancer and identify potential EDCs which are in association with breast cancer.

  4. Topological relationships between brain and social networks.

    PubMed

    Sakata, Shuzo; Yamamori, Tetsuo

    2007-01-01

    Brains are complex networks. Previously, we revealed that specific connected structures are either significantly abundant or rare in cortical networks. However, it remains unknown whether systems from other disciplines have similar architectures to brains. By applying network-theoretical methods, here we show topological similarities between brain and social networks. We found that the statistical relevance of specific tied structures differs between social "friendship" and "disliking" networks, suggesting relation-type-specific topology of social networks. Surprisingly, overrepresented connected structures in brain networks are more similar to those in the friendship networks than to those in other networks. We found that balanced and imbalanced reciprocal connections between nodes are significantly abundant and rare, respectively, whereas these results are unpredictable by simply counting mutual connections. We interpret these results as evidence of positive selection of balanced mutuality between nodes. These results also imply the existence of underlying common principles behind the organization of brain and social networks.

  5. Exploration of the integration of care for persons with a traumatic brain injury using social network analysis methodology.

    PubMed

    Lamontagne, Marie-Eve

    2013-01-01

    Integration is a popular strategy to increase the quality of care within systems of care. However, there is no common language, approach or tool allowing for a valid description, comparison and evaluation of integrated care. Social network analysis could be a viable methodology to provide an objective picture of integrated networks. To illustrate social network analysis use in the context of systems of care for traumatic brain injury. We surveyed members of a network using a validated questionnaire to determine the links between them. We determined the density, centrality, multiplexity, and quality of the links reported. The network was described as moderately dense (0.6), the most prevalent link was knowledge, and four organisation members of a consortium were central to the network. Social network analysis allowed us to create a graphic representation of the network. Social network analysis is a useful methodology to objectively characterise integrated networks.

  6. Methylphenidate Modulates Functional Network Connectivity to Enhance Attention

    PubMed Central

    Zhang, Sheng; Hsu, Wei-Ting; Scheinost, Dustin; Finn, Emily S.; Shen, Xilin; Constable, R. Todd; Li, Chiang-Shan R.; Chun, Marvin M.

    2016-01-01

    Recent work has demonstrated that human whole-brain functional connectivity patterns measured with fMRI contain information about cognitive abilities, including sustained attention. To derive behavioral predictions from connectivity patterns, our group developed a connectome-based predictive modeling (CPM) approach (Finn et al., 2015; Rosenberg et al., 2016). Previously using CPM, we defined a high-attention network, comprising connections positively correlated with performance on a sustained attention task, and a low-attention network, comprising connections negatively correlated with performance. Validating the networks as generalizable biomarkers of attention, models based on network strength at rest predicted attention-deficit/hyperactivity disorder (ADHD) symptoms in an independent group of individuals (Rosenberg et al., 2016). To investigate whether these networks play a causal role in attention, here we examined their strength in healthy adults given methylphenidate (Ritalin), a common ADHD treatment, compared with unmedicated controls. As predicted, individuals given methylphenidate showed patterns of connectivity associated with better sustained attention: higher high-attention and lower low-attention network strength than controls. There was significant overlap between the high-attention network and a network with greater strength in the methylphenidate group, and between the low-attention network and a network with greater strength in the control group. Network strength also predicted behavior on a stop-signal task, such that participants with higher go response rates showed higher high-attention and lower low-attention network strength. These results suggest that methylphenidate acts by modulating functional brain networks related to sustained attention, and that changing whole-brain connectivity patterns may help improve attention. SIGNIFICANCE STATEMENT Recent work identified a promising neuromarker of sustained attention based on whole-brain functional connectivity networks. To investigate the causal role of these networks in attention, we examined their response to a dose of methylphenidate, a common and effective treatment for attention-deficit/hyperactivity disorder, in healthy adults. As predicted, individuals on methylphenidate showed connectivity signatures of better sustained attention: higher high-attention and lower low-attention network strength than controls. These results suggest that methylphenidate acts by modulating strength in functional brain networks related to attention, and that changing whole-brain connectivity patterns may improve attention. PMID:27629707

  7. Methylphenidate Modulates Functional Network Connectivity to Enhance Attention.

    PubMed

    Rosenberg, Monica D; Zhang, Sheng; Hsu, Wei-Ting; Scheinost, Dustin; Finn, Emily S; Shen, Xilin; Constable, R Todd; Li, Chiang-Shan R; Chun, Marvin M

    2016-09-14

    Recent work has demonstrated that human whole-brain functional connectivity patterns measured with fMRI contain information about cognitive abilities, including sustained attention. To derive behavioral predictions from connectivity patterns, our group developed a connectome-based predictive modeling (CPM) approach (Finn et al., 2015; Rosenberg et al., 2016). Previously using CPM, we defined a high-attention network, comprising connections positively correlated with performance on a sustained attention task, and a low-attention network, comprising connections negatively correlated with performance. Validating the networks as generalizable biomarkers of attention, models based on network strength at rest predicted attention-deficit/hyperactivity disorder (ADHD) symptoms in an independent group of individuals (Rosenberg et al., 2016). To investigate whether these networks play a causal role in attention, here we examined their strength in healthy adults given methylphenidate (Ritalin), a common ADHD treatment, compared with unmedicated controls. As predicted, individuals given methylphenidate showed patterns of connectivity associated with better sustained attention: higher high-attention and lower low-attention network strength than controls. There was significant overlap between the high-attention network and a network with greater strength in the methylphenidate group, and between the low-attention network and a network with greater strength in the control group. Network strength also predicted behavior on a stop-signal task, such that participants with higher go response rates showed higher high-attention and lower low-attention network strength. These results suggest that methylphenidate acts by modulating functional brain networks related to sustained attention, and that changing whole-brain connectivity patterns may help improve attention. Recent work identified a promising neuromarker of sustained attention based on whole-brain functional connectivity networks. To investigate the causal role of these networks in attention, we examined their response to a dose of methylphenidate, a common and effective treatment for attention-deficit/hyperactivity disorder, in healthy adults. As predicted, individuals on methylphenidate showed connectivity signatures of better sustained attention: higher high-attention and lower low-attention network strength than controls. These results suggest that methylphenidate acts by modulating strength in functional brain networks related to attention, and that changing whole-brain connectivity patterns may improve attention. Copyright © 2016 the authors 0270-6474/16/369547-11$15.00/0.

  8. On the stochastic dissemination of faults in an admissible network

    NASA Technical Reports Server (NTRS)

    Kyrala, A.

    1987-01-01

    The dynamic distribution of faults in a general type network is discussed. The starting point is a uniquely branched network in which each pair of nodes is connected by a single branch. Mathematical expressions for the uniquely branched network transition matrix are derived to show that sufficient stationarity exists to ensure the validity of the use of the Markov Chain model to analyze networks. In addition the conditions for the use of Semi-Markov models are discussed. General mathematical expressions are derived in an examination of branch redundancy techniques commonly used to increase reliability.

  9. The price of complexity in financial networks

    NASA Astrophysics Data System (ADS)

    Battiston, Stefano; Caldarelli, Guido; May, Robert M.; Roukny, Tarik; Stiglitz, Joseph E.

    2016-09-01

    Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises.

  10. Graphical user interface for wireless sensor networks simulator

    NASA Astrophysics Data System (ADS)

    Paczesny, Tomasz; Paczesny, Daniel; Weremczuk, Jerzy

    2008-01-01

    Wireless Sensor Networks (WSN) are currently very popular area of development. It can be suited in many applications form military through environment monitoring, healthcare, home automation and others. Those networks, when working in dynamic, ad-hoc model, need effective protocols which must differ from common computer networks algorithms. Research on those protocols would be difficult without simulation tool, because real applications often use many nodes and tests on such a big networks take much effort and costs. The paper presents Graphical User Interface (GUI) for simulator which is dedicated for WSN studies, especially in routing and data link protocols evaluation.

  11. The price of complexity in financial networks.

    PubMed

    Battiston, Stefano; Caldarelli, Guido; May, Robert M; Roukny, Tarik; Stiglitz, Joseph E

    2016-09-06

    Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises.

  12. Identification of Common Neural Circuit Disruptions in Cognitive Control Across Psychiatric Disorders.

    PubMed

    McTeague, Lisa M; Huemer, Julia; Carreon, David M; Jiang, Ying; Eickhoff, Simon B; Etkin, Amit

    2017-07-01

    Cognitive deficits are a common feature of psychiatric disorders. The authors investigated the nature of disruptions in neural circuitry underlying cognitive control capacities across psychiatric disorders through a transdiagnostic neuroimaging meta-analysis. A PubMed search was conducted for whole-brain functional neuroimaging articles published through June 2015 that compared activation in patients with axis I disorders and matched healthy control participants during cognitive control tasks. Tasks that probed performance or conflict monitoring, response inhibition or selection, set shifting, verbal fluency, and recognition or working memory were included. Activation likelihood estimation meta-analyses were conducted on peak voxel coordinates. The 283 experiments submitted to meta-analysis included 5,728 control participants and 5,493 patients with various disorders (schizophrenia, bipolar or unipolar depression, anxiety disorders, and substance use disorders). Transdiagnostically abnormal activation was evident in the left prefrontal cortex as well as the anterior insula, the right ventrolateral prefrontal cortex, the right intraparietal sulcus, and the midcingulate/presupplementary motor area. Disruption was also observed in a more anterior cluster in the dorsal cingulate cortex, which overlapped with a network of structural perturbation that the authors previously reported in a transdiagnostic meta-analysis of gray matter volume. These findings demonstrate a common pattern of disruption across major psychiatric disorders that parallels the "multiple-demand network" observed in intact cognition. This network interfaces with the anterior-cingulo-insular or "salience network" demonstrated to be transdiagnostically vulnerable to gray matter reduction. Thus, networks intrinsic to adaptive, flexible cognition are vulnerable to broad-spectrum psychopathology. Dysfunction in these networks may reflect an intermediate transdiagnostic phenotype, which could be leveraged to advance therapeutics.

  13. Role of Hydrophobic Clusters and Long-Range Contact Networks in the Folding of (α/β)8 Barrel Proteins

    PubMed Central

    Selvaraj, S.; Gromiha, M. Michael

    2003-01-01

    Analysis on the three dimensional structures of (α/β)8 barrel proteins provides ample light to understand the factors that are responsible for directing and maintaining their common fold. In this work, the hydrophobically enriched clusters are identified in 92% of the considered (α/β)8 barrel proteins. The residue segments with hydrophobic clusters have high thermal stability. Further, these clusters are formed and stabilized through long-range interactions. Specifically, a network of long-range contacts connects adjacent β-strands of the (α/β)8 barrel domain and the hydrophobic clusters. The implications of hydrophobic clusters and long-range networks in providing a feasible common mechanism for the folding of (α/β)8 barrel proteins are proposed. PMID:12609894

  14. Molecular Regulation of Lumen Morphogenesis Review

    PubMed Central

    Datta, Anirban; Bryant, David M.; Mostov, Keith E.

    2013-01-01

    The asymmetric polarization of cells allows specialized functions to be performed at discrete subcellular locales. Spatiotemporal coordination of polarization between groups of cells allowed the evolution of metazoa. For instance, coordinated apical-basal polarization of epithelial and endothelial cells allows transport of nutrients and metabolites across cell barriers and tissue microenvironments. The defining feature of such tissues is the presence of a central, interconnected luminal network. Although tubular networks are present in seemingly different organ systems, such as the kidney, lung, and blood vessels, common underlying principles govern their formation. Recent studies using in vivo and in vitro models of lumen formation have shed new light on the molecular networks regulating this fundamental process. We here discuss progress in understanding common design principles underpinning de novo lumen formation and expansion. PMID:21300279

  15. Correlational Neural Networks.

    PubMed

    Chandar, Sarath; Khapra, Mitesh M; Larochelle, Hugo; Ravindran, Balaraman

    2016-02-01

    Common representation learning (CRL), wherein different descriptions (or views) of the data are embedded in a common subspace, has been receiving a lot of attention recently. Two popular paradigms here are canonical correlation analysis (CCA)-based approaches and autoencoder (AE)-based approaches. CCA-based approaches learn a joint representation by maximizing correlation of the views when projected to the common subspace. AE-based methods learn a common representation by minimizing the error of reconstructing the two views. Each of these approaches has its own advantages and disadvantages. For example, while CCA-based approaches outperform AE-based approaches for the task of transfer learning, they are not as scalable as the latter. In this work, we propose an AE-based approach, correlational neural network (CorrNet), that explicitly maximizes correlation among the views when projected to the common subspace. Through a series of experiments, we demonstrate that the proposed CorrNet is better than AE and CCA with respect to its ability to learn correlated common representations. We employ CorrNet for several cross-language tasks and show that the representations learned using it perform better than the ones learned using other state-of-the-art approaches.

  16. Experiments in Neural-Network Control of a Free-Flying Space Robot

    NASA Technical Reports Server (NTRS)

    Wilson, Edward

    1995-01-01

    Four important generic issues are identified and addressed in some depth in this thesis as part of the development of an adaptive neural network based control system for an experimental free flying space robot prototype. The first issue concerns the importance of true system level design of the control system. A new hybrid strategy is developed here, in depth, for the beneficial integration of neural networks into the total control system. A second important issue in neural network control concerns incorporating a priori knowledge into the neural network. In many applications, it is possible to get a reasonably accurate controller using conventional means. If this prior information is used purposefully to provide a starting point for the optimizing capabilities of the neural network, it can provide much faster initial learning. In a step towards addressing this issue, a new generic Fully Connected Architecture (FCA) is developed for use with backpropagation. A third issue is that neural networks are commonly trained using a gradient based optimization method such as backpropagation; but many real world systems have Discrete Valued Functions (DVFs) that do not permit gradient based optimization. One example is the on-off thrusters that are common on spacecraft. A new technique is developed here that now extends backpropagation learning for use with DVFs. The fourth issue is that the speed of adaptation is often a limiting factor in the implementation of a neural network control system. This issue has been strongly resolved in the research by drawing on the above new contributions.

  17. The importance of social networks on smoking: perspectives of women who quit smoking during pregnancy.

    PubMed

    Nguyen, Stephanie N; Von Kohorn, Isabelle; Schulman-Green, Dena; Colson, Eve R

    2012-08-01

    While up to 45% of women quit smoking during pregnancy, nearly 80% return to smoking within a year after delivery. Interventions to prevent relapse have had limited success. The study objective was to understand what influences return to smoking after pregnancy among women who quit smoking during pregnancy, with a focus on the role of social networks. We conducted in-depth, semi-structured interviews during the postpartum hospital stay with women who quit smoking while pregnant. Over 300 pages of transcripts were analyzed using qualitative methods to identify common themes. Respondents [n = 24] were predominately white (63%), had at least some college education (54%) and a mean age of 26 years (range = 18-36). When reflecting on the experience of being a smoker who quit smoking during pregnancy, all participants emphasized the importance of their relationships with other smokers and the changes in these relationships that ensued once they quit smoking. Three common themes were: (1) being enmeshed in social networks with prominent smoking norms (2) being tempted to smoke by members of their social networks, and (3) changing relationships with the smokers in their social networks as a result of their non-smoking status. We found that women who quit smoking during pregnancy found themselves confronted by a change in their social network since most of those in their social network were smokers. For this reason, smoking cessation interventions may be most successful if they help women consider restructuring or reframing their social network.

  18. Identifying influential directors in the United States corporate governance network

    NASA Astrophysics Data System (ADS)

    Huang, Xuqing; Vodenska, Irena; Wang, Fengzhong; Havlin, Shlomo; Stanley, H. Eugene

    2011-10-01

    The influence of directors has been one of the most engaging topics recently, but surprisingly little research has been done to quantitatively evaluate the influence and power of directors. We analyze the structure of the US corporate governance network for the 11-year period 1996-2006 based on director data from the Investor Responsibility Research Center director database, and we develop a centrality measure named the influence factor to estimate the influence of directors quantitatively. The US corporate governance network is a network of directors with nodes representing directors and links between two directors representing their service on common company boards. We assume that information flows in the network through information-sharing processes among linked directors. The influence factor assigned to a director is based on the level of information that a director obtains from the entire network. We find that, contrary to commonly accepted belief that directors of large companies, measured by market capitalization, are the most powerful, in some instances, the directors who are influential do not necessarily serve on boards of large companies. By applying our influence factor method to identify the influential people contained in the lists created by popular magazines such as Fortune, Networking World, and Treasury and Risk Management, we find that the influence factor method is consistently either the best or one of the two best methods in identifying powerful people compared to other general centrality measures that are used to denote the significance of a node in complex network theory.

  19. Identifying influential directors in the United States corporate governance network.

    PubMed

    Huang, Xuqing; Vodenska, Irena; Wang, Fengzhong; Havlin, Shlomo; Stanley, H Eugene

    2011-10-01

    The influence of directors has been one of the most engaging topics recently, but surprisingly little research has been done to quantitatively evaluate the influence and power of directors. We analyze the structure of the US corporate governance network for the 11-year period 1996-2006 based on director data from the Investor Responsibility Research Center director database, and we develop a centrality measure named the influence factor to estimate the influence of directors quantitatively. The US corporate governance network is a network of directors with nodes representing directors and links between two directors representing their service on common company boards. We assume that information flows in the network through information-sharing processes among linked directors. The influence factor assigned to a director is based on the level of information that a director obtains from the entire network. We find that, contrary to commonly accepted belief that directors of large companies, measured by market capitalization, are the most powerful, in some instances, the directors who are influential do not necessarily serve on boards of large companies. By applying our influence factor method to identify the influential people contained in the lists created by popular magazines such as Fortune, Networking World, and Treasury and Risk Management, we find that the influence factor method is consistently either the best or one of the two best methods in identifying powerful people compared to other general centrality measures that are used to denote the significance of a node in complex network theory.

  20. Autonomic Intelligent Cyber Sensor to Support Industrial Control Network Awareness

    DOE PAGES

    Vollmer, Todd; Manic, Milos; Linda, Ondrej

    2013-06-01

    The proliferation of digital devices in a networked industrial ecosystem, along with an exponential growth in complexity and scope, has resulted in elevated security concerns and management complexity issues. This paper describes a novel architecture utilizing concepts of Autonomic computing and a SOAP based IF-MAP external communication layer to create a network security sensor. This approach simplifies integration of legacy software and supports a secure, scalable, self-managed framework. The contribution of this paper is two-fold: 1) A flexible two level communication layer based on Autonomic computing and Service Oriented Architecture is detailed and 2) Three complementary modules that dynamically reconfiguremore » in response to a changing environment are presented. One module utilizes clustering and fuzzy logic to monitor traffic for abnormal behavior. Another module passively monitors network traffic and deploys deceptive virtual network hosts. These components of the sensor system were implemented in C++ and PERL and utilize a common internal D-Bus communication mechanism. A proof of concept prototype was deployed on a mixed-use test network showing the possible real world applicability. In testing, 45 of the 46 network attached devices were recognized and 10 of the 12 emulated devices were created with specific Operating System and port configurations. Additionally the anomaly detection algorithm achieved a 99.9% recognition rate. All output from the modules were correctly distributed using the common communication structure.« less

Top