Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander
2008-04-01
We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.
Invocation oriented architecture for agile code and agile data
NASA Astrophysics Data System (ADS)
Verma, Dinesh; Chan, Kevin; Leung, Kin; Gkelias, Athanasios
2017-05-01
In order to address the unique requirements of sensor information fusion in a tactical coalition environment, we are proposing a new architecture - one based on the concept of invocations. An invocation is a combination of a software code and a piece of data, both managed using techniques from Information Centric networking. This paper will discuss limitations of current approaches, present the architecture for an invocation oriented architecture, illustrate how it works with an example scenario, and provide reasons for its suitability in a coalition environment.
Web-Based Distributed Simulation of Aeronautical Propulsion System
NASA Technical Reports Server (NTRS)
Zheng, Desheng; Follen, Gregory J.; Pavlik, William R.; Kim, Chan M.; Liu, Xianyou; Blaser, Tammy M.; Lopez, Isaac
2001-01-01
An application was developed to allow users to run and view the Numerical Propulsion System Simulation (NPSS) engine simulations from web browsers. Simulations were performed on multiple INFORMATION POWER GRID (IPG) test beds. The Common Object Request Broker Architecture (CORBA) was used for brokering data exchange among machines and IPG/Globus for job scheduling and remote process invocation. Web server scripting was performed by JavaServer Pages (JSP). This application has proven to be an effective and efficient way to couple heterogeneous distributed components.
NASA Technical Reports Server (NTRS)
Sundermier, Amy (Inventor)
2002-01-01
A method for acquiring and assembling software components at execution time into a client program, where the components may be acquired from remote networked servers is disclosed. The acquired components are assembled according to knowledge represented within one or more acquired mediating components. A mediating component implements knowledge of an object model. A mediating component uses its implemented object model knowledge, acquired component class information and polymorphism to assemble components into an interacting program at execution time. The interactions or abstract relationships between components in the object model may be implemented by the mediating component as direct invocations or indirect events or software bus exchanges. The acquired components may establish communications with remote servers. The acquired components may also present a user interface representing data to be exchanged with the remote servers. The mediating components may be assembled into layers, allowing arbitrarily complex programs to be constructed at execution time.
Advances in a distributed approach for ocean model data interoperability
Signell, Richard P.; Snowden, Derrick P.
2014-01-01
An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.
Users' Manual and Installation Guide for the EverVIEW Slice and Dice Tool (Version 1.0 Beta)
Roszell, Dustin; Conzelmann, Craig; Chimmula, Sumani; Chandrasekaran, Anuradha; Hunnicut, Christina
2009-01-01
Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need arose for additional tools to view and manipulate NetCDF datasets, specifically to create subsets of large NetCDF files. To address this need, we created the EverVIEW Slice and Dice Tool to allow users to create subsets of grid-based NetCDF files. The major functions of this tool are (1) to subset NetCDF files both spatially and temporally; (2) to view the NetCDF data in table form; and (3) to export filtered data to a comma-separated value file format.
Data Publishing and Sharing Via the THREDDS Data Repository
NASA Astrophysics Data System (ADS)
Wilson, A.; Caron, J.; Davis, E.; Baltzer, T.
2007-12-01
The terms "Team Science" and "Networked Science" have been coined to describe a virtual organization of researchers tied via some intellectual challenge, but often located in different organizations and locations. A critical component to these endeavors is publishing and sharing of content, including scientific data. Imagine pointing your web browser to a web page that interactively lets you upload data and metadata to a repository residing on a remote server, which can then be accessed by others in a secure fasion via the web. While any content can be added to this repository, it is designed particularly for storing and sharing scientific data and metadata. Server support includes uploading of data files that can subsequently be subsetted, aggregrated, and served in NetCDF or other scientific data formats. Metadata can be associated with the data and interactively edited. The THREDDS Data Repository (TDR) is a server that provides client initiated, on demand, location transparent storage for data of any type that can then be served by the THREDDS Data Server (TDS). The TDR provides functionality to: * securely store and "own" data files and associated metadata * upload files via HTTP and gridftp * upload a collection of data as single file * modify and restructure repository contents * incorporate metadata provided by the user * generate additional metadata programmatically * edit individual metadata elements The TDR can exist separately from a TDS, serving content via HTTP. Also, it can work in conjunction with the TDS, which includes functionality to provide: * access to data in a variety of formats via -- OPeNDAP -- OGC Web Coverage Service (for gridded datasets) -- bulk HTTP file transfer * a NetCDF view of datasets in NetCDF, OPeNDAP, HDF-5, GRIB, and NEXRAD formats * serving of very large volume datasets, such as NEXRAD radar * aggregation into virtual datasets * subsetting via OPeNDAP and NetCDF Subsetting services This talk will discuss TDR/TDS capabilities as well as how users can install this software to create their own repositories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mather, James
Atmospheric Radiation Measurement (ARM) Program standard data format is NetCDF 3 (Network Common Data Form). The object of this tutorial is to provide a basic introduction to NetCDF with an emphasis on aspects of the ARM application of NetCDF. The goal is to provide basic instructions for reading and visualizing ARM NetCDF data with the expectation that these examples can then be applied to more complex applications.
Public-domain-software solution to data-access problems for numerical modelers
Jenter, Harry; Signell, Richard
1992-01-01
Unidata's network Common Data Form, netCDF, provides users with an efficient set of software for scientific-data-storage, retrieval, and manipulation. The netCDF file format is machine-independent, direct-access, self-describing, and in the public domain, thereby alleviating many problems associated with accessing output from large hydrodynamic models. NetCDF has programming interfaces in both the Fortran and C computer language with an interface to C++ planned for release in the future. NetCDF also has an abstract data type that relieves users from understanding details of the binary file structure; data are written and retrieved by an intuitive, user-supplied name rather than by file position. Users are aided further by Unidata's inclusion of the Common Data Language, CDL, a printable text-equivalent of the contents of a netCDF file. Unidata provides numerous operators and utilities for processing netCDF files. In addition, a number of public-domain and proprietary netCDF utilities from other sources are available at this time or will be available later this year. The U.S. Geological Survey has produced and is producing a number of public-domain netCDF utilities.
Visualizing NetCDF Files by Using the EverVIEW Data Viewer
Conzelmann, Craig; Romañach, Stephanie S.
2010-01-01
Over the past few years, modelers in South Florida have started using Network Common Data Form (NetCDF) as the standard data container format for storing hydrologic and ecologic modeling inputs and outputs. With its origins in the meteorological discipline, NetCDF was created by the Unidata Program Center at the University Corporation for Atmospheric Research, in conjunction with the National Aeronautics and Space Administration and other organizations. NetCDF is a portable, scalable, self-describing, binary file format optimized for storing array-based scientific data. Despite attributes which make NetCDF desirable to the modeling community, many natural resource managers have few desktop software packages which can consume NetCDF and unlock the valuable data contained within. The U.S. Geological Survey and the Joint Ecosystem Modeling group, an ecological modeling community of practice, are working to address this need with the EverVIEW Data Viewer. Available for several operating systems, this desktop software currently supports graphical displays of NetCDF data as spatial overlays on a three-dimensional globe and views of grid-cell values in tabular form. An included Open Geospatial Consortium compliant, Web-mapping service client and charting interface allows the user to view Web-available spatial data as additional map overlays and provides simple charting visualizations of NetCDF grid values.
American Cosmology and the Rhetoric of Inaugural Prayer
ERIC Educational Resources Information Center
Medhurst, Martin J.
1977-01-01
Examines the invocation delivered by Bishop William P. Cannon at the Carter inauguration and contends that the invocation departs from previous inaugural prayer rhetorical forms and may indicate serious implications for both religious and political persuaders. (MH)
Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc
NASA Astrophysics Data System (ADS)
Percivall, George; Simonis, Ingo
2016-06-01
The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.
Filtering NetCDF Files by Using the EverVIEW Slice and Dice Tool
Conzelmann, Craig; Romañach, Stephanie S.
2010-01-01
Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. It was created to provide a common interface between applications and real-time meteorological and other scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need surfaced for additional tools to view and manipulate NetCDF datasets, specifically to filter the files by creating subsets of large NetCDF files. The U.S. Geological Survey (USGS) and the Joint Ecosystem Modeling (JEM) group are working to address these needs with applications like the EverVIEW Slice and Dice Tool, which allows users to filter grid-based NetCDF files, thus targeting those data most important to them. The major functions of this tool are as follows: (1) to create subsets of NetCDF files temporally, spatially, and by data value; (2) to view the NetCDF data in table form; and (3) to export the filtered data to a comma-separated value (CSV) file format. The USGS and JEM will continue to work with scientists and natural resource managers across The Everglades to solve complex restoration problems through technological advances.
Implementing Network Common Data Form (netCDF) for the 3DWF Model
2016-02-01
format. In addition, data extraction from netCDF-formatted Weather Research and Forecasting ( WRF ) model results necessary for the 3DWF model’s wind...Requirement for the 3DWF Model 1 3. Implementing netCDF to the 3DWF Model 2 3.1 Weather Research and Forecasting ( WRF ) domain and results 3 3.2...Extracting Variables from netCDF Formatted WRF Data File 5 3.3 Converting the 3DWF’s Results into netCDF 11 4. Conclusion 14 5. References 15 Appendix
Collaborative Sharing of Multidimensional Space-time Data Using HydroShare
NASA Astrophysics Data System (ADS)
Gan, T.; Tarboton, D. G.; Horsburgh, J. S.; Dash, P. K.; Idaszak, R.; Yi, H.; Blanton, B.
2015-12-01
HydroShare is a collaborative environment being developed for sharing hydrological data and models. It includes capability to upload data in many formats as resources that can be shared. The HydroShare data model for resources uses a specific format for the representation of each type of data and specifies metadata common to all resource types as well as metadata unique to specific resource types. The Network Common Data Form (NetCDF) was chosen as the format for multidimensional space-time data in HydroShare. NetCDF is widely used in hydrological and other geoscience modeling because it contains self-describing metadata and supports the creation of array-oriented datasets that may include three spatial dimensions, a time dimension and other user defined dimensions. For example, NetCDF may be used to represent precipitation or surface air temperature fields that have two dimensions in space and one dimension in time. This presentation will illustrate how NetCDF files are used in HydroShare. When a NetCDF file is loaded into HydroShare, header information is extracted using the "ncdump" utility. Python functions developed for the Django web framework on which HydroShare is based, extract science metadata present in the NetCDF file, saving the user from having to enter it. Where the file follows Climate Forecast (CF) convention and Attribute Convention for Dataset Discovery (ACDD) standards, metadata is thus automatically populated. Users also have the ability to add metadata to the resource that may not have been present in the original NetCDF file. HydroShare's metadata editing functionality then writes this science metadata back into the NetCDF file to maintain consistency between the science metadata in HydroShare and the metadata in the NetCDF file. This further helps researchers easily add metadata information following the CF and ACDD conventions. Additional data inspection and subsetting functions were developed, taking advantage of Python and command line libraries for working with NetCDF files. We describe the design and implementation of these features and illustrate how NetCDF files from a modeling application may be curated in HydroShare and thus enhance reproducibility of the associated research. We also discuss future development planned for multidimensional space-time data in HydroShare.
NetCDF4/HDF5 and Linked Data in the Real World - Enriching Geoscientific Metadata without Bloat
NASA Astrophysics Data System (ADS)
Ip, Alex; Car, Nicholas; Druken, Kelsey; Poudjom-Djomani, Yvette; Butcher, Stirling; Evans, Ben; Wyborn, Lesley
2017-04-01
NetCDF4 has become the dominant generic format for many forms of geoscientific data, leveraging (and constraining) the versatile HDF5 container format, while providing metadata conventions for interoperability. However, the encapsulation of detailed metadata within each file can lead to metadata "bloat", and difficulty in maintaining consistency where metadata is replicated to multiple locations. Complex conceptual relationships are also difficult to represent in simple key-value netCDF metadata. Linked Data provides a practical mechanism to address these issues by associating the netCDF files and their internal variables with complex metadata stored in Semantic Web vocabularies and ontologies, while complying with and complementing existing metadata conventions. One of the stated objectives of the netCDF4/HDF5 formats is that they should be self-describing: containing metadata sufficient for cataloguing and using the data. However, this objective can be regarded as only partially-met where details of conventions and definitions are maintained externally to the data files. For example, one of the most widely used netCDF community standards, the Climate and Forecasting (CF) Metadata Convention, maintains standard vocabularies for a broad range of disciplines across the geosciences, but this metadata is currently neither readily discoverable nor machine-readable. We have previously implemented useful Linked Data and netCDF tooling (ncskos) that associates netCDF files, and individual variables within those files, with concepts in vocabularies formulated using the Simple Knowledge Organization System (SKOS) ontology. NetCDF files contain Uniform Resource Identifier (URI) links to terms represented as SKOS Concepts, rather than plain-text representations of those terms, so we can use simple, standardised web queries to collect and use rich metadata for the terms from any Linked Data-presented SKOS vocabulary. Geoscience Australia (GA) manages a large volume of diverse geoscientific data, much of which is being translated from proprietary formats to netCDF at NCI Australia. This data is made available through the NCI National Environmental Research Data Interoperability Platform (NERDIP) for programmatic access and interdisciplinary analysis. The netCDF files contain both scientific data variables (e.g. gravity, magnetic or radiometric values), but also domain-specific operational values (e.g. specific instrument parameters) best described fully in formal vocabularies. Our ncskos codebase provides access to multiple stores of detailed external metadata in a standardised fashion. Geophysical datasets are generated from a "survey" event, and GA maintains corporate databases of all surveys and their associated metadata. It is impractical to replicate the full source survey metadata into each netCDF dataset so, instead, we link the netCDF files to survey metadata using public Linked Data URIs. These URIs link to Survey class objects which we model as a subclass of Activity objects as defined by the PROV Ontology, and we provide URI resolution for them via a custom Linked Data API which draws current survey metadata from GA's in-house databases. We have demonstrated that Linked Data is a practical way to associate netCDF data with detailed, external metadata. This allows us to ensure that catalogued metadata is kept consistent with metadata points-of-truth, and we can infer complex conceptual relationships not possible with netCDF key-value attributes alone.
Community Intercomparison Suite (CIS) v1.4.0: a tool for intercomparing models and observations
NASA Astrophysics Data System (ADS)
Watson-Parris, Duncan; Schutgens, Nick; Cook, Nicholas; Kipling, Zak; Kershaw, Philip; Gryspeerdt, Edward; Lawrence, Bryan; Stier, Philip
2016-09-01
The Community Intercomparison Suite (CIS) is an easy-to-use command-line tool which has been developed to allow the straightforward intercomparison of remote sensing, in situ and model data. While there are a number of tools available for working with climate model data, the large diversity of sources (and formats) of remote sensing and in situ measurements necessitated a novel software solution. Developed by a professional software company, CIS supports a large number of gridded and ungridded data sources "out-of-the-box", including climate model output in NetCDF or the UK Met Office pp file format, CloudSat, CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization), MODIS (MODerate resolution Imaging Spectroradiometer), Cloud and Aerosol CCI (Climate Change Initiative) level 2 satellite data and a number of in situ aircraft and ground station data sets. The open-source architecture also supports user-defined plugins to allow many other sources to be easily added. Many of the key operations required when comparing heterogenous data sets are provided by CIS, including subsetting, aggregating, collocating and plotting the data. Output data are written to CF-compliant NetCDF files to ensure interoperability with other tools and systems. The latest documentation, including a user manual and installation instructions, can be found on our website (http://cistools.net). Here, we describe the need which this tool fulfils, followed by descriptions of its main functionality (as at version 1.4.0) and plugin architecture which make it unique in the field.
NCWin — A Component Object Model (COM) for processing and visualizing NetCDF data
Liu, Jinxun; Chen, J.M.; Price, D.T.; Liu, S.
2005-01-01
NetCDF (Network Common Data Form) is a data sharing protocol and library that is commonly used in large-scale atmospheric and environmental data archiving and modeling. The NetCDF tool described here, named NCWin and coded with Borland C + + Builder, was built as a standard executable as well as a COM (component object model) for the Microsoft Windows environment. COM is a powerful technology that enhances the reuse of applications (as components). Environmental model developers from different modeling environments, such as Python, JAVA, VISUAL FORTRAN, VISUAL BASIC, VISUAL C + +, and DELPHI, can reuse NCWin in their models to read, write and visualize NetCDF data. Some Windows applications, such as ArcGIS and Microsoft PowerPoint, can also call NCWin within the application. NCWin has three major components: 1) The data conversion part is designed to convert binary raw data to and from NetCDF data. It can process six data types (unsigned char, signed char, short, int, float, double) and three spatial data formats (BIP, BIL, BSQ); 2) The visualization part is designed for displaying grid map series (playing forward or backward) with simple map legend, and displaying temporal trend curves for data on individual map pixels; and 3) The modeling interface is designed for environmental model development by which a set of integrated NetCDF functions is provided for processing NetCDF data. To demonstrate that the NCWin can easily extend the functions of some current GIS software and the Office applications, examples of calling NCWin within ArcGIS and MS PowerPoint for showing NetCDF map animations are given.
Linking netCDF Data with the Semantic Web - Enhancing Data Discovery Across Domains
NASA Astrophysics Data System (ADS)
Biard, J. C.; Yu, J.; Hedley, M.; Cox, S. J. D.; Leadbetter, A.; Car, N. J.; Druken, K. A.; Nativi, S.; Davis, E.
2016-12-01
Geophysical data communities are publishing large quantities of data across a wide variety of scientific domains which are overlapping more and more. Whilst netCDF is a common format for many of these communities, it is only one of a large number of data storage and transfer formats. One of the major challenges ahead is finding ways to leverage these diverse data sets to advance our understanding of complex problems. We describe a methodology for incorporating Resource Description Framework (RDF) triples into netCDF files called netCDF-LD (netCDF Linked Data). NetCDF-LD explicitly connects the contents of netCDF files - both data and metadata, with external web-based resources, including vocabularies, standards definitions, and data collections, and through them, a whole host of related information. This approach also preserves and enhances the self describing essence of the netCDF format and its metadata, whilst addressing the challenge of integrating various conventions into files. We present a case study illustrating how reasoning over RDF graphs can empower researchers to discover datasets across domain boundaries.
netCDF Operators for Rapid Analysis of Measured and Modeled Swath-like Data
NASA Astrophysics Data System (ADS)
Zender, C. S.
2015-12-01
Swath-like data (hereafter SLD) are defined by non-rectangular and/or time-varying spatial grids in which one or more coordinates are multi-dimensional. It is often challenging and time-consuming to work with SLD, including all Level 2 satellite-retrieved data, non-rectangular subsets of Level 3 data, and model data on curvilinear grids. Researchers and data centers want user-friendly, fast, and powerful methods to specify, extract, serve, manipulate, and thus analyze, SLD. To meet these needs, large research-oriented agencies and modeling center such as NASA, DOE, and NOAA increasingly employ the netCDF Operators (NCO), an open-source scientific data analysis software package applicable to netCDF and HDF data. NCO includes extensive, fast, parallelized regridding features to facilitate analysis and intercomparison of SLD and model data. Remote sensing, weather and climate modeling and analysis communities face similar problems in handling SLD including how to easily: 1. Specify and mask irregular regions such as ocean basins and political boundaries in SLD (and rectangular) grids. 2. Bin, interpolate, average, or re-map SLD to regular grids. 3. Derive secondary data from given quality levels of SLD. These common tasks require a data extraction and analysis toolkit that is SLD-friendly and, like NCO, familiar in all these communities. With NCO users can 1. Quickly project SLD onto the most useful regular grids for intercomparison. 2. Access sophisticated statistical and regridding functions that are robust to missing data and allow easy specification of quality control metrics. These capabilities improve interoperability, software-reuse, and, because they apply to SLD, minimize transmission, storage, and handling of unwanted data. While SLD analysis still poses many challenges compared to regularly gridded, rectangular data, the custom analyses scripts SLD once required are now shorter, more powerful, and user-friendly.
NASA Astrophysics Data System (ADS)
Ward-Garrison, C.; May, R.; Davis, E.; Arms, S. C.
2016-12-01
NetCDF is a set of software libraries and self-describing, machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. The Climate and Forecasting (CF) metadata conventions for netCDF foster the ability to work with netCDF files in general and useful ways. These conventions include metadata attributes for physical units, standard names, and spatial coordinate systems. While these conventions have been successful in easing the use of working with netCDF-formatted output from climate and forecast models, their use for point-based observation data has been less so. Unidata has prototyped using the discrete sampling geometry (DSG) CF conventions to serve, using the THREDDS Data Server, the real-time point observation data flowing across the Internet Data Distribution (IDD). These data originate in text format reports for individual stations (e.g. METAR surface data or TEMP upper air data) and are converted and stored in netCDF files in real-time. This work discusses the experiences and challenges of using the current CF DSG conventions for storing such real-time data. We also test how parts of netCDF's extended data model can address these challenges, in order to inform decisions for a future version of CF (CF 2.0) that would take advantage of features of the netCDF enhanced data model.
Facilitating preemptive hardware system design using partial reconfiguration techniques.
Dondo Gazzano, Julio; Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos
2014-01-01
In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration.
Facilitating Preemptive Hardware System Design Using Partial Reconfiguration Techniques
Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos
2014-01-01
In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration. PMID:24672292
Transparent process migration: Design alternatives and the Sprite implementation
NASA Technical Reports Server (NTRS)
Douglis, Fred; Ousterhout, John
1991-01-01
The Sprite operating system allows executing processes to be moved between hosts at any time. We use this process migration mechanism to offload work onto idle machines, and also to evict migrated processes when idle workstations are reclaimed by their owners. Sprite's migration mechanism provides a high degree of transparency both for migrated processes and for users. Idle machines are identified, and eviction is invoked, automatically by daemon processes. On Sprite it takes up to a few hundred milliseconds on SPARCstation 1 workstations to perform a remote exec, while evictions typically occur in a few seconds. The pmake program uses remote invocation to invoke tasks concurrently. Compilations commonly obtain speedup factors in the range of three to six; they are limited primarily by contention for centralized resources such as file servers. CPU-bound tasks such as simulations can make more effective use of idle hosts, obtaining as much as eight-fold speedup over a period of hours. Process migration has been in regular service for over two years.
Distributed nuclear medicine applications using World Wide Web and Java technology.
Knoll, P; Höll, K; Mirzaei, S; Koriska, K; Köhn, H
2000-01-01
At present, medical applications applying World Wide Web (WWW) technology are mainly used to view static images and to retrieve some information. The Java platform is a relative new way of computing, especially designed for network computing and distributed applications which enables interactive connection between user and information via the WWW. The Java 2 Software Development Kit (SDK) including Java2D API, Java Remote Method Invocation (RMI) technology, Object Serialization and the Java Advanced Imaging (JAI) extension was used to achieve a robust, platform independent and network centric solution. Medical image processing software based on this technology is presented and adequate performance capability of Java is demonstrated by an iterative reconstruction algorithm for single photon emission computerized tomography (SPECT).
Cache Locality Optimization for Recursive Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lifflander, Jonathan; Krishnamoorthy, Sriram
We present an approach to optimize the cache locality for recursive programs by dynamically splicing--recursively interleaving--the execution of distinct function invocations. By utilizing data effect annotations, we identify concurrency and data reuse opportunities across function invocations and interleave them to reduce reuse distance. We present algorithms that efficiently track effects in recursive programs, detect interference and dependencies, and interleave execution of function invocations using user-level (non-kernel) lightweight threads. To enable multi-core execution, a program is parallelized using a nested fork/join programming model. Our cache optimization strategy is designed to work in the context of a random work stealing scheduler. Wemore » present an implementation using the MIT Cilk framework that demonstrates significant improvements in sequential and parallel performance, competitive with a state-of-the-art compile-time optimizer for loop programs and a domain- specific optimizer for stencil programs.« less
Automatic Invocation Linking for Collaborative Web-Based Corpora
NASA Astrophysics Data System (ADS)
Gardner, James; Krowne, Aaron; Xiong, Li
Collaborative online encyclopedias or knowledge bases such as Wikipedia and PlanetMath are becoming increasingly popular because of their open access, comprehensive and interlinked content, rapid and continual updates, and community interactivity. To understand a particular concept in these knowledge bases, a reader needs to learn about related and underlying concepts. In this chapter, we introduce the problem of invocation linking for collaborative encyclopedia or knowledge bases, review the state of the art for invocation linking including the popular linking system of Wikipedia, discuss the problems and challenges of automatic linking, and present the NNexus approach, an abstraction and generalization of the automatic linking system used by PlanetMath.org. The chapter emphasizes both research problems and practical design issues through discussion of real world scenarios and hence is suitable for both researchers in web intelligence and practitioners looking to adopt the techniques. Below is a brief outline of the chapter.
Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC
NASA Astrophysics Data System (ADS)
Alruwaili, Manal
With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.
A Study of NetCDF as an Approach for High Performance Medical Image Storage
NASA Astrophysics Data System (ADS)
Magnus, Marcone; Coelho Prado, Thiago; von Wangenhein, Aldo; de Macedo, Douglas D. J.; Dantas, M. A. R.
2012-02-01
The spread of telemedicine systems increases every day. The systems and PACS based on DICOM images has become common. This rise reflects the need to develop new storage systems, more efficient and with lower computational costs. With this in mind, this article discusses a study for application in NetCDF data format as the basic platform for storage of DICOM images. The study case comparison adopts an ordinary database, the HDF5 and the NetCDF to storage the medical images. Empirical results, using a real set of images, indicate that the time to retrieve images from the NetCDF for large scale images has a higher latency compared to the other two methods. In addition, the latency is proportional to the file size, which represents a drawback to a telemedicine system that is characterized by a large amount of large image files.
Displaying Composite and Archived Soundings in the Advanced Weather Interactive Processing System
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III; Volkmer, Matthew R.; Blottman, Peter F.; Sharp, David W.
2008-01-01
In a previous task, the Applied Meteorology Unit (AMU) developed spatial and temporal climatologies of lightning occurrence based on eight atmospheric flow regimes. The AMU created climatological, or composite, soundings of wind speed and direction, temperature, and dew point temperature at four rawinsonde observation stations at Jacksonville, Tampa, Miami, and Cape Canaveral Air Force Station, for each of the eight flow regimes. The composite soundings were delivered to the National Weather Service (NWS) Melbourne (MLB) office for display using the National version of the Skew-T Hodograph analysis and Research Program (NSHARP) software program. The NWS MLB requested the AMU make the composite soundings available for display in the Advanced Weather Interactive Processing System (AWIPS), so they could be overlaid on current observed soundings. This will allow the forecasters to compare the current state of the atmosphere with climatology. This presentation describes how the AMU converted the composite soundings from NSHARP Archive format to Network Common Data Form (NetCDF) format, so that the soundings could be displayed in AWl PS. The NetCDF is a set of data formats, programming interfaces, and software libraries used to read and write scientific data files. In AWIPS, each meteorological data type, such as soundings or surface observations, has a unique NetCDF format. Each format is described by a NetCDF template file. Although NetCDF files are in binary format, they can be converted to a text format called network Common data form Description Language (CDL). A software utility called ncgen is used to create a NetCDF file from a CDL file, while the ncdump utility is used to create a CDL file from a NetCDF file. An AWIPS receives soundings in Binary Universal Form for the Representation of Meteorological data (BUFR) format (http://dss.ucar.edu/docs/formats/bufr/), and then decodes them into NetCDF format. Only two sounding files are generated in AWIPS per day. One file contains all of the soundings received worldwide between 0000 UTC and 1200 UTC, and the other includes all soundings between 1200 UTC and 0000 UTC. In order to add the composite soundings into AWIPS, a procedure was created to configure, or localize, AWIPS. This involved modifying and creating several configuration text files. A unique fourcharacter site identifier was created for each of the 32 soundings so each could be viewed separately. The first three characters were based on the site identifier of the observed sounding, while the last character was based on the flow regime. While researching the localization process for soundings, the AMU discovered a method of archiving soundings so old soundings would not get purged automatically by AWl PS. This method could provide an alternative way of localizing AWl PS for composite soundings. In addition, this would allow forecasters to use archived soundings in AWIPS for case studies. A test sounding file in NetCDF format was written in order to verify the correct format for soundings in AWIPS. After the file was viewed successfully in AWIPS, the AMU wrote a software program in the Tool Command Language/Tool Kit (Tcl/Tk) language to convert the 32 composite soundings from NSHARP Archive to CDL format. The ncgen utility was then used to convert the CDL file to a NetCDF file. The NetCDF file could then be read and displayed in AWIPS.
NASA Astrophysics Data System (ADS)
Li, J.; Zhang, T.; Huang, Q.; Liu, Q.
2014-12-01
Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.
Moving from HDF4 to HDF5/netCFD-4
NASA Technical Reports Server (NTRS)
Pourmal, Elena; Yang, Kent; Lee, Joe
2017-01-01
In this presentation, we will go over the major differences between two file formats and libraries, and will talk about the HDF5 features that users should consider when designing new products in HDF5netCDF4. We will also discuss the h4h5tools toolkit that can facilitate conversion of data in the existing HDF4 files to HDF5 and netCDF-4, and we will engage the participants in the discussion of how The HDF Group can help with the transition and adoption of HDF5 and netCDF-4.
Modeling and Detecting Feature Interactions among Integrated Services of Home Network Systems
NASA Astrophysics Data System (ADS)
Igaki, Hiroshi; Nakamura, Masahide
This paper presents a framework for formalizing and detecting feature interactions (FIs) in the emerging smart home domain. We first establish a model of home network system (HNS), where every networked appliance (or the HNS environment) is characterized as an object consisting of properties and methods. Then, every HNS service is defined as a sequence of method invocations of the appliances. Within the model, we next formalize two kinds of FIs: (a) appliance interactions and (b) environment interactions. An appliance interaction occurs when two method invocations conflict on the same appliance, whereas an environment interaction arises when two method invocations conflict indirectly via the environment. Finally, we propose offline and online methods that detect FIs before service deployment and during execution, respectively. Through a case study with seven practical services, it is shown that the proposed framework is generic enough to capture feature interactions in HNS integrated services. We also discuss several FI resolution schemes within the proposed framework.
Xu, Jingping; Lightsom, Fran; Noble, Marlene A.; Denham, Charles
2002-01-01
During the past several years, the sediment transport group in the Coastal and Marine Geology Program (CMGP) of the U. S. Geological Survey has made major revisions to its methodology of processing, analyzing, and maintaining the variety of oceanographic time-series data. First, CMGP completed the transition of the its oceanographic time-series database to a self-documenting NetCDF (Rew et al., 1997) data format. Second, CMGP’s oceanographic data variety and complexity have been greatly expanded from traditional 2-dimensional, single-point time-series measurements (e.g., Electro-magnetic current meters, transmissometers) to more advanced 3-dimensional and profiling time-series measurements due to many new acquisitions of modern instruments such as Acoustic Doppler Current Profiler (RDI, 1996), Acoustic Doppler Velocitimeter, Pulse-Coherence Acoustic Doppler Profiler (SonTek, 2001), Acoustic Bacscatter Sensor (Aquatec, 1001001001001001001). In order to accommodate the NetCDF format of data from the new instruments, a software package of processing, analyzing, and visualizing time-series oceanographic data was developed. It is named CMGTooL. The CMGTooL package contains two basic components: a user-friendly GUI for NetCDF file analysis, processing and manipulation; and a data analyzing program library. Most of the routines in the library are stand-alone programs suitable for batch processing. CMGTooL is written in MATLAB computing language (The Mathworks, 1997), therefore users must have MATLAB installed on their computer in order to use this software package. In addition, MATLAB’s Signal Processing Toolbox is also required by some CMGTooL’s routines. Like most MATLAB programs, all CMGTooL codes are compatible with different computing platforms including PC, MAC, and UNIX machines (Note: CMGTooL has been tested on different platforms that run MATLAB 5.2 (Release 10) or lower versions. Some of the commands related to MAC may not be compatible with later releases of MATLAB). The GUI and some of the library routines call low-level NetCDF file I/O, variable and attribute functions. These NetCDF exclusive functions are supported by a MATLAB toolbox named NetCDF, created by Dr. Charles Denham . This toolbox has to be installed in order to use the CMGTooL GUI. The CMGTooL GUI calls several routines that were initially developed by others. The authors would like to acknowledge the following scientists for their ideas and codes: Dr. Rich Signell (USGS), Dr. Chris Sherwood (USGS), and Dr. Bob Beardsley (WHOI). Many special terms that carry special meanings in either MATLAB or the NetCDF Toolbox are used in this manual. Users are encouraged to read the documents of MATLAB and NetCDF for references.
Web Program for Development of GUIs for Cluster Computers
NASA Technical Reports Server (NTRS)
Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward
2003-01-01
WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.
NetCDF-U - Uncertainty conventions for netCDF datasets
NASA Astrophysics Data System (ADS)
Bigagli, Lorenzo; Nativi, Stefano; Domenico, Ben
2013-04-01
To facilitate the automated processing of uncertain data (e.g. uncertainty propagation in modeling applications), we have proposed a set of conventions for expressing uncertainty information within the netCDF data model and format: the NetCDF Uncertainty Conventions (NetCDF-U). From a theoretical perspective, it can be said that no dataset is a perfect representation of the reality it purports to represent. Inevitably, errors arise from the observation process, including the sensor system and subsequent processing, differences in scales of phenomena and the spatial support of the observation mechanism, lack of knowledge about the detailed conversion between the measured quantity and the target variable. This means that, in principle, all data should be treated as uncertain. The most natural representation of an uncertain quantity is in terms of random variables, with a probabilistic approach. However, it must be acknowledged that almost all existing data resources are not treated in this way. Most datasets come simply as a series of values, often without any uncertainty information. If uncertainty information is present, then it is typically within the metadata, as a data quality element. This is typically a global (dataset wide) representation of uncertainty, often derived through some form of validation process. Typically, it is a statistical measure of spread, for example the standard deviation of the residuals. The introduction of a mechanism by which such descriptions of uncertainty can be integrated into existing geospatial applications is considered a practical step towards a more accurate modeling of our uncertain understanding of any natural process. Given the generality and flexibility of the netCDF data model, conventions on naming, syntax, and semantics have been adopted by several communities of practice, as a means of improving data interoperability. Some of the existing conventions include provisions on uncertain elements and concepts, but, to our knowledge, no general convention on the encoding of uncertainty has been proposed, to date. In particular, the netCDF Climate and Forecast Conventions (NetCDF-CF), a de-facto standard for a large amount of data in Fluid Earth Sciences, mention the issue and provide limited support for uncertainty representation. NetCDF-U is designed to be fully compatible with NetCDF-CF, where possible adopting the same mechanisms (e.g. using the same attributes name with compatible semantics). The rationale for this is that a probabilistic description of scientific quantities is a crosscutting aspect, which may be modularized (note that a netCDF dataset may be compliant with more than one convention). The scope of NetCDF-U is to extend and qualify the netCDF classic data model (also known as netCDF3), to capture the uncertainty related to geospatial information encoded in that format. In the future, a netCDF4 approach for uncertainty encoding will be investigated. The NetCDF-U Conventions have the following rationale: • Compatibility with netCDF-CF Conventions 1.5. • Human-readability of conforming datasets structure. • Minimal difference between certain/agnostic and uncertain representations of data (e.g. with respect to dataset structure). NetCDF-U is based on a generic mechanism for annotating netCDF data variables with probability theory semantics. The Uncertainty Markup Language (UncertML) 2.0 is used as a controlled conceptual model and vocabulary for NetCDF-U annotations. The proposed mechanism anticipates a generalized support for semantic annotations in netCDF. NetCDF-U defines syntactical conventions for encoding samples, summary statistics, and distributions, along with mechanisms for expressing dependency relationships among variables. The conventions were accepted as an Open Geospatial Consortium (OGC) Discussion Paper (OGC 11-163); related discussions are conducted on a public forum hosted by the OGC. NetCDF-U may have implications for future work directed at communicating geospatial data provenance and uncertainty in contexts other than netCDF. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 248488.
NASA Astrophysics Data System (ADS)
Meertens, C.; Wier, S.; Ahern, T.; Casey, R.; Weertman, B.; Laughbon, C.
2008-12-01
UNAVCO and the IRIS DMC are data service partners for seismic visualization, particularly for hypocentral data and tomography. UNAVCO provides the GEON Integrated Data Viewer (IDV), an extension of the Unidata IDV, a free, interactive, research-level, software display and analysis tool for data in 3D (latitude, longitude, depth) and 4D (with time), located on or inside the Earth. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three- dimensional geoscience data in the context of new remote and shared data sources. The GEON IDV supports data access from data sources using HTTP and FTP servers, OPeNDAP servers, THREDDS catalogs, RSS feeds, and WMS (web map) servers. The IRIS DMC (Data Management System) has developed web services providing data for earthquake hypocentral data and seismic tomography model grids. These services can be called by the GEON IDV to access data at IRIS without copying files. The IRIS Earthquake Browser (IEB) is a web-based query tool for hypocentral data. The IEB combines the DMC's large database of more than 1,900,000 earthquakes with the Google Maps web interface. With the IEB you can quickly find earthquakes in any region of the globe and then import this information into the GEON Integrated Data Viewer where the hypocenters may be visualized. You can select earthquakes by location region, time, depth, and magnitude. The IEB gives the IDV a URL to the selected data. The IDV then shows the data as maps or 3D displays, with interactive control of vertical scale, area, map projection, with symbol size and color control by magnitude or depth. The IDV can show progressive time animation of, for example, aftershocks filling a source region. The IRIS Tomoserver converts seismic tomography model output grids to NetCDF for use in the IDV. The Tomoserver accepts a tomographic model file as input from a user and provides an equivalent NetCDF file as output. The service supports NA04, S3D, A1D and CUB input file formats, contributed by their respective creators. The NetCDF file is saved to a location that can be referenced with a URL on an IRIS server. The URL for the NetCDF file is provided to the user. The user can download the data from IRIS, or copy the URL into IDV directly for interpretation, and the IDV will access the data at IRIS. The Tomoserver conversion software was developed by Instrumental Software Technologies, Inc. Use cases with the GEON IDV and IRIS DMC data services will be shown.
Aspect-Oriented Programming is Quantification and Implicit Invocation
NASA Technical Reports Server (NTRS)
Filman, Robert E.; Friedman, Daniel P.; Koga, Dennis (Technical Monitor)
2001-01-01
We propose that the distinguishing characteristic of Aspect-Oriented Programming (AOP) languages is that they allow programming by making quantified programmatic assertions over programs that lack local notation indicating the invocation of these assertions. This suggests that AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the interactions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are metabolism: they are sufficiently expressive to allow straightforwardly programming an AOP system within them.
A Prototype Web-based system for GOES-R Space Weather Data
NASA Astrophysics Data System (ADS)
Sundaravel, A.; Wilkinson, D. C.
2010-12-01
The Geostationary Operational Environmental Satellite-R Series (GOES-R) makes use of advanced instruments and technologies to monitor the Earth's surface and provide with accurate space weather data. The first GOES-R series satellite is scheduled to be launched in 2015. The data from the satellite will be widely used by scientists for space weather modeling and predictions. This project looks into the ways of how these datasets can be made available to the scientists on the Web and to assist them on their research. We are working on to develop a prototype web-based system that allows users to browse, search and download these data. The GOES-R datasets will be archived in NetCDF (Network Common Data Form) and CSV (Comma Separated Values) format. The NetCDF is a self-describing data format that contains both the metadata information and the data. The data is stored in an array-oriented fashion. The web-based system will offer services in two ways: via a web application (portal) and via web services. Using the web application, the users can download data in NetCDF or CSV format and can also plot a graph of the data. The web page displays the various categories of data and the time intervals for which the data is available. The web application (client) sends the user query to the server, which then connects to the data sources to retrieve the data and delivers it to the users. Data access will also be provided via SOAP (Simple Object Access Protocol) and REST (Representational State Transfer) web services. These provide functions which can be used by other applications to fetch data and use the data for further processing. To build the prototype system, we are making use of proxy data from existing GOES and POES space weather datasets. Java is the programming language used in developing tools that formats data to NetCDF and CSV. For the web technology we have chosen Grails to develop both the web application and the services. Grails is an open source web application framework based on the Groovy language. We are also making use of the THREDDS (Thematic Realtime Environmental Distributed Data Services) server to publish and access the NetCDF files. We have completed developing software tools to generate NetCDF and CSV data files and also tools to translate NetCDF to CSV. The current phase of the project involves in designing and developing the web interface.
A NetCDF version of the two-dimensional energy balance model based on the full multigrid algorithm
NASA Astrophysics Data System (ADS)
Zhuang, Kelin; North, Gerald R.; Stevens, Mark J.
A NetCDF version of the two-dimensional energy balance model based on the full multigrid method in Fortran is introduced for both pedagogical and research purposes. Based on the land-sea-ice distribution, orbital elements, greenhouse gases concentration, and albedo, the code calculates the global seasonal surface temperature. A step-by-step guide with examples is provided for practice.
NASA Astrophysics Data System (ADS)
Hassell, David; Gregory, Jonathan; Blower, Jon; Lawrence, Bryan N.; Taylor, Karl E.
2017-12-01
The CF (Climate and Forecast) metadata conventions are designed to promote the creation, processing, and sharing of climate and forecasting data using Network Common Data Form (netCDF) files and libraries. The CF conventions provide a description of the physical meaning of data and of their spatial and temporal properties, but they depend on the netCDF file encoding which can currently only be fully understood and interpreted by someone familiar with the rules and relationships specified in the conventions documentation. To aid in development of CF-compliant software and to capture with a minimal set of elements all of the information contained in the CF conventions, we propose a formal data model for CF which is independent of netCDF and describes all possible CF-compliant data. Because such data will often be analysed and visualised using software based on other data models, we compare our CF data model with the ISO 19123 coverage model, the Open Geospatial Consortium CF netCDF standard, and the Unidata Common Data Model. To demonstrate that this CF data model can in fact be implemented, we present cf-python, a Python software library that conforms to the model and can manipulate any CF-compliant dataset.
Developing a Hadoop-based Middleware for Handling Multi-dimensional NetCDF
NASA Astrophysics Data System (ADS)
Li, Z.; Yang, C. P.; Schnase, J. L.; Duffy, D.; Lee, T. J.
2014-12-01
Climate observations and model simulations are collecting and generating vast amounts of climate data, and these data are ever-increasing and being accumulated in a rapid speed. Effectively managing and analyzing these data are essential for climate change studies. Hadoop, a distributed storage and processing framework for large data sets, has attracted increasing attentions in dealing with the Big Data challenge. The maturity of Infrastructure as a Service (IaaS) of cloud computing further accelerates the adoption of Hadoop in solving Big Data problems. However, Hadoop is designed to process unstructured data such as texts, documents and web pages, and cannot effectively handle the scientific data format such as array-based NetCDF files and other binary data format. In this paper, we propose to build a Hadoop-based middleware for transparently handling big NetCDF data by 1) designing a distributed climate data storage mechanism based on POSIX-enabled parallel file system to enable parallel big data processing with MapReduce, as well as support data access by other systems; 2) modifying the Hadoop framework to transparently processing NetCDF data in parallel without sequencing or converting the data into other file formats, or loading them to HDFS; and 3) seamlessly integrating Hadoop, cloud computing and climate data in a highly scalable and fault-tolerance framework.
Autoplot: a Browser for Science Data on the Web
NASA Astrophysics Data System (ADS)
Faden, J.; Weigel, R. S.; West, E. E.; Merka, J.
2008-12-01
Autoplot (www.autoplot.org) is software for plotting data from many different sources and in many different file formats. Data from CDF, CEF, Fits, NetCDF, and OpenDAP can be plotted, along with many other sources such as ASCII tables and Excel spreadsheets. This is done by adapting these various data formats and APIs into a common data model that borrows from the netCDF and CDF data models. Autoplot uses a web browser metaphor to simplify use. The user specifies a parameter URL, for example a CDF file accessible via http with a parameter name appended, and the file resource is downloaded and the parameter is rendered in a scientifically meaningful way. When data span multiple files, the user can use a file name template in the URL to aggregate (combine) a set of remote files. So the problem of aggregating data across file boundaries is handled on the client side, allowing simple web servers to be used. The das2 graphics library provides rich controls for exploring the data. Scripting is supported through Python, providing not just programmatic control, but for calculating new parameters in a language that will look familiar to IDL and Matlab users. Autoplot is Java-based software, and will run on most computers without a burdensome installation process. It can also used as an applet or as a servlet that serves static images. Autoplot was developed as part of the Virtual Radiation Belt Observatory (ViRBO) project, and is also being used for the Virtual Magnetospheric Observatory (VMO). It is expected that this flexible, general-purpose plotting tool will be useful for allowing a data provider to add instant visualization capabilities to a directory of files or for general use in the Virtual Observatory environment.
HYDRA Hyperspectral Data Research Application Tom Rink and Tom Whittaker
NASA Astrophysics Data System (ADS)
Rink, T.; Whittaker, T.
2005-12-01
HYDRA is a freely available, easy to install tool for visualization and analysis of large local or remote hyper/multi-spectral datasets. HYDRA is implemented on top of the open source VisAD Java library via Jython - the Java implementation of the user friendly Python programming language. VisAD provides data integration, through its generalized data model, user-display interaction and display rendering. Jython has an easy to read, concise, scripting-like, syntax which eases software development. HYDRA allows data sharing of large datasets through its support of the OpenDAP and OpenADDE server-client protocols. The users can explore and interrogate data, and subset in physical and/or spectral space to isolate key areas of interest for further analysis without having to download an entire dataset. It also has an extensible data input architecture to recognize new instruments and understand different local file formats, currently NetCDF and HDF4 are supported.
Interfacing with Legacy using Remote Method Invocation
NASA Technical Reports Server (NTRS)
Howard, Scott M.
1998-01-01
The assignment described was enough to make a neophyte Java developer bolt for the door: provide a remote method for use by an applet which invokes a native method that wraps a function in an existing legacy library. The purpose of the remote method is to return an instance of a class object whose contents reflect the data structure returned by the legacy function. While embroiled in implementation, I would have spent the time wading through their JNI use group archive as well, but I couldn't seem to locate one. Subsequently, I made the decision to try to document my findings in order to assist others. Before we start on the class design, let's look at what the existing legacy code does. The C function to be called, Get-Legacy-Data, consists of two steps: an ASII file is read from the local disk and its contents are parsed into a Legacy_Type structure whose address is passed as an argument by the caller. The legacy code was compiled into a shared object library, legacy. so, using the IRIX 6.2 compiler and then loaded onto the Web server, a Silicon Graphics Indy station loaded with the IRIX 6.4 operating system. As far as the class design is concerned, the first thing required is a class to act as a template for the data structure returned by the legacy function. This class, JLegacy, declares a series of public instance variables which correspond to the members of Legacy_Type and provides a parameterless constructor. This constructor is never called, not even by the native method which allocates the object for return to the remote method. Next, the remote interface declaration for the remote object must be defined. In order for JLegacyRO to implement getJLegacy, JLegacyRO must interface with the existing legacy code through a native method, getn. getn is declared in the JLegacyRO class but implemented in C, just like the legacy code. getn returns a JLegacy instance and is declared static since its implementation is the same for all instances of the JLegacyRO class.
Summary of ADTT Website Functionality and Features
NASA Technical Reports Server (NTRS)
Hawke, Veronica; Duong, Trang; Liang, Lawrence; Gage, Peter; Lawrence, Scott (Technical Monitor)
2001-01-01
This report summarizes development of the ADTT web-based design environment by the ELORET team in 2000. The Advanced Design Technology Testbed had been in development for several years, with demonstration applications restricted to aerodynamic analyses of subsonic aircraft. The key changes achieved this year were improvements in Web-based accessibility, evaluation of collaborative visualization, remote invocation of geometry updates and performance analysis, and application to aerospace system analysis. Significant effort was also devoted to post-processing of data, chiefly through comparison of similar data for alternative vehicle concepts. Such comparison is an essential requirement for designers to make informed choices between alternatives. The next section of this report provides more discussion of the goals for ADTT development. Section 3 provides screen shots from a sample session in the ADTT environment, including Login and navigation to the project of interest, data inspection, analysis execution and output evaluation. The following section provides discussion of implementation details and recommendations for future development of the software and information technologies that provide the key functionality of the ADTT system. Section 5 discusses the integration architecture for the system, which links machines running different operating systems and provides unified access to data stored in distributed locations. Security is a significant issue for this system, especially for remote access to NAS machines, so Section 6 discusses several architectural considerations with respect to security. Additional details of some aspects of ADTT development are included in Appendices.
Enabling Flexible and Continuous Capability Invocation in Mobile Prosumer Environments
Alcarria, Ramon; Robles, Tomas; Morales, Augusto; López-de-Ipiña, Diego; Aguilera, Unai
2012-01-01
Mobile prosumer environments require the communication with heterogeneous devices during the execution of mobile services. These environments integrate sensors, actuators and smart devices, whose availability continuously changes. The aim of this paper is to design a reference architecture for implementing a model for continuous service execution and access to capabilities, i.e., the functionalities provided by these devices. The defined architecture follows a set of software engineering patterns and includes some communication paradigms to cope with the heterogeneity of sensors, actuators, controllers and other devices in the environment. In addition, we stress the importance of the flexibility in capability invocation by allowing the communication middleware to select the access technology and change the communication paradigm when dealing with smart devices, and by describing and evaluating two algorithms for resource access management. PMID:23012526
Development of a prototype multi-processing interactive software invocation system
NASA Technical Reports Server (NTRS)
Berman, W. J.
1983-01-01
The Interactive Software Invocation System (NASA-ISIS) was first transported to the M68000 microcomputer, and then rewritten in the programming language Path Pascal. Path Pascal is a significantly enhanced derivative of Pascal, allowing concurrent algorithms to be expressed using the simple and elegant concept of Path Expressions. The primary results of this contract was to verify the viability of Path Pascal as a system's development language. The NASA-ISIS implementation using Path Pascal is a prototype of a large, interactive system in Path Pascal. As such, it is an excellent demonstration of the feasibility of using Path Pascal to write even more extensive systems. It is hoped that future efforts will build upon this research and, ultimately, that a full Path Pascal/ISIS Operating System (PPIOS) might be developed.
Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe
NASA Astrophysics Data System (ADS)
Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun
2013-04-01
The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.
2017-08-01
This large repository of climate model results for North America (Wang and Kotamarthi 2013, 2014, 2015) is stored in Network Common Data Form (NetCDF...Network Common Data Form (NetCDF). UCAR/Unidata Program Center, Boulder, CO. Available at: http://www.unidata.ucar.edu/software/netcdf. Accessed on 6/20...emissions diverge from each other regarding fossil fuel use, technology, and other socioeconomic factors. As a result, the estimated emissions for each of
Demonstrating NaradaBrokering as a Middleware Fabric for Grid-based Remote Visualization Services
NASA Astrophysics Data System (ADS)
Pallickara, S.; Erlebacher, G.; Yuen, D.; Fox, G.; Pierce, M.
2003-12-01
Remote Visualization Services (RVS) have tended to rely on approaches based on the client server paradigm. Here we demonstrate our approach - based on a distributed brokering infrastructure, NaradaBrokering [1] - that relies on distributed, asynchronous and loosely coupled interactions to meet the requirements and constraints of RVS. In our approach to RVS, services advertise their capabilities to the broker network that manages these service advertisements. Among the services considered within our system are those that perform graphic transformations, mediate access to specialized datasets and finally those that manage the execution of specified tasks. There could be multiple instances of each of these services and the system ensures that load for a given service is distributed efficiently over these service instances. We will demonstrate implementation of concepts that we outlined in the oral presentation. This would involve two or more visualization servers interacting asynchronously with multiple clients through NaradaBrokering. The communicating entities may exchange SOAP [2] (Simple Object Access Protocol) messages. SOAP is a lightweight protocol for exchange of information in a decentralized, distributed environment. It is an XML based protocol that consists of three parts: an envelope that describes what is in a message and how to process it, rules for expressing instances of application-defined data types, and a convention for representing remote invocation related operations. Furthermore, we will also demonstrate how clients can retrieve their results after prolonged disconnects or after any failures that might have taken place. The entities, services and clients alike, are not limited by the geographical distances that separate them. We are planning to test this system in the context of trans-Atlantic links separating interacting entities. {[1]} The NaradaBrokering Project: http://www.naradabrokering.org {[2]} Newcomer, E., 2002, Understanding web services: XML, WSDL, SOAP, and UDDI, Addison Wesley Professional.
Emerald: an object-based language for distributed programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchinson, N.C.
1987-01-01
Distributed systems have become more common, however constructing distributed applications remains a very difficult task. Numerous operating systems and programming languages have been proposed that attempt to simplify the programming of distributed applications. Here a programing language called Emerald is presented that simplifies distributed programming by extending the concepts of object-based languages to the distributed environment. Emerald supports a single model of computation: the object. Emerald objects include private entities such as integers and Booleans, as well as shared, distributed entities such as compilers, directories, and entire file systems. Emerald objects may move between machines in the system, but objectmore » invocation is location independent. The uniform semantic model used for describing all Emerald objects makes the construction of distributed applications in Emerald much simpler than in systems where the differences in implementation between local and remote entities are visible in the language semantics. Emerald incorporates a type system that deals only with the specification of objects - ignoring differences in implementation. Thus, two different implementations of the same abstraction may be freely mixed.« less
NASA Astrophysics Data System (ADS)
Schweitzer, R. H.
2001-05-01
The Climate Diagnostics Center maintains a collection of gridded climate data primarily for use by local researchers. Because this data is available on fast digital storage and because it has been converted to netCDF using a standard metadata convention (called COARDS), we recognize that this data collection is also useful to the community at large. At CDC we try to use technology and metadata standards to reduce our costs associated with making these data available to the public. The World Wide Web has been an excellent technology platform for meeting that goal. Specifically we have developed Web-based user interfaces that allow users to search, plot and download subsets from the data collection. We have also been exploring use of the Pacific Marine Environment Laboratory's Live Access Server (LAS) as an engine for this task. This would result in further savings by allowing us to concentrate on customizing the LAS where needed, rather that developing and maintaining our own system. One such customization currently under development is the use of Java Servlets and JavaServer pages in conjunction with a metadata database to produce a hierarchical user interface to LAS. In addition to these Web-based user interfaces all of our data are available via the Distributed Oceanographic Data System (DODS). This allows other sites using LAS and individuals using DODS-enabled clients to use our data as if it were a local file. All of these technology systems are driven by metadata. When we began to create netCDF files, we collaborated with several other agencies to develop a netCDF convention (COARDS) for metadata. At CDC we have extended that convention to incorporate additional metadata elements to make the netCDF files as self-describing as possible. Part of the local metadata is a set of controlled names for the variable, level in the atmosphere and ocean, statistic and data set for each netCDF file. To allow searching and easy reorganization of these metadata, we loaded the metadata from the netCDF files into a mySQL database. The combination of the mySQL database and the controlled names makes it possible to automate the construction of user interfaces and standard format metadata descriptions, like Federal Geographic Data Committee (FGDC) and Directory Interchange Format (DIF). These standard descriptions also include an association between our controlled names and standard keywords such as those developed by the Global Change Master Directory (GCMD). This talk will give an overview of each of these technology and metadata standards as it applies to work at the Climate Diagnostics Center. The talk will also discuss the pros and cons of each approach and discuss areas for future development.
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Xing, Z.
2007-12-01
The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* & Globus Alliance toolkits), and enables scientists to do multi- instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.
NASA Astrophysics Data System (ADS)
Ludwig, R.; Mauser, W.; Niemeyer, S.; Colgan, A.; Stolz, R.; Escher-Vetter, H.; Kuhn, M.; Reichstein, M.; Tenhunen, J.; Kraus, A.; Ludwig, M.; Barth, M.; Hennicker, R.
The GLOWA-initiative (Global Change of the water cycle), funded by the German Ministry of Research and Education (BMBF), has been established to address the manifold consequences of Global Change on regional water resources in a variety of catchment areas with different natural and cultural characteristics. Within this framework, the GLOWA-Danube project is dealing with the Upper Danube watershed as a representative mesoscale test site (∼75.000 km 2) for mountain-foreland regions in the temperate mid-latitudes. The principle objective is to identify, examine and develop new techniques of coupled distributed modelling for the integration of natural and socio-economic sciences. The transdisciplinary research in GLOWA-Danube develops an integrated decision support system, called DANUBIA, to investigate the sustainability of future water use. GLOWA-Danube, which is scheduled for a total run-time of eight years to operationally implement and establish DANUBIA, comprises a university-based network of experts with water-related competence in the fields of engineering, natural and social sciences. Co-operation with a network of stakeholders in water resources management of the Upper Danube catchment ensures that practical issues and future problems in the water sector of the region can be addressed. In order to synthesize a common understanding between the project partners, a standardized notation of parameters and functions and a platform-independent structure of computational methods and interfaces has been established, by making use of the unified modelling language, an industry standard for the structuring and co-ordination of large projects in software development [Booch et al., The Unified Modelling Language User Guide, Addison-Wesley, Reading, 1999]. DANUBIA is object-oriented, spatially distributed and raster-based at its core. It applies the concept of “proxels” (process pixels) as its basic objects, which have different dimensions depending on the viewing scale and connect to their environment through fluxes. The presented paper excerpts the hydrological view point of GLOWA-Danube, its approach of model coupling and network-based communication, and object-oriented techniques to simulate physical processes and interactions at the land surface. The mechanisms and technologies applied to communicate data and model parameters across the typical discipline borders are demonstrated from the perspective of the Landsurface object. It comprises the capabilities of interdependent expert models for energy exchange at various surface types, snowmelt, soil water movement, runoff formation and plant growth in a distributed Java-based modelling environment using the remote method invocation [Pitt et al., Java.rmi: The Remote Method Invocation Guide, Addison Wesley Professional, Reading, 2001, p. 320]. The presented text summarizes the GLOWA-Danube concept and shows the state of an implemented DANUBIA prototype after completion of the first project-year (2001).
Cultural Trauma and Christian Identity in the Late Medieval Heroic Epic, The Siege of Jerusalem.
DeMarco, Patricia A
2015-01-01
This essay examines scenes of violence in the late medieval poem The Siege of Jerusalem in order to reveal the ways in which trauma is used as the grounds upon which Christian/Jewish difference is established. In particular, I argue that this poem serves as an example of a widespread element in Christian chivalric identity, namely the need to manage the repetitive invocation of Christ's crucifixion (ritually repeated through liturgical and poetic invocation) as a means of asserting both the bodily and psychic integrity of the Christian subject in contrast to the violently abjected figure of the Jewish body. The failure of The Siege protagonist, Wespasian, to navigate the cultural trauma of the crucifixion is contrasted to the successful management of trauma by the martial hero, Tancred, in Tasso's epic, Gerusalemme Liberata, illustrating the range of imaginative possibilities for understanding trauma in pre-modern war literature.
Cultural Trauma and Christian Identity in the Late Medieval Heroic Epic, The Siege of Jerusalem.
DeMarco, Patricia A
2015-01-01
This essay examines scenes of violence in the late medieval poem The Siege of Jerusalem in order to reveal the ways in which trauma is used as the grounds upon which Christian/Jewish difference is established. In particular, I argue that this poem serves as an example of a widespread element in Christian chivalric identity, namely the need to manage the repetitive invocation of Christ's crucifixion (ritually repeated through liturgical and poetic invocation) as a means of asserting both the bodily and psychic integrity of the Christian subject in contrast to the violently abjected figure of the Jewish body. The failure of The Siege protagonist, Wespasian, to navigate the cultural trauma of the crucifixion is contrasted to the successful management of trauma by the martial hero, Tancred, in Tasso's epic, Gerusalemme Liberata, illustrating the range of imaginative possibilities for understanding trauma in pre-modern war literature.
A Mediator-Based Approach to Resolving Interface Heterogeneity of Web Services
NASA Astrophysics Data System (ADS)
Leitner, Philipp; Rosenberg, Florian; Michlmayr, Anton; Huber, Andreas; Dustdar, Schahram
In theory, service-oriented architectures are based on the idea of increasing flexibility in the selection of internal and external business partners using loosely-coupled services. However, in practice this flexibility is limited by the fact that partners need not only to provide the same service, but to do so via virtually the same interface in order to actually be interchangeable easily. Invocation-level mediation may be used to overcome this issue — by using mediation interface differences can be resolved transparently at runtime. In this chapter we discuss the basic ideas of mediation, with a focus on interface-level mediation. We show how interface mediation is integrated into our dynamic Web service invocation framework DAIOS, and present three different mediation strategies, one based on structural message similarity, one based on semantically annotated WSDL, and one which is embedded into the VRESCo SOA runtime, a larger research project with explicit support for service mediation.
Sharing electronic structure and crystallographic data with ETSF_IO
NASA Astrophysics Data System (ADS)
Caliste, D.; Pouillon, Y.; Verstraete, M. J.; Olevano, V.; Gonze, X.
2008-11-01
We present a library of routines whose main goal is to read and write exchangeable files (NetCDF file format) storing electronic structure and crystallographic information. It is based on the specification agreed inside the European Theoretical Spectroscopy Facility (ETSF). Accordingly, this library is nicknamed ETSF_IO. The purpose of this article is to give both an overview of the ETSF_IO library and a closer look at its usage. ETSF_IO is designed to be robust and easy to use, close to Fortran read and write routines. To facilitate its adoption, a complete documentation of the input and output arguments of the routines is available in the package, as well as six tutorials explaining in detail various possible uses of the library routines. Catalogue identifier: AEBG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Gnu Lesser General Public License No. of lines in distributed program, including test data, etc.: 63 156 No. of bytes in distributed program, including test data, etc.: 363 390 Distribution format: tar.gz Programming language: Fortran 95 Computer: All systems with a Fortran95 compiler Operating system: All systems with a Fortran95 compiler Classification: 7.3, 8 External routines: NetCDF, http://www.unidata.ucar.edu/software/netcdf Nature of problem: Store and exchange electronic structure data and crystallographic data independently of the computational platform, language and generating software Solution method: Implement a library based both on NetCDF file format and an open specification (http://etsf.eu/index.php?page=standardization)
Collaboration tools and techniques for large model datasets
Signell, R.P.; Carniel, S.; Chiggiato, J.; Janekovic, I.; Pullen, J.; Sherwood, C.R.
2008-01-01
In MREA and many other marine applications, it is common to have multiple models running with different grids, run by different institutions. Techniques and tools are described for low-bandwidth delivery of data from large multidimensional datasets, such as those from meteorological and oceanographic models, directly into generic analysis and visualization tools. Output is stored using the NetCDF CF Metadata Conventions, and then delivered to collaborators over the web via OPeNDAP. OPeNDAP datasets served by different institutions are then organized via THREDDS catalogs. Tools and procedures are then used which enable scientists to explore data on the original model grids using tools they are familiar with. It is also low-bandwidth, enabling users to extract just the data they require, an important feature for access from ship or remote areas. The entire implementation is simple enough to be handled by modelers working with their webmasters - no advanced programming support is necessary. ?? 2007 Elsevier B.V. All rights reserved.
Secure web-based invocation of large-scale plasma simulation codes
NASA Astrophysics Data System (ADS)
Dimitrov, D. A.; Busby, R.; Exby, J.; Bruhwiler, D. L.; Cary, J. R.
2004-12-01
We present our design and initial implementation of a web-based system for running, both in parallel and serial, Particle-In-Cell (PIC) codes for plasma simulations with automatic post processing and generation of visual diagnostics.
Comparing NetCDF and SciDB on managing and querying 5D hydrologic dataset
NASA Astrophysics Data System (ADS)
Liu, Haicheng; Xiao, Xiao
2016-11-01
Efficiently extracting information from high dimensional hydro-meteorological modelling datasets requires smart solutions. Traditional methods are mostly based on files, which can be edited and accessed handily. But they have problems of efficiency due to contiguous storage structure. Others propose databases as an alternative for advantages such as native functionalities for manipulating multidimensional (MD) arrays, smart caching strategy and scalability. In this research, NetCDF file based solutions and the multidimensional array database management system (DBMS) SciDB applying chunked storage structure are benchmarked to determine the best solution for storing and querying 5D large hydrologic modelling dataset. The effect of data storage configurations including chunk size, dimension order and compression on query performance is explored. Results indicate that dimension order to organize storage of 5D data has significant influence on query performance if chunk size is very large. But the effect becomes insignificant when chunk size is properly set. Compression of SciDB mostly has negative influence on query performance. Caching is an advantage but may be influenced by execution of different query processes. On the whole, NetCDF solution without compression is in general more efficient than the SciDB DBMS.
Hydratools, a MATLAB® based data processing package for Sontek Hydra data
Martini, M.; Lightsom, F.L.; Sherwood, C.R.; Xu, Jie; Lacy, J.R.; Ramsey, A.; Horwitz, R.
2005-01-01
The U.S. Geological Survey (USGS) has developed a set of MATLAB tools to process and convert data collected by Sontek Hydra instruments to netCDF, which is a format used by the USGS to process and archive oceanographic time-series data. The USGS makes high-resolution current measurements within 1.5 meters of the bottom. These data are used in combination with other instrument data from sediment transport studies to develop sediment transport models. Instrument manufacturers provide software which outputs unique binary data formats. Multiple data formats are cumbersome. The USGS solution is to translate data streams into a common data format: netCDF. The Hydratools toolbox is written to create netCDF format files following EPIC conventions, complete with embedded metadata. Data are accepted from both the ADV and the PCADP. The toolbox will detect and remove bad data, substitute other sources of heading and tilt measurements if necessary, apply ambiguity corrections, calculate statistics, return information about data quality, and organize metadata. Standardized processing and archiving makes these data more easily and routinely accessible locally and over the Internet. In addition, documentation of the techniques used in the toolbox provides a baseline reference for others utilizing the data.
The Ophidia framework: toward cloud-based data analytics for climate change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni
2015-04-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.
SciSpark's SRDD : A Scientific Resilient Distributed Dataset for Multidimensional Data
NASA Astrophysics Data System (ADS)
Palamuttam, R. S.; Wilson, B. D.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; McGibbney, L. J.; Ramirez, P.
2015-12-01
Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We have developed SciSpark, a robust Big Data framework, that extends ApacheTM Spark for scaling scientific computations. Apache Spark improves the map-reduce implementation in ApacheTM Hadoop for parallel computing on a cluster, by emphasizing in-memory computation, "spilling" to disk only as needed, and relying on lazy evaluation. Central to Spark is the Resilient Distributed Dataset (RDD), an in-memory distributed data structure that extends the functional paradigm provided by the Scala programming language. However, RDDs are ideal for tabular or unstructured data, and not for highly dimensional data. The SciSpark project introduces the Scientific Resilient Distributed Dataset (sRDD), a distributed-computing array structure which supports iterative scientific algorithms for multidimensional data. SciSpark processes data stored in NetCDF and HDF files by partitioning them across time or space and distributing the partitions among a cluster of compute nodes. We show usability and extensibility of SciSpark by implementing distributed algorithms for geospatial operations on large collections of multi-dimensional grids. In particular we address the problem of scaling an automated method for finding Mesoscale Convective Complexes. SciSpark provides a tensor interface to support the pluggability of different matrix libraries. We evaluate performance of the various matrix libraries in distributed pipelines, such as Nd4jTM and BreezeTM. We detail the architecture and design of SciSpark, our efforts to integrate climate science algorithms, parallel ingest and partitioning (sharding) of A-Train satellite observations from model grids. These solutions are encompassed in SciSpark, an open-source software framework for distributed computing on scientific data.
NASA Astrophysics Data System (ADS)
Niemeijer, Sander
2017-04-01
The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and analysis application for atmospheric data and can be used to visualize and analyze the data that you retrieve using the CODA and HARP interfaces. The application uses the Python language as the means through which you provide commands to the application. The Python interfaces for CODA and HARP are included so you can directly ingest product data from within VISAN. Powerful visualization functionality for 2D plots and geographical plots in VISAN will allow you to directly visualize the ingested data. All components from the ESA Atmospheric Toolbox are Open Source and freely available. Software packages can be downloaded from the BEAT website: http://stcorp.nl/beat/
NASA Astrophysics Data System (ADS)
Ansari, S.; Del Greco, S.
2006-12-01
In February 2005, 61 countries around the World agreed on a 10 year plan to work towards building open systems for sharing geospatial data and services across different platforms worldwide. This system is known as the Global Earth Observation System of Systems (GEOSS). The objective of GEOSS focuses on easy access to environmental data and interoperability across different systems allowing participating countries to measure the "pulse" of the planet in an effort to advance society. In support of GEOSS goals, NOAA's National Climatic Data Center (NCDC) has developed radar visualization and data exporter tools in an open systems environment. The NCDC Weather Radar Toolkit (WRT) loads Weather Surveillance Radar 1988 Doppler (WSR-88D) volume scan (S-band) data, known as Level-II, and derived products, known as Level-III, into an Open Geospatial Consortium (OGC) compliant environment. The application is written entirely in Java and will run on any Java- supported platform including Windows, Macintosh and Linux/Unix. The application is launched via Java Web Start and runs on the client machine while accessing these data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT allows the data to be manipulated to create custom mosaics, composites and precipitation estimates. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. By decoding the various Radar formats into the NetCDF Common Data Model, the exported NetCDF data becomes interoperable with existing software packages including THREDDS Data Server and the Integrated Data Viewer (IDV). The NCDC recently partnered with NOAA's National Severe Storms Lab (NSSL) to decode Sigmet C-band Doppler radar data providing the NCDC Viewer/Data Exporter the functionality to read C-Band. This also supports a bilateral agreement between the United States and Canada for data sharing and to support interoperability with the US WSR-88D and Environment Canada radar networks. In addition, the NCDC partnered with the University of Oklahoma to develop decoders to read a test bed of distributed X- band radars that are funded through the Collaborative Adaptive Sensing of the Atmosphere (CASA) project. The NCDC is also archiving the National Mosaic and Next Generation QPE (Q2) products from NSSL, which provide products such as three-dimensional reflectivity, composite reflectivity and precipitation estimates at a 1 km resolution. These three sources of Radar data are also supported in the WRT.
[Further Distinctions between Magic, Reality, Religion, and Fiction. Commentaries.
ERIC Educational Resources Information Center
Boyer, Pascal; Taylor, Marjorie; Harris, Paul L.; Chandler, Michael; Johnson, Carl N.
1997-01-01
Contains the following commentaries: "Further Distinctions between Magic, Reality, Religion, and Fiction"; "The Role of Creative Control and Culture in Children's Fantasy/Reality Judgments"; "The Last of the Magicians? Children, Scientists, and the Invocation of Hidden Causal Powers"; "Rescuing Magical Thinking…
Wagener, Johannes; Spjuth, Ola; Willighagen, Egon L; Wikberg, Jarl ES
2009-01-01
Background Life sciences make heavily use of the web for both data provision and analysis. However, the increasing amount of available data and the diversity of analysis tools call for machine accessible interfaces in order to be effective. HTTP-based Web service technologies, like the Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST) services, are today the most common technologies for this in bioinformatics. However, these methods have severe drawbacks, including lack of discoverability, and the inability for services to send status notifications. Several complementary workarounds have been proposed, but the results are ad-hoc solutions of varying quality that can be difficult to use. Results We present a novel approach based on the open standard Extensible Messaging and Presence Protocol (XMPP), consisting of an extension (IO Data) to comprise discovery, asynchronous invocation, and definition of data types in the service. That XMPP cloud services are capable of asynchronous communication implies that clients do not have to poll repetitively for status, but the service sends the results back to the client upon completion. Implementations for Bioclipse and Taverna are presented, as are various XMPP cloud services in bio- and cheminformatics. Conclusion XMPP with its extensions is a powerful protocol for cloud services that demonstrate several advantages over traditional HTTP-based Web services: 1) services are discoverable without the need of an external registry, 2) asynchronous invocation eliminates the need for ad-hoc solutions like polling, and 3) input and output types defined in the service allows for generation of clients on the fly without the need of an external semantics description. The many advantages over existing technologies make XMPP a highly interesting candidate for next generation online services in bioinformatics. PMID:19732427
Wagener, Johannes; Spjuth, Ola; Willighagen, Egon L; Wikberg, Jarl E S
2009-09-04
Life sciences make heavily use of the web for both data provision and analysis. However, the increasing amount of available data and the diversity of analysis tools call for machine accessible interfaces in order to be effective. HTTP-based Web service technologies, like the Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST) services, are today the most common technologies for this in bioinformatics. However, these methods have severe drawbacks, including lack of discoverability, and the inability for services to send status notifications. Several complementary workarounds have been proposed, but the results are ad-hoc solutions of varying quality that can be difficult to use. We present a novel approach based on the open standard Extensible Messaging and Presence Protocol (XMPP), consisting of an extension (IO Data) to comprise discovery, asynchronous invocation, and definition of data types in the service. That XMPP cloud services are capable of asynchronous communication implies that clients do not have to poll repetitively for status, but the service sends the results back to the client upon completion. Implementations for Bioclipse and Taverna are presented, as are various XMPP cloud services in bio- and cheminformatics. XMPP with its extensions is a powerful protocol for cloud services that demonstrate several advantages over traditional HTTP-based Web services: 1) services are discoverable without the need of an external registry, 2) asynchronous invocation eliminates the need for ad-hoc solutions like polling, and 3) input and output types defined in the service allows for generation of clients on the fly without the need of an external semantics description. The many advantages over existing technologies make XMPP a highly interesting candidate for next generation online services in bioinformatics.
A Software Architecture for Intelligent Synthesis Environments
NASA Technical Reports Server (NTRS)
Filman, Robert E.; Norvig, Peter (Technical Monitor)
2001-01-01
The NASA's Intelligent Synthesis Environment (ISE) program is a grand attempt to develop a system to transform the way complex artifacts are engineered. This paper discusses a "middleware" architecture for enabling the development of ISE. Desirable elements of such an Intelligent Synthesis Architecture (ISA) include remote invocation; plug-and-play applications; scripting of applications; management of design artifacts, tools, and artifact and tool attributes; common system services; system management; and systematic enforcement of policies. This paper argues that the ISA extend conventional distributed object technology (DOT) such as CORBA and Product Data Managers with flexible repositories of product and tool annotations and "plug-and-play" mechanisms for inserting "ility" or orthogonal concerns into the system. I describe the Object Infrastructure Framework, an Aspect Oriented Programming (AOP) environment for developing distributed systems that provides utility insertion and enables consistent annotation maintenance. This technology can be used to enforce policies such as maintaining the annotations of artifacts, particularly the provenance and access control rules of artifacts-, performing automatic datatype transformations between representations; supplying alternative servers of the same service; reporting on the status of jobs and the system; conveying privileges throughout an application; supporting long-lived transactions; maintaining version consistency; and providing software redundancy and mobility.
75 FR 50773 - Invocation of Sunken Military Craft Act
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-17
... approximately 2450 ft of water near position: 32-58.0 N 118-10.10 W. This location now serves as the gravesite... under the Sunken Military Craft Act (10 U.S.C. 113 note; Pub. L. 108-375, Sections 1401-1408) (``the Act...
Fast Multiscale Algorithms for Information Representation and Fusion
2011-07-01
We are also developing convenient command-line invocation tools in addition to the previously developed APIs . Various real-world data sets...This knowledge is important in geolocation applications where knowing whether a received signal is line-of-sight or not is necessary for the
NASA Astrophysics Data System (ADS)
Tronconi, C.; Forneris, V.; Santoleri, R.
2009-04-01
CNR-ISAC-GOS is responsible for the Mediterranean Sea satellite operational system in the framework of MOON Patnership. This Observing System acquires satellite data and produces Near Real Time, Delayed Time and Re-analysis of Ocean Colour and Sea Surface Temperature products covering the Mediterranean and the Black Seas and regional basins. In the framework of several projects (MERSEA, PRIMI, Adricosm Star, SeaDataNet, MyOcean, ECOOP), GOS is producing Climatological/Satellite datasets based on optimal interpolation and specific Regional algorithm for chlorophyll, updated in Near Real Time and in Delayed mode. GOS has built • an informatic infrastructure data repository and delivery based on THREDDS technology The datasets are generated in NETCDF format, compliant with both the CF convention and the international satellite-oceanographic specification, as prescribed by GHRSST (for SST). All data produced, are made available to the users through a THREDDS server catalog. • A LAS has been installed in order to exploit the potential of NETCDF data and the OPENDAP URL. It provides flexible access to geo-referenced scientific data • a Grid Environment based on Globus Technologies (GT4) connecting more than one Institute; in particular exploiting CNR and ESA clusters makes possible to reprocess 12 years of Chlorophyll data in less than one month.(estimated processing time on a single core PC: 9months). In the poster we will give an overview of: • the features of the THREDDS catalogs, pointing out the powerful characteristics of this new middleware that has replaced the "old" OPENDAP Server; • the importance of adopting a common format (as NETCDF) for data exchange; • the tools (e.g. LAS) connected with THREDDS and NETCDF format use. • the Grid infrastructure on ISAC We will present also specific basin-scale High Resolution products and Ultra High Resolution regional/coastal products available on these catalogs.
"One-Stop Shopping" for Ocean Remote-Sensing and Model Data
NASA Technical Reports Server (NTRS)
Li, P. Peggy; Vu, Quoc; Chao, Yi; Li, Zhi-Jin; Choi, Jei-Kook
2006-01-01
OurOcean Portal 2.0 (http:// ourocean.jpl.nasa.gov) is a software system designed to enable users to easily gain access to ocean observation data, both remote-sensing and in-situ, configure and run an Ocean Model with observation data assimilated on a remote computer, and visualize both the observation data and the model outputs. At present, the observation data and models focus on the California coastal regions and Prince William Sound in Alaska. This system can be used to perform both real-time and retrospective analyses of remote-sensing data and model outputs. OurOcean Portal 2.0 incorporates state-of-the-art information technologies (IT) such as MySQL database, Java Web Server (Apache/Tomcat), Live Access Server (LAS), interactive graphics with Java Applet at the Client site and MatLab/GMT at the server site, and distributed computing. OurOcean currently serves over 20 real-time or historical ocean data products. The data are served in pre-generated plots or their native data format. For some of the datasets, users can choose different plotting parameters and produce customized graphics. OurOcean also serves 3D Ocean Model outputs generated by ROMS (Regional Ocean Model System) using LAS. The Live Access Server (LAS) software, developed by the Pacific Marine Environmental Laboratory (PMEL) of the National Oceanic and Atmospheric Administration (NOAA), is a configurable Web-server program designed to provide flexible access to geo-referenced scientific data. The model output can be views as plots in horizontal slices, depth profiles or time sequences, or can be downloaded as raw data in different data formats, such as NetCDF, ASCII, Binary, etc. The interactive visualization is provided by graphic software, Ferret, also developed by PMEL. In addition, OurOcean allows users with minimal computing resources to configure and run an Ocean Model with data assimilation on a remote computer. Users may select the forcing input, the data to be assimilated, the simulation period, and the output variables and submit the model to run on a backend parallel computer. When the run is complete, the output will be added to the LAS server for
Prayers and Extracurricular Activities in Public Schools.
ERIC Educational Resources Information Center
Bjorklun, Eugene C.
1989-01-01
Examines the constitutionality of public school personnel organizing prayers at extracurricular events and of using ceremonial prayers, invocation, and benedictions at school activities. Reviews court litigation and Supreme Court decisions that use the Establishment Clause and Lemon test to determine legality. Finds, in most cases, that prayer at…
47 CFR 76.109 - Requirements for invocation of protection.
Code of Federal Regulations, 2014 CFR
2014-10-01
... SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Network Non-duplication Protection, Syndicated... entered on or after August 18, 1988, must contain the following words: “the licensee [or substitute name... provided in the FCC's syndicated exclusivity rules’].” Contracts entered into prior to August 18, 1988...
47 CFR 76.109 - Requirements for invocation of protection.
Code of Federal Regulations, 2012 CFR
2012-10-01
... SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Network Non-duplication Protection, Syndicated... entered on or after August 18, 1988, must contain the following words: “the licensee [or substitute name... provided in the FCC's syndicated exclusivity rules’].” Contracts entered into prior to August 18, 1988...
47 CFR 76.109 - Requirements for invocation of protection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Network Non-duplication Protection, Syndicated... entered on or after August 18, 1988, must contain the following words: “the licensee [or substitute name... provided in the FCC's syndicated exclusivity rules’].” Contracts entered into prior to August 18, 1988...
47 CFR 76.109 - Requirements for invocation of protection.
Code of Federal Regulations, 2011 CFR
2011-10-01
... SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Network Non-duplication Protection, Syndicated... entered on or after August 18, 1988, must contain the following words: “the licensee [or substitute name... provided in the FCC's syndicated exclusivity rules’].” Contracts entered into prior to August 18, 1988...
47 CFR 76.109 - Requirements for invocation of protection.
Code of Federal Regulations, 2013 CFR
2013-10-01
... SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Network Non-duplication Protection, Syndicated... entered on or after August 18, 1988, must contain the following words: “the licensee [or substitute name... provided in the FCC's syndicated exclusivity rules’].” Contracts entered into prior to August 18, 1988...
Investigating the feasibility of Visualising Complex Space Weather Data in a CAVE
NASA Astrophysics Data System (ADS)
Loughlin, S.; Habash Krause, L.
2013-12-01
The purpose of this study was to investigate the feasibility of visualising complex space weather data in a Cave Automatic Virtual Environment (CAVE). Space weather is increasingly causing disruptions on Earth, such as power outages and disrupting communication to satellites. We wanted to display this space weather data within the CAVE since the data from instruments, models and simulations are typically too complex to understand on their own, especially when they are of 7 dimensions. To accomplish this, I created a VTK to NetCDF converter. NetCDF is a science data format, which stores array oriented scientific data. The format is maintained by the University Corporation for Atmospheric Research, and is used extensively by the atmospheric and space communities.
An Archetypal Phenomenology of "Skholé"
ERIC Educational Resources Information Center
Kennedy, David
2017-01-01
In this essay David Kennedy argues that children represent one vanguard of an emergent shift in Western subjectivity, and that adult-child dialogue, especially in the context of schooling, is a key locus for the epistemological change that implies. Following Herbert Marcuse's invocation of a "new sensibility," Kennedy argues that the…
32 CFR 151.4 - Procedures and responsibilities.
Code of Federal Regulations, 2010 CFR
2010-07-01
... country for personnel assigned to foreign areas. (c) Designated commanding officer. Formal invocation of... geographical areas for which a unified command exists, the commander shall designate within each country the “Commanding Officer” referred to in the Senate Resolution (§ 151.6). (2) In areas where a unified command does...
47 CFR 76.124 - Requirements for invocation of protection.
Code of Federal Regulations, 2011 CFR
2011-10-01
... station licensee or distributor of syndicated programming to be eligible to invoke the provisions of § 76... against duplication of programming imported under the Statutory Copyright License, as provided in § 76.122... foregoing language plus a clear and specific reference to the licensee's authority to exercise exclusivity...
47 CFR 76.124 - Requirements for invocation of protection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... station licensee or distributor of syndicated programming to be eligible to invoke the provisions of § 76... against duplication of programming imported under the Statutory Copyright License, as provided in § 76.122... foregoing language plus a clear and specific reference to the licensee's authority to exercise exclusivity...
ERIC Educational Resources Information Center
O'Neill, Arthur
2014-01-01
The author states that in earlier pieces (O'Neill, 2002, 2010, 2012), he chewed on and tried to digest newspaper advertisements made by universities. Byproducts did not come out smelling like roses: universities are scarcely able to present themselves without boasting, crass displays of salesmanship, and brazen invocations of virtue. This article…
Web mapping system for complex processing and visualization of environmental geospatial datasets
NASA Astrophysics Data System (ADS)
Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor
2016-04-01
Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.
MK3TOOLS & NetCDF - storing VLBI data in a machine independent array oriented data format
NASA Astrophysics Data System (ADS)
Hobiger, T.; Koyama, Y.; Kondo, T.
2007-07-01
In the beginning of 2002 the International VLBI Service (IVS) has agreed to introduce a Platform-independent VLBI exchange format (PIVEX) which permits the exchange of observational data and stimulates the research across different analysis groups. Unfortunately PIVEX has never been implemented and many analysis software packages are still depending on prior processing (e.g. ambiguity resolution and computation of ionosphere corrections) done by CALC/SOLVE. Thus MK3TOOLS which handles MK3 databases without CALC/SOLVE being installed has been developed. It uses the NetCDF format to store the data and since interfaces exist for a variety of programming languages (FORTRAN, C/C++, JAVA, Perl, Python) it can be easily incorporated in existing and upcoming analysis software packages.
NASA Astrophysics Data System (ADS)
Gipson, John
2011-07-01
I describe the proposed data structure for storing, archiving and processing VLBI data. In this scheme, most VLBI data is stored in NetCDF files. NetCDF has the advantage that there are interfaces to most common computer languages including Fortran, Fortran-90, C, C++, Perl, etc, and the most common operating systems including linux, Windows and Mac. The data files for a particular session are organized by special ASCII "wrapper" files which contain pointers to the data files. This allows great flexibility in the processing and analysis of VLBI data, and also allows for extending the types of data used, e.g., source maps. I discuss the use of the new format in calc/solve and other VLBI analysis packages. I also discuss plans for transitioning to the new structure.
"Listen Then, Or, Rather, Answer": Contemporary Challenges to Socratic Education
ERIC Educational Resources Information Center
Fullam, Jordan
2015-01-01
The popularity of Jacques Rancière in recent work in educational philosophy has rejuvenated discussion of the merits and weaknesses of Socratic education, both in Plato's dialogues and in invocations of Socrates in contemporary educational practice. In this essay Jordan Fullam explores the implications of this trend through comparing…
1984-01-01
Culex (Melanoconion) invocator Pazos with a redescription of adults and illustration of male genitalia (Diptera: Culicidae). Mosq. Syst. 10(2):239-245...11111_2 1 11J._- 4 MICROCOPY RESOLUTION TEST CHART NATIONAL BUR[AU OF MTANDARDS 1961 A ,o __I I I~ iI!p••n AD_ _ _ _ _ _ _ _ MEDICAL ENTOMOLOGY PROJECT...12 a . Genus Anopheles .............................................. 12 b. Genus Aedes
Barack Obama, the Exodus Tradition, and the Joshua Generation
ERIC Educational Resources Information Center
Murphy, John M.
2011-01-01
This essay explores Barack Obama's invocation of the Exodus during his 2008 presidential campaign. It argues Obama's turn to Exodus, his rare embodiment of Joshua, and his renewal of the American covenant nicely addressed major rhetorical problems that he faced. Of equal importance, his campaign oratory opens an important line of inquiry into the…
Collegiality Matters: Massachusetts Public Higher Education Librarians' Perspective
ERIC Educational Resources Information Center
Freedman, Shin
2012-01-01
It is no secret that collegiality matters in academe regardless of the size and type of institution. When it comes to promotion, reappointment and tenure, the invocation of collegiality occurs. This paper aims to examine the perception and issues surrounding collegiality in the academic library setting. The data, based on the survey results of the…
"Suited to Their Needs": White Innocence as a Vestige of Segregation
ERIC Educational Resources Information Center
Orozco, Richard; Jaime Diaz, Jesus
2016-01-01
Discourses that supported de jure segregated schools often invoked White innocence in the form of altruistic motivations. These same invocations are found in more contemporary school policy discourses. The authors of this article argue, based on the concept of intertextuality of discourse, the existence of contemporary schooling policies as…
Invocations, Benedictions, and Freedom of Speech in Public Schools.
ERIC Educational Resources Information Center
Harris, Phillip H.
1991-01-01
The Supreme Court, in an upcoming case "Lee v. Weisman," will rule on whether prayer may be offered out loud at a public school graduation program. Argues that past court decisions have interpreted the Establishment Clause of the First Amendment over the Free Speech Clause of that same amendment. (57 references) (MLF)
Neoteny, Dialogic Education and an Emergent Psychoculture: Notes on Theory and Practice
ERIC Educational Resources Information Center
Kennedy, David
2014-01-01
This article argues that children represent one vanguard of an emergent shift in Western subjectivity, and that adult-child dialogue, especially in the context of schooling, is a key locus for the epistemological change that implies. Following Herbert Marcuse's invocation of a "new sensibility", the author argues that the…
jORCA: easily integrating bioinformatics Web Services.
Martín-Requena, Victoria; Ríos, Javier; García, Maximiliano; Ramírez, Sergio; Trelles, Oswaldo
2010-02-15
Web services technology is becoming the option of choice to deploy bioinformatics tools that are universally available. One of the major strengths of this approach is that it supports machine-to-machine interoperability over a network. However, a weakness of this approach is that various Web Services differ in their definition and invocation protocols, as well as their communication and data formats-and this presents a barrier to service interoperability. jORCA is a desktop client aimed at facilitating seamless integration of Web Services. It does so by making a uniform representation of the different web resources, supporting scalable service discovery, and automatic composition of workflows. Usability is at the top of the jORCA agenda; thus it is a highly customizable and extensible application that accommodates a broad range of user skills featuring double-click invocation of services in conjunction with advanced execution-control, on the fly data standardization, extensibility of viewer plug-ins, drag-and-drop editing capabilities, plus a file-based browsing style and organization of favourite tools. The integration of bioinformatics Web Services is made easier to support a wider range of users. .
The PEcAn Project: Accessible Tools for On-demand Ecosystem Modeling
NASA Astrophysics Data System (ADS)
Cowdery, E.; Kooper, R.; LeBauer, D.; Desai, A. R.; Mantooth, J.; Dietze, M.
2014-12-01
Ecosystem models play a critical role in understanding the terrestrial biosphere and forecasting changes in the carbon cycle, however current forecasts have considerable uncertainty. The amount of data being collected and produced is increasing on daily basis as we enter the "big data" era, but only a fraction of this data is being used to constrain models. Until we can improve the problems of model accessibility and model-data communication, none of these resources can be used to their full potential. The Predictive Ecosystem Analyzer (PEcAn) is an ecoinformatics toolbox and a set of workflows that wrap around an ecosystem model and manage the flow of information in and out of regional-scale TBMs. Here we present new modules developed in PEcAn to manage the processing of meteorological data, one of the primary driver dependencies for ecosystem models. The module downloads, reads, extracts, and converts meteorological observations to Unidata Climate Forecast (CF) NetCDF community standard, a convention used for most climate forecast and weather models. The module also automates the conversion from NetCDF to model specific formats, including basic merging, gap-filling, and downscaling procedures. PEcAn currently supports tower-based micrometeorological observations at Ameriflux and FluxNET sites, site-level CSV-formatted data, and regional and global reanalysis products such as the North American Regional Reanalysis and CRU-NCEP. The workflow is easily extensible to additional products and processing algorithms.These meteorological workflows have been coupled with the PEcAn web interface and now allow anyone to run multiple ecosystem models for any location on the Earth by simply clicking on an intuitive Google-map based interface. This will allow users to more readily compare models to observations at those sites, leading to better calibration and validation. Current work is extending these workflows to also process field, remotely-sensed, and historical observations of vegetation composition and structure. The processing of heterogeneous met and veg data within PEcAn is made possible using the Brown Dog cyberinfrastructure tools for unstructured data.
A data delivery system for IMOS, the Australian Integrated Marine Observing System
NASA Astrophysics Data System (ADS)
Proctor, R.; Roberts, K.; Ward, B. J.
2010-09-01
The Integrated Marine Observing System (IMOS, www.imos.org.au), an AUD 150 m 7-year project (2007-2013), is a distributed set of equipment and data-information services which, among many applications, collectively contribute to meeting the needs of marine climate research in Australia. The observing system provides data in the open oceans around Australia out to a few thousand kilometres as well as the coastal oceans through 11 facilities which effectively observe and measure the 4-dimensional ocean variability, and the physical and biological response of coastal and shelf seas around Australia. Through a national science rationale IMOS is organized as five regional nodes (Western Australia - WAIMOS, South Australian - SAIMOS, Tasmania - TASIMOS, New SouthWales - NSWIMOS and Queensland - QIMOS) surrounded by an oceanic node (Blue Water and Climate). Operationally IMOS is organized as 11 facilities (Argo Australia, Ships of Opportunity, Southern Ocean Automated Time Series Observations, Australian National Facility for Ocean Gliders, Autonomous Underwater Vehicle Facility, Australian National Mooring Network, Australian Coastal Ocean Radar Network, Australian Acoustic Tagging and Monitoring System, Facility for Automated Intelligent Monitoring of Marine Systems, eMarine Information Infrastructure and Satellite Remote Sensing) delivering data. IMOS data is freely available to the public. The data, a combination of near real-time and delayed mode, are made available to researchers through the electronic Marine Information Infrastructure (eMII). eMII utilises the Australian Academic Research Network (AARNET) to support a distributed database on OPeNDAP/THREDDS servers hosted by regional computing centres. IMOS instruments are described through the OGC Specification SensorML and where-ever possible data is in CF compliant netCDF format. Metadata, conforming to standard ISO 19115, is automatically harvested from the netCDF files and the metadata records catalogued in the OGC GeoNetwork Metadata Entry and Search Tool (MEST). Data discovery, access and download occur via web services through the IMOS Ocean Portal (http://imos.aodn.org.au) and tools for the display and integration of near real-time data are in development.
NASA Astrophysics Data System (ADS)
Hardman, M.; Brodzik, M. J.; Long, D. G.
2017-12-01
Beginning in 1978, the satellite passive microwave data record has been a mainstay of remote sensing of the cryosphere, providing twice-daily, near-global spatial coverage for monitoring changes in hydrologic and cryospheric parameters that include precipitation, soil moisture, surface water, vegetation, snow water equivalent, sea ice concentration and sea ice motion. Historical versions of the gridded passive microwave data sets were produced as flat binary files described in human-readable documentation. This format is error-prone and makes it difficult to reliably include all processing and provenance. Funded by NASA MEaSUREs, we have completely reprocessed the gridded data record that includes SMMR, SSM/I-SSMIS and AMSR-E. The new Calibrated Enhanced-Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR) files are self-describing. Our approach to the new data set was to create netCDF4 files that use standard metadata conventions and best practices to incorporate file-level, machine- and human-readable contents, geolocation, processing and provenance metadata. We followed the flexible and adaptable Climate and Forecast (CF-1.6) Conventions with respect to their coordinate conventions and map projection parameters. Additionally, we made use of Attribute Conventions for Dataset Discovery (ACDD-1.3) that provided file-level conventions with spatio-temporal bounds that enable indexing software to search for coverage. Our CETB files also include temporal coverage and spatial resolution in the file-level metadata for human-readability. We made use of the JPL CF/ACDD Compliance Checker to guide this work. We tested our file format with real software, for example, netCDF Command-line Operators (NCO) power tools for unlimited control on spatio-temporal subsetting and concatenation of files. The GDAL tools understand the CF metadata and produce fully-compliant geotiff files from our data. ArcMap can then reproject the geotiff files on-the-fly and work with other geolocated data such as coastlines, with no special work required. We expect this combination of standards and well-tested interoperability to significantly improve the usability of this important ESDR for the Earth Science community.
Improving Metadata Compliance for Earth Science Data Records
NASA Astrophysics Data System (ADS)
Armstrong, E. M.; Chang, O.; Foster, D.
2014-12-01
One of the recurring challenges of creating earth science data records is to ensure a consistent level of metadata compliance at the granule level where important details of contents, provenance, producer, and data references are necessary to obtain a sufficient level of understanding. These details are important not just for individual data consumers but also for autonomous software systems. Two of the most popular metadata standards at the granule level are the Climate and Forecast (CF) Metadata Conventions and the Attribute Conventions for Dataset Discovery (ACDD). Many data producers have implemented one or both of these models including the Group for High Resolution Sea Surface Temperature (GHRSST) for their global SST products and the Ocean Biology Processing Group for NASA ocean color and SST products. While both the CF and ACDD models contain various level of metadata richness, the actual "required" attributes are quite small in number. Metadata at the granule level becomes much more useful when recommended or optional attributes are implemented that document spatial and temporal ranges, lineage and provenance, sources, keywords, and references etc. In this presentation we report on a new open source tool to check the compliance of netCDF and HDF5 granules to the CF and ACCD metadata models. The tool, written in Python, was originally implemented to support metadata compliance for netCDF records as part of the NOAA's Integrated Ocean Observing System. It outputs standardized scoring for metadata compliance for both CF and ACDD, produces an objective summary weight, and can be implemented for remote records via OPeNDAP calls. Originally a command-line tool, we have extended it to provide a user-friendly web interface. Reports on metadata testing are grouped in hierarchies that make it easier to track flaws and inconsistencies in the record. We have also extended it to support explicit metadata structures and semantic syntax for the GHRSST project that can be easily adapted to other satellite missions as well. Overall, we hope this tool will provide the community with a useful mechanism to improve metadata quality and consistency at the granule level by providing objective scoring and assessment, as well as encourage data producers to improve metadata quality and quantity.
SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Mattmann, C. A.; Waliser, D. E.; Kim, J.; Loikith, P.; Lee, H.; McGibbney, L. J.; Whitehall, K. D.
2014-12-01
Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark. Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk, and makes iterative algorithms feasible. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 100 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning (ML) based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. The goals of SciSpark are to: (1) Decrease the time to compute comparison statistics and plots from minutes to seconds; (2) Allow for interactive exploration of time-series properties over seasons and years; (3) Decrease the time for satellite data ingestion into RCMES to hours; (4) Allow for Level-2 comparisons with higher-order statistics or PDF's in minutes to hours; and (5) Move RCMES into a near real time decision-making platform. We will report on: the architecture and design of SciSpark, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning (sharding) of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.
Java RMI Software Technology for the Payload Planning System of the International Space Station
NASA Technical Reports Server (NTRS)
Bryant, Barrett R.
1999-01-01
The Payload Planning System is for experiment planning on the International Space Station. The planning process has a number of different aspects which need to be stored in a database which is then used to generate reports on the planning process in a variety of formats. This process is currently structured as a 3-tier client/server software architecture comprised of a Java applet at the front end, a Java server in the middle, and an Oracle database in the third tier. This system presently uses CGI, the Common Gateway Interface, to communicate between the user-interface and server tiers and Active Data Objects (ADO) to communicate between the server and database tiers. This project investigated other methods and tools for performing the communications between the three tiers of the current system so that both the system performance and software development time could be improved. We specifically found that for the hardware and software platforms that PPS is required to run on, the best solution is to use Java Remote Method Invocation (RMI) for communication between the client and server and SQLJ (Structured Query Language for Java) for server interaction with the database. Prototype implementations showed that RMI combined with SQLJ significantly improved performance and also greatly facilitated construction of the communication software.
NetCDF files of PBL height (m), Shortwave Radiation, 10 m wind speed from WRF and Ozone from CMAQ. The data is the standard deviation of these variables for each hour of the 4 day simulation. Figure 4 is only one of the time periods: June 8, 2100 UTC. The NetCDF files have a time stamp (Times) that can be used to find this time in order to reproduce the Figure 4. Also included is a data dictionary that describes the domain and all other attributes of the model simulation.This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).
eWaterCycle visualisation. combining the strength of NetCDF and Web Map Service: ncWMS
NASA Astrophysics Data System (ADS)
Hut, R.; van Meersbergen, M.; Drost, N.; Van De Giesen, N.
2016-12-01
As a result of the eWatercycle global hydrological forecast we have created Cesium-ncWMS, a web application based on ncWMS and Cesium. ncWMS is a server side application capable of reading any NetCDF file written using the Climate and Forecasting (CF) conventions, and making the data available as a Web Map Service(WMS). ncWMS automatically determines available variables in a file, and creates maps colored according to map data and a user selected color scale. Cesium is a Javascript 3D virtual Globe library. It uses WebGL for rendering, which makes it very fast, and it is capable of displaying a wide variety of data types such as vectors, 3D models, and 2D maps. The forecast results are automatically uploaded to our web server running ncWMS. In turn, the web application can be used to change the settings for color maps and displayed data. The server uses the settings provided by the web application, together with the data in NetCDF to provide WMS image tiles, time series data and legend graphics to the Cesium-NcWMS web application. The user can simultaneously zoom in to the very high resolution forecast results anywhere on the world, and get time series data for any point on the globe. The Cesium-ncWMS visualisation combines a global overview with local relevant information in any browser. See the visualisation live at forecast.ewatercycle.org
Tool to assess contents of ARM surface meteorology network netCDF files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staudt, A.; Kwan, T.; Tichler, J.
The Atmospheric Radiation Measurement (ARM) Program, supported by the US Department of Energy, is a major program of atmospheric measurement and modeling designed to improve the understanding of processes and properties that affect atmospheric radiation, with a particular focus on the influence of clouds and the role of cloud radiative feedback in the climate system. The ARM Program will use three highly instrumented primary measurement sites. Deployment of instrumentation at the first site, located in the Southern Great Plains of the United States, began in May of 1992. The first phase of deployment at the second site in the Tropicalmore » Western Pacific is scheduled for late in 1995. The third site will be in the North Slope of Alaska and adjacent Arctic Ocean. To meet the scientific objectives of ARM, observations from the ARM sites are combined with data from other sources; these are called external data. Among these external data sets are surface meteorological observations from the Oklahoma Mesonet, a Kansas automated weather network, the Wind Profiler Demonstration Network (WPDN), and the National Weather Service (NWS) surface stations. Before combining these data with the Surface Meteorological Observations Station (SMOS) ARM data, it was necessary to assess the contents and quality of both the ARM and the external data sets. Since these data sets had previously been converted to netCDF format for use by the ARM Science Team, a tool was written to assess the contents of the netCDF files.« less
ERIC Educational Resources Information Center
Hernández, Laura E.
2016-01-01
Reformers today maintain the use of civil rights rhetoric when advocating for policies that address educational inequity. While continuing the legacy of earlier civil rights activists, the leaders invoking this rhetoric and the educational platforms they promote differ greatly from previous decades. Not only does this new crop of reformers differ…
What's so Bad about Being "Professorial"?
ERIC Educational Resources Information Center
Vaidhyanathan, Siva
2008-01-01
CNN commentator Bill Bennett's invocation of "professorial" was the latest among a string of comments about Barack Obama, who used to teach constitutional law at the University of Chicago. On September 13, the "New York Times" columnist Thomas L. Friedman wrote, "Obama may be a bit professorial, but at least he is trying to unite the country to…
Pragmatics of the Evil Eye in Egyptian Arabic.
ERIC Educational Resources Information Center
Mughazy, Mustafa A.
A study examined the different strategies used by speakers of Egyptian Arabic to ward off the potential effects of the evil eye, specifically the responding strategies to compliments perceived as invocations of evil as it relates to the gender of the recipient of the compliment and the social context in which the compliment takes place. Social…
Data Access Services that Make Remote Sensing Data Easier to Use
NASA Technical Reports Server (NTRS)
Lynnes, Christopher
2010-01-01
This slide presentation reviews some of the processes that NASA uses to make the remote sensing data easy to use over the World Wide Web. This work involves much research into data formats, geolocation structures and quality indicators, often to be followed by coding a preprocessing program. Only then are the data usable within the analysis tool of choice. The Goddard Earth Sciences Data and Information Services Center is deploying a variety of data access services that are designed to dramatically shorten the time consumed in the data preparation step. On-the-fly conversion to the standard network Common Data Form (netCDF) format with Climate-Forecast (CF) conventions imposes a standard coordinate system framework that makes data instantly readable through several tools, such as the Integrated Data Viewer, Gridded Analysis and Display System, Panoply and Ferret. A similar benefit is achieved by serving data through the Open Source Project for a Network Data Access Protocol (OPeNDAP), which also provides subsetting. The Data Quality Screening Service goes a step further in filtering out data points based on quality control flags, based on science team recommendations or user-specified criteria. Further still is the Giovanni online analysis system which goes beyond handling formatting and quality to provide visualization and basic statistics of the data. This general approach of automating the preparation steps has the important added benefit of enabling use of the data by non-human users (i.e., computer programs), which often make sub-optimal use of the available data due to the need to hard-code data preparation on the client side.
Situational Lightning Climatologies for Central Florida: Phase III
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III
2008-01-01
This report describes work done by the Applied Meteorology Unit (AMU) to add composite soundings to the Advanced Weather Interactive Processing System (AWIPS). This allows National Weather Service (NWS) forecasters to compare the current atmospheric state with climatology. In a previous phase, the AMU created composite soundings for four rawinsonde observation stations in Florida, for each of eight flow regimes. The composite soundings were delivered to the NWS Melbourne (MLB) office for display using the NSHARP software program. NWS MLB requested that the AMU make the composite soundings available for display in AWIPS. The AMU first created a procedure to customize AWIPS so composite soundings could be displayed. A unique four-character identifier was created for each of the 32 composite soundings. The AMU wrote a Tool Command Language/Tool Kit (TcVTk) software program to convert the composite soundings from NSHARP to Network Common Data Form (NetCDF) format. The NetCDF files were then displayable by AWIPS.
Paskevich, Valerie F.
1992-01-01
The Branch of Atlantic Marine Geology has been involved in the collection, processing and digital mosaicking of high, medium and low-resolution side-scan sonar data during the past 6 years. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. With the need to process sidescan data in the field with increased power and reduced cost of major workstations, a need to have an image processing package on a UNIX based computer system which could be utilized in the field as well as be more generally available to Branch personnel was identified. This report describes the initial development of that package referred to as the Woods Hole Image Processing System (WHIPS). The software was developed using the Unidata NetCDF software interface to allow data to be more readily portable between different computer operating systems.
Scientific Data Storage for Cloud Computing
NASA Astrophysics Data System (ADS)
Readey, J.
2014-12-01
Traditionally data storage used for geophysical software systems has centered on file-based systems and libraries such as NetCDF and HDF5. In contrast cloud based infrastructure providers such as Amazon AWS, Microsoft Azure, and the Google Cloud Platform generally provide storage technologies based on an object based storage service (for large binary objects) complemented by a database service (for small objects that can be represented as key-value pairs). These systems have been shown to be highly scalable, reliable, and cost effective. We will discuss a proposed system that leverages these cloud-based storage technologies to provide an API-compatible library for traditional NetCDF and HDF5 applications. This system will enable cloud storage suitable for geophysical applications that can scale up to petabytes of data and thousands of users. We'll also cover other advantages of this system such as enhanced metadata search.
Retsas, Spyros
2015-02-01
This paper addresses the myths surrounding the birth and death of Asclepios, the popular healing God of the Greeks and his place among other deities of the Greek Pantheon. The enigmatic invocation of Asclepios by Socrates, the Athenian philosopher condemned to take the hemlock, in his final moments is also discussed. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Which products are available for subsetting?
Atmospheric Science Data Center
2014-12-08
... users to create smaller files (subsets) of the original data by selecting desired parameters, parameter criterion, or latitude and ... fluxes, where the net flux is constrained to the global heat storage in netCDF format. Single Scanner Footprint TOA/Surface Fluxes ...
CfRadial - CF NetCDF for Radar and Lidar Data in Polar Coordinates.
NASA Astrophysics Data System (ADS)
Dixon, M. J.; Lee, W. C.; Michelson, D.; Curtis, M.
2016-12-01
Since 1990, NCAR has supported over 20 different data formats for radar and lidar data in polar coordinates. Researchers, students and operational users spend unnecessary time handling a multitude of unique formats. CfRadial grew out of the need to simplify the use of these data and thereby to improve efficiency in research and operations. CfRadial adopts the well-known NetCDF framework, along with the Climate and Forecasting (CF) conventions such that data and metadata are accurately represented. Mobile platforms are also supported. The first major release, CfRadial version 1.1, occurred in February 2011, followed by minor updates. CfRadial has been adopted by NCAR as well as other agencies in the US and the UK. CfRadial development was boosted in 2015 through a two-year NSF EarthCube grant to improve CF in general. Version 1.4 was agreed upon in May 2016, adding explicit support for quality control fields and spectra. In Europe and Australia, EUMETNET OPERA's HDF5-based ODIM_H5 standard has been rapidly embraced as the modern standard for exchanging weather radar data for operations. ODIM_H5 exploits data groups, hierarchies, and built-in compression, characteristics that have been added to NetCDF4. A meeting of the WMO Task Team on Weather Radar Data Exchange (TT-WRDE) was held at NCAR in Boulder in July 2016, with a goal of identifying a single global standard for radar and lidar data in polar coordinates. CfRadial and ODIM_H5 were considered alongside the older and more rigid table-driven WMO BUFR and GRIB2 formats. TT-WRDE recommended that CfRadial 1.4 be merged with the sweep-oriented structure of ODIM_H5, making use of NetCDF groups, to produce a single format that will encompass the best ideas of both formats. That has led to the emergence of the CfRadial 2.0 standard. This format should meet the objectives of both the NSF EarthCube CF 2.0 initiative and the WMO TT-WRDE. It has the added benefit of improving data exchange between operational and research users, making operational data more readily available to researchers, and research algorithms more accessible to operational agencies.
A Parallel Vector Machine for the PM Programming Language
NASA Astrophysics Data System (ADS)
Bellerby, Tim
2016-04-01
PM is a new programming language which aims to make the writing of computational geoscience models on parallel hardware accessible to scientists who are not themselves expert parallel programmers. It is based around the concept of communicating operators: language constructs that enable variables local to a single invocation of a parallelised loop to be viewed as if they were arrays spanning the entire loop domain. This mechanism enables different loop invocations (which may or may not be executing on different processors) to exchange information in a manner that extends the successful Communicating Sequential Processes idiom from single messages to collective communication. Communicating operators avoid the additional synchronisation mechanisms, such as atomic variables, required when programming using the Partitioned Global Address Space (PGAS) paradigm. Using a single loop invocation as the fundamental unit of concurrency enables PM to uniformly represent different levels of parallelism from vector operations through shared memory systems to distributed grids. This paper describes an implementation of PM based on a vectorised virtual machine. On a single processor node, concurrent operations are implemented using masked vector operations. Virtual machine instructions operate on vectors of values and may be unmasked, masked using a Boolean field, or masked using an array of active vector cell locations. Conditional structures (such as if-then-else or while statement implementations) calculate and apply masks to the operations they control. A shift in mask representation from Boolean to location-list occurs when active locations become sufficiently sparse. Parallel loops unfold data structures (or vectors of data structures for nested loops) into vectors of values that may additionally be distributed over multiple computational nodes and then split into micro-threads compatible with the size of the local cache. Inter-node communication is accomplished using standard OpenMP and MPI. Performance analyses of the PM vector machine, demonstrating its scaling properties with respect to domain size and the number of processor nodes will be presented for a range of hardware configurations. The PM software and language definition are being made available under unrestrictive MIT and Creative Commons Attribution licenses respectively: www.pm-lang.org.
NASA Astrophysics Data System (ADS)
Manzano Muñoz, Fernando; Pouliquen, Sylvie; Petit de la Villeon, Loic; Carval, Thierry; Loubrieu, Thomas; Wedhe, Henning; Sjur Ringheim, Lid; Hammarklint, Thomas; Tamm, Susanne; De Alfonso, Marta; Perivoliotis, Leonidas; Chalkiopoulos, Antonis; Marinova, Veselka; Tintore, Joaquin; Troupin, Charles
2016-04-01
Copernicus, previously known as GMES (Global Monitoring for Environment and Security), is the European Programme for the establishment of a European capacity for Earth Observation and Monitoring. Copernicus aims to provide a sustainable service for Ocean Monitoring and Forecasting validated and commissioned by users. From May 2015, the Copernicus Marine Environment Monitoring Service (CMEMS) is working on an operational mode through a contract with services engagement (result is regular data provision). Within CMEMS, the In Situ Thematic Assembly Centre (INSTAC) distributed service integrates in situ data from different sources for operational oceanography needs. CMEMS INSTAC is collecting and carrying out quality control in a homogeneous manner on data from providers outside Copernicus (national and international networks), to fit the needs of internal and external users. CMEMS INSTAC has been organized in 7 regional Dissemination Units (DUs) to rely on the EuroGOOS ROOSes. Each DU aggregates data and metadata provided by a series of Production Units (PUs) acting as an interface for providers. Homogeneity and standardization are key features to ensure coherent and efficient service. All DUs provide data in the OceanSITES NetCDF format 1.2 (based on NetCDF 3.6), which is CF compliant, relies on SeaDataNet vocabularies and is able to handle profile and time-series measurements. All the products, both near real-time (NRT) and multi-year (REP), are available online for every CMEMS registered user through an FTP service. On top of the FTP service, INSTAC products are available through Oceanotron, an open-source data server dedicated to marine observations dissemination. It provides services such as aggregation on spatio-temporal coordinates and observed parameters, and subsetting on observed parameters and metadata. The accuracy of the data is checked on various levels. Quality control procedures are applied for the validity of the data and correctness tests for the metadata of each NetCDF file. The quality control procedures for the data include different routines for NRT and REP products. Key Performance Indicators (KPI) for monitoring purposes are also used in Copernicus. They allow a periodic monitoring of the availability, quantity and quality of the INSTAC data integrated in the NRT products. Statistical reports are generated on quarterly and yearly basis to provide more visibility on the coverage in space and time of the INSTAC NRT and REP products, as well as information on their quality. These reports are generated using Java and Python procedures developed within the INSTAC group. One of the most critical tasks for the DUs is to generate NetCDF files compliant with the agreed format. Many tools and programming libraries have been developed for that purpose, for instance Unidata Java Library. These tools provide NetCDF data management capabilities including creation, reading and modification. Some DUs have also developed regional data portals which offer useful information for the users including data charts, platforms availability through interactive maps, KPI and statistical figures and direct access to the FTP service. The proposed presentation will detail Copernicus in situ data service and the monitoring tools that have been developed by the INSTAC group.
Challenges to Standardization: A Case Study Using Coastal and Deep-Ocean Water Level Data
NASA Astrophysics Data System (ADS)
Sweeney, A. D.; Stroker, K. J.; Mungov, G.; McLean, S. J.
2015-12-01
Sea levels recorded at coastal stations and inferred from deep-ocean pressure observations at the seafloor are submitted for archive in multiple data and metadata formats. These formats include two forms of schema-less XML and a custom binary format accompanied by metadata in a spreadsheet. The authors report on efforts to use existing standards to make this data more discoverable and more useful beyond their initial use in detecting tsunamis. An initial review of data formats for sea level data around the globe revealed heterogeneity in presentation and content. In the absence of a widely-used domain-specific format, we adopted the general model for structuring data and metadata expressed by the Network Common Data Form (netCDF). netCDF has been endorsed by the Open Geospatial Consortium and has the advantages of small size when compared to equivalent plain text representation and provides a standard way of embedding metadata in the same file. We followed the orthogonal time-series profile of the Climate and Forecast discrete sampling geometries as the convention for structuring the data and describing metadata relevant for use. We adhered to the Attribute Convention for Data Discovery for capturing metadata to support user search. Beyond making it possible to structure data and metadata in a standard way, netCDF is supported by multiple software tools in providing programmatic cataloging, access, subsetting, and transformation to other formats. We will describe our successes and failures in adhering to existing standards and provide requirements for either augmenting existing conventions or developing new ones. Some of these enhancements are specific to sea level data, while others are applicable to time-series data in general.
Rosetta: Ensuring the Preservation and Usability of ASCII-based Data into the Future
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Arms, S. C.
2015-12-01
Field data obtained from dataloggers often take the form of comma separated value (CSV) ASCII text files. While ASCII based data formats have positive aspects, such as the ease of accessing the data from disk and the wide variety of tools available for data analysis, there are some drawbacks, especially when viewing the situation through the lens of data interoperability and stewardship. The Unidata data translation tool, Rosetta, is a web-based service that provides an easy, wizard-based interface for data collectors to transform their datalogger generated ASCII output into Climate and Forecast (CF) compliant netCDF files following the CF-1.6 discrete sampling geometries. These files are complete with metadata describing what data are contained in the file, the instruments used to collect the data, and other critical information that otherwise may be lost in one of many README files. The choice of the machine readable netCDF data format and data model, coupled with the CF conventions, ensures long-term preservation and interoperability, and that future users will have enough information to responsibly use the data. However, with the understanding that the observational community appreciates the ease of use of ASCII files, methods for transforming the netCDF back into a CSV or spreadsheet format are also built-in. One benefit of translating ASCII data into a machine readable format that follows open community-driven standards is that they are instantly able to take advantage of data services provided by the many open-source data server tools, such as the THREDDS Data Server (TDS). While Rosetta is currently a stand-alone service, this talk will also highlight efforts to couple Rosetta with the TDS, thus allowing self-publishing of thoroughly documented datasets by the data producers themselves.
NASA Astrophysics Data System (ADS)
Smith, M. J.; Vardaro, M.; Crowley, M. F.; Glenn, S. M.; Schofield, O.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Fram, J. P.; Kerfoot, J.
2016-02-01
The Ocean Observatories Initiative (OOI), funded by the National Science Foundation, provides users with access to long-term datasets from a variety of oceanographic sensors. The Endurance Array in the Pacific Ocean consists of two separate lines off the coasts of Oregon and Washington. The Oregon line consists of 7 moorings, two cabled benthic experiment packages and 6 underwater gliders. The Washington line comprises 6 moorings and 6 gliders. Each mooring is outfitted with a variety of instrument packages. The raw data from these instruments are sent to shore via satellite communication and in some cases, via fiber optic cable. Raw data is then sent to the cyberinfrastructure (CI) group at Rutgers where it is aggregated, parsed into thousands of different data streams, and integrated into a software package called uFrame. The OOI CI delivers the data to the general public via a web interface that outputs data into commonly used scientific data file formats such as JSON, netCDF, and CSV. The Rutgers data management team has developed a series of command-line Python tools that streamline data acquisition in order to facilitate the QA/QC review process. The first step in the process is querying the uFrame database for a list of all available platforms. From this list, a user can choose a specific platform and automatically download all available datasets from the specified platform. The downloaded dataset is plotted using a generalized Python netcdf plotting routine that utilizes a data visualization toolbox called matplotlib. This routine loads each netCDF file separately and outputs plots by each available parameter. These Python tools have been uploaded to a Github repository that is openly available to help facilitate OOI data access and visualization.
NASA Astrophysics Data System (ADS)
Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.
2007-12-01
The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.
SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Palamuttam, R. S.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; Verma, R.; Waliser, D. E.; Lee, H.
2015-12-01
Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark under a NASA AIST grant (PI Mattmann). Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 10 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. We have implemented a parallel data ingest capability in which the user specifies desired variables (arrays) as several time-sorted lists of URL's (i.e. using OPeNDAP model.nc?varname, or local files). The specified variables are partitioned by time/space and then each Spark node pulls its bundle of arrays into memory to begin a computation pipeline. We also investigated the performance of several N-dim. array libraries (scala breeze, java jblas & netlib-java, and ND4J). We are currently developing science codes using ND4J and studying memory behavior on the JVM. On the pyspark side, many of our science codes already use the numpy and SciPy ecosystems. The talk will cover: the architecture of SciSpark, the design of the scientific RDD (sRDD) data structure, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.
Access to Inter-Organization Computer Networks.
1985-08-01
management of computing and information systems, system management . 20. ABSTRACT (Continue on reverse aide it neceeery end identify by block number) When two...necessary control mechanisms. Message-based gateways that support non-real-time invocation of services (e.g., file and print servers, financial ...operations (C.2.3), electronic mail (H.4.3), public policy issues (K.4.1), organizationa impacts (K.4.3), management of computing and information systems (K.6
Rollout of Endeavour at Palmdale, California (Part 1 of 2)
NASA Technical Reports Server (NTRS)
1991-01-01
Footage shows the rollout ceremonies for Endeavour, including the display of colors, invocation, and speeches by Sam Iacobellis, Executive Vice-President and CEO of Rockwell International, Richard H. Truly, Administrator for NASA, and Senator Jake Garn (Utah). The tape ends during the speech by Senator Garn and continues on part two (Input Processing ID 2000152220, Document ID 20010010951). Endeavour rolls out to music provided by the band on-site.
Efficient NC Algorithms for Set Cover Applications to Learning and Geometry.
1989-05-01
required for one invocation of [L2]. The number of processors is O( ZeEE Jej 2 +n 2) = O(cneE_,E jeI+n 2). Assuming the n 2 term is insignificant...get a good P. 0 The number of processors for our deterministic selection procedure will be O( ZeEE , e 2+ IV12), which is at most n times the input size
Multi-Mission Automated Task Invocation Subsystem
NASA Technical Reports Server (NTRS)
Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.
2009-01-01
Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,
Distributed spatial information integration based on web service
NASA Astrophysics Data System (ADS)
Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng
2008-10-01
Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.
Distributed spatial information integration based on web service
NASA Astrophysics Data System (ADS)
Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng
2009-10-01
Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.
Improving the Accessibility and Use of NASA Earth Science Data
NASA Technical Reports Server (NTRS)
Tisdale, Matthew; Tisdale, Brian
2015-01-01
Many of the NASA Langley Atmospheric Science Data Center (ASDC) Distributed Active Archive Center (DAAC) multidimensional tropospheric and atmospheric chemistry data products are stored in HDF4, HDF5 or NetCDF format, which traditionally have been difficult to analyze and visualize with geospatial tools. With the rising demand from the diverse end-user communities for geospatial tools to handle multidimensional products, several applications, such as ArcGIS, have refined their software. Many geospatial applications now have new functionalities that enable the end user to: Store, serve, and perform analysis on each individual variable, its time dimension, and vertical dimension. Use NetCDF, GRIB, and HDF raster data formats across applications directly. Publish output within REST image services or WMS for time and space enabled web application development. During this webinar, participants will learn how to leverage geospatial applications such as ArcGIS, OPeNDAP and ncWMS in the production of Earth science information, and in increasing data accessibility and usability.
The Comparison of Point Data Models for the Output of WRF Hydro Model in the IDV
NASA Astrophysics Data System (ADS)
Ho, Y.; Weber, J.
2017-12-01
WRF Hydro netCDF output files contain streamflow, flow depth, longitude, latitude, altitude and stream order values for each forecast point. However, the data are not CF compliant. The total number of forecast points for the US CONUS is approximately 2.7 million and it is a big challenge for any visualization and analysis tool. The IDV point cloud display shows point data as a set of points colored by parameter. This display is very efficient compared to a standard point type display for rendering a large number of points. The one problem we have is that the data I/O can be a bottleneck issue when dealing with a large collection of point input files. In this presentation, we will experiment with different point data models and their APIs to access the same WRF Hydro model output. The results will help us construct a CF compliant netCDF point data format for the community.
An open source Java web application to build self-contained Web GIS sites
NASA Astrophysics Data System (ADS)
Zavala Romero, O.; Ahmed, A.; Chassignet, E.; Zavala-Hidalgo, J.
2014-12-01
This work describes OWGIS, an open source Java web application that creates Web GIS sites by automatically writing HTML and JavaScript code. OWGIS is configured by XML files that define which layers (geographic datasets) will be displayed on the websites. This project uses several Open Geospatial Consortium standards to request data from typical map servers, such as GeoServer, and is also able to request data from ncWMS servers. The latter allows for the displaying of 4D data stored using the NetCDF file format (widely used for storing environmental model datasets). Some of the features available on the sites built with OWGIS are: multiple languages, animations, vertical profiles and vertical transects, color palettes, color ranges, and the ability to download data. OWGIS main users are scientists, such as oceanographers or climate scientists, who store their data in NetCDF files and want to analyze, visualize, share, or compare their data using a website.
Data Mining Web Services for Science Data Repositories
NASA Astrophysics Data System (ADS)
Graves, S.; Ramachandran, R.; Keiser, K.; Maskey, M.; Lynnes, C.; Pham, L.
2006-12-01
The maturation of web services standards and technologies sets the stage for a distributed "Service-Oriented Architecture" (SOA) for NASA's next generation science data processing. This architecture will allow members of the scientific community to create and combine persistent distributed data processing services and make them available to other users over the Internet. NASA has initiated a project to create a suite of specialized data mining web services designed specifically for science data. The project leverages the Algorithm Development and Mining (ADaM) toolkit as its basis. The ADaM toolkit is a robust, mature and freely available science data mining toolkit that is being used by several research organizations and educational institutions worldwide. These mining services will give the scientific community a powerful and versatile data mining capability that can be used to create higher order products such as thematic maps from current and future NASA satellite data records with methods that are not currently available. The package of mining and related services are being developed using Web Services standards so that community-based measurement processing systems can access and interoperate with them. These standards-based services allow users different options for utilizing them, from direct remote invocation by a client application to deployment of a Business Process Execution Language (BPEL) solutions package where a complex data mining workflow is exposed to others as a single service. The ability to deploy and operate these services at a data archive allows the data mining algorithms to be run where the data are stored, a more efficient scenario than moving large amounts of data over the network. This will be demonstrated in a scenario in which a user uses a remote Web-Service-enabled clustering algorithm to create cloud masks from satellite imagery at the Goddard Earth Sciences Data and Information Services Center (GES DISC).
Incorporating Brokers within Collaboration Environments
NASA Astrophysics Data System (ADS)
Rajasekar, A.; Moore, R.; de Torcy, A.
2013-12-01
A collaboration environment, such as the integrated Rule Oriented Data System (iRODS - http://irods.diceresearch.org), provides interoperability mechanisms for accessing storage systems, authentication systems, messaging systems, information catalogs, networks, and policy engines from a wide variety of clients. The interoperability mechanisms function as brokers, translating actions requested by clients to the protocol required by a specific technology. The iRODS data grid is used to enable collaborative research within hydrology, seismology, earth science, climate, oceanography, plant biology, astronomy, physics, and genomics disciplines. Although each domain has unique resources, data formats, semantics, and protocols, the iRODS system provides a generic framework that is capable of managing collaborative research initiatives that span multiple disciplines. Each interoperability mechanism (broker) is linked to a name space that enables unified access across the heterogeneous systems. The collaboration environment provides not only support for brokers, but also support for virtualization of name spaces for users, files, collections, storage systems, metadata, and policies. The broker enables access to data or information in a remote system using the appropriate protocol, while the collaboration environment provides a uniform naming convention for accessing and manipulating each object. Within the NSF DataNet Federation Consortium project (http://www.datafed.org), three basic types of interoperability mechanisms have been identified and applied: 1) drivers for managing manipulation at the remote resource (such as data subsetting), 2) micro-services that execute the protocol required by the remote resource, and 3) policies for controlling the execution. For example, drivers have been written for manipulating NetCDF and HDF formatted files within THREDDS servers. Micro-services have been written that manage interactions with the CUAHSI data repository, the DataONE information catalog, and the GeoBrain broker. Policies have been written that manage transfer of messages between an iRODS message queue and the Advanced Message Queuing Protocol. Examples of these brokering mechanisms will be presented. The DFC collaboration environment serves as the intermediary between community resources and compute grids, enabling reproducible data-driven research. It is possible to create an analysis workflow that retrieves data subsets from a remote server, assemble the required input files, automate the execution of the workflow, automatically track the provenance of the workflow, and share the input files, workflow, and output files. A collaborator can re-execute a shared workflow, compare results, change input files, and re-execute an analysis.
Gaustad, Krista; Hardin, Joseph
2015-12-14
The wsacr PCM process executed by the sacr3 binary reads in wsacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.
Gaustad, Krista; Hardin, Joseph
2015-07-22
The wsacr PCM process executed by the sacr3 binary reads in wsacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.
Gaustad, Krista; Hardin, Joseph
2015-07-22
The wsacr PCM process executed by the sacr3 binary reads in wsacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.
Gaustad, Krista; Hardin, Joseph
2015-07-22
The kasacr PCM process executed by the sacr3 binary reads in kasacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.
Joint Battlespace Infosphere: Information Management Within a C2 Enterprise
2005-06-01
using. In version 1.2, we support both MySQL and Oracle as underlying implementations where the XML metadata schema is mapped into relational tables in...Identity Servers, Role-Based Access Control, and Policy Representation – Databases: Oracle , MySQL , TigerLogic, Berkeley XML DB 15 Instrumentation Services...converted to SQL for execution. Invocations are then forwarded to the appropriate underlying IOR core components that have the responsibility of issuing
Cuba: Multidimensional numerical integration library
NASA Astrophysics Data System (ADS)
Hahn, Thomas
2016-08-01
The Cuba library offers four independent routines for multidimensional numerical integration: Vegas, Suave, Divonne, and Cuhre. The four algorithms work by very different methods, and can integrate vector integrands and have very similar Fortran, C/C++, and Mathematica interfaces. Their invocation is very similar, making it easy to cross-check by substituting one method by another. For further safeguarding, the output is supplemented by a chi-square probability which quantifies the reliability of the error estimate.
Stroke in ancient times: a reinterpretation of Psalms 137:5,6.
Resende, Luiz Antonio de Lima; Weber, Silke Anna Theresa; Bertotti, Marcelo Fernando Zeugner; Agapejev, Svetlana
2008-09-01
Stroke was probably first described in Psalms 136: 5-6 of the Catholic Bible, and Psalms 137:5-6 of the Evangelical Bible. Based on the Portuguese, Spanish, English, German, Dutch, Russian, Greek, and original Hebrew Bible, the significance of this Psalm is the invocation of a punishment, of which the final result would be a stroke of the left middle cerebral artery, causing motor aphasia and right hemiparesis.
2009-09-01
2.1 Participants Twelve civilians (7 men and 5 women ) with no prior experience with the Robotic NCO simulation participated in this study. The mean...operators in a multitasking environment. 15. SUBJECT TERMS design guidelines, robotics, simulation, unmanned systems, automation 16. SECURITY...model of operator performance, or a hybrid method which combines one or more of these different invocation techniques (e.g., critical events and
McMahon, B T; Domer, T M
1997-01-01
No federal or state lawmaker could have foreseen the nuances involved in the mutual implementations of the Americans with Disabilities Act, the Family and Medical Leave Act, and state workers compensation statutes. These laws are compared and contrasted on a number of key issues. Readers are provided with a decision matrix to guide them and those they represent in the judicious invocation of the most beneficial statute for each issue.
National Centers for Environmental Prediction
Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Chuang (POST) Fanglin Yang (VSDB) Perry Shafran (VERIFICATION) Ilya Rivin (HYCOM) David Behringer (MOM4 * Functional Equivalence test for MOM4p0 on GAEA - Dave Behringer * NCEP Gaea module - $NETCDF * Use a forum
Adaptive Automation and Cue Invocation: The Effect of Cue Timing on Operator Error
2013-05-01
129. 5. Parasuraman, R. (2000). Designing automation for human use: Empirical studies and quantitative models. Ergonomics , 43, 931-951. 6...Prospective memory errors involve memory for intended actions that are planned to be performed at some designated point in the future [20]. In the DMOO...RESCHU) [21] was used in this study. A Navy pilot who is familiar with supervisory control tasks designed the RESCHU task and the task has been
MAPI: towards the integrated exploitation of bioinformatics Web Services.
Ramirez, Sergio; Karlsson, Johan; Trelles, Oswaldo
2011-10-27
Bioinformatics is commonly featured as a well assorted list of available web resources. Although diversity of services is positive in general, the proliferation of tools, their dispersion and heterogeneity complicate the integrated exploitation of such data processing capacity. To facilitate the construction of software clients and make integrated use of this variety of tools, we present a modular programmatic application interface (MAPI) that provides the necessary functionality for uniform representation of Web Services metadata descriptors including their management and invocation protocols of the services which they represent. This document describes the main functionality of the framework and how it can be used to facilitate the deployment of new software under a unified structure of bioinformatics Web Services. A notable feature of MAPI is the modular organization of the functionality into different modules associated with specific tasks. This means that only the modules needed for the client have to be installed, and that the module functionality can be extended without the need for re-writing the software client. The potential utility and versatility of the software library has been demonstrated by the implementation of several currently available clients that cover different aspects of integrated data processing, ranging from service discovery to service invocation with advanced features such as workflows composition and asynchronous services calls to multiple types of Web Services including those registered in repositories (e.g. GRID-based, SOAP, BioMOBY, R-bioconductor, and others).
FLASH_SSF_Aqua-FM3-MODIS_Version3C
Atmospheric Science Data Center
2018-04-04
... Tool: CERES Order Tool (netCDF) Subset Data: CERES Search and Subset Tool (HDF4 & netCDF) ... Cloud Layer Area Cloud Infared Emissivity Cloud Base Pressure Surface (Radiative) Flux TOA Flux Surface Types TOT ... Radiance SW Filtered Radiance LW Flux Order Data: Earthdata Search: Order Data Guide Documents: ...
FLASH_SSF_Terra-FM1-MODIS_Version3C
Atmospheric Science Data Center
2018-04-04
... Tool: CERES Order Tool (netCDF) Subset Data: CERES Search and Subset Tool (HDF4 & netCDF) ... Cloud Layer Area Cloud Infrared Emissivity Cloud Base Pressure Surface (Radiative) Flux TOA Flux Surface Types TOT ... Radiance SW Filtered Radiance LW Flux Order Data: Earthdata Search: Order Data Guide Documents: ...
NASA Technical Reports Server (NTRS)
Habermann, Ted; Gallagher, James; Jelenak, Aleksandar; Potter, Nathan; Lee, Joe; Yang, Kent
2017-01-01
This study explored three candidate architectures with different types of objects and access paths for serving NASA Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the study were: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and netCDF4 data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3). Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.We will describe the three architectures and the use cases along with performance results and recommendations for further work.
IVS Working Group 4: VLBI Data Structures
NASA Astrophysics Data System (ADS)
Gipson, J.
2012-12-01
I present an overview of the "openDB format" for storing, archiving, and processing VLBI data. In this scheme, most VLBI data is stored in NetCDF files. NetCDF has the advantage that there are interfaces to most common computer languages including Fortran, Fortran-90, C, C++, Perl, etc, and the most common operating systems including Linux, Windows, and Mac. The data files for a particular session are organized by special ASCII "wrapper" files which contain pointers to the data files. This allows great flexibility in the processing and analysis of VLBI data. For example it allows you to easily change subsets of the data used in the analysis such as troposphere modeling, ionospheric calibration, editing, and ambiguity resolution. It also allows for extending the types of data used, e.g., source maps. I present a roadmap to transition to this new format. The new format can already be used by VieVS and by the global mode of solve. There are plans in work for other software packages to be able to use the new format.
';Best' Practices for Aggregating Subset Results from Archived Datasets
NASA Astrophysics Data System (ADS)
Baskin, W. E.; Perez, J.
2013-12-01
In response to the exponential growth in science data analysis and visualization capabilities Data Centers have been developing new delivery mechanisms to package and deliver large volumes of aggregated subsets of archived data. New standards are evolving to help data providers and application programmers deal with growing needs of the science community. These standards evolve from the best practices gleaned from new products and capabilities. The NASA Atmospheric Sciences Data Center (ASDC) has developed and deployed production provider-specific search and subset web applications for the CALIPSO, CERES, TES, and MOPITT missions. This presentation explores several use cases that leverage aggregated subset results and examines the standards and formats ASDC developers applied to the delivered files as well as the implementation strategies for subsetting and processing the aggregated products. The following topics will be addressed: - Applications of NetCDF CF conventions to aggregated level 2 satellite subsets - Data-Provider-Specific format requirements vs. generalized standards - Organization of the file structure of aggregated NetCDF subset output - Global Attributes of individual subsetted files vs. aggregated results - Specific applications and framework used for subsetting and delivering derivative data files
A Climate Statistics Tool and Data Repository
NASA Astrophysics Data System (ADS)
Wang, J.; Kotamarthi, V. R.; Kuiper, J. A.; Orr, A.
2017-12-01
Researchers at Argonne National Laboratory and collaborating organizations have generated regional scale, dynamically downscaled climate model output using Weather Research and Forecasting (WRF) version 3.3.1 at a 12km horizontal spatial resolution over much of North America. The WRF model is driven by boundary conditions obtained from three independent global scale climate models and two different future greenhouse gas emission scenarios, named representative concentration pathways (RCPs). The repository of results has a temporal resolution of three hours for all the simulations, includes more than 50 variables, is stored in Network Common Data Form (NetCDF) files, and the data volume is nearly 600Tb. A condensed 800Gb set of NetCDF files were made for selected variables most useful for climate-related planning, including daily precipitation, relative humidity, solar radiation, maximum temperature, minimum temperature, and wind. The WRF model simulations are conducted for three 10-year time periods (1995-2004, 2045-2054, and 2085-2094), and two future scenarios RCP4.5 and RCP8.5). An open-source tool was coded using Python 2.7.8 and ESRI ArcGIS 10.3.1 programming libraries to parse the NetCDF files, compute summary statistics, and output results as GIS layers. Eight sets of summary statistics were generated as examples for the contiguous U.S. states and much of Alaska, including number of days over 90°F, number of days with a heat index over 90°F, heat waves, monthly and annual precipitation, drought, extreme precipitation, multi-model averages, and model bias. This paper will provide an overview of the project to generate the main and condensed data repositories, describe the Python tool and how to use it, present the GIS results of the computed examples, and discuss some of the ways they can be used for planning. The condensed climate data, Python tool, computed GIS results, and documentation of the work are shared on the Internet.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander
2017-04-01
For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.
Climate Prediction Center - NCEP Global Ocean Data Assimilation System:
home page National Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Monthly in NetCDF Other formats Links NOAA Ocean Climate Observation Program (OCO) Climate Test Bed About Prediction (NCEP) are a valuable community asset for monitoring different aspects of ocean climate
Visualization of ocean forecast in BYTHOS
NASA Astrophysics Data System (ADS)
Zhuk, E.; Zodiatis, G.; Nikolaidis, A.; Stylianou, S.; Karaolia, A.
2016-08-01
The Cyprus Oceanography Center has been constantly searching for new ideas for developing and implementing innovative methods and new developments concerning the use of Information Systems in Oceanography, to suit both the Center's monitoring and forecasting products. Within the frame of this scope two major online managing and visualizing data systems have been developed and utilized, those of CYCOFOS and BYTHOS. The Cyprus Coastal Ocean Forecasting and Observing System - CYCOFOS provides a variety of operational predictions such as ultra high, high and medium resolution ocean forecasts in the Levantine Basin, offshore and coastal sea state forecasts in the Mediterranean and Black Sea, tide forecasting in the Mediterranean, ocean remote sensing in the Eastern Mediterranean and coastal and offshore monitoring. As a rich internet application, BYTHOS enables scientists to search, visualize and download oceanographic data online and in real time. The recent improving of BYTHOS system is the extension with access and visualization of CYCOFOS data and overlay forecast fields and observing data. The CYCOFOS data are stored at OPENDAP Server in netCDF format. To search, process and visualize it the php and python scripts were developed. Data visualization is achieved through Mapserver. The BYTHOS forecast access interface allows to search necessary forecasting field by recognizing type, parameter, region, level and time. Also it provides opportunity to overlay different forecast and observing data that can be used for complex analyze of sea basin aspects.
Web Services as Building Blocks for an Open Coastal Observing System
NASA Astrophysics Data System (ADS)
Breitbach, G.; Krasemann, H.
2012-04-01
In coastal observing systems it is needed to integrate different observing methods like remote sensing, in-situ measurements, and models into a synoptic view of the state of the observed region. This integration can be based solely on web services combining data and metadata. Such an approach is pursued for COSYNA (Coastal Observing System for Northern and Artic seas). Data from satellite and radar remote sensing, measurements of buoys, stations and Ferryboxes are the observation part of COSYNA. These data are assimilated into models to create pre-operational forecasts. For discovering data an OGC Web Feature Service (WFS) is used by the COSYNA data portal. This Web Feature Service knows the necessary metadata not only for finding data, but in addition the URLs of web services to view and download the data. To make the data from different resources comparable a common vocabulary is needed. For COSYNA the standard names from CF-conventions are stored within the metadata whenever possible. For the metadata an INSPIRE and ISO19115 compatible data format is used. The WFS is fed from the metadata-system using database-views. Actual data are stored in two different formats, in NetCDF-files for gridded data and in an RDBMS for time-series-like data. The web service URLs are mostly standard based the standards are mainly OGC standards. Maps were created from netcdf files with the help of the ncWMS tool whereas a self-developed java servlet is used for maps of moving measurement platforms. In this case download of data is offered via OGC SOS. For NetCDF-files OPeNDAP is used for the data download. The OGC CSW is used for accessing extended metadata. The concept of data management in COSYNA will be presented which is independent of the special services used in COSYNA. This concept is parameter and data centric and might be useful for other observing systems.
Software reuse example and challenges at NSIDC
NASA Astrophysics Data System (ADS)
Billingsley, B. W.; Brodzik, M.; Collins, J. A.
2009-12-01
NSIDC has created a new data discovery and access system, Searchlight, to provide users with the data they want in the format they want. NSIDC Searchlight supports discovery and access to disparate data types with on-the-fly reprojection, regridding and reformatting. Architected to both reuse open source systems and be reused itself, Searchlight reuses GDAL and Proj4 for manipulating data and format conversions, the netCDF Java library for creating netCDF output, MapServer and OpenLayers for defining spatial criteria and the JTS Topology Suite (JTS) in conjunction with Hibernate Spatial for database interaction and rich OGC-compliant spatial objects. The application reuses popular Java and Java Script libraries including Struts 2, Spring, JPA (Hibernate), Sitemesh, JFreeChart, JQuery, DOJO and a PostGIS PostgreSQL database. Future reuse of Searchlight components is supported at varying architecture levels, ranging from the database and model components to web services. We present the tools, libraries and programs that Searchlight has reused. We describe the architecture of Searchlight and explain the strategies deployed for reusing existing software and how Searchlight is built for reuse. We will discuss NSIDC reuse of the Searchlight components to support rapid development of new data delivery systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosler, Peter
Stride Search provides a flexible tool for detecting storms or other extreme climate events in high-resolution climate data sets saved on uniform latitude-longitude grids in standard NetCDF format. Users provide the software a quantitative description of a meteorological event they are interested in; the software searches a data set for locations in space and time that meet the user’s description. In its first stage, Stride Search performs a spatial search of the data set at each timestep by dividing a search domain into circular sectors of constant geodesic radius. Data from a netCDF file is read into memory for eachmore » circular search sector. If the data meet or exceed a set of storm identification criteria (defined by the user), a storm is recorded to a linked list. Finally, the linked list is examined and duplicate detections of the same storm are removed and the results are written to an output file. The first stage’s output file is read by a second program that builds storm. Additional identification criteria may be applied at this stage to further classify storms. Storm tracks are the software’s ultimate output and routines are provided for formatting that output for various external software libraries for plotting and tabulating data.« less
NaradaBrokering as Middleware Fabric for Grid-based Remote Visualization Services
NASA Astrophysics Data System (ADS)
Pallickara, S.; Erlebacher, G.; Yuen, D.; Fox, G.; Pierce, M.
2003-12-01
Remote Visualization Services (RVS) have tended to rely on approaches based on the client server paradigm. The simplicity in these approaches is offset by problems such as single-point-of-failures, scaling and availability. Furthermore, as the complexity, scale and scope of the services hosted on this paradigm increase, this approach becomes increasingly unsuitable. We propose a scheme based on top of a distributed brokering infrastructure, NaradaBrokering, which comprises a distributed network of broker nodes. These broker nodes are organized in a cluster-based architecture that can scale to very large sizes. The broker network is resilient to broker failures and efficiently routes interactions to entities that expressed an interest in them. In our approach to RVS, services advertise their capabilities to the broker network, which manages these service advertisements. Among the services considered within our system are those that perform graphic transformations, mediate access to specialized datasets and finally those that manage the execution of specified tasks. There could be multiple instances of each of these services and the system ensures that load for a given service is distributed efficiently over these service instances. Among the features provided in our approach are efficient discovery of services and asynchronous interactions between services and service requestors (which could themselves be other services). Entities need not be online during the execution of the service request. The system also ensures that entities can be notified about task executions, partial results and failures that might have taken place during service execution. The system also facilitates specification of task overrides, distribution of execution results to alternate devices (which were not used to originally request service execution) and to multiple users. These RVS services could of course be either OGSA (Open Grid Services Architecture) based Grid services or traditional Web services. The brokering infrastructure will manage the service advertisements and the invocation of these services. This scheme ensures that the fundamental Grid computing concept is met - provide computing capabilities of those that are willing to provide it to those that seek the same. {[1]} The NaradaBrokering Project: http://www.naradabrokering.org
Functional description of the ISIS system
NASA Technical Reports Server (NTRS)
Berman, W. J.
1979-01-01
Development of software for avionic and aerospace applications (flight software) is influenced by a unique combination of factors which includes: (1) length of the life cycle of each project; (2) necessity for cooperation between the aerospace industry and NASA; (3) the need for flight software that is highly reliable; (4) the increasing complexity and size of flight software; and (5) the high quality of the programmers and the tightening of project budgets. The interactive software invocation system (ISIS) which is described is designed to overcome the problems created by this combination of factors.
1982-11-12
File 1/0 Prgram Invocation Other Access M and Control Services KAPSE/Host Interface most Operating System Peripherals/ 01 su ?eetworks 6282318-2 Figure 3...3.2.4.3.8.5 Transitory Windows The TRANSITORY flag is used to prevent permanent dependence on temporary windows created simply for focusing on a part of the...KAPSE/Tool interfaces in terms of these low-level host-independent interfaces. In addition, the KAPSE/Host interface packages prevent the application
NASA Technical Reports Server (NTRS)
Grantham, C.
1979-01-01
The Interactive Software Invocation (ISIS), an interactive data management system, was developed to act as a buffer between the user and host computer system. The user is provided by ISIS with a powerful system for developing software or systems in the interactive environment. The user is protected from the idiosyncracies of the host computer system by providing such a complete range of capabilities that the user should have no need for direct access to the host computer. These capabilities are divided into four areas: desk top calculator, data editor, file manager, and tool invoker.
Using OPeNDAP's Data-Services Framework to Lift Mash-Ups above Blind Dates
NASA Astrophysics Data System (ADS)
Gallagher, J. H. R.; Fulker, D. W.
2015-12-01
OPeNDAP's data-as-service framework (Hyrax) matches diverse sources with many end-user tools and contexts. Keys to its flexibility include: A data model embracing tabular data alongside n-dim arrays and other structures useful in geoinformatics. A REST-like protocol that supports—via suffix notation—a growing set of output forms (netCDF, XML, etc.) plus a query syntax for subsetting. Subsetting applies (via constraints on column values) to tabular data or (via constraints on indices or coordinates) to array-style data . A handler-style architecture that admits a growing set of input types. Community members may contribute handlers, making Hyrax effective as middleware, where N sources are mapped to M outputs with order N+M effort (not NxM). Hyrax offers virtual aggregations of source data, enabling granularity aimed at users, not data-collectors. OPeNDAP-access libraries exist in multiple languages, including Python, Java, and C++. Recent enhancements are increasing this framework's interoperability (i.e., its mash-up) potential. Extensions implemented as servlets—running adjacent to Hyrax—are enriching the forms of aggregation and enabling new protocols: User-specified aggregations, namely, applying a query to (huge) lists of source granules, and receiving one (large) table or zipped netCDF file. OGC (Open Geospatial Consortium) protocols, WMS and WCS. A Webification (W10n) protocol that returns JavaScript Object Notation (JSON). Extensions to OPeNDAP's query language are reducing transfer volumes and enabling new forms of inspection. Advances underway include: Functions that, for triangular-mesh sources, return sub-meshes spec'd via geospatial bounding boxes. Functions that, for data from multiple, satellite-borne sensors (with differing orbits), select observations based on coincidence. Calculations of means, histograms, etc. that greatly reduce output volumes.. Paths for communities to contribute new server functions (in Python, e.g.) that data providers may incorporate into Hyrax via installation parameters. One could say Hyrax itself is a mash-up, but we suggest it as an instrument for a mash-up artist's toolbox. This instrument can support mash-ups built on netCDF files, OGC protocols, JavaScript Web pages, and/or programs written in Python, Java, C or C++.
NASA Astrophysics Data System (ADS)
Zender, Charles S.
2016-09-01
Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits of consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25-80 and 5-65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1-5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1-2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that it can compress, Bit Grooming guarantees the specified precision throughout the full floating-point range. Data quantization by Bit Grooming is irreversible (i.e., lossy) yet transparent, meaning that no extra processing is required by data users/readers. Hence Bit Grooming can easily reduce data storage volume without sacrificing scientific precision or imposing extra burdens on users.
NASA Astrophysics Data System (ADS)
Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.
2015-12-01
An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability. Differencesbetween data file contents and software package expectations areexposed, affording an opportunity to improve conformance of software,data or both. The incident can also serve as a demonstration that dataproviders, distributors, and users can work together to improve dataproduct quality and interoperability.
NASA Astrophysics Data System (ADS)
Mihajlovski, A.; Spinuso, A.; Plieger, M.; Som de Cerff, W.
2016-12-01
Modern Climate analysis platforms provide generic and standardized ways of accessing data and processing services. These are typically supported by a wide range of OGC formats and interfaces. However, the problem of instrumentally tracing the lineage of the transformations occurring on a dataset and its provenance remains an open challenge. It requires standard-driven and interoperable solutions to facilitate understanding, sharing of self-describing data products, fostering collaboration among peers. The CLIPC portal provided us real use case, where the need of an instrumented provenance management is fundamental. CLIPC provides a single point of access for scientific information on climate change. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses. This is made possible through the Copernicus Earth Observation Programme for Europe. With a backbone combining WPS and OPeNDAP services, CLIPC has two themes: 1. Harmonized access to climate datasets derived from models, observations and re-analyses 2. A climate impact tool kit to evaluate, rank and aggregate indicators The climate impact tool kit is realised with the orchestration of a number of WPS that ingest, normalize and combine NetCDF files. The WPS allowing this specific computation are hosted by the climate4impact portal, which is a more generic climate data-access and processing service. In this context, guaranteeing validation and reproducibility of results, is a clearly stated requirement to improve the quality of the results obtained by the combined analysis Two core contributions made, are the enabling of a provenance wrapper around WPS services and the enabling of provenance tracing within the NetCDF format, which adopts and extends the W3C's PROV model. To disseminate indicator data and create transformed data products, a standardized provenance, metadata and processing infrastructure is researched for CLIPC. These efforts will lead towards the provision of tools for further web service processing development and optimisation, opening up possibilities to scale and administer abstract users and data driven workflows.
NASA Astrophysics Data System (ADS)
Callahan, P. S.; Wilson, B. D.; Xing, Z.; Raskin, R. G.
2010-12-01
We have developed a web-based system to allow updating and subsetting of TOPEX data. The Altimeter Service will be operated by PODAAC along with their other provision of oceanographic data. The Service could be easily expanded to other mission data. An Altimeter Service is crucial to the improvement and expanded use of altimeter data. A service is necessary for altimetry because the result of most interest - sea surface height anomaly (SSHA) - is composed of several components that are updated individually and irregularly by specialized experts. This makes it difficult for projects to provide the most up-to-date products. Some components are the subject of ongoing research, so the ability for investigators to make products for comparison or sharing is important. The service will allow investigators/producers to get their component models or processing into widespread use much more quickly. For coastal altimetry, the ability to subset the data to the area of interest and insert specialized models (e.g., tides) or data processing results is crucial. A key part of the Altimeter Service is having data producers provide updated or local models and data. In order for this to succeed, producers need to register their products with the Altimeter Service and to provide the product in a form consistent with the service update methods. We will describe the capabilities of the web service and the methods for providing new components. Currently the Service is providing TOPEX GDRs with Retracking (RGDRs) in netCDF format that has been coordinated with Jason data. Users can add new orbits, tide models, gridded geophysical fields such as mean sea surface, and along-track corrections as they become available and are installed by PODAAC. The updated fields are inserted into the netCDF files while the previous values are retained for comparison. The Service will also generate SSH and SSHA. In addition, the Service showcases a feature that plots any variable from files in netCDF. The research described here was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.
A Context-Aware S-Health Service System for Drivers.
Chang, Jingkun; Yao, Wenbin; Li, Xiaoyong
2017-03-17
As a stressful and sensitive task, driving can be disturbed by various factors from the health condition of the driver to the environmental variables of the vehicle. Continuous monitoring of driving hazards and providing the most appropriate business services to meet actual needs can guarantee safe driving and make great use of the existing information resources and business services. However, there is no in-depth research on the perception of a driver's health status or the provision of customized business services in case of various hazardous situations. In order to constantly monitor the health status of the drivers and react to abnormal situations, this paper proposes a context-aware service system providing a configurable architecture for the design and implementation of the smart health service system for safe driving, which can perceive a driver's health status and provide helpful services to the driver. With the context-aware technology to construct a smart health services system for safe driving, this is the first time that such a service system has been implemented in practice. Additionally, an assessment model is proposed to mitigate the impact of the acceptable abnormal status and, thus, reduce the unnecessary invocation of the services. With regard to different assessed situations, the business services can be invoked for the driver to adapt to hazardous situations according to the services configuration model, which can take full advantage of the existing information resources and business services. The evaluation results indicate that the alteration of the observed status in a valid time range T can be tolerated and the frequency of the service invocation can be reduced.
A Context-Aware S-Health Service System for Drivers
Chang, Jingkun; Yao, Wenbin; Li, Xiaoyong
2017-01-01
As a stressful and sensitive task, driving can be disturbed by various factors from the health condition of the driver to the environmental variables of the vehicle. Continuous monitoring of driving hazards and providing the most appropriate business services to meet actual needs can guarantee safe driving and make great use of the existing information resources and business services. However, there is no in-depth research on the perception of a driver’s health status or the provision of customized business services in case of various hazardous situations. In order to constantly monitor the health status of the drivers and react to abnormal situations, this paper proposes a context-aware service system providing a configurable architecture for the design and implementation of the smart health service system for safe driving, which can perceive a driver’s health status and provide helpful services to the driver. With the context-aware technology to construct a smart health services system for safe driving, this is the first time that such a service system has been implemented in practice. Additionally, an assessment model is proposed to mitigate the impact of the acceptable abnormal status and, thus, reduce the unnecessary invocation of the services. With regard to different assessed situations, the business services can be invoked for the driver to adapt to hazardous situations according to the services configuration model, which can take full advantage of the existing information resources and business services. The evaluation results indicate that the alteration of the observed status in a valid time range T can be tolerated and the frequency of the service invocation can be reduced. PMID:28304330
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E.
2011-12-01
Under several NASA grants, we are generating multi-sensor merged atmospheric datasets to enable the detection of instrument biases and studies of climate trends over decades of data. For example, under a NASA MEASURES grant we are producing a water vapor climatology from the A-Train instruments, stratified by the Cloudsat cloud classification for each geophysical scene. The generation and proper use of such multi-sensor climate data records (CDR's) requires a high level of openness, transparency, and traceability. To make the datasets self-documenting and provide access to full metadata and traceability, we have implemented a set of capabilities and services using known, interoperable protocols. These protocols include OpenSearch, OPeNDAP, Open Provenance Model, service & data casting technologies using Atom feeds, and REST-callable analysis workflows implemented as SciFlo (XML) documents. We advocate that our approach can serve as a blueprint for how to openly "document and serve" complex, multi-sensor CDR's with full traceability. The capabilities and services provided include: - Discovery of the collections by keyword search, exposed using OpenSearch protocol; - Space/time query across the CDR's granules and all of the input datasets via OpenSearch; - User-level configuration of the production workflows so that scientists can select additional physical variables from the A-Train to add to the next iteration of the merged datasets; - Efficient data merging using on-the-fly OPeNDAP variable slicing & spatial subsetting of data out of input netCDF and HDF files (without moving the entire files); - Self-documenting CDR's published in a highly usable netCDF4 format with groups used to organize the variables, CF-style attributes for each variable, numeric array compression, & links to OPM provenance; - Recording of processing provenance and data lineage into a query-able provenance trail in Open Provenance Model (OPM) format, auto-captured by the workflow engine; - Open Publishing of all of the workflows used to generate products as machine-callable REST web services, using the capabilities of the SciFlo workflow engine; - Advertising of the metadata (e.g. physical variables provided, space/time bounding box, etc.) for our prepared datasets as "datacasts" using the Atom feed format; - Publishing of all datasets via our "DataDrop" service, which exploits the WebDAV protocol to enable scientists to access remote data directories as local files on their laptops; - Rich "web browse" of the CDR's with full metadata and the provenance trail one click away; - Advertising of all services as Google-discoverable "service casts" using the Atom format. The presentation will describe our use of the interoperable protocols and demonstrate the capabilities and service GUI's.
NASA Astrophysics Data System (ADS)
O'Kuinghttons, Ryan; Koziol, Benjamin; Oehmke, Robert; DeLuca, Cecelia; Theurich, Gerhard; Li, Peggy; Jacob, Joseph
2016-04-01
The Earth System Modeling Framework (ESMF) Python interface (ESMPy) supports analysis and visualization in Earth system modeling codes by providing access to a variety of tools for data manipulation. ESMPy started as a Python interface to the ESMF grid remapping package, which provides mature and robust high-performance and scalable grid remapping between 2D and 3D logically rectangular and unstructured grids and sets of unconnected data. ESMPy now also interfaces with OpenClimateGIS (OCGIS), a package that performs subsetting, reformatting, and computational operations on climate datasets. ESMPy exposes a subset of ESMF grid remapping utilities. This includes bilinear, finite element patch recovery, first-order conservative, and nearest neighbor grid remapping methods. There are also options to ignore unmapped destination points, mask points on source and destination grids, and provide grid structure in the polar regions. Grid remapping on the sphere takes place in 3D Cartesian space, so the pole problem is not an issue as it can be with other grid remapping software. Remapping can be done between any combination of 2D and 3D logically rectangular and unstructured grids with overlapping domains. Grid pairs where one side of the regridding is represented by an appropriate set of unconnected data points, as is commonly found with observational data streams, is also supported. There is a developing interoperability layer between ESMPy and OpenClimateGIS (OCGIS). OCGIS is a pure Python, open source package designed for geospatial manipulation, subsetting, and computation on climate datasets stored in local NetCDF files or accessible remotely via the OPeNDAP protocol. Interfacing with OCGIS has brought GIS-like functionality to ESMPy (i.e. subsetting, coordinate transformations) as well as additional file output formats (i.e. CSV, ESRI Shapefile). ESMPy is distinguished by its strong emphasis on open source, community governance, and distributed development. The user base has grown quickly, and the package is integrating with several other software tools and frameworks. These include the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT), Iris, PyFerret, cfpython, and the Community Surface Dynamics Modeling System (CSDMS). ESMPy minimum requirements include Python 2.6, Numpy 1.6.1 and an ESMF installation. Optional dependencies include NetCDF and OCGIS-related dependencies: GDAL, Shapely, and Fiona. ESMPy is regression tested nightly, and supported on Darwin, Linux and Cray systems with the GNU compiler suite and MPI communications. OCGIS is supported on Linux, and also undergoes nightly regression testing. Both packages are installable from Anaconda channels. Upcoming development plans for ESMPy involve development of a higher order conservative grid remapping method. Future OCGIS development will focus on mesh and location stream interoperability and streamlined access to ESMPy's MPI implementation.
Standards for Advisement, Invocation, and Waiver of Counsel in Military Intelligence Interrogations
2004-04-01
appointed counsel experienced 288 Id para. 86.1 289 Direccion Nacional contra el Terrorismo . 290 Id para. 86.2. 291 See id. (citing Decree-Law 25,744, art...60 (1st Cir. 2000) ( en banc); United States v. Zabenah, 837 F.2d 1249, 1261 (5th Cir. 1988); Goldstar v. United States, 967 F.2d 965, 968 (4th Cir...Jimenez-Nava, 243 F.3d at 195-98. 135 Jimenez-Nava, 243 F.3d at 198. See also United States v. Lombera-Camorlinga, 206 F.3d 882, 885 (9th Cir. 2000) ( en
NASA Astrophysics Data System (ADS)
Cherubin, S.; Agosta, G.
2018-01-01
We present LIBVERSIONINGCOMPILER, a C++ library designed to support the dynamic generation of multiple versions of the same compute kernel in a HPC scenario. It can be used to provide continuous optimization, code specialization based on the input data or on workload changes, or otherwise to dynamically adjust the application, without the burden of a full dynamic compiler. The library supports multiple underlying compilers but specifically targets the LLVM framework. We also provide examples of use, showing the overhead of the library, and providing guidelines for its efficient use.
Francis Bacon's Valerius Terminus and the Voyage to the "Great Instauration".
Serjeantson, Richard
2017-01-01
Francis Bacon's earliest surviving natural philosophical treatise (composed circa 1603) bears the title Valerius Terminus of the Interpretation of Nature. This study, resting on fresh attention to the surviving authorial manuscript, has three goals. It begins by identifying a lost precursor work apparently entitled "Of Active Knowledge." It then examines the significance of the pseudonyms Bacon chose to introduce his ideas, considering especially his invocation of Erasmus's emblem, the Roman deity Terminus. Finally, it shows how the Valerius Terminus's global vision of contemporary knowledge ultimately helped shape the iconography of Bacon's published Instauratio magna.
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
NASA Technical Reports Server (NTRS)
Lynnes, Christopher
2016-01-01
The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk covered the program of cloud computing prototypes being undertaken for the Earth Observing System Data and Information System (EOSDIS). Also discussed were dataset interoperability recommendations ratified via the EOSDIS Standards Office and the HDF Product Designer tool with respect to its possible applicability to data in network Common Data Form (NetCDF) version 4.
NASA Astrophysics Data System (ADS)
Roach, Colin; Carlsson, Johan; Cary, John R.; Alexander, David A.
2002-11-01
The National Transport Code Collaboration (NTCC) has developed an array of software, including a data client/server. The data server, which is written in C++, serves local data (in the ITER Profile Database format) as well as remote data (by accessing one or several MDS+ servers). The client, a web-invocable Java applet, provides a uniform, intuitive, user-friendly, graphical interface to the data server. The uniformity of the interface relieves the user from the trouble of mastering the differences between different data formats and lets him/her focus on the essentials: plotting and viewing the data. The user runs the client by visiting a web page using any Java capable Web browser. The client is automatically downloaded and run by the browser. A reference to the data server is then retrieved via the standard Web protocol (HTTP). The communication between the client and the server is then handled by the mature, industry-standard CORBA middleware. CORBA has bindings for all common languages and many high-quality implementations are available (both Open Source and commercial). The NTCC data server has been installed at the ITPA International Multi-tokamak Confinement Profile Database, which is hosted by the UKAEA at Culham Science Centre. The installation of the data server is protected by an Internet firewall. To make it accessible to clients outside the firewall some modifications of the server were required. The working version of the ITPA confinement profile database is not open to the public. Authentification of legitimate users is done utilizing built-in Java security features to demand a password to download the client. We present an overview of the NTCC data client/server and some details of how the CORBA firewall-traversal issues were resolved and how the user authentification is implemented.
NASA Astrophysics Data System (ADS)
Arkebauer, T. J.; Walter-Shea, E. A.
2017-12-01
Vegetation indices, based on canopy spectral reflectance, are widely used to infer physical and biological characteristics of vegetation. Understanding the changes in remotely sensed signals as vegetation responds to its changing environment is essential for full assessment of canopy structure and function. Canopy-level reflectance has been measured at Nebraska AmeriFlux sites US-Ne1, US-Ne2 and US-Ne3 for most years since flux measurements were initiated in 2001. Tower-mounted spectral sensors provided 10-minute averaged reflectance (in PAR and NIR spectral regions) every half hour through the growing season for maize and soybean. Canopy reflectance varied over diurnal and seasonal time periods which led to variations in vegetation indices. One source of variation is due to the interaction of incident solar radiant energy with canopy structure (e.g., reflectance varies with changes in solar zenith angle and direct beam fraction, vegetative fraction, and leaf angle distribution). Another source of variation results from changes in canopy function (e.g., fluctuations in gross primary production and invocation of photoprotective mechanisms with plant stress). We present here a series of diurnal "patterns" of vegetation indices (including Normalized Difference Vegetation Index and Chlorophyll Index) for maize and soybean under mostly clear sky conditions. We demonstrate that diurnal patterns change as the LAI of the canopy changes through the course of the growing season in a somewhat predictable pattern from plant emergence (low vegetative cover) through peak green LAI (full vegetation cover). However, there are changes in the diurnal pattern that we have yet to fully understand; this variation in pattern may indicate variation in canopy function. Initially, we have explored the pattern changes qualitatively and are currently developing more quantitative approaches.
System on Mobile Devices Middleware: Thinking beyond Basic Phones and PDAs
NASA Astrophysics Data System (ADS)
Prasad, Sushil K.
Several classes of emerging applications, spanning domains such as medical informatics, homeland security, mobile commerce, and scientific applications, are collaborative, and a significant portion of these will harness the capabilities of both the stable and mobile infrastructures (the “mobile grid”). Currently, it is possible to develop a collaborative application running on a collection of heterogeneous, possibly mobile, devices, each potentially hosting data stores, using existing middleware technologies such as JXTA, BREW, Compact .NET and J2ME. However, they require too many ad-hoc techniques as well as cumbersome and time-consuming programming. Our System on Mobile Devices (SyD) middleware, on the other hand, has a modular architecture that makes such application development very systematic and streamlined. The architecture supports transactions over mobile data stores, with a range of remote group invocation options and embedded interdependencies among such data store objects. The architecture further provides a persistent uniform object view, group transaction with Quality of Service (QoS) specifications, and XML vocabulary for inter-device communication. I will present the basic SyD concepts, introduce the architecture and the design of the SyD middleware and its components. We will discuss the basic performance figures of SyD components and a few SyD applications on PDAs. SyD platform has led to developments in distributed web service coordination and workflow technologies, which we will briefly discuss. There is a vital need to develop methodologies and systems to empower common users, such as computational scientists, for rapid development of such applications. Our BondFlow system enables rapid configuration and execution of workflows over web services. The small footprint of the system enables them to reside on Java-enabled handheld devices.
Suplatov, Dmitry; Popova, Nina; Zhumatiy, Sergey; Voevodin, Vladimir; Švedas, Vytas
2016-04-01
Rapid expansion of online resources providing access to genomic, structural, and functional information associated with biological macromolecules opens an opportunity to gain a deeper understanding of the mechanisms of biological processes due to systematic analysis of large datasets. This, however, requires novel strategies to optimally utilize computer processing power. Some methods in bioinformatics and molecular modeling require extensive computational resources. Other algorithms have fast implementations which take at most several hours to analyze a common input on a modern desktop station, however, due to multiple invocations for a large number of subtasks the full task requires a significant computing power. Therefore, an efficient computational solution to large-scale biological problems requires both a wise parallel implementation of resource-hungry methods as well as a smart workflow to manage multiple invocations of relatively fast algorithms. In this work, a new computer software mpiWrapper has been developed to accommodate non-parallel implementations of scientific algorithms within the parallel supercomputing environment. The Message Passing Interface has been implemented to exchange information between nodes. Two specialized threads - one for task management and communication, and another for subtask execution - are invoked on each processing unit to avoid deadlock while using blocking calls to MPI. The mpiWrapper can be used to launch all conventional Linux applications without the need to modify their original source codes and supports resubmission of subtasks on node failure. We show that this approach can be used to process huge amounts of biological data efficiently by running non-parallel programs in parallel mode on a supercomputer. The C++ source code and documentation are available from http://biokinet.belozersky.msu.ru/mpiWrapper .
NASA Astrophysics Data System (ADS)
Thompson, John
2015-04-01
As the Physical Review Focused Collection demonstrates, recent frontiers in physics education research include systematic investigations at the upper division. As part of a collaborative project, we have examined student understanding of several topics in upper-division thermal and statistical physics. A fruitful context for research is the Boltzmann factor in statistical mechanics: the standard derivation involves several physically justified mathematical steps as well as the invocation of a Taylor series expansion. We have investigated student understanding of the physical significance of the Boltzmann factor as well as its utility in various circumstances, and identified various lines of student reasoning related to the use of the Boltzmann factor. Results from written data as well as teaching interviews suggest that many students do not use the Boltzmann factor when answering questions related to probability in applicable physical situations, even after lecture instruction. We designed an inquiry-based tutorial activity to guide students through a derivation of the Boltzmann factor and to encourage deep connections between the physical quantities involved and the mathematics. Observations of students working through the tutorial suggest that many students at this level can recognize and interpret Taylor series expansions, but they often lack fluency in creating and using Taylor series appropriately, despite previous exposure in both calculus and physics courses. Our findings also suggest that tutorial participation not only increases the prevalence of relevant invocation of the Boltzmann factor, but also helps students gain an appreciation of the physical implications and meaning of the mathematical formalism behind the formula. Supported in part by NSF Grants DUE-0817282, DUE-0837214, and DUE-1323426.
Zender, Charles S.
2016-09-19
Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits ofmore » consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25–80 and 5–65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1–5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1–2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that it can compress, Bit Grooming guarantees the specified precision throughout the full floating-point range. Data quantization by Bit Grooming is irreversible (i.e., lossy) yet transparent, meaning that no extra processing is required by data users/readers. Hence Bit Grooming can easily reduce data storage volume without sacrificing scientific precision or imposing extra burdens on users.« less
Wave data processing toolbox manual
Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul
2006-01-01
Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata structure is embedded with the data to document pertinent information regarding the deployment and the parameters used to process the data. Using this format ensures that the relevant information about how the data was collected and converted to physical units is maintained with the actual data. EPIC-standard variable names have been utilized where appropriate. These standards, developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) (http://www.pmel.noaa.gov/epic/), provide a universal vernacular allowing researchers to share data without translation.
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Idol, T. A.
2015-12-01
Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.
ClimateNet: A Machine Learning dataset for Climate Science Research
NASA Astrophysics Data System (ADS)
Prabhat, M.; Biard, J.; Ganguly, S.; Ames, S.; Kashinath, K.; Kim, S. K.; Kahou, S.; Maharaj, T.; Beckham, C.; O'Brien, T. A.; Wehner, M. F.; Williams, D. N.; Kunkel, K.; Collins, W. D.
2017-12-01
Deep Learning techniques have revolutionized commercial applications in Computer vision, speech recognition and control systems. The key for all of these developments was the creation of a curated, labeled dataset ImageNet, for enabling multiple research groups around the world to develop methods, benchmark performance and compete with each other. The success of Deep Learning can be largely attributed to the broad availability of this dataset. Our empirical investigations have revealed that Deep Learning is similarly poised to benefit the task of pattern detection in climate science. Unfortunately, labeled datasets, a key pre-requisite for training, are hard to find. Individual research groups are typically interested in specialized weather patterns, making it hard to unify, and share datasets across groups and institutions. In this work, we are proposing ClimateNet: a labeled dataset that provides labeled instances of extreme weather patterns, as well as associated raw fields in model and observational output. We develop a schema in NetCDF to enumerate weather pattern classes/types, store bounding boxes, and pixel-masks. We are also working on a TensorFlow implementation to natively import such NetCDF datasets, and are providing a reference convolutional architecture for binary classification tasks. Our hope is that researchers in Climate Science, as well as ML/DL, will be able to use (and extend) ClimateNet to make rapid progress in the application of Deep Learning for Climate Science research.
OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats
NASA Astrophysics Data System (ADS)
Erickson, T. A.; Koziol, B. W.; Rood, R. B.
2011-12-01
The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.
Culture-specific delusions. Sense and nonsense in cultural context.
Gaines, A D
1995-06-01
It can be said that a definition of delusions requires the invocation of cultural understandings, standards of acceptability, as well as conceptions of reality and the forces that animate it. For these reasons, the determination of delusional or normative ideation can only be effected properly within particular cultural contexts. The cross-cultural record suggests that it is difficult to separate the delusional from the cultural; a belief that is patterened and culturally specific is, by definition a cultural, not a delusional belief. One must rely upon particular, relevant local cultural understandings to ascertain when the bounds of culture have been transgressed and meaning has given way to unshareable nonsense.
2001-09-27
KENNEDY SPACE CENTER, Fla. -- Maria Lopez-Tellado (center) and Rey N. Diaz (right) display the plaques they received at the annual Hispanic Heritage Month Celebration, held at the Kurt Debus Conference Facility at KSC. The two were recognized for their efforts as chairs of the event, which featuraed a luncheon and comments by Deputy Center Director Jim Jennings and Miguel Rodriquez, chief, Integration Office, of the Joint Performance Management Office. Joseph Tellado (left), International Space Station/Payload Processing, led the pledge of allegiance and invocation. The Merrit Island High School ROTC provided the color guard. The event was sponsored by the Hispanic Employment Program Working Group at KSC
Unleashing Geophysics Data with Modern Formats and Services
NASA Astrophysics Data System (ADS)
Ip, Alex; Brodie, Ross C.; Druken, Kelsey; Bastrakova, Irina; Evans, Ben; Kemp, Carina; Richardson, Murray; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley
2016-04-01
Geoscience Australia (GA) is the national steward of large volumes of geophysical data extending over the entire Australasian region and spanning many decades. The volume and variety of data which must be managed, coupled with the increasing need to support machine-to-machine data access, mean that the old "click-and-ship" model delivering data as downloadable files for local analysis is rapidly becoming unviable - a "big data" problem not unique to geophysics. The Australian Government, through the Research Data Services (RDS) Project, recently funded the Australian National Computational Infrastructure (NCI) to organize a wide range of Earth Systems data from diverse collections including geoscience, geophysics, environment, climate, weather, and water resources onto a single High Performance Data (HPD) Node. This platform, which now contains over 10 petabytes of data, is called the National Environmental Research Data Interoperability Platform (NERDIP), and is designed to facilitate broad user access, maximise reuse, and enable integration. GA has contributed several hundred terabytes of geophysical data to the NERDIP. Historically, geophysical datasets have been stored in a range of formats, with metadata of varying quality and accessibility, and without standardised vocabularies. This has made it extremely difficult to aggregate original data from multiple surveys (particularly un-gridded geophysics point/line data) into standard formats suited to High Performance Computing (HPC) environments. To address this, it was decided to use the NERDIP-preferred Hierarchical Data Format (HDF) 5, which is a proven, standard, open, self-describing and high-performance format supported by extensive software tools, libraries and data services. The Network Common Data Form (NetCDF) 4 API facilitates the use of data in HDF5, whilst the NetCDF Climate & Forecasting conventions (NetCDF-CF) further constrain NetCDF4/HDF5 data so as to provide greater inherent interoperability. The first geophysical data collection selected for transformation by GA was Airborne ElectroMagnetics (AEM) data which was held in proprietary-format files, with associated ISO 19115 metadata held in a separate relational database. Existing NetCDF-CF metadata profiles were enhanced to cover AEM and other geophysical data types, and work is underway to formalise the new geophysics vocabulary as a proposed extension to the Climate & Forecasting conventions. The richness and flexibility of HDF5's internal indexing mechanisms has allowed lossless restructuring of the AEM data for efficient storage, subsetting and access via either the NetCDF4/HDF5 APIs or Open-source Project for a Network Data Access Protocol (OPeNDAP) data services. This approach not only supports large-scale HPC processing, but also interactive access to a wide range of geophysical data in user-friendly environments such as iPython notebooks and more sophisticated cloud-enabled portals such as the Virtual Geophysics Laboratory (VGL). As multidimensional AEM datasets are relatively complex compared to other geophysical data types, the general approach employed in this project for modernizing AEM data is likely to be applicable to other geophysics data types. When combined with the use of standards-based data services and APIs, a coordinated, systematic modernisation will result in vastly improved accessibility to, and usability of, geophysical data in a wide range of computational environments both within and beyond the geophysics community.
NASA Astrophysics Data System (ADS)
Fang, H.; Kato, H.; Rodell, M.; Teng, W. L.; Vollmer, B. E.
2008-12-01
The Global Land Data Assimilation System (GLDAS) has been generating a series of land surface state (e.g., soil moisture and surface temperature) and flux (e.g., evaporation and sensible heat flux) products, simulated by four land surface models (CLM, Mosaic, Noah and VIC). These products are now accessible at the Hydrology Data and Information Services Center (HDISC), a component of the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Current GLDAS data hosted at HDISC include a set of 1.0° data products, covering 1979 to the present, from the four models and a 0.25° data product, covering 2000 to the present, from the Noah model. In addition to the basic anonymous ftp data downloading, users can avail themselves of several advanced data search and downloading services, such as Mirador and OPeNDAP. Mirador is a Google-based search tool that provides keywords searching, on-the-fly spatial and parameter subsetting of selected data. OPeNDAP (Open-source Project for a Network Data Access Protocol) enables remote OPeNDAP clients to access OPeNDAP served data regardless of local storage format. Additional data services to be available in the near future from HDISC include (1) on-the-fly converter of GLDAS to NetCDF and binary data formats; (2) temporal aggregation of GLDAS files; and (3) Giovanni, an online visualization and analysis tool that provides a simple way to visualize, analyze, and access vast amounts of data without having to download the data.
NASA Reverb: Standards-Driven Earth Science Data and Service Discovery
NASA Astrophysics Data System (ADS)
Cechini, M. F.; Mitchell, A.; Pilone, D.
2011-12-01
NASA's Earth Observing System Data and Information System (EOSDIS) is a core capability in NASA's Earth Science Data Systems Program. NASA's EOS ClearingHOuse (ECHO) is a metadata catalog for the EOSDIS, providing a centralized catalog of data products and registry of related data services. Working closely with the EOSDIS community, the ECHO team identified a need to develop the next generation EOS data and service discovery tool. This development effort relied on the following principles: + Metadata Driven User Interface - Users should be presented with data and service discovery capabilities based on dynamic processing of metadata describing the targeted data. + Integrated Data & Service Discovery - Users should be able to discovery data and associated data services that facilitate their research objectives. + Leverage Common Standards - Users should be able to discover and invoke services that utilize common interface standards. Metadata plays a vital role facilitating data discovery and access. As data providers enhance their metadata, more advanced search capabilities become available enriching a user's search experience. Maturing metadata formats such as ISO 19115 provide the necessary depth of metadata that facilitates advanced data discovery capabilities. Data discovery and access is not limited to simply the retrieval of data granules, but is growing into the more complex discovery of data services. These services include, but are not limited to, services facilitating additional data discovery, subsetting, reformatting, and re-projecting. The discovery and invocation of these data services is made significantly simpler through the use of consistent and interoperable standards. By utilizing an adopted standard, developing standard-specific adapters can be utilized to communicate with multiple services implementing a specific protocol. The emergence of metadata standards such as ISO 19119 plays a similarly important role in discovery as the 19115 standard. After a yearlong design, development, and testing process, the ECHO team successfully released "Reverb - The Next Generation Earth Science Discovery Tool." Reverb relies heavily on the information contained in dataset and granule metadata, such as ISO 19115, to provide a dynamic experience to users based on identified search facet values extracted from science metadata. Such an approach allows users to perform cross-dataset correlation and searches, discovering additional data that they may not previously have been aware of. In addition to data discovery, Reverb users may discover services associated with their data of interest. When services utilize supported standards and/or protocols, Reverb can facilitate the invocation of both synchronous and asynchronous data processing services. This greatly enhances a users ability to discover data of interest and accomplish their research goals. Extrapolating on the current movement towards interoperable standards and an increase in available services, data service invocation and chaining will become a natural part of data discovery. Reverb is one example of a discovery tool that provides a mechanism for transforming the earth science data discovery paradigm.
NASA Technical Reports Server (NTRS)
Habermann, Ted; Jelenak, Aleksander; Lee, Joe; Yang, Kent; Gallagher, James; Potter, Nathan
2017-01-01
As part of the overall effort to understand implications of migrating ESDIS data and services to the cloud we are testing several common OPeNDAP and HDF use cases against three architectures for general performance and cost characteristics. The architectures include retrieving entire files, retrieving datasets using HTTP range gets, and retrieving elements of datasets (chunks) with HTTP range gets. We will describe these architectures and discuss our approach to estimating cost.
Obtaining and processing Daymet data using Python and ArcGIS
Bohms, Stefanie
2013-01-01
This set of scripts was developed to automate the process of downloading and mosaicking daily Daymet data to a user defined extent using ArcGIS and Python programming language. The three steps are downloading the needed Daymet tiles for the study area extent, converting the netcdf file to a tif raster format, and mosaicking those rasters to one file. The set of scripts is intended for all levels of experience with Python programming language and requires no scripting by the user.
Development of web-GIS system for analysis of georeferenced geophysical data
NASA Astrophysics Data System (ADS)
Okladnikov, I.; Gordov, E. P.; Titov, A. G.; Bogomolov, V. Y.; Genina, E.; Martynova, Y.; Shulgina, T. M.
2012-12-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated web-GIS information-computational system for analysis of georeferenced climatological and meteorological data has been created. The information-computational system consists of 4 basic parts: computational kernel developed using GNU Data Language (GDL), a set of PHP-controllers run within specialized web-portal, JavaScript class libraries for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology, and an archive of geophysical datasets. Computational kernel comprises of a number of dedicated modules for querying and extraction of data, mathematical and statistical data analysis, visualization, and preparing output files in geoTIFF and netCDF format containing processing results. Specialized web-portal consists of a web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript libraries aiming at graphical user interface development are based on GeoExt library combining ExtJS Framework and OpenLayers software. The archive of geophysical data consists of a number of structured environmental datasets represented by data files in netCDF, HDF, GRIB, ESRI Shapefile formats. For processing by the system are available: two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, meteorological observational data for the territory of the former USSR for the 20th century, results of modeling by global and regional climatological models, and others. The system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The Web-GIS information-computational system for geophysical data analysis provides specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified web-interface in a common graphical web-browser. This work is partially supported by the Ministry of education and science of the Russian Federation (contract #07.514.114044), projects IV.31.1.5, IV.31.2.7, RFBR grants #10-07-00547a, #11-05-01190a, and integrated project SB RAS #131.
Data Container Study for Handling Array-based Data Using Rasdaman, Hive, Spark, and MongoDB
NASA Astrophysics Data System (ADS)
Xu, M.; Hu, F.; Yu, M.; Scheele, C.; Liu, K.; Huang, Q.; Yang, C. P.; Little, M. M.
2016-12-01
Geoscience communities have come up with various big data storage solutions, such as Rasdaman and Hive, to address the grand challenges for massive Earth observation data management and processing. To examine the readiness of current solutions in supporting big Earth observation, we propose to investigate and compare four popular data container solutions, including Rasdaman, Hive, Spark, and MongoDB. Using different types of spatial and non-spatial queries, datasets stored in common scientific data formats (e.g., NetCDF and HDF), and two applications (i.e. dust storm simulation data mining and MERRA data analytics), we systematically compare and evaluate the feature and performance of these four data containers in terms of data discover and access. The computing resources (e.g. CPU, memory, hard drive, network) consumed while performing various queries and operations are monitored and recorded for the performance evaluation. The initial results show that 1) Rasdaman has the best performance for queries on statistical and operational functions, and supports NetCDF data format better than HDF; 2) Rasdaman clustering configuration is more complex than the others; 3) Hive performs better on single pixel extraction from multiple images; and 4) Except for the single pixel extractions, Spark performs better than Hive and its performance is close to Rasdaman. A comprehensive report will detail the experimental results, and compare their pros and cons regarding system performance, ease of use, accessibility, scalability, compatibility, and flexibility.
Efforts to integrate CMIP metadata and standards into NOAA-GFDL's climate model workflow
NASA Astrophysics Data System (ADS)
Blanton, C.; Lee, M.; Mason, E. E.; Radhakrishnan, A.
2017-12-01
Modeling centers participating in CMIP6 run model simulations, publish requested model output (conforming to community data standards), and document models and simulations using ES-DOC. GFDL developed workflow software implementing some best practices to meet these metadata and documentation requirements. The CMIP6 Data Request defines the variables that should be archived for each experiment and specifies their spatial and temporal structure. We used the Data Request's dreqPy python library to write GFDL model configuration files as an alternative to hand-crafted tables. There was also a largely successful effort to standardize variable names within the model to reduce the additional overhead of translating "GFDL to CMOR" variables at a later stage in the pipeline. The ES-DOC ecosystem provides tools and standards to create, publish, and view various types of community-defined CIM documents, most notably model and simulation documents. Although ES-DOC will automatically create simulation documents during publishing by harvesting NetCDF global attributes, the information must be collected, stored, and placed in the NetCDF files by the workflow. We propose to develop a GUI to collect the simulation document precursors. In addition, a new MIP for CMIP6-CPMIP, a comparison of computational performance of climate models-is documented using machine and performance CIM documents. We used ES-DOC's pyesdoc python library to automatically create these machine and performance documents. We hope that these and similar efforts will become permanent features of the GFDL workflow to facilitate future participation in CMIP-like activities.
Fundamental limitation of a two-dimensional description of magnetic reconnection
NASA Astrophysics Data System (ADS)
Firpo, Marie-Christine
2014-10-01
For magnetic reconnection to be possible, the electrons have at some point to ``get free from magnetic slavery,'' according to von Steiger's formulation. Stochasticity may be considered as one possible ingredient through which this may be realized in the magnetic reconnection process. It will be argued that non-ideal effects may be considered as a ``hidden'' way to introduce stochasticity. Then it will be shown that there exists a generic intrinsic stochasticity of magnetic field lines that does not require the invocation of non-ideal effects but cannot show up in effective two-dimensional models of magnetic reconnection. Possible implications will be discussed in the frame of tokamak sawteeth that form a laboratory prototype of magnetic reconnection.
Panarchy: theory and application
Allen, Craig R.; Angeler, David G.; Garmestani, Ahjond S.; Gunderson, Lance H.; Holling, Crawford S.
2014-01-01
The concept of panarchy provides a framework that characterizes complex systems of people and nature as dynamically organized and structured within and across scales of space and time. It has been more than a decade since the introduction of panarchy. Over this period, its invocation in peer-reviewed literature has been steadily increasing, but its use remains primarily descriptive and abstract. Here, we discuss the use of the concept in the literature to date, highlight where the concept may be useful, and discuss limitations to the broader applicability of panarchy theory for research in the ecological and social sciences. Finally, we forward a set of testable hypotheses to evaluate key propositions that follow from panarchy theory.
Effects of international law on migration policy and practice: the uses of hypocrisy.
Martin, D A
1989-01-01
Classical learning recognizes no role for international law in affecting migration policy and practice, but in modern times the salutary effects are increasing, although they remain modest. International law influences migration policy primarily through effective invocation of various forms of "soft law" in internal and international political forums. More limited prospects exist for beneficial changes enforced by international institutions and domestic courts. The article cautions against inflated expectations in the latter settings, however, particularly because overly ambitious claims can be counterproductive. It then offers a few predictions about near-term effects of international law, having to do with departures from a country, refugee law, and the integration of migrants into their new homelands.
Evidence for global processing of complex visual displays
NASA Technical Reports Server (NTRS)
Munson, Robert C.; Horst, Richard L.
1986-01-01
'Polar graphic' displays, in which changes in system status are represented by distortions in the form of a geometric figure, were presented to subjects, and reaction time (RT) to discriminate system status was recorded. Of interest was the extent to which reaction time showed evidence of global processing of these displays as the number of nodes and difficulty of discrimination were varied. When discrimination of system status was easy, RT showed no increase with increasing number of nodes, providing evidence of global processing. When discrimination was difficult, systematic differences in RT as a function of the number of nodes suggested the invocation of other (local) processes, although the data were not consistent with a node-by-node search process.
The extended curse: being a woman every day.
Berg, D H; Coutts, L B
1994-01-01
Inductive analysis of the portrayal of menstruating women in contemporary menstrual product advertisements revealed a profound shift in the meaning of feminine hygiene over the last few decades. The phrase sanitary protection has been replaced by a more euphemistic phrase, feminine hygiene, to refer to menstrual management products. What first appeared to be mere euphemistic substitution, on further analysis, was revealed to involve considerably more than semantics. In the case of panty liners, it was found that the marketing of these specialized products involves the invocation of negative definitions of femaleness and a concomitant subscription to altered definitions of femininity. These altered meanings, conveyed in recent menstrual product advertisements, have serious implications for contemporary women's self-images.
Development of Extended Content Standards for Biodiversity Data
NASA Astrophysics Data System (ADS)
Hugo, Wim; Schmidt, Jochen; Saarenmaa, Hannu
2013-04-01
Interoperability in the field of Biodiversity observation has been strongly driven by the development of a number of global initiatives (GEO, GBIF, OGC, TDWG, GenBank, …) and its supporting standards (OGC-WxS, OGC-SOS, Darwin Core (DwC), NetCDF, …). To a large extent, these initiatives have focused on discoverability and standardization of syntactic and schematic interoperability. Semantic interoperability is more complex, requiring development of domain-dependent conceptual data models, and extension of these models with appropriate ontologies (typically manifested as controlled vocabularies). Biodiversity content has been standardized partly, for example through Darwin Core for occurrence data and associated taxonomy, and through Genbank for genetic data, but other contexts of biodiversity observation have lagged behind - making it difficult to achieve semantic interoperability between distributed data sources. With this in mind, WG8 of GEO BON (charged with data and systems interoperability) has started a work programme to address a number of concerns, one of which is the gap in content standards required to make Biodiversity data truly interoperable. The paper reports on the framework developed by WG8 for the classification of Biodiversity observation data into 'families' of use cases and its supporting data schema, where gaps, if any, in the availability if content standards have been identified, and how these are to be addressed by way of an abstract data model and the development of associated content standards. It is proposed that a minimum set of standards (1) will be required to address the scope of Biodiversity content, aligned with levels and dimensions of observation, and based on the 'Essential Biodiversity Variables' (2) being developed by GEO BON . The content standards are envisaged as loosely separated from the syntactic and schematic standards used for the base data exchange: typically, services would offer an existing data standard (DwC, WFS, SOS, NetCDF), with a use-case dependent 'payload' embedded into the data stream. This enables the re-use of the abstract schema, and sometimes the implementation specification (for example XML, JSON, or NetCDF conventions) across services. An explicit aim will be to make the XML implementation specification re-usable as a DwC and a GML (SOS end WFS) extension. (1) Olga Lyashevska, Keith D. Farnsworth, How many dimensions of biodiversity do we need?, Ecological Indicators, Volume 18, July 2012, Pages 485-492, ISSN 1470-160X, 10.1016/j.ecolind.2011.12.016. (2) GEO BON: Workshop on Essential Biodiversity Variables (27-29 February 2012, Frascati, Italy). (http://www.earthobservations.org/geobon_docs_20120227.shtml)
NASA Astrophysics Data System (ADS)
Stein, Olaf; Schultz, Martin G.; Rambadt, Michael; Saini, Rajveer; Hoffmann, Lars; Mallmann, Daniel
2017-04-01
Global model data of atmospheric composition produced by the Copernicus Atmospheric Monitoring Service (CAMS) is collected since 2010 at FZ Jülich and serves as boundary condition for use by Regional Air Quality (RAQ) modellers world-wide. RAQ models need time-resolved meteorological as well as chemical lateral boundary conditions for their individual model domains. While the meteorological data usually come from well-established global forecast systems, the chemical boundary conditions are not always well defined. In the past, many models used 'climatic' boundary conditions for the tracer concentrations, which can lead to significant concentration biases, particularly for tracers with longer lifetimes which can be transported over long distances (e.g. over the whole northern hemisphere) with the mean wind. The Copernicus approach utilizes extensive near-realtime data assimilation of atmospheric composition data observed from space which gives additional reliability to the global modelling data and is well received by the RAQ communities. An existing Web Coverage Service (WCS) for sharing these individually tailored model results is currently being re-engineered to make use of a modern, scalable database technology in order to improve performance, enhance flexibility, and allow the operation of catalogue services. The new Jülich Atmospheric Data Distributions Server (JADDS) adheres to the Web Coverage Service WCS2.0 standard as defined by the Open Geospatial Consortium OGC. This enables the user groups to flexibly define datasets they need by selecting a subset of chemical species or restricting geographical boundaries or the length of the time series. The data is made available in the form of different catalogues stored locally on our server. In addition, the Jülich OWS Interface (JOIN) provides interoperable web services allowing for easy download and visualization of datasets delivered from WCS servers via the internet. We will present the prototype JADDS server and address the major issues identified when relocating large four-dimensional datasets into a RASDAMAN raster array database. So far the RASDAMAN support for data available in netCDF format is limited with respect to metadata related to variables and axes. For community-wide accepted solutions, selected data coverages shall result in downloadable netCDF files including metadata complying with the netCDF CF Metadata Conventions standard (http://cfconventions.org/). This can be achieved by adding custom metadata elements for RASDAMAN bands (model levels) on data ingestion. Furthermore, an optimization strategy for ingestion of several TB of 4D model output data will be outlined.
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo; Valentin, Bernard; Koubarakis, Manolis; Nativi, Stefano
2013-04-01
Access to Earth Observation products remains not at all straightforward for end users in most domains. Semantically-enabled search engines, generally accessible through Web portals, have been developed. They allow searching for products by selecting application-specific terms and specifying basic geographical and temporal filtering criteria. Although this mostly suits the needs of the general public, the scientific communities require more advanced and controlled means to find products. Ranges of validity, traceability (e.g. origin, applied algorithms), accuracy, uncertainty, are concepts that are typically taken into account in research activities. The Prod-Trees (Enriching Earth Observation Ontology Services using Product Trees) project will enhance the CF-netCDF product format and vocabulary to allow storing metadata that better describe the products, and in particular EO products. The project will bring a standardized solution that permits annotating EO products in such a manner that official and third-party software libraries and tools will be able to search for products using advanced tags and controlled parameter names. Annotated EO products will be automatically supported by all the compatible software. Because the entire product information will come from the annotations and the standards, there will be no need for integrating extra components and data structures that have not been standardized. In the course of the project, the most important and popular open-source software libraries and tools will be extended to support the proposed extensions of CF-netCDF. The result will be provided back to the respective owners and maintainers for ensuring the best dissemination and adoption of the extended format. The project, funded by ESA, has started in December 2012 and will end in May 2014. It is coordinated by Space Applications Services, and the Consortium includes CNR-IIA and the National and Kapodistrian University of Athens. The first activities included the elicitation of user requirements in order to identify gaps in the current CF and netCDF specification for providing an extended support of the discovery of EO data. To this aim a Validation Group has been established including members from organizations actively using netCDF and CF standards. A questionnaire has been prepared and submitted to the Validation Group; it was aimed for being filled online, but also for guiding interviews. The presentation will focus on the project objectives, the first achievements with particular reference to the results of the requirements analysis and future plans.
Coupling West WRF to GSSHA with GSSHApy
NASA Astrophysics Data System (ADS)
Snow, A. D.
2017-12-01
The West WRF output data is in the gridded NetCDF output format containing the required forcing data needed to run a GSSHA simulation. These data include precipitation, pressure, temperature, relative humidity, cloud cover, wind speed, and solar radiation. Tools to reproject, resample, and reformat the data for GSSHA have recently been added to the open source Python library GSSHApy (https://github.com/ci-water/gsshapy). These tools have created a connection that has made it possible to run forecasts using the West WRF forcing data with GSSHA to produce both streamflow and lake level predictions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.
NetCDF file of the SREF standard deviation of wind speed and direction that was used to inject variability in the FDDA input.variable U_NDG_OLD contains standard deviation of wind speed (m/s)variable V_NDG_OLD contains the standard deviation of wind direction (deg)This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).
SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE
NASA Technical Reports Server (NTRS)
Kleine, H.
1994-01-01
Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.
Pelagic habitat visualization: the need for a third (and fourth) dimension: HabitatSpace
Beegle-Krause, C; Vance, Tiffany; Reusser, Debbie; Stuebe, David; Howlett, Eoin
2009-01-01
Habitat in open water is not simply a 2-D to 2.5-D surface such as the ocean bottom or the air-water interface. Rather, pelagic habitat is a 3-D volume of water that can change over time, leading us to the term habitat space. Visualization and analysis in 2-D is well supported with GIS tools, but a new tool was needed for visualization and analysis in four dimensions. Observational data (cruise profiles (xo, yo, z, to)), numerical circulation model fields (x,y,z,t), and trajectories (larval fish, 4-D line) need to be merged together in a meaningful way for visualization and analysis. As a first step toward this new framework, UNIDATA’s Integrated Data Viewer (IDV) has been used to create a set of tools for habitat analysis in 4-D. IDV was designed for 3-D+time geospatial data in the meteorological community. NetCDF JavaTM libraries allow the tool to read many file formats including remotely located data (e.g. data available via OPeNDAP ). With this project, IDV has been adapted for use in delineating habitat space for multiple fish species in the ocean. The ability to define and visualize boundaries of a water mass, which meets specific biologically relevant criteria (e.g., volume, connectedness, and inter-annual variability) based on model results and observational data, will allow managers to investigate the survival of individual year classes of commercially important fisheries. Better understanding of the survival of these year classes will lead to improved forecasting of fisheries recruitment.
Data Container Study for Handling array-based data using Hive, Spark, MongoDB, SciDB and Rasdaman
NASA Astrophysics Data System (ADS)
Xu, M.; Hu, F.; Yang, J.; Yu, M.; Yang, C. P.
2017-12-01
Geoscience communities have come up with various big data storage solutions, such as Rasdaman and Hive, to address the grand challenges for massive Earth observation data management and processing. To examine the readiness of current solutions in supporting big Earth observation, we propose to investigate and compare four popular data container solutions, including Rasdaman, Hive, Spark, SciDB and MongoDB. Using different types of spatial and non-spatial queries, datasets stored in common scientific data formats (e.g., NetCDF and HDF), and two applications (i.e. dust storm simulation data mining and MERRA data analytics), we systematically compare and evaluate the feature and performance of these four data containers in terms of data discover and access. The computing resources (e.g. CPU, memory, hard drive, network) consumed while performing various queries and operations are monitored and recorded for the performance evaluation. The initial results show that 1) the popular data container clusters are able to handle large volume of data, but their performances vary in different situations. Meanwhile, there is a trade-off between data preprocessing, disk saving, query-time saving, and resource consuming. 2) ClimateSpark, MongoDB and SciDB perform the best among all the containers in all the queries tests, and Hive performs the worst. 3) These studied data containers can be applied on other array-based datasets, such as high resolution remote sensing data and model simulation data. 4) Rasdaman clustering configuration is more complex than the others. A comprehensive report will detail the experimental results, and compare their pros and cons regarding system performance, ease of use, accessibility, scalability, compatibility, and flexibility.
NASA Technical Reports Server (NTRS)
Rui, Hualan; Vollmer, Bruce; Teng, Bill; Jasinski, Michael; Mocko, David; Loeser, Carlee; Kempler, Steven
2016-01-01
The National Climate Assessment-Land Data Assimilation System (NCA-LDAS) is an Integrated Terrestrial Water Analysis, and is one of NASAs contributions to the NCA of the United States. The NCA-LDAS has undergone extensive development, including multi-variate assimilation of remotely-sensed water states and anomalies as well as evaluation and verification studies, led by the Goddard Space Flight Centers Hydrological Sciences Laboratory (HSL). The resulting NCA-LDAS data have recently been released to the general public and include those from the Noah land-surface model (LSM) version 3.3 (Noah-3.3) and the Catchment LSM version Fortuna-2.5 (CLSM-F2.5). Standard LSM output variables including soil moistures temperatures, surface fluxes, snow cover depth, groundwater, and runoff are provided, as well as streamflow using a river routing system. The NCA-LDAS data are archived at and distributed by the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). The data can be accessed via HTTP, OPeNDAP, Mirador search and download, and NASA Earth data Search. To further facilitate access and use, the NCA-LDAS data are integrated into the NASA Giovanni, for quick visualization and analysis, and into the Data Rods system, for retrieval of time series of long time periods. The temporal and spatial resolutions of the NCA-LDAS data are, respectively, daily-averages and 0.125x0.125 degree, covering North America (25N 53N; 125W 67W) and the period January 1979 to December 2015. The data files are in self-describing, machine-independent, CF-compliant netCDF-4 format.
Visualization and Quality Control Web Tools for CERES Products
NASA Astrophysics Data System (ADS)
Mitrescu, C.; Doelling, D.; Chu, C.; Mlynczak, P.
2014-12-01
The CERES project continues to provide the scientific community a wide variety of satellite-derived data products. The flagship products TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. These datasets encompass a wide range of temporal and spatial resolutions, suited to specific applications. We thus offer time resolutions that range from instantaneous to monthly means, with spatial resolutions that range from 20-km footprint to global scales. The 14-year record is mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. CERES products are also used by the remote sensing community for their climatological studies. In the last years however, our CERES products had been used by an even broader audience, like the green energy, health and environmental research communities, and others. Because of that, the CERES project has implemented a now well-established web-oriented Ordering and Visualization Tool (OVT), which is well into its fifth year of development. In order to help facilitate a comprehensive quality control of CERES products, the OVT Team began introducing a series of specialized functions. These include the 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and other specialized scientific application capabilities. Over time increasingly higher order temporal and spatial resolution products are being made available to the public through the CERES OVT. These high-resolution products require accessing the existing long-term archive - thus the reading of many very large netCDF or HDF files that pose a real challenge to the task of near instantaneous visualization. An overview of the CERES OVT basic functions and QC capabilities as well as future steps in expanding its capabilities will be presented at the meeting.
Increasing the value of geospatial informatics with open approaches for Big Data
NASA Astrophysics Data System (ADS)
Percivall, G.; Bermudez, L. E.
2017-12-01
Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."
Lin, Meng Kuan; Nicolini, Oliver; Waxenegger, Harald; Galloway, Graham J; Ullmann, Jeremy F P; Janke, Andrew L
2013-01-01
Digital Imaging Processing (DIP) requires data extraction and output from a visualization tool to be consistent. Data handling and transmission between the server and a user is a systematic process in service interpretation. The use of integrated medical services for management and viewing of imaging data in combination with a mobile visualization tool can be greatly facilitated by data analysis and interpretation. This paper presents an integrated mobile application and DIP service, called M-DIP. The objective of the system is to (1) automate the direct data tiling, conversion, pre-tiling of brain images from Medical Imaging NetCDF (MINC), Neuroimaging Informatics Technology Initiative (NIFTI) to RAW formats; (2) speed up querying of imaging measurement; and (3) display high-level of images with three dimensions in real world coordinates. In addition, M-DIP provides the ability to work on a mobile or tablet device without any software installation using web-based protocols. M-DIP implements three levels of architecture with a relational middle-layer database, a stand-alone DIP server, and a mobile application logic middle level realizing user interpretation for direct querying and communication. This imaging software has the ability to display biological imaging data at multiple zoom levels and to increase its quality to meet users' expectations. Interpretation of bioimaging data is facilitated by an interface analogous to online mapping services using real world coordinate browsing. This allows mobile devices to display multiple datasets simultaneously from a remote site. M-DIP can be used as a measurement repository that can be accessed by any network environment, such as a portable mobile or tablet device. In addition, this system and combination with mobile applications are establishing a virtualization tool in the neuroinformatics field to speed interpretation services.
Lin, Meng Kuan; Nicolini, Oliver; Waxenegger, Harald; Galloway, Graham J.; Ullmann, Jeremy F. P.; Janke, Andrew L.
2013-01-01
Digital Imaging Processing (DIP) requires data extraction and output from a visualization tool to be consistent. Data handling and transmission between the server and a user is a systematic process in service interpretation. The use of integrated medical services for management and viewing of imaging data in combination with a mobile visualization tool can be greatly facilitated by data analysis and interpretation. This paper presents an integrated mobile application and DIP service, called M-DIP. The objective of the system is to (1) automate the direct data tiling, conversion, pre-tiling of brain images from Medical Imaging NetCDF (MINC), Neuroimaging Informatics Technology Initiative (NIFTI) to RAW formats; (2) speed up querying of imaging measurement; and (3) display high-level of images with three dimensions in real world coordinates. In addition, M-DIP provides the ability to work on a mobile or tablet device without any software installation using web-based protocols. M-DIP implements three levels of architecture with a relational middle-layer database, a stand-alone DIP server, and a mobile application logic middle level realizing user interpretation for direct querying and communication. This imaging software has the ability to display biological imaging data at multiple zoom levels and to increase its quality to meet users’ expectations. Interpretation of bioimaging data is facilitated by an interface analogous to online mapping services using real world coordinate browsing. This allows mobile devices to display multiple datasets simultaneously from a remote site. M-DIP can be used as a measurement repository that can be accessed by any network environment, such as a portable mobile or tablet device. In addition, this system and combination with mobile applications are establishing a virtualization tool in the neuroinformatics field to speed interpretation services. PMID:23847587
Data Standardization for Carbon Cycle Modeling: Lessons Learned
NASA Astrophysics Data System (ADS)
Wei, Y.; Liu, S.; Cook, R. B.; Post, W. M.; Huntzinger, D. N.; Schwalm, C.; Schaefer, K. M.; Jacobson, A. R.; Michalak, A. M.
2012-12-01
Terrestrial biogeochemistry modeling is a crucial component of carbon cycle research and provides unique capabilities to understand terrestrial ecosystems. The Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) aims to identify key differences in model formulation that drive observed differences in model predictions of biospheric carbon exchange. To do so, the MsTMIP framework provides standardized prescribed environmental driver data and a standard model protocol to facilitate comparisons of modeling results from nearly 30 teams. Model performance is then evaluated against a variety of carbon-cycle related observations (remote sensing, atmospheric, and flux tower-based observations) using quantitative performance measures and metrics in an integrated evaluation framework. As part of this effort, we have harmonized highly diverse and heterogeneous environmental driver data, model outputs, and observational benchmark data sets to facilitate use and analysis by the MsTMIP team. In this presentation, we will describe the lessons learned from this data-intensive carbon cycle research. The data harmonization activity itself can be made more efficient with the consideration of proper tools, version control, workflow management, and collaboration within the whole team. The adoption of on-demand and interoperable protocols (e.g. OPeNDAP and Open Geospatial Consortium) makes data visualization and distribution more flexible. Users can customize and download data in specific spatial extent, temporal period, and different resolutions. The effort to properly organize data in an open and standard format (e.g. Climate & Forecast compatible netCDF) allows the data to be analysed by a dispersed set of researchers more efficiently, and maximizes the longevity and utilization of the data. The lessons learned from this specific experience can benefit efforts by the broader community to leverage diverse data resources more efficiently in scientific research.
Comparing apples and oranges: the Community Intercomparison Suite
NASA Astrophysics Data System (ADS)
Schutgens, Nick; Stier, Philip; Kershaw, Philip; Pascoe, Stephen
2015-04-01
Visual representation and comparison of geoscientific datasets presents a huge challenge due to the large variety of file formats and spatio-temporal sampling of data (be they observations or simulations). The Community Intercomparison Suite attempts to greatly simplify these tasks for users by offering an intelligent but simple command line tool for visualisation and colocation of diverse datasets. In addition, CIS can subset and aggregate large datasets into smaller more manageable datasets. Our philosophy is to remove as much as possible the need for specialist knowledge by the user of the structure of a dataset. The colocation of observations with model data is as simple as: "cis col
NASA Astrophysics Data System (ADS)
Piasecki, M.; Ji, P.
2014-12-01
Geoscience data comes in many flavors that are determined by type of data such as continous on a grid or mesh or discrete colelcted at point either as one time samples or a stream of data coming of sensors, but coudl also encompass digital files of any time type such text files, WORD or EXCEL documents, or audio and video files. We present a storage facility that is comprsed of 6 nodes each of speciaized to host a certain data type: grid based data (netCDF on a THREDDS server), GIS data (shapefiles using GeoServer), point time series data (CUAHSI ODM), sample data (EDBS), and any digital data (RAMADAA) plus a server fro Remote sensing data and its products. While there is overlap in data type storage capabilities (rasters can go into several of these nodes) we prefer to use dedicated storage facilities that are a) freeware, and b) have a good degree of maturity, and c) have shown their utility for stroing a cetain type. In addition it allows to place these commonly used software stacks and storage solutiosn side-by-side to develop interoprability strategies. We have used a DRUPAL based system to handle user regoistration and authentication, and also use the system for data submission and data search. In support for tis system we developed an extensive controlled vocabulary system that is an amalgamation of various CVs used in the geosciecne community in order to achieve as high a degree of recognition, such the CF conventions, CUAHSI Cvs, , NASA (GCMD), EPA and USGS taxonomies, GEMET, in addition to ontological representations such as SWEET.
[On the ancient and magical lesions in the sixteenth to eighteenth centuries].
Hach, W; Hach-Wunderle, V
2014-11-01
At the beginning of the Renaissance magical, witchcraft and demonological medicine still played a large role in the poor healing ability of chronic leg ulcers. This included the general administration of magical potions and topical application. An example of the manipulation of the whole body by the devil was the Abracadabra text from Johann Christoph Bitterkraut in the year 1677. The use of bewitched ointments was particularly propagated by Paracelsus in 1622; however, even as early as the beginning of the seventeenth century, the invocation of supernatural powers was slowly diminishing until at the beginning of the nineteenth century the medical schools on chronic leg ulcers could be cultivated at the universities and by specialized wound healers.
An overview of the Opus language and runtime system
NASA Technical Reports Server (NTRS)
Mehrotra, Piyush; Haines, Matthew
1994-01-01
We have recently introduced a new language, called Opus, which provides a set of Fortran language extensions that allow for integrated support of task and data parallelism. lt also provides shared data abstractions (SDA's) as a method for communication and synchronization among these tasks. In this paper, we first provide a brief description of the language features and then focus on both the language-dependent and language-independent parts of the runtime system that support the language. The language-independent portion of the runtime system supports lightweight threads across multiple address spaces, and is built upon existing lightweight thread and communication systems. The language-dependent portion of the runtime system supports conditional invocation of SDA methods and distributed SDA argument handling.
Liberty has its responsibilities: holding non-vaccinators liable for the harm they do.
Caplan, Arthur
2013-12-01
David Ropiek in his useful essay on how society should respond to the risks created by those who choose not to vaccinate themselves or their children does a very useful job of identifying the enormous costs in money and health that non-vaccinators create. He also pinpoints the many factors that drive vaccine resistance locating them not in a misunderstanding of the facts but, in fears and negative emotions. It is important to pay attention to his message since frequently those who want to try to reduce vaccine hesitation or outright non-vaccination behavior put their faith in education and resort to an invocation of the facts about the value of vaccines when it is fear and emotions that must be addressed.
A progress report on a NASA research program for embedded computer systems software
NASA Technical Reports Server (NTRS)
Foudriat, E. C.; Senn, E. H.; Will, R. W.; Straeter, T. A.
1979-01-01
The paper presents the results of the second stage of the Multipurpose User-oriented Software Technology (MUST) program. Four primary areas of activities are discussed: programming environment, HAL/S higher-order programming language support, the Integrated Verification and Testing System (IVTS), and distributed system language research. The software development environment is provided by the interactive software invocation system. The higher-order programming language (HOL) support chosen for consideration is HAL/S mainly because at the time it was one of the few HOLs with flight computer experience and it is the language used on the Shuttle program. The overall purpose of IVTS is to provide a 'user-friendly' software testing system which is highly modular, user controlled, and cooperative in nature.
Bondi, Mark W; Serody, Adam B; Chan, Agnes S; Eberson-Shumate, Sonja C; Delis, Dean C; Hansen, Lawrence A; Salmon, David P
2002-07-01
The Stroop Color-Word Test (SCWT; C. Golden, 1978) was examined in 59 patients with probable Alzheimer's disease (AD) and in 51 demographically comparable normal control (NC) participants. AD patients produced significantly larger Stroop interference effects than NC participants, and level of dementia severity significantly influenced SCWT performance. Principal-components analyses demonstrated a dissociation in the factor structure of the Stroop trials between NC participants and AD patients, suggesting that disruption of semantic knowledge and speeded verbal processing in AD may be a major contributor to impairment on the incongruent trial. Results of clinicopathologic correlations in an autopsy-confirmed AD subgroup further suggest the invocation of a broad network of integrated cortical regions and executive and language processes underlying successful SCWT performance.
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J., III; Kaber, David B.
2006-01-01
This report presents a review of literature on approaches to adaptive and adaptable task/function allocation and adaptive interface technologies for effective human management of complex systems that are likely to be issues for the Next Generation Air Transportation System, and a focus of research under the Aviation Safety Program, Integrated Intelligent Flight Deck Project. Contemporary literature retrieved from an online database search is summarized and integrated. The major topics include the effects of delegation-type, adaptable automation on human performance, workload and situation awareness, the effectiveness of various automation invocation philosophies and strategies to function allocation in adaptive systems, and the role of user modeling in adaptive interface design and the performance implications of adaptive interface technology.
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Basham, Bryan D.
1989-01-01
CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.
Comparison of the F2 Structure Function in Iron as Measured by Charged Lepton and Neutrino Probes
NASA Astrophysics Data System (ADS)
Kalantarians, Narbe; Christy, Eric; Keppel, Cynthia
2017-09-01
World data for the F2 structure function for Iron, as measured by multiple charged lepton and neutrino deep inelastic scattering experiments, are compared. Data obtained from charged lepton and neutrino scattering at larger values of x are in remarkably good agreement with a simple invocation of the 18/5 rule, while a discrepancy in the behavior of the data obtained from the different probes well beyond the data uncertainties is observed in the shadowing/anti-shadowing transition region where the Bjorken scaling variable x is less than 0.15. The data are compared to theoretical calculations. Details and results of the data comparison will be presented, along with future plans.
Global Ocean Currents Database
NASA Astrophysics Data System (ADS)
Boyer, T.; Sun, L.
2016-02-01
The NOAA's National Centers for Environmental Information has released an ocean currents database portal that aims 1) to integrate global ocean currents observations from a variety of instruments with different resolution, accuracy and response to spatial and temporal variability into a uniform network common data form (NetCDF) format and 2) to provide a dedicated online data discovery, access to NCEI-hosted and distributed data sources for ocean currents data. The portal provides a tailored web application that allows users to search for ocean currents data by platform types and spatial/temporal ranges of their interest. The dedicated web application is available at http://www.nodc.noaa.gov/gocd/index.html. The NetCDF format supports widely-used data access protocols and catalog services such as OPeNDAP (Open-source Project for a Network Data Access Protocol) and THREDDS (Thematic Real-time Environmental Distributed Data Services), which the GOCD users can use data files with their favorite analysis and visualization client software without downloading to their local machine. The potential users of the ocean currents database include, but are not limited to, 1) ocean modelers for their model skills assessments, 2) scientists and researchers for studying the impact of ocean circulations on the climate variability, 3) ocean shipping industry for safety navigation and finding optimal routes for ship fuel efficiency, 4) ocean resources managers while planning for the optimal sites for wastes and sewages dumping and for renewable hydro-kinematic energy, and 5) state and federal governments to provide historical (analyzed) ocean circulations as an aid for search and rescue
NASA Astrophysics Data System (ADS)
Meertens, C. M.; Murray, D.; McWhirter, J.
2004-12-01
Over the last five years, UNIDATA has developed an extensible and flexible software framework for analyzing and visualizing geoscience data and models. The Integrated Data Viewer (IDV), initially developed for visualization and analysis of atmospheric data, has broad interdisciplinary application across the geosciences including atmospheric, ocean, and most recently, earth sciences. As part of the NSF-funded GEON Information Technology Research project, UNAVCO has enhanced the IDV to display earthquakes, GPS velocity vectors, and plate boundary strain rates. These and other geophysical parameters can be viewed simultaneously with three-dimensional seismic tomography and mantle geodynamic model results. Disparate data sets of different formats, variables, geographical projections and scales can automatically be displayed in a common projection. The IDV is efficient and fully interactive allowing the user to create and vary 2D and 3D displays with contour plots, vertical and horizontal cross-sections, plan views, 3D isosurfaces, vector plots and streamlines, as well as point data symbols or numeric values. Data probes (values and graphs) can be used to explore the details of the data and models. The IDV is a freely available Java application using Java3D and VisAD and runs on most computers. UNIDATA provides easy-to-follow instructions for download, installation and operation of the IDV. The IDV primarily uses netCDF, a self-describing binary file format, to store multi-dimensional data, related metadata, and source information. The IDV is designed to work with OPeNDAP-equipped data servers that provide real-time observations and numerical models from distributed locations. Users can capture and share screens and animations, or exchange XML "bundles" that contain the state of the visualization and embedded links to remote data files. A real-time collaborative feature allows groups of users to remotely link IDV sessions via the Internet and simultaneously view and control the visualization. A Jython-based formulation facility allows computations on disparate data sets using simple formulas. Although the IDV is an advanced tool for research, its flexible architecture has also been exploited for educational purposes with the Virtual Geophysical Exploration Environment (VGEE) development. The VGEE demonstration added physical concept models to the IDV and curricula for atmospheric science education intended for the high school to graduate student levels.
Network-based Modeling of Mesoscale Catchments - The Hydrology Perspective of Glowa-danube
NASA Astrophysics Data System (ADS)
Ludwig, R.; Escher-Vetter, H.; Hennicker, R.; Mauser, W.; Niemeyer, S.; Reichstein, M.; Tenhunen, J.
Within the GLOWA initiative of the German Ministry for Research and Educa- tion (BMBF), the project GLOWA-Danube is funded to establish a transdisciplinary network-based decision support tool for water related issues in the Upper Danube wa- tershed. It aims to develop and validate integration techniques, integrated models and integrated monitoring procedures and to implement them in the network-based De- cision Support System DANUBIA. An accurate description of processes involved in energy, water and matter fluxes and turnovers requires an intense collaboration and exchange of water related expertise of different scientific disciplines. DANUBIA is conceived as a distributed expert network and is developed on the basis of re-useable, refineable, and documented sub-models. In order to synthesize a common understand- ing between the project partners, a standardized notation of parameters and functions and a platform-independent structure of computational methods and interfaces has been established using the Unified Modeling Language UML. DANUBIA is object- oriented, spatially distributed and raster-based at its core. It applies the concept of "proxels" (Process Pixel) as its basic object, which has different dimensions depend- ing on the viewing scale and connects to its environment through fluxes. The presented study excerpts the hydrological view point of GLOWA-Danube, its approach of model coupling and network based communication (using the Remote Method Invocation RMI), the object-oriented technology to simulate physical processes and interactions at the land surface and the methodology to treat the issue of spatial and temporal scal- ing in large, heterogeneous catchments. The mechanisms applied to communicate data and model parameters across the typical discipline borders will be demonstrated from the perspective of a land-surface object, which comprises the capabilities of interde- pendent expert models for snowmelt, soil water movement, runoff formation, plant growth and radiation balance in a distributed JAVA-based modeling environment. The coupling to the adjacent physical objects of atmosphere, groundwater and river net- work will also be addressed.
Tools and strategies for instrument monitoring, data mining and data access
NASA Astrophysics Data System (ADS)
van Hees, R. M., ,, Dr
2009-04-01
The ever growing size of data sets produced by various satellite instruments creates a challenge in data management. Three main tasks were identified: instrument performance monitoring, data mining by users and data deployment. In this presentation, I will discuss the three tasks and our solution. As a practical example to illustrate the problem and make the discussion less abstract, I will use Sciamachy on-board the ESA satellite Envisat. Since the launch of Envisat, in March 2002, Sciamachy has performed nearly a billion science measurements and performed daily calibrations measurements. The total size of the data set (not including reprocessed data) is over 30 TB, distributed over 150,000 files. [Instrument Monitoring] Most instruments produce house-keeping data, which may include time, geo-location, temperature of different parts of the instrument and instrument settings and configuration. In addition, many instruments perform calibration measurements. Instrument performance monitoring requires automated analyzes of critical parameters for events, and the option to off-line inspect the behavior of various parameters in time. We choose to extract the necessary information from the SCIAMACHY data products, and store everything in one file, where we separated house-keeping data from calibration measurements. Due to the large volume and the need to have quick random-access, the Hierarchical Data Format (HDF5) was our obvious choice. The HDF5 format is self describing and designed to organize different types of data in one file. For example, one data set may contain the meta data of the calibration measurements: time, geo-location, instrument settings, quality parameters (temperature of the instrument), while a second large data set contains the actual measurements. The HDF5 high-level packet table API is ideal for tables that only grow (by appending rows), while the HDF5 table API is better suited for tables where rows need to be updated, inserted or replaced. In particular, the packet table API allows very compact storage of compound data sets and very fast read/write access. Details about this implementation and pitfalls will be given in the presentation. [Data Mining] The ability to select relevant data is a requirement that all data centers have to offer. The NL-SCIA-DC allows the users to select data using several criteria including: time, geo-location, type of observation and data quality. The result of the query are [i] location and name of relevant data products (files), or [ii] listing of meta data of the relevant measurements, or [iii] listing of the measurements (level 2 or higher). For this application, we need the power of a relational database, the SQL language, and the availability of spatial functions. PostgreSQL, extended with postGIS support turned out to be a good choice. Common queries on tables with millions of rows can be executed within seconds. [Data Deployment] The dissemination of scientific data is often cumbersome by the usage of many different formats to store the products. Therefore, time-consuming and inefficient conversions are needed to use data products from different origin. Within the Atmospheric Data Access for the Geospatial User Community (ADAGUC) project we provide selected space borne atmospheric and land data sets in the same data format and consistent internal structure, so that users can easily use and combine data. The common format for storage is HDF5, but the netCDF-4 API is used to create the data sets. The standard for metadata and dataset attributes follow the netCDF Climate and Forecast conventions, in addition metadata complies to the ISO 19115:2003 INSPIRE profile are added. The advantage of netCDF-4 is that the API is essentially equal to netCDF-3 (with a few extensions), while the data format is HDF5 (recognized by many scientific tools). The added metadata ensures product traceability. Details will be given in the presentation and several posters.
Common Patterns with End-to-end Interoperability for Data Access
NASA Astrophysics Data System (ADS)
Gallagher, J.; Potter, N.; Jones, M. B.
2010-12-01
At first glance, using common storage formats and open standards should be enough to ensure interoperability between data servers and client applications, but that is often not the case. In the REAP (Realtime Environment for Analytical Processing; NSF #0619060) project we integrated access to data from OPeNDAP servers into the Kepler workflow system and found that, as in previous cases, we spent the bulk of our effort addressing the twin issues of data model compatibility and integration strategies. Implementing seamless data access between a remote data source and a client application (data sink) can be broken down into two kinds of issues. First, the solution must address any differences in the data models used by the data source (OPeNDAP) and the data sink (the Kepler workflow system). If these models match completely, there is little work to be done. However, that is rarely the case. To map OPeNDAP's data model to Kepler's, we used two techniques (ignoring trivial conversions): On-the-fly type mapping and out-of-band communication. Type conversion takes place both for data and metadata because Kepler requires a priori knowledge of some aspects (e.g., syntactic metadata) of the data to build a workflow. In addition, OPeNDAP's constraint expression syntax was used to send out-of-band information to restrict the data requested from the server, facilitating changes in the returned data's type. This technique provides a way for users to exert fine-grained control over the data request, a potentially useful technique, at the cost of requiring that users understand a little about the data source's processing capabilities. The second set of issues for end-to-end data access are integration strategies. OPeNDAP provides several different tools for bringing data into an application: C++, C and Java libraries that provide functions for newly written software; The netCDF library which enables existing applications to read from servers using an older interface; and simple file transfers. These options affect seamlessness in that they represent tradeoffs in new development (required for the first option) with cumbersome extra user actions (required by the last option). While the middle option, adding new functionality to an existing library (netCDF), is very appealing because practice has shown that it can be very effective over a wide range of clients, it's very hard to build these libraries because correctly writing a new implementation of an existing API that preserves the original's exact semantics can be a daunting task. In the example discussed here, we developed a new module for Kepler using OPeNDAP's Java API. This provided a way to leverage internal optimizations for data organization in Kepler and we felt that outweighed the additional cost of new development and the need for users to learn how to use a new Kepler module. While common storage formats and open standards play an important role in data access, our work with the Kepler workflow system reinforces the experience that matching the data models of the data server (source) and user client (sink) and choosing the most appropriate integration strategy are critical to achieving interoperability.
SGP and TWP (Manus) Ice Cloud Vertical Velocities
Kalesse, Heike
2013-06-27
Daily netcdf-files of ice-cloud dynamics observed at the ARM sites at SGP (Jan1997-Dec2010) and Manus (Jul1999-Dec2010). The files include variables at different time resolution (10s, 20min, 1hr). Profiles of radar reflectivity factor (dbz), Doppler velocity (vel) as well as retrieved vertical air motion (V_air) and reflectivity-weighted particle terminal fall velocity (V_ter) are given at 10s, 20min and 1hr resolution. Retrieved V_air and V_ter follow radar notation, so positive values indicate downward motion. Lower level clouds are removed, however a multi-layer flag is included.
This is an R statistics package script that allows the reproduction of Figure 5. The script includes the links to large NetCDF files that the figures access for O3, CO, wind speed, radiation and PBL height. It pulls the timeseries for each variable at a number of cities (lat-lon specified). This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).
SACR ADVance 3-D Cartesian Cloud Cover (SACR-ADV-3D3C) product
Meng Wang, Tami Toto, Eugene Clothiaux, Katia Lamer, Mariko Oue
2017-03-08
SACR-ADV-3D3C remaps the outputs of SACRCORR for cross-wind range-height indicator (CW-RHI) scans to a Cartesian grid and reports reflectivity CFAD and best estimate domain averaged cloud fraction. The final output is a single NetCDF file containing all aforementioned corrected radar moments remapped on a 3-D Cartesian grid, the SACR reflectivity CFAD, a profile of best estimate cloud fraction, a profile of maximum observable x-domain size (xmax), a profile time to horizontal distance estimate and a profile of minimum observable reflectivity (dBZmin).
NASA Technical Reports Server (NTRS)
Callender, E. D.; Clarkson, T. B.; Frasier, C. E.
1980-01-01
The software design and documentation language (SDDL) is a general purpose processor to support a lanugage for the description of any system, structure, concept, or procedure that may be presented from the viewpoint of a collection of hierarchical entities linked together by means of binary connections. The language comprises a set of rules of syntax, primitive construct classes (module, block, and module invocation), and language control directives. The result is a language with a fixed grammar, variable alphabet and punctuation, and an extendable vocabulary. The application of SDDL to the detailed software design of the Command Data Subsystem for the Galileo Spacecraft is discussed. A set of constructs was developed and applied. These constructs are evaluated and examples of their application are considered.
Improving generalized inverted index lock wait times
NASA Astrophysics Data System (ADS)
Borodin, A.; Mirvoda, S.; Porshnev, S.; Ponomareva, O.
2018-01-01
Concurrent operations on tree like data structures is a cornerstone of any database system. Concurrent operations intended for improving read\\write performance and usually implemented via some way of locking. Deadlock-free methods of concurrency control are known as tree locking protocols. These protocols provide basic operations(verbs) and algorithm (ways of operation invocations) for applying it to any tree-like data structure. These algorithms operate on data, managed by storage engine which are very different among RDBMS implementations. In this paper, we discuss tree locking protocol implementation for General inverted index (Gin) applied to multiversion concurrency control (MVCC) storage engine inside PostgreSQL RDBMS. After that we introduce improvements to locking protocol and provide usage statistics about evaluation of our improvement in very high load environment in one of the world’s largest IT company.
Jones, Robert P
2002-01-01
Liberals often view religion chiefly as "a problem" for democratic discourse in modern pluralistic societies and propose an allegedly neutral solution in the form of philosophical distinctions between "the right" and "the good" or populist invocations of a "right to choose." Drawing on cultural theory and ethnographic research among activists in the Oregon debates over the legalization of physician-assisted suicide, I demonstrate that liberal "neutrality" harbors its own cultural bias, flattens the complexity of public debates, and undermines liberalism's own commitments to equality. I conclude that the praiseworthy liberal goal of impartiality in policy decisions would best be met not by the inaccessible norm of neutrality but by a norm of inclusivity, which intentionally solicits multiple cultural perspectives.
Development of an Operational TS Dataset Production System for the Data Assimilation System
NASA Astrophysics Data System (ADS)
Kim, Sung Dae; Park, Hyuk Min; Kim, Young Ho; Park, Kwang Soon
2017-04-01
An operational TS (Temperature and Salinity) dataset production system was developed to provide near real-time data to the data assimilation system periodically. It collects the latest 15 days' TS data of the north western pacific area (20°N - 55°N, 110°E - 150°E), applies QC tests to the archived data and supplies them to numerical prediction models of KIOST (Korea Institute of Ocean Science and Technology). The latest real-time TS data are collected from Argo GDAC and GTSPP data server every week. Argo data are downloaded from /latest_data directory of Argo GDAC. Because many duplicated data exist when all profile data are extracted from all Argo netCDF files, DB system is used to avoid duplication. All metadata (float ID, location, observation date and time, etc) of all Argo floats is stored into Database system and a Matlab program was developed to manipulate DB data, to check the duplication and to exclude duplicated data. GTSPP data are downloaded from /realtime directory of GTSPP data service. The latest data except ARGO data are extracted from the original data. Another Matlab program was coded to inspect all collected data using 10 QC tests and produce final dataset which can be used by the assimilation system. Three regional range tests to inspect annual, seasonal and monthly variations are included in the QC procedures. The C program was developed to provide regional ranges to data managers. It can calculate upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. The final TS dataset contains the latest 15 days' TS data in netCDF format. It is updated every week and transmitted to numerical modeler of KIOST for operational use.
NASA Astrophysics Data System (ADS)
Brisc, Felicia; Vater, Stefan; Behrens, Joern
2016-04-01
We present the UGRID Reader, a visualization software component that implements the UGRID Conventions into Paraview. It currently supports the reading and visualization of 2D unstructured triangular, quadrilateral and mixed triangle/quadrilateral meshes, while the data can be defined per cell or per vertex. The Climate and Forecast Metadata Conventions (CF Conventions) have been set for many years as the standard framework for climate data written in NetCDF format. While they allow storing unstructured data simply as data defined at a series of points, they do not currently address the topology of the underlying unstructured mesh. However, it is often necessary to have additional mesh topology information, i.e. is it a one dimensional network, a 2D triangular mesh or a flexible mixed triangle/quadrilateral mesh, a 2D mesh with vertical layers, or a fully unstructured 3D mesh. The UGRID Conventions proposed by the UGRID Interoperability group are attempting to fill in this void by extending the CF Conventions with topology specifications. As the UGRID Conventions are increasingly popular with an important subset of the CF community, they warrant the development of a customized tool for the visualization and exploration of UGRID-conforming data. The implementation of the UGRID Reader has been designed corresponding to the ParaView plugin architecture. This approach allowed us to tap into the powerful reading and rendering capabilities of ParaView, while the reader is easy to install. We aim at parallelism to be able to process large data sets. Furthermore, our current application of the reader is the visualization of higher order simulation output which demands for a special representation of the data within a cell.
The BLAZE language: A parallel language for scientific programming
NASA Technical Reports Server (NTRS)
Mehrotra, P.; Vanrosendale, J.
1985-01-01
A Pascal-like scientific programming language, Blaze, is described. Blaze contains array arithmetic, forall loops, and APL-style accumulation operators, which allow natural expression of fine grained parallelism. It also employs an applicative or functional procedure invocation mechanism, which makes it easy for compilers to extract coarse grained parallelism using machine specific program restructuring. Thus Blaze should allow one to achieve highly parallel execution on multiprocessor architectures, while still providing the user with onceptually sequential control flow. A central goal in the design of Blaze is portability across a broad range of parallel architectures. The multiple levels of parallelism present in Blaze code, in principle, allow a compiler to extract the types of parallelism appropriate for the given architecture while neglecting the remainder. The features of Blaze are described and shows how this language would be used in typical scientific programming.
The "hour of pink twilight": lesbian poetics and queer encounters on the fin-de-siècle street.
Flint, Kate
2009-01-01
This essay examines the cultural representation of women's encounters on the fin-de-siècle street and, in particular, the uncertainties that clustered around the possibilities of mutual, or one-sided, same-sex desire accompanying such meetings. It argues, through an examination of lyric poetry, paintings, and short fiction, for the usefulness of twilight--a time of shadowy ambiguity--as a trope to suggest these uncertainities. More than this, it maintains that the lyric and the developing genre of the short story were modes ideally suited to an invocation of the fluid, the uncertain, and the unnamable. This argument is advanced through a close reading of Charlotte Mew's strange short story "Passed," which is read as representative of a transitional moment in lesbian literary history.
NASA Astrophysics Data System (ADS)
Lindholm, D. M.; Weigel, R. S.; Wilson, A.; Ware Dewolfe, A.
2009-12-01
Data analysis in the physical sciences is often plagued by the difficulty in acquiring the desired data. A great deal of work has been done in the area of metadata and data discovery, however, many such discoveries simply provide links that lead directly to a data file. Often these files are impractically large, containing more time samples or variables than desired, and are slow to access. Once these files are downloaded, format issues further complicate using the data. Some data servers have begun to address these problems by improving data virtualization and ease of use. However, these services often don't scale to large datasets. Also, the generic nature of the data models used by these servers, while providing greater flexibility, may complicate setting up such a service for data providers and limit sufficient semantics that would otherwise simplify use for clients, machine or human. The Time Series Data Server (TSDS) aims to address these problems within the limited, yet common, domain of time series data. With the simplifying assumption that all data products served are a function of time, the server can optimize for data access based on time subsets, a common use case. The server also supports requests for specific variables, which can be of type scalar, structure, or sequence. It also supports data types with higher level semantics, such as "spectrum." The TSDS is implemented using Java Servlet technology and can be dropped into any servlet container and customized for a data provider's needs. The interface is based on OPeNDAP (http://opendap.org) and conforms to the Data Acces Protocol (DAP) 2.0, a NASA standard (ESDS-RFC-004), which defines a simple HTTP request and response paradigm. Thus a TSDS server instance is a compliant OPeNDAP server that can be accessed by any OPeNDAP client or directly via RESTful web service requests. The TSDS reads the data that it serves into a common data model via the NetCDF Markup Language (NcML, http://www.unidata.ucar.edu/software/netcdf/ncml/) which enables dataset virtualization. An NcML file can expose a single file, a subset, or an aggregation of files as a single, logical dataset. With the appropriate NcML adapter, the TSDS can read data from its native format, eliminating the need for data providers to reformat their data and lowering the barrier for integration. Data can even be read via remote services which is important for enabling VxOs to be truly virtual. The TSDS provides reading, writing, and filtering capabilities through a modular framework. A collection of standard modules is available and customized modules are easy to create and integrate. This way the TSDS can read and write data in a variety of formats and apply filters to them an a manner customizable to meet the needs of both the data providers and consumers. The TSDS server is currently in use serving solar irradiance data from the LASP Interactive Solar IRradiance Datacenter (LISIRD, http://lasp.colorado.edu/lisird/), and is being introduced into the space physics virtual observatory community. The TSDS software is Open Source and available at SourceForge.
The Basic Radar Altimetry Toolbox for Sentinel 3 Users
NASA Astrophysics Data System (ADS)
Lucas, Bruno; Rosmorduc, Vinca; Niemeijer, Sander; Bronner, Emilie; Dinardo, Salvatore; Benveniste, Jérôme
2013-04-01
The Basic Radar Altimetry Toolbox (BRAT) is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2006 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales). The latest version of the software, 3.1, was released on March 2012. The tools enable users to interact with the most common altimetry data formats, being the most used way, the Graphical User Interface (BratGui). This GUI is a front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. The BratDisplay (graphic visualizer) can be launched from BratGui, or used as a stand-alone tool to visualize netCDF files - it is distributed with another ESA toolbox (GUT) as the visualizer. The most frequent uses of BRAT are teaching remote sensing, altimetry data reading (all missions from ERS-1 to Saral and soon Sentinel-3), quick data visualization/export and simple computation on the data fields. BRAT can be used for importing data and having a quick look at his contents, with several different types of plotting available. One can also use it to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BratGui involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas (MSS, -SSH, MSLA, editing of spurious data, etc.). The documentation collection includes the standard user manual explaining all the ways to interact with the set of software tools but the most important item is the Radar Altimeter Tutorial, that contains a strong introduction to altimetry, showing its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "data use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The upcoming release that is on the forge will focus on Sentinel 3 Surface Topography Mission that is build on the successful heritage of ERS, Envisat and Cryosat. The first of the two sentinel is expected to be launched in 2014. It will have on-board a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter and will provide measurements at a resolution of ~300m in SAR mode along track. Sentinel 3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The future version will provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and made them aware of the great potential of SAR altimetery for coastal and inland applications. The BRAT software is distributed under the GNU GPL open-source license and can be obtained, along with all the documentation (including the tutorial), on the webstite: http://earth.esa.int/brat
WMT: The CSDMS Web Modeling Tool
NASA Astrophysics Data System (ADS)
Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.
2015-12-01
The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Evans, B. J. K.
2015-12-01
The National Computational Infrastructure (NCI) at the Australian National University (ANU) has evolved to become Australia's peak computing centre for national computational and Data-intensive Earth system science. More recently NCI collocated 10 Petabytes of 34 major national and international environmental, climate, earth system, geophysics and astronomy data collections to create the National Environmental Research Interoperability Data Platform (NERDIP). Spatial scales of the collections range from global to local ultra-high resolution, whilst sizes range from 3PB down to a few GB. The data is highly connected to both NCI HPC and cloud resources via low latency internal networks with massive bandwidth. Now that the collections are collocated on a single data platform, the 'Hype' and expectations around potential use cases for the NERDIP are high. Not unexpected issues are emerging such as access, licensing issues, ownership, and incompatible data standards. Many communities are standardised within their domain, but achieving true interdisciplinary science will require all communities to move towards open interoperable data formats such as NetCDF4/HDF5. This transition will impact on software using proprietary or non-open standards. But before we reach the 'Plateau of Productivity', there needs to be greater 'Enlightenment' of users to encourage them to realise that this unprecedented Earth system science platform provides a rich mine of opportunities for discovery and innovation for a diverse range of both domain-specific and interdisciplinary investigations including climate and weather research, impact analysis, environment, remote sensing and geophysics and develop new and innovative interdisciplinary use cases that will guide those architecting the system and help minimise the amplitude of the 'Trough of Disillusionment' and ensure greater productivity and uptake of the collections that make NERDIP unique in the next generation of Data-intensive Science.
The Convergence of High Performance Computing and Large Scale Data Analytics
NASA Astrophysics Data System (ADS)
Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.
2015-12-01
As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.
Armstrong, Brandy N.; Warner, John C.; Voulgaris, George; List, Jeffrey H.; Thieler, E. Robert; Martini, Marinna A.; Montgomery, Ellyn T.
2011-01-01
This Open-File Report provides information collected for an oceanographic field study that occurred during January - May 2009 to investigate processes that control the sediment transport dynamics at Diamond Shoals, North Carolina. The objective of this report is to make the data available in digital form and to provide information to facilitate further analysis of the data. The report describes the background, experimental setup, equipment, and locations of the sensor deployments. The edited data are presented in time-series plots for rapid visualization of the data set, and in data files that are in the Network Common Data Format (netcdf). Supporting observational data are also included.
GOME/ERS-2: New Homogeneous Level 1B Data from an Old Instrument
NASA Astrophysics Data System (ADS)
Slijkhuis, S.; Aberle, B.; Coldewey-Egbers, M.; Loyola, D.; Dehn, A.; Fehr, T.
2015-11-01
In the framework of ESA's "GOME Evolution Project", a reprocessing will be made of the entire 16 year GOME Level 1 dataset. The GOME Evolution Project further includes the generation of a new GOME water vapour product, and a public outreach programme.In this paper we will describe the reprocessing of the Level 1 data, carried out with the latest version of the GOME Data Processor at DLR. The change most visible to the user will be the new product format in NetCDF, plus supporting documentation (ATBD and PUM). Full-mission reprocessed L1b data are expected to be released in the 4th quarter of 2015.
Web-based CERES Clouds QC Property Viewing Tool
NASA Astrophysics Data System (ADS)
Smith, R. A.; Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Minnis, P.
2014-12-01
This presentation will display the capabilities of a web-based CERES cloud property viewer. Terra data will be chosen for examples. It will demonstrate viewing of cloud properties in gridded global maps, histograms, time series displays, latitudinal zonal images, binned data charts, data frequency graphs, and ISCCP plots. Images can be manipulated by the user to narrow boundaries of the map as well as color bars and value ranges, compare datasets, view data values, and more. Other atmospheric studies groups will be encouraged to put their data into the underlying NetCDF data format and view their data with the tool. A laptop will hopefully be available to allow conference attendees to try navigating the tool.
Arguing from Nature: The role of `nature' in students' argumentations on a socio-scientific issue
NASA Astrophysics Data System (ADS)
Nielsen, Jan Alexis
2012-03-01
This paper explores how students invoked different conceptions of 'nature' in eight socio-scientific group discussions about human gene therapy. The paper illustrates and discusses how the students articulated nature and to what extent they elicited science factual content in the process. While the students in this study invoked nature at key places in a variety of dialectical contexts in the discussions, these invocations were often uncritical appeals and rarely involved science factual content. Even when an argument from nature was challenged, the author of that argument would often shift the sense of nature rather than elaborate upon the argumentation. It is argued that if students were properly introduced to the evaluative character of the term 'nature' it would not just be conducive to the quality of their argumentation, but also invite them to foreground science factual content at key places in their discussion.
Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations
NASA Astrophysics Data System (ADS)
Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.
2010-11-01
We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.
Culture at work: Family therapy and the culture concept in post-World War II America.
Weinstein, Deborah F
2004-01-01
During the 1950s and 1960s, the concept of culture had currency beyond the disciplinary boundaries of anthropology and sociology. This article takes up a clinical example of the invocation of the culture concept by examining how early family therapists such as Nathan Ackerman, Murray Bowen, and Don Jackson used culture as a category of analysis during the formative years of their new field. The culture concept played an integral role in the processes by which family therapists simultaneously defined the object of their research and treatment, the family, and built their new field. Their varied uses of culture also contained tensions and contradictions, most notably between universal and relativist views of family and psychopathology and between views of family therapy as a conservative force for maintaining the nuclear family or a progressive force for overcoming social inequality. Copyright 2004 Wiley Periodicals, Inc.
Understanding Aprun Use Patterns
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Hwa-Chun Wendy
2009-05-06
On the Cray XT, aprun is the command to launch an application to a set of compute nodes reserved through the Application Level Placement Scheduler (ALPS). At the National Energy Research Scientific Computing Center (NERSC), interactive aprun is disabled. That is, invocations of aprun have to go through the batch system. Batch scripts can and often do contain several apruns which either use subsets of the reserved nodes in parallel, or use all reserved nodes in consecutive apruns. In order to better understand how NERSC users run on the XT, it is necessary to associate aprun information with jobs. Itmore » is surprisingly more challenging than it sounds. In this paper, we describe those challenges and how we solved them to produce daily per-job reports for completed apruns. We also describe additional uses of the data, e.g. adjusting charging policy accordingly or associating node failures with jobs/users, and plans for enhancements.« less
Rudowski, R; Frostell, C; Gill, H
1989-09-01
The KUSIVAR is an expert system for mechanical ventilation of adult patients suffering from respiratory insufficiency. Its main objective is to provide guidance in respirator management. The knowledge base includes both qualitative, rule-based knowledge and quantitative knowledge expressed in the form of mathematical models (expert control) which is used for prediction of arterial gas tensions and optimization purposes. The system is data driven and uses a forward chaining mechanism for rule invocation. The interaction with the user will be performed in advisory, critiquing, semi-automatic and automatic modes. The system is at present in an advanced prototype stage. Prototyping is performed using KEE (Knowledge Engineering Environment) on a Sperry Explorer workstation. For further development and clinical use the expert system will be downloaded to an advanced PC. The system is intended to support therapy with a Siemens-Elema Servoventilator 900 C.
Hardisty, Frank; Robinson, Anthony C.
2010-01-01
In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423
Protecting Database Centric Web Services against SQL/XPath Injection Attacks
NASA Astrophysics Data System (ADS)
Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique
Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.
On different types of uncertainties in the context of the precautionary principle.
Aven, Terje
2011-10-01
Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean. © 2011 Society for Risk Analysis.
Clausewitz, nonlinearity, and the unpredictability of war
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyerchen, A.
Despite the frequent invocations of his name in recent years, especially during the Gulf War, there is something deeply perplexing about the work of Carl von Clausewitz (1780-1831). In particular, his unfinished magnum opus On War seems to offer a theory of war, at the same time that is perversely denies many of the fundamental preconditions of theory as such - simplification, generalization and prediction, among others. The book continues to draw the attention of both soldiers and theorists of war, although soldiers often find the ideas of Clausewitz too philosophical to appear practical, while analysts usually find his thoughtsmore » too empirical to seem elegant. Members of both groups sense that there is too much truth in what he writes to ignore him. Yet, as the German historian Hans Rothfels has bluntly put it, Clausewitz is an author more quoted than actually read.' 84 refs.« less
The BLAZE language - A parallel language for scientific programming
NASA Technical Reports Server (NTRS)
Mehrotra, Piyush; Van Rosendale, John
1987-01-01
A Pascal-like scientific programming language, BLAZE, is described. BLAZE contains array arithmetic, forall loops, and APL-style accumulation operators, which allow natural expression of fine grained parallelism. It also employs an applicative or functional procedure invocation mechanism, which makes it easy for compilers to extract coarse grained parallelism using machine specific program restructuring. Thus BLAZE should allow one to achieve highly parallel execution on multiprocessor architectures, while still providing the user with conceptually sequential control flow. A central goal in the design of BLAZE is portability across a broad range of parallel architectures. The multiple levels of parallelism present in BLAZE code, in principle, allow a compiler to extract the types of parallelism appropriate for the given architecture while neglecting the remainder. The features of BLAZE are described and it is shown how this language would be used in typical scientific programming.
Automated Environment Generation for Software Model Checking
NASA Technical Reports Server (NTRS)
Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.
2003-01-01
A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.
Semantic Web Service Delivery in Healthcare Based on Functional and Non-Functional Properties.
Schweitzer, Marco; Gorfer, Thilo; Hörbst, Alexander
2017-01-01
In the past decades, a lot of endeavor has been made on the trans-institutional exchange of healthcare data through electronic health records (EHR) in order to obtain a lifelong, shared accessible health record of a patient. Besides basic information exchange, there is a growing need for Information and Communication Technology (ICT) to support the use of the collected health data in an individual, case-specific workflow-based manner. This paper presents the results on how workflows can be used to process data from electronic health records, following a semantic web service approach that enables automatic discovery, composition and invocation of suitable web services. Based on this solution, the user (physician) can define its needs from a domain-specific perspective, whereas the ICT-system fulfills those needs with modular web services. By involving also non-functional properties for the service selection, this approach is even more suitable for the dynamic medical domain.
Ontology-aided Data Fusion (Invited)
NASA Astrophysics Data System (ADS)
Raskin, R.
2009-12-01
An ontology provides semantic descriptions that are analogous to those in a dictionary, but are readable by both computers and humans. A data or service is semantically annotated when it is formally associated with elements of an ontology. The ESIP Federation Semantic Web Cluster has developed a set of ontologies to describe datatypes and data services that can be used to support automated data fusion. The service ontology includes descriptors of the service function, its inputs/outputs, and its invocation method. The datatype descriptors resemble typical metadata fields (data format, data model, data structure, originator, etc.) augmented with descriptions of the meaning of the data. These ontologies, in combination with the SWEET science ontology, enable a registered data fusion service to be chained together and implemented that is scientifically meaningful based on machine understanding of the associated data and services. This presentation describes initial results and experiences in automated data fusion.
An XML-based system for the flexible classification and retrieval of clinical practice guidelines.
Ganslandt, T.; Mueller, M. L.; Krieglstein, C. F.; Senninger, N.; Prokosch, H. U.
2002-01-01
Beneficial effects of clinical practice guidelines (CPGs) have not yet reached expectations due to limited routine adoption. Electronic distribution and reminder systems have the potential to overcome implementation barriers. Existing electronic CPG repositories like the National Guideline Clearinghouse (NGC) provide individual access but lack standardized computer-readable interfaces necessary for automated guideline retrieval. The aim of this paper was to facilitate automated context-based selection and presentation of CPGs. Using attributes from the NGC classification scheme, an XML-based metadata repository was successfully implemented, providing document storage, classification and retrieval functionality. Semi-automated extraction of attributes was implemented for the import of XML guideline documents using XPath. A hospital information system interface was exemplarily implemented for diagnosis-based guideline invocation. Limitations of the implemented system are discussed and possible future work is outlined. Integration of standardized computer-readable search interfaces into existing CPG repositories is proposed. PMID:12463831
NASA Astrophysics Data System (ADS)
McGibbney, L. J.; Armstrong, E. M.
2016-12-01
Figuratively speaking, Scientific Datasets (SD) are shared by data producers in a multitude of shapes, sizes and flavors. Primarily however they exist as machine-independent manifestations supporting the creation, access, and sharing of array-oriented SD that can on occasion be spread across multiple files. Within the Earth Sciences, the most notable general examples include the HDF family, NetCDF, etc. with other formats such as GRIB being used pervasively within specific domains such as the Oceanographic, Atmospheric and Meteorological sciences. Such file formats contain Coverage Data e.g. a digital representation of some spatio-temporal phenomenon. A challenge for large data producers such as NASA and NOAA as well as consumers of coverage datasets (particularly surrounding visualization and interactive use within web clients) is that this is still not a straight-forward issue due to size, serialization and inherent complexity. Additionally existing data formats are either unsuitable for the Web (like netCDF files) or hard to interpret independently due to missing standard structures and metadata (e.g. the OPeNDAP protocol). Therefore alternative, Web friendly manifestations of such datasets are required.CoverageJSON is an emerging data format for publishing coverage data to the web in a web-friendly, way which fits in with the linked data publication paradigm hence lowering the barrier for interpretation by consumers via mobile devices and client applications, etc. as well as data producers who can build next generation Web friendly Web services around datasets. This work will detail how CoverageJSON is being evaluated at NASA JPL's PO.DAAC as an enabling data representation format for publishing SD as Linked Open Data embedded within SD landing pages as well as via semantic data repositories. We are currently evaluating how utilization of CoverageJSON within SD landing pages addresses the long-standing acknowledgement that SD producers are not currently addressing content-based optimization within their SD landing pages for better crawlability by commercial search engines.
Workflow-Oriented Cyberinfrastructure for Sensor Data Analytics
NASA Astrophysics Data System (ADS)
Orcutt, J. A.; Rajasekar, A.; Moore, R. W.; Vernon, F.
2015-12-01
Sensor streams comprise an increasingly large part of Earth Science data. Analytics based on sensor data require an easy way to perform operations such as acquisition, conversion to physical units, metadata linking, sensor fusion, analysis and visualization on distributed sensor streams. Furthermore, embedding real-time sensor data into scientific workflows is of growing interest. We have implemented a scalable networked architecture that can be used to dynamically access packets of data in a stream from multiple sensors, and perform synthesis and analysis across a distributed network. Our system is based on the integrated Rule Oriented Data System (irods.org), which accesses sensor data from the Antelope Real Time Data System (brtt.com), and provides virtualized access to collections of data streams. We integrate real-time data streaming from different sources, collected for different purposes, on different time and spatial scales, and sensed by different methods. iRODS, noted for its policy-oriented data management, brings to sensor processing features and facilities such as single sign-on, third party access control lists ( ACLs), location transparency, logical resource naming, and server-side modeling capabilities while reducing the burden on sensor network operators. Rich integrated metadata support also makes it straightforward to discover data streams of interest and maintain data provenance. The workflow support in iRODS readily integrates sensor processing into any analytical pipeline. The system is developed as part of the NSF-funded Datanet Federation Consortium (datafed.org). APIs for selecting, opening, reaping and closing sensor streams are provided, along with other helper functions to associate metadata and convert sensor packets into NetCDF and JSON formats. Near real-time sensor data including seismic sensors, environmental sensors, LIDAR and video streams are available through this interface. A system for archiving sensor data and metadata in NetCDF format has been implemented and will be demonstrated at AGU.
Damsel: A Data Model Storage Library for Exascale Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok; Liao, Wei-keng
Computational science applications have been described as having one of seven motifs (the “seven dwarfs”), each having a particular pattern of computation and communication. From a storage and I/O perspective, these applications can also be grouped into a number of data model motifs describing the way data is organized and accessed during simulation, analysis, and visualization. Major storage data models developed in the 1990s, such as Network Common Data Format (netCDF) and Hierarchical Data Format (HDF) projects, created support for more complex data models. Development of both netCDF and HDF5 was influenced by multi-dimensional dataset storage requirements, but their accessmore » models and formats were designed with sequential storage in mind (e.g., a POSIX I/O model). Although these and other high-level I/O libraries have had a beneficial impact on large parallel applications, they do not always attain a high percentage of peak I/O performance due to fundamental design limitations, and they do not address the full range of current and future computational science data models. The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. The project consists of three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community. The product of this project, Damsel library, is openly available for download from http://cucis.ece.northwestern.edu/projects/DAMSEL. Several case studies and application programming interface reference are also available to assist new users to learn to use the library.« less
Exposing Coverage Data to the Semantic Web within the MELODIES project: Challenges and Solutions
NASA Astrophysics Data System (ADS)
Riechert, Maik; Blower, Jon; Griffiths, Guy
2016-04-01
Coverage data, typically big in data volume, assigns values to a given set of spatiotemporal positions, together with metadata on how to interpret those values. Existing storage formats like netCDF, HDF and GeoTIFF all have various restrictions that prevent them from being preferred formats for use over the web, especially the semantic web. Factors that are relevant here are the processing complexity, the semantic richness of the metadata, and the ability to request partial information, such as a subset or just the appropriate metadata. Making coverage data available within web browsers opens the door to new ways for working with such data, including new types of visualization and on-the-fly processing. As part of the European project MELODIES (http://melodiesproject.eu) we look into the challenges of exposing such coverage data in an interoperable and web-friendly way, and propose solutions using a host of emerging technologies like JSON-LD, the DCAT and GeoDCAT-AP ontologies, the CoverageJSON format, and new approaches to REST APIs for coverage data. We developed the CoverageJSON format within the MELODIES project as an additional way to expose coverage data to the web, next to having simple rendered images available using standards like OGC's WMS. CoverageJSON partially incorporates JSON-LD but does not encode individual data values as semantic resources, making use of the technology in a practical manner. The development also focused on it being a potential output format for OGC WCS. We will demonstrate how existing netCDF data can be exposed as CoverageJSON resources on the web together with a REST API that allows users to explore the data and run operations such as spatiotemporal subsetting. We will show various use cases from the MELODIES project, including reclassification of a Land Cover dataset client-side within the browser with the ability for the user to influence the reclassification result by making use of the above technologies.
Rescue, Archival and Discovery of Tsunami Events on Marigrams
NASA Astrophysics Data System (ADS)
Eble, M. C.; Wright, L. M.; Stroker, K. J.; Sweeney, A.; Lancaster, M.
2017-12-01
The Big Earth Data Initiative made possible the reformatting of paper marigram records on which were recorded measurements of the 1946, 1952, 1960, and 1964 tsunamis generated in the Pacific Ocean. Data contained within each record were determined to be invaluable for tsunami researchers and operational agencies with a responsibility for issuing warnings during a tsunami event. All marigrams were carefully digitized and metadata were generated to form numerical datasets in order to provide the tsunami and other research and application-driven communities with quality data. Data were then packaged as CF-compliant netCDF datafiles and submitted to the NOAA Centers for Environmental Information for long-term stewardship, archival, and public discovery of both original scanned images and data in digital netCDF and CSC formats. The PNG plots of each time series were generated and included with data packages to provide a visual representation of the numerical data sets. ISO-compliant metadata were compiled for the collection at the event level and individual DOIs were minted for each of the four events included in this project. The procedure followed to reformat each record in this four-event subset of the larger NCEI scanned marigram inventory is presented and discussed. The practical use of these data is presented to highlight that even infrequent measurements of tsunamis hold information that may potentially help constrain earthquake rupture area, provide estimates of earthquake co-seismic slip distribution, identify subsidence or uplift, and significantly increase the holdings of situ data available for tsunami model validation. These same data may also prove valuable to the broader global tide community for validation and further development of tide models and for investigation into the stability of tidal harmonic constants. Data reformatted as part of this project are PARR compliant and meet the requirements for Data Management, Discoverability, Accessibility, Documentation, Readability, and Data Preservation and Stewardship as per the Big Earth Data Initiative.
A Comparison of a Brain-Based Adaptive System and a Manual Adaptable System for Invoking Automation
NASA Technical Reports Server (NTRS)
Bailey, Nathan R.; Scerbo, Mark W.; Freeman, Frederick G.; Mikulka, Peter J.; Scott, Lorissa A.
2004-01-01
Two experiments are presented that examine alternative methods for invoking automation. In each experiment, participants were asked to perform simultaneously a monitoring task and a resource management task as well as a tracking task that changed between automatic and manual modes. The monitoring task required participants to detect failures of an automated system to correct aberrant conditions under either high or low system reliability. Performance on each task was assessed as well as situation awareness and subjective workload. In the first experiment, half of the participants worked with a brain-based system that used their EEG signals to switch the tracking task between automatic and manual modes. The remaining participants were yoked to participants from the adaptive condition and received the same schedule of mode switches, but their EEG had no effect on the automation. Within each group, half of the participants were assigned to either the low or high reliability monitoring task. In addition, within each combination of automation invocation and system reliability, participants were separated into high and low complacency potential groups. The results revealed no significant effects of automation invocation on the performance measures; however, the high complacency individuals demonstrated better situation awareness when working with the adaptive automation system. The second experiment was the same as the first with one important exception. Automation was invoked manually. Thus, half of the participants pressed a button to invoke automation for 10 s. The remaining participants were yoked to participants from the adaptable condition and received the same schedule of mode switches, but they had no control over the automation. The results showed that participants who could invoke automation performed more poorly on the resource management task and reported higher levels of subjective workload. Further, those who invoked automation more frequently performed more poorly on the tracking task and reported higher levels of subjective workload. and the adaptable condition in the second experiment revealed only one significant difference: the subjective workload was higher in the adaptable condition. Overall, the results show that a brain-based, adaptive automation system may facilitate situation awareness for those individuals who are more complacent toward automation. By contrast, requiring operators to invoke automation manually may have some detrimental impact on performance but does appear to increases subjective workload relative to an adaptive system.
NCL script: cmaq_ensemble_isam_4panels_subdomain.nclNetcdf input file for NCL script, containing ensemble means and standard deviation of ISAM SO4 and O3 contributions from IPM: test.ncPlot (ps): maps_isam_mean_std_lasthour_ipm_so4_o3_east.psPlot (pdf): maps_isam_mean_std_lasthour_ipm_so4_o3_east.pdfPlot (ncgm): maps_isam_mean_std_lasthour_ipm_so4_o3_east.ncgmThis dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).
Development of a Multilayer MODIS IST-Albedo Product of Greenland
NASA Technical Reports Server (NTRS)
Hall, D. K.; Comiso, J. C.; Cullather, R. I.; Digirolamo, N. E.; Nowicki, S. M.; Medley, B. C.
2017-01-01
A new multilayer IST-albedo Moderate Resolution Imaging Spectroradiometer (MODIS) product of Greenland was developed to meet the needs of the ice sheet modeling community. The multiple layers of the product enable the relationship between IST and albedo to be evaluated easily. Surface temperature is a fundamental input for dynamical ice sheet models because it is a component of the ice sheet radiation budget and mass balance. Albedo influences absorption of incoming solar radiation. The daily product will combine the existing standard MODIS Collection-6 ice-surface temperature, derived melt maps, snow albedo and water vapor products. The new product is available in a polar stereographic projection in NetCDF format. The product will ultimately extend from March 2000 through the end of 2017.
EverVIEW: a visualization platform for hydrologic and Earth science gridded data
Romañach, Stephanie S.; McKelvy, James M.; Suir, Kevin J.; Conzelmann, Craig
2015-01-01
The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.
Ramses-GPU: Second order MUSCL-Handcock finite volume fluid solver
NASA Astrophysics Data System (ADS)
Kestener, Pierre
2017-10-01
RamsesGPU is a reimplementation of RAMSES (ascl:1011.007) which drops the adaptive mesh refinement (AMR) features to optimize 3D uniform grid algorithms for modern graphics processor units (GPU) to provide an efficient software package for astrophysics applications that do not need AMR features but do require a very large number of integration time steps. RamsesGPU provides an very efficient C++/CUDA/MPI software implementation of a second order MUSCL-Handcock finite volume fluid solver for compressible hydrodynamics as a magnetohydrodynamics solver based on the constraint transport technique. Other useful modules includes static gravity, dissipative terms (viscosity, resistivity), and forcing source term for turbulence studies, and special care was taken to enhance parallel input/output performance by using state-of-the-art libraries such as HDF5 and parallel-netcdf.
SysSon - A Framework for Systematic Sonification Design
NASA Astrophysics Data System (ADS)
Vogt, Katharina; Goudarzi, Visda; Holger Rutz, Hanns
2015-04-01
SysSon is a research approach on introducing sonification systematically to a scientific community where it is not yet commonly used - e.g., in climate science. Thereby, both technical and socio-cultural barriers have to be met. The approach was further developed with climate scientists, who participated in contextual inquiries, usability tests and a workshop of collaborative design. Following from these extensive user tests resulted our final software framework. As frontend, a graphical user interface allows climate scientists to parametrize standard sonifications with their own data sets. Additionally, an interactive shell allows to code new sonifications for users competent in sound design. The framework is a standalone desktop application, available as open source (for details see http://sysson.kug.ac.at/) and works with data in NetCDF format.
Efficiently Serving HDF5 Products via OPeNDAP
NASA Technical Reports Server (NTRS)
Yang, Kent
2017-01-01
Hyrax OPeNDAP services are widely used by the Earth Science data centers in NASA, NOAA and other organizations to serve end users. In this talk, we will present some key features added in the HDF5 Hyrax OPeNDAP handler that can help data centers to better serve the HDF5netCDF-4 data products. Among these new features, we will focus on the following:1.The DAP4 support 2.The memory cache and the disk cache support that can reduce the service access time 3.The enhancement that makes the swath-like HDF5 products visualized by CF-client tools. We will also discuss the role of the HDF5 handler in-depth in the recent study of the Hyrax service in the cloud environment.
Rational rates of uniform decay for strong solutions to a fluid-structure PDE system
NASA Astrophysics Data System (ADS)
Avalos, George; Bucci, Francesca
2015-06-01
In this work we investigate the uniform stability properties of solutions to a well-established partial differential equation (PDE) model for a fluid-structure interaction. The PDE system under consideration comprises a Stokes flow which evolves within a three-dimensional cavity; moreover, a Kirchhoff plate equation is invoked to describe the displacements along a (fixed) portion - say, Ω - of the cavity wall. Contact between the respective fluid and structure dynamics occurs on the boundary interface Ω. The main result in the paper is as follows: the solutions to the composite PDE system, corresponding to smooth initial data, decay at the rate of O (1 / t). Our method of proof hinges upon the appropriate invocation of a relatively recent resolvent criterion for polynomial decays of C0-semigroups. While the characterization provided by said criterion originates in the context of operator theory and functional analysis, the work entailed here is wholly within the realm of PDE.
NASA Astrophysics Data System (ADS)
Ragozzine, Brett
The invocation of dark matter in the universe is predicated upon gravitational observations that cannot be explained by the amount of luminous matter that we detect. There is an ongoing debate over which gravitational model is correct. The work herein tests a prescription of gravity theory known as Tensor-Vector-Scalar and is based upon the work of Angus et al. (2007). We add upon this work by extending the sample of galaxy clusters to five and testing the accepted Navarro, Frenk & White (NFW) dark matter potential (Navarro et al., 1996). Our independent implementation of this method includes weak gravitational lensing analysis to determine the amount of dark matter in these galaxy clusters by calculating the gas fraction ƒgas = Mgas=Mtot. The ability of the Tensor-Vector-Scalar theory to predict a consistent ƒgas across all galaxy clusters is a measure of its liklihood of being the correct gravity model.
NASA Technical Reports Server (NTRS)
Engelberg, N.; Shaw, C., III
1984-01-01
The design of a uniform command language to be used in a local area network of heterogeneous, autonomous nodes is considered. After examining the major characteristics of such a network, and after considering the profile of a scientist using the computers on the net as an investigative aid, a set of reasonable requirements for the command language are derived. Taking into account the possible inefficiencies in implementing a guest-layered network operating system and command language on a heterogeneous net, the authors examine command language naming, process/procedure invocation, parameter acquisition, help and response facilities, and other features found in single-node command languages, and conclude that some features may extend simply to the network case, others extend after some restrictions are imposed, and still others require modifications. In addition, it is noted that some requirements considered reasonable (user accounting reports, for example) demand further study before they can be efficiently implemented on a network of the sort described.
Computing Flows Using Chimera and Unstructured Grids
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Zheng, Yao
2006-01-01
DRAGONFLOW is a computer program that solves the Navier-Stokes equations of flows in complexly shaped three-dimensional regions discretized by use of a direct replacement of arbitrary grid overlapping by nonstructured (DRAGON) grid. A DRAGON grid (see figure) is a combination of a chimera grid (a composite of structured subgrids) and a collection of unstructured subgrids. DRAGONFLOW incorporates modified versions of two prior Navier-Stokes-equation-solving programs: OVERFLOW, which is designed to solve on chimera grids; and USM3D, which is used to solve on unstructured grids. A master module controls the invocation of individual modules in the libraries. At each time step of a simulated flow, DRAGONFLOW is invoked on the chimera portion of the DRAGON grid in alternation with USM3D, which is invoked on the unstructured subgrids of the DRAGON grid. The USM3D and OVERFLOW modules then immediately exchange their solutions and other data. As a result, USM3D and OVERFLOW are coupled seamlessly.
Hybridization can facilitate species invasions, even without enhancing local adaptation.
Mesgaran, Mohsen B; Lewis, Mark A; Ades, Peter K; Donohue, Kathleen; Ohadi, Sara; Li, Chengjun; Cousens, Roger D
2016-09-06
The founding population in most new species introductions, or at the leading edge of an ongoing invasion, is likely to be small. Severe Allee effects-reductions in individual fitness at low population density-may then result in a failure of the species to colonize, even if the habitat could support a much larger population. Using a simulation model for plant populations that incorporates demography, mating systems, quantitative genetics, and pollinators, we show that Allee effects can potentially be overcome by transient hybridization with a resident species or an earlier colonizer. This mechanism does not require the invocation of adaptive changes usually attributed to invasions following hybridization. We verify our result in a case study of sequential invasions by two plant species where the outcrosser Cakile maritima has replaced an earlier, inbreeding, colonizer Cakile edentula (Brassicaceae). Observed historical rates of replacement are consistent with model predictions from hybrid-alleviated Allee effects in outcrossers, although other causes cannot be ruled out.
LexValueSets: An Approach for Context-Driven Value Sets Extraction
Pathak, Jyotishman; Jiang, Guoqian; Dwarkanath, Sridhar O.; Buntrock, James D.; Chute, Christopher G.
2008-01-01
The ability to model, share and re-use value sets across multiple medical information systems is an important requirement. However, generating value sets semi-automatically from a terminology service is still an unresolved issue, in part due to the lack of linkage to clinical context patterns that provide the constraints in defining a concept domain and invocation of value sets extraction. Towards this goal, we develop and evaluate an approach for context-driven automatic value sets extraction based on a formal terminology model. The crux of the technique is to identify and define the context patterns from various domains of discourse and leverage them for value set extraction using two complementary ideas based on (i) local terms provided by the Subject Matter Experts (extensional) and (ii) semantic definition of the concepts in coding schemes (intensional). A prototype was implemented based on SNOMED CT rendered in the LexGrid terminology model and a preliminary evaluation is presented. PMID:18998955
KeyWare: an open wireless distributed computing environment
NASA Astrophysics Data System (ADS)
Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir
1995-12-01
Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.
Language and human nature: Kurt Goldstein's neurolinguistic foundation of a holistic philosophy.
Ludwig, David
2012-01-01
Holism in interwar Germany provides an excellent example for social and political influences on scientific developments. Deeply impressed by the ubiquitous invocation of a cultural crisis, biologists, physicians, and psychologists presented holistic accounts as an alternative to the "mechanistic worldview" of the nineteenth century. Although the ideological background of these accounts is often blatantly obvious, many holistic scientists did not content themselves with a general opposition to a mechanistic worldview but aimed at a rational foundation of their holistic projects. This article will discuss the work of Kurt Goldstein, who is known for both his groundbreaking contributions to neuropsychology and his holistic philosophy of human nature. By focusing on Goldstein's neurolinguistic research, I want to reconstruct the empirical foundations of his holistic program without ignoring its cultural background. In this sense, Goldstein's work provides a case study for the formation of a scientific theory through the complex interplay between specific empirical evidences and the general cultural developments of the Weimar Republic. © 2012 Wiley Periodicals, Inc.
Secure Service Invocation in a Peer-to-Peer Environment Using JXTA-SOAP
NASA Astrophysics Data System (ADS)
Laghi, Maria Chiara; Amoretti, Michele; Conte, Gianni
The effective convergence of service-oriented architectures (SOA) and peer-to-peer (P2P) is an urgent task, with many important applications ranging from e-business to ambient intelligence. A considerable standardization effort is being carried out from both SOA and P2P communities, but a complete platform for the development of secure, distributed applications is still missing. In this context, the result of our research and development activity is JXTA-SOAP, an official extension for JXTA enabling Web Service sharing in peer-to-peer networks. Recently we focused on security aspects, providing JXTA-SOAP with a general security management system, and specialized policies that target both J2SE and J2ME versions of the component. Among others, we implemented a policy based on Multimedia Internet KEYing (MIKEY), which can be used to create a key pair and all the required parameters for encryption and decryption of service messages in consumer and provider peers running on resource-constrained devices.
Graph-Based Semantic Web Service Composition for Healthcare Data Integration.
Arch-Int, Ngamnij; Arch-Int, Somjit; Sonsilphong, Suphachoke; Wanchai, Paweena
2017-01-01
Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement.
Graph-Based Semantic Web Service Composition for Healthcare Data Integration
2017-01-01
Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement. PMID:29065602
NASA Technical Reports Server (NTRS)
Zank, G. P.; Khabibrakhmanov, I. KH.; Story, T.
1993-01-01
A new two-fluid model which describes mass loading in the solar wind (e.g., the interaction of the solar wind with a cometary coma or the local interstellar medium) is presented. The self-consistent back-reaction of the mass-loaded ions is included through their effective scattering in low-frequency MHD turbulence and the invocation of a diffusive approximation. Such an approximation has the advantage of introducing self-consistent dissipation coefficients into the governing equations, thereby facilitating the investigation of the internal structure of shocks in mass-loading environments. To illustrate the utility of the new model, we consider the structure of cometary shocks in the hypersonic one-dimensional limit, finding that the incoming solar wind is slowed by both mass loading and the development of a large cometary ion pressure gradient. The shock is broadened and smoothed by the cometary ions with a thickness of the order of the cometary ion diffusion scale.
Cross-Layer Adaptive Feedback Scheduling of Wireless Control Systems
Xia, Feng; Ma, Longhua; Peng, Chen; Sun, Youxian; Dong, Jinxiang
2008-01-01
There is a trend towards using wireless technologies in networked control systems. However, the adverse properties of the radio channels make it difficult to design and implement control systems in wireless environments. To attack the uncertainty in available communication resources in wireless control systems closed over WLAN, a cross-layer adaptive feedback scheduling (CLAFS) scheme is developed, which takes advantage of the co-design of control and wireless communications. By exploiting cross-layer design, CLAFS adjusts the sampling periods of control systems at the application layer based on information about deadline miss ratio and transmission rate from the physical layer. Within the framework of feedback scheduling, the control performance is maximized through controlling the deadline miss ratio. Key design parameters of the feedback scheduler are adapted to dynamic changes in the channel condition. An event-driven invocation mechanism for the feedback scheduler is also developed. Simulation results show that the proposed approach is efficient in dealing with channel capacity variations and noise interference, thus providing an enabling technology for control over WLAN. PMID:27879934
Overcoming the Challenges of Implementing a Multi-Mission Distributed Workflow System
NASA Technical Reports Server (NTRS)
Sayfi, Elias; Cheng, Cecilia; Lee, Hyun; Patel, Rajesh; Takagi, Atsuya; Yu, Dan
2009-01-01
A multi-mission approach to solving the same problems for various projects is enticing. However, the multi-mission approach leads to the need to develop a configurable, adaptable and distributed system to meet unique project requirements. That, in turn, leads to a set of challenges varying from handling synchronization issues to coming up with a smart design that allows the "unknowns" to be decided later. This paper discusses the challenges that the Multi-mission Automated Task Invocation Subsystem (MATIS) team has come up against while designing the distributed workflow system, as well as elaborates on the solutions that were implemented. The first is to design an easily adaptable system that requires no code changes as a result of configuration changes. The number of formal deliveries is often limited because each delivery costs time and money. Changes such as the sequence of programs being called, a change of a parameter value in the program that is being automated should not result in code changes or redelivery.
The precautionary principle and pharmaceutical risk management.
Callréus, Torbjörn
2005-01-01
Although it is often vigorously contested and has several different formulations, the precautionary principle has in recent decades guided environmental policy making in the face of scientific uncertainty. Originating from a criticism of traditional risk assessment, the key element of the precautionary principle is the justification for acting in the face of uncertain knowledge about risks. In the light of its growing invocation in various areas that are related to public health and recently in relation to drug safety issues, this article presents an introductory review of the main elements of the precautionary principle and some arguments conveyed by its advocates and opponents. A comparison of the characteristics of pharmaceutical risk management and environmental policy making (i.e. the setting within which the precautionary principle evolved), indicates that several important differences exist. If believed to be of relevance, in order to avoid arbitrary and unpredictable decision making, both the interpretation and possible application of the precautionary principle need to be adapted to the conditions of pharmaceutical risk management.
NASA Astrophysics Data System (ADS)
Baudel, S.; Blanc, F.; Jolibois, T.; Rosmorduc, V.
2004-12-01
The Products and Services (P&S) department in the Space Oceanography Division at CLS is in charge of diffusing and promoting altimetry and operational oceanography data. P&S is so involved in Aviso satellite altimetry project, in Mercator ocean operational forecasting system, and in the European Godae /Mersea ocean portal. Aiming to a standardisation and a common vision and management of all these ocean data, these projects led to the implementation of several OPeNDAP/LAS Internet servers. OPeNDAP allows the user to extract via a client software (like IDL, Matlab or Ferret) the data he is interested in and only this data, avoiding him to download full information files. OPeNDAP allows to extract a geographic area, a period time, an oceanic variable, and an output format. LAS is an OPeNDAP data access web server whose special feature consists in the facility for unify in a single vision the access to multiple types of data from distributed data sources. The LAS can make requests to different remote OPeNDAP servers. This enables to make comparisons or statistics upon several different data types. Aviso is the CNES/CLS service which distributes altimetry products since 1993. The Aviso LAS distributes several Ssalto/Duacs altimetry products such as delayed and near-real time mean sea level anomaly, absolute dynamic topography, absolute geostrophic velocities, gridded significant wave height and gridded wind speed modulus. Mercator-Ocean is a French operational oceanography centre which distributes its products by several means among them LAS/OPeNDAP servers as part of Mercator Mersea-strand1 contribution. 3D ocean description (temperature, salinity, current and other oceanic variables) of the North Atlantic and Mediterranean are real-time available and weekly updated. LAS special feature consisting in the possibility of making requests to several remote data centres with same OPeNDAP configurations particularly fitted to Mersea strand-1 problematics. This European project (June 2003 to June 2004) sponsored by the European Commission was the first experience of an integrated operational oceanography project. The objective was the assessment of several existing operational in situ and satellite monitoring and numerical forecasting systems for the future elaboration (Mersea Integrated Project, 2004-2008) of an integrated system able to deliver, operationally, information products (physical, chemical, biological) towards end-users in several domains related to environment, security and safety. Five forecasting ocean models with data assimilation coming from operational in situ or satellite data centres, have been intercompared. The main difficulty of this LAS implementation has lied in the ocean model metrics definition and a common file format adoption which forced the model teams to produce the same datasets in the same formats (NetCDF, COARDS/CF convention). Notice that this was a pioneer approach and that it has been adopted by Godae standards (see F. Blanc's paper in this session). Going on these web technologies implementation and entering a more user-oriented issue, perspectives deal with the implementation of a Map Server, a GIS opensource server which will communicate with the OPeNDAP server. The Map server will be able to manipulate simultaneously raster and vector multidisciplinary remote data. The aim is to construct a full complete web oceanic data distribution service. The projects in which we are involved allow us to progress towards that.
SAR Altimetry Processing on Demand Service for CryoSat-2 and Sentinel-3 at ESA G-POD
NASA Astrophysics Data System (ADS)
Dinardo, Salvatore; Lucas, Bruno; Benveniste, Jerome
2015-12-01
The scope of this work is to feature the new ESA service (SARvatore) for the exploitation of the CryoSat-2 data, designed and developed entirely by the Altimetry Team at ESA-ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The G-POD Service, SARvatore (SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation) for CryoSat-2, is a web platform that provides the capability to process on-line and on-demand CryoSat-2 SAR/SARIN data, from L1a (FBR) data products until SAR/SARIN Level-2 geophysical data products.. The Processor will make use of the G-POD (Grid-Processing On Demand) distributed computing platform to deliver timely the output data products. These output data products are generated in standard NetCDF format (using CF Convention), and they are compatible with BRAT (Basic Radar Altimetry Toolbox) and other NetCDF tool. Using the G-POD graphic interface, it is easy to select the geographical area of interest along with the time-frame of interest, based on the Cryosat-2 SAR/SARIN FBR data products availability in the service's catalogue. After the task submission, the users can follow, in real time, the status of the processing task. The processor prototype is versatile in the sense that the users can customize and adapt the processing, according their specific requirements, setting a list of configurable options. The processing service is meant to be used for research & development experiments, to support the development contracts awarded confronting the deliverables to ESA, on site demonstrations/training in training courses and workshops, cross-comparison against third party products (CLS/CNES CPP Products for instance), preparation for the Sentinel-3 Topographic mission, producing data and graphics for publications, etc. So far, the processing has been designed and optimized for open ocean studies and is fully functional only over this kind of surface but there are plans to augment this processing capacity over coastal zone, inland water and over land in view of maximizing the exploitation of the upcoming Sentinel-3 Topographic mission over all surfaces. The service is open and free of charge.
High performance geospatial and climate data visualization using GeoJS
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Beezley, J. D.
2015-12-01
GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.
SAR Altimetry Processing on Demand Service for CryoSat-2 and Sentinel-3 at ESA G-POD
NASA Astrophysics Data System (ADS)
Benveniste, J.; Dinardo, S.; Lucas, B.
2014-12-01
The scope of this work is to show the new ESA service (SARvatore) for the exploitation of the CryoSat-2 data and upcoming Sentinel-3 data, designed and developed entirely by the Altimetry Team at ESRIN EOP-SER. The G-POD (Grid-Processing On Demand) Service, SARvatore (SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation) for CryoSat-2, is a web platform that provides the capability to process on-line and on demand CryoSat-2 SAR data, starting from L1a (FBR) data up to SAR Level-2 geophysical data products.The service is based on SARvatore Processor Prototype and it The output data products are generated in standard NetCDF format (using CF Convention), and they are compatible with BRAT (Basic Radar Altimety Toolbox) and its successor, the up-coming Sentinel-3 Altimetry Toolbox and other NetCDF tools.Using the G-POD graphic interface, it is possible to easily select the geographical area of interest along with the time of interest. As of August 2014 the service allows the user to select data for most of 2013 and part of 2014, no geographical restriction on this data. It is expected that before Fall 2014 all the mission (when available) will be at the disposal of the users.The processor prototype is versatile in the sense that the users can customize and adapt the processing, according their specific requirements, setting a list of configurable options..The processing service is meant to be used for research & development scopes, supporting the development contracts, on site demonstrations/training to selected users, cross-comparison against third part products, preparation to Sentinel-3 mission, publications, etc.So far, the processing has been designed and optimized for open ocean studies and is fully functional only over this kind of surface but there are plans to augment this processing capacity over coastal zones, inland waters and over land in sight of maximizing the exploitation of the upcoming Sentinel-3 Topographic mission over all surfaces.
NASA Astrophysics Data System (ADS)
Vines, Aleksander; Hamre, Torill; Lygre, Kjetil
2014-05-01
The GreenSeas project (Development of global plankton data base and model system for eco-climate early warning) aims to advance the knowledge and predictive capacities of how marine ecosystems will respond to global change. A main task has been to set up a data delivery and monitoring core service following the open and free data access policy implemented in the Global Monitoring for the Environment and Security (GMES) programme. A key feature of the system is its ability to compare data from different datasets, including an option to upload one's own netCDF files. The user can for example search in an in situ database for different variables (like temperature, salinity, different elements, light, specific plankton types or rate measurements) with different criteria (bounding box, date/time, depth, Longhurst region, cruise/transect) and compare the data with model data. The user can choose model data or Earth observation data from a list, or upload his/her own netCDF files to use in the comparison. The data can be visualized on a map, as graphs and plots (e.g. time series and property-property plots), or downloaded in various formats. The aim is to ensure open and free access to historical plankton data, new data (EO products and in situ measurements), model data (including estimates of simulation error) and biological, environmental and climatic indicators to a range of stakeholders, such as scientists, policy makers and environmental managers. We have implemented a web-based GIS(Geographical Information Systems) system and want to demonstrate the use of this. The tool is designed for a wide range of users: Novice users, who want a simple way to be able to get basic information about the current state of the marine planktonic ecosystem by utilizing predefined queries and comparisons with models. Intermediate level users who want to explore the database on their own and customize the prefedined setups. Advanced users who want to perform complex queries and inventory searching and compare the data in their own way or with their own models.
NASA Astrophysics Data System (ADS)
Forte, M.; Hesser, T.; Knee, K.; Ingram, I.; Hathaway, K. K.; Brodie, K. L.; Spore, N.; Bird, A.; Fratantonio, R.; Dopsovic, R.; Keith, A.; Gadomski, K.
2016-02-01
The U.S. Army Engineer Research and Development Center's (USACE ERDC) Coastal and Hydraulics Laboratory (CHL) Coastal Observations and Analysis Branch (COAB) Measurements Program has a 35-year record of coastal observations. These datasets include oceanographic point source measurements, Real-Time Kinematic (RTK) GPS bathymetry surveys, and remote sensing data from both the Field Research Facility (FRF) in Duck, NC and from other project and experiment sites around the nation. The data has been used to support a variety of USACE mission areas, including coastal wave model development, beach and bar response, coastal project design, coastal storm surge, and other coastal hazard investigations. Furthermore these data have been widely used by a number of federal and state agencies, academic institutions, and private industries in hundreds of scientific and engineering investigations, publications, conference presentations and model advancement studies. A limiting factor to the use of FRF data has been rapid, reliable access and publicly available metadata for each data type. The addition of web tools, accessible data files, and well-documented metadata will open the door to much future collaboration. With the help of industry partner RPS ASA and the U.S. Army Corps of Engineers Mobile District Spatial Data Branch, a Data Integration Framework (DIF) was developed. The DIF represents a combination of processes, standards, people, and tools used to transform disconnected enterprise data into useful, easily accessible information for analysis and reporting. A front-end data portal connects the user to the framework that integrates both oceanographic observation and geomorphology measurements using a combination of ESRI and open-source technology while providing a seamless data discovery, access, and analysis experience to the user. The user interface was built with ESRI's JavaScript API and all project metadata is managed using Geoportal. The geomorphology data is made available through ArcGIS Server, while the oceanographic data sets have been formatted to netCDF4 and made available through a THREDDS server. Additional web tools run alongside the THREDDS server to provide rapid statistical calculations and plotting, allowing for user defined data access and visualization.
NASA Astrophysics Data System (ADS)
Hausman, J.; Sanchez, A.; Armstrong, E. M.
2014-12-01
Seasat-A was NASA's first ocean observing satellite mission. It launched in June 1978 and operated continuously until it suffered a power failure 106 days later. It contained an altimeter (ALT), scatterometer (SASS), SAR, microwave radiometer (SMMR), and a visible/infrared radiometer (VIRR). These instruments allowed Seasat to measure sea surface height, ocean winds and both brightness and sea surface temperatures. The data, except for the SAR, are archived at PO.DAAC. Since these are the only oceanographic satellite data available for this early period of remote sensing, their importance has grown for use in climate studies. Even though the datasets were digitized from the original tapes, the Seasat data have since still been maintained in the same flat binary format technology of 1980 when the data were first distributed. In 2013 PO.DAAC began a project to reformat the original data into a user friendly, modern and maintainable format consistent with the netCDF data model and Climate Forecast (CF) and Attribute Conventions Dataset Discovery (ACDD) metadata standards. A significant benefit of using this data format includes the improved interoperability with tools and web services such as OPeNDAP, THREDDS, and various subsetting software, such as PO.DAAC's HiTIDE. Additionally, application of such metadata standards provides an opportunity to correctly document the data at the granule level. The first step in the conversion process involved going through the original documentation to understand the source binary data format. Documentation was found for processing levels 1 and 2 for ALT, SASS and SMMR. Software readers were then written for each of the datasets using Matlab , followed by regression tests performed on the newly outputted data in order to demonstrate that the readers were correctly interpreting the source data. Next, writers were created to convert the data into the updated format. The reformatted data were also regression tested and science validated to ensure that the data were not corrupted during the reformatting process. The resulting modernized Seasat datasets will be made available iteratively by instrument and processing level on PO.DAAC's web portal http://podaac.jpl.nasa.gov, anonymous ftp site, ftp://podaac.jpl.nasa.gov/allData/seasat and other web services.
A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.
2015-12-01
Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.
Web-based CERES Clouds QC Property Viewing Tool
NASA Astrophysics Data System (ADS)
Smith, R. A.
2015-12-01
Churngwei Chu1, Rita Smith1, Sunny Sun-Mack1, Yan Chen1, Elizabeth Heckert1, Patrick Minnis21 Science Systems and Applications, Inc., Hampton, Virginia2 NASA Langley Research Center, Hampton, Virginia This presentation will display the capabilities of a web-based CERES cloud property viewer. Aqua/Terra/NPP data will be chosen for examples. It will demonstrate viewing of cloud properties in gridded global maps, histograms, time series displays, latitudinal zonal images, binned data charts, data frequency graphs, and ISCCP plots. Images can be manipulated by the user to narrow boundaries of the map as well as color bars and value ranges, compare datasets, view data values, and more. Other atmospheric studies groups will be encouraged to put their data into the underlying NetCDF data format and view their data with the tool.
Atmospheric data access for the geospatial user community
NASA Astrophysics Data System (ADS)
van de Vegte, John; Som de Cerff, Wim-Jan; van den Oord, Gijsbertus H. J.; Sluiter, Raymond; van der Neut, Ian A.; Plieger, Maarten; van Hees, Richard M.; de Jeu, Richard A. M.; Schaepman, Michael E.; Hoogerwerf, Marc R.; Groot, Nikée E.; Domenico, Ben; Nativi, Stefano; Wilhelmi, Olga V.
2007-10-01
Historically the atmospheric and meteorological communities are separate worlds with their own data formats and tools for data handling making sharing of data difficult and cumbersome. On the other hand, these information sources are becoming increasingly of interest outside these communities because of the continuously improving spatial and temporal resolution of e.g. model and satellite data and the interest in historical datasets. New user communities that use geographically based datasets in a cross-domain manner are emerging. This development is supported by the progress made in Geographical Information System (GIS) software. The current GIS software is not yet ready for the wealth of atmospheric data, although the faint outlines of new generation software are already visible: support of HDF, NetCDF and an increasing understanding of temporal issues are only a few of the hints.
NASA Astrophysics Data System (ADS)
Palanisamy, Giriprakash; Wilson, Bruce E.; Cook, Robert B.; Lenhardt, Chris W.; Santhana Vannan, Suresh; Pan, Jerry; McMurry, Ben F.; Devarakonda, Ranjeet
2010-12-01
The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) is one of the science-oriented data centers in EOSDIS, aligned primarily with terrestrial ecology. The ORNL DAAC archives and serves data from NASA-funded field campaigns (such as BOREAS, FIFE, and LBA), regional and global data sets relevant to biogeochemical cycles, land validation studies for remote sensing, and source code for some terrestrial ecology models. Users of the ORNL DAAC include field ecologists, remote sensing scientists, modelers at various scales, synthesis scientific groups, a range of educational users (particularly baccalaureate and graduate instruction), and decision support analysts. It is clear that the wide range of users served by the ORNL DAAC have differing needs and differing capabilities for accessing and using data. It is also not possible for the ORNL DAAC, or the other data centers in EDSS to develop all of the tools and interfaces to support even most of the potential uses of data directly. As is typical of Information Technology to support a research enterprise, the user needs will continue to evolve rapidly over time and users themselves cannot predict future needs, as those needs depend on the results of current investigation. The ORNL DAAC is addressing these needs by targeted implementation of web services and tools which can be consumed by other applications, so that a modeler can retrieve data in netCDF format with the Climate Forecasting convention and a field ecologist can retrieve subsets of that same data in a comma separated value format, suitable for use in Excel or R. Tools such as our MODIS Subsetting capability, the Spatial Data Access Tool (SDAT; based on OGC web services), and OPeNDAP-compliant servers such as THREDDS particularly enable such diverse means of access. We also seek interoperability of metadata, recognizing that terrestrial ecology is a field where there are a very large number of relevant data repositories. ORNL DAAC metadata is published to several metadata repositories using the Open Archive Initiative Protocol for Metadata Handling (OAI-PMH), to increase the chances that users can find data holdings relevant to their particular scientific problem. ORNL also seeks to leverage technology across these various data projects and encourage standardization of processes and technical architecture. This standardization is behind current efforts involving the use of Drupal and Fedora Commons. This poster describes the current and planned approaches that the ORNL DAAC is taking to enable cost-effective interoperability among data centers, both across the NASA EOSDIS data centers and across the international spectrum of terrestrial ecology-related data centers. The poster will highlight the standards that we are currently using across data formats, metadata formats, and data protocols. References: [1]Devarakonda R., et al. Mercury: reusable metadata management, data discovery and access system. Earth Science Informatics (2010), 3(1): 87-94. [2]Devarakonda R., et al. Data sharing and retrieval using OAI-PMH. Earth Science Informatics (2011), 4(1): 1-5.
Satellite Level 3 & 4 Data Subsetting at NASA GES DISC
NASA Technical Reports Server (NTRS)
Huwe, Paul; Su, Jian; Loeser, Carlee; Ostrenga, Dana; Rui, Hualan; Vollmer, Bruce
2017-01-01
Earth Science data are available in many file formats (NetCDF, HDF, GRB, etc.) and in a wide range of sizes, from kilobytes to gigabytes. These properties have become a challenge to users if they are not familiar with these formats or only want a small region of interest (ROI) from a specific dataset. At NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), we have developed and implemented a multipurpose subset service to ease user access to Earth Science data. Our Level 3 & 4 Regridder is capable of subsetting across multiple parameters (spatially, temporally, by level, and by variable) as well as having additional beneficial features (temporal means, regridding to target grids, and file conversion to other data formats). In this presentation, we will demonstrate how users can use this service to better access only the data they need in the form they require.
Recommendations resulting from the SPDS Community-Wide Workshop
NASA Technical Reports Server (NTRS)
1993-01-01
The Data Systems Panel identified three critical functionalities of a Space Physics Data System (SPDS): the delivery of self-documenting data, the existence of a matrix of translators between various standard formats (IDFS, CDF, netCDF, HDF, TENNIS, UCLA flat file, and FITS), and a network-based capability for browsing and examining inventory records for the system's data holdings. The recommendations resulting from the workshop include the philosophy, funding, and objectives of a SPDS. Access to quality data is seen as the most important objective by the Policy Panel, with curation and information about the data being integral parts of any accessible data set. The Data Issues Panel concluded that the SPDS can supply encouragement, guidelines, and ultimately provide a mechanism for financial support for data archiving, restoration, and curation. The Software Panel of the SPDS focused on defining the requirements and priorities for SPDS to support common data analysis and data visualization tools and packages.
MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.
Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten
2006-12-01
MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant
Satellite Level 3 & 4 Data Subsetting at NASA GES DISC
NASA Astrophysics Data System (ADS)
Huwe, P.; Su, J.; Loeser, C. F.; Ostrenga, D.; Rui, H.; Vollmer, B.
2017-12-01
Earth Science data are available in many file formats (NetCDF, HDF, GRB, etc.) and in a wide range of sizes, from kilobytes to gigabytes. These properties have become a challenge to users if they are not familiar with these formats or only want a small region of interest (ROI) from a specific dataset. At NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), we have developed and implemented a multipurpose subset service to ease user access to Earth Science data. Our Level 3 & 4 Regridder is capable of subsetting across multiple parameters (spatially, temporally, by level, and by variable) as well as having additional beneficial features (temporal means, regridding to target grids, and file conversion to other data formats). In this presentation, we will demonstrate how users can use this service to better access only the data they need in the form they require.
Task 28: Web Accessible APIs in the Cloud Trade Study
NASA Technical Reports Server (NTRS)
Gallagher, James; Habermann, Ted; Jelenak, Aleksandar; Lee, Joe; Potter, Nathan; Yang, Muqun
2017-01-01
This study explored three candidate architectures for serving NASA Earth Science Hierarchical Data Format Version 5 (HDF5) data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the project are: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and Network Common Data Format Version 4 (netCDF4) data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3).Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.
Using Browser Notebooks to Analyse Big Atmospheric Data-sets in the Cloud
NASA Astrophysics Data System (ADS)
Robinson, N.; Tomlinson, J.; Arribas, A.; Prudden, R.
2016-12-01
We are presenting an account of our experience building an ecosystem for the analysis of big atmospheric data-sets. By using modern technologies we have developed a prototype platform which is scaleable and capable of analysing very large atmospheric datasets. We tested different big-data ecosystems such as Hadoop MapReduce, Spark and Dask, in order to find the one which was best suited for analysis of multidimensional binary data such as NetCDF. We make extensive use of infrastructure-as-code and containerisation to provide a platform which is reusable, and which can scale to accommodate changes in demand. We make this platform readily accessible using browser based notebooks. As a result, analysts with minimal technology experience can, in tens of lines of Python, make interactive data-visualisation web pages, which can analyse very large amounts of data using cutting edge big-data technology
Coordinating Resource Usage through Adaptive Service Provisioning in Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Fok, Chien-Liang; Roman, Gruia-Catalin; Lu, Chenyang
Wireless sensor networks (WSNs) exhibit high levels of network dynamics and consist of devices with limited energy. This results in the need to coordinate applications not only at the functional level, as is traditionally done, but also in terms of resource utilization. In this paper, we present a middleware that does this using adaptive service provisioning. Novel service binding strategies automatically adapt application behavior when opportunities for energy savings surface, and switch providers when the network topology changes. The former is accomplished by providing limited information about the energy consumption associated with using various services, systematically exploiting opportunities for sharing service invocations, and exploiting the broadcast nature of wireless communication in WSNs. The middleware has been implemented and evaluated on two disparate WSN platforms, the TelosB and Imote2. Empirical results show that adaptive service provisioning can enable energy-aware service binding decisions that result in increased energy efficiency and significantly increase service availability, while imposing minimal additional burden on the application, service, and device developers. Two applications, medical patient monitoring and structural health monitoring, demonstrate the middleware's efficacy.
Compositional mining of multiple object API protocols through state abstraction.
Dai, Ziying; Mao, Xiaoguang; Lei, Yan; Qi, Yuhua; Wang, Rui; Gu, Bin
2013-01-01
API protocols specify correct sequences of method invocations. Despite their usefulness, API protocols are often unavailable in practice because writing them is cumbersome and error prone. Multiple object API protocols are more expressive than single object API protocols. However, the huge number of objects of typical object-oriented programs poses a major challenge to the automatic mining of multiple object API protocols: besides maintaining scalability, it is important to capture various object interactions. Current approaches utilize various heuristics to focus on small sets of methods. In this paper, we present a general, scalable, multiple object API protocols mining approach that can capture all object interactions. Our approach uses abstract field values to label object states during the mining process. We first mine single object typestates as finite state automata whose transitions are annotated with states of interacting objects before and after the execution of the corresponding method and then construct multiple object API protocols by composing these annotated single object typestates. We implement our approach for Java and evaluate it through a series of experiments.
Compositional Mining of Multiple Object API Protocols through State Abstraction
Mao, Xiaoguang; Qi, Yuhua; Wang, Rui; Gu, Bin
2013-01-01
API protocols specify correct sequences of method invocations. Despite their usefulness, API protocols are often unavailable in practice because writing them is cumbersome and error prone. Multiple object API protocols are more expressive than single object API protocols. However, the huge number of objects of typical object-oriented programs poses a major challenge to the automatic mining of multiple object API protocols: besides maintaining scalability, it is important to capture various object interactions. Current approaches utilize various heuristics to focus on small sets of methods. In this paper, we present a general, scalable, multiple object API protocols mining approach that can capture all object interactions. Our approach uses abstract field values to label object states during the mining process. We first mine single object typestates as finite state automata whose transitions are annotated with states of interacting objects before and after the execution of the corresponding method and then construct multiple object API protocols by composing these annotated single object typestates. We implement our approach for Java and evaluate it through a series of experiments. PMID:23844378
Gender scripts and unwanted pregnancy among urban Kenyan women.
Izugbara, Chimaraoke O; Ochako, Rhoune; Izugbara, Chibuogwu
2011-10-01
Women's lived experiences and lay accounts of unwanted pregnancy remain poorly interrogated. We investigated portrayals of unwanted pregnancy using narrative data gathered from 80 women in Nairobi, Kenya. Unwanted pregnancy had a diversity of significance for the women. Pregnancies were not simply unwanted because they occurred when women became pregnant without wanting to. Rather, pregnancies were considered unwanted largely because they had occurred in contexts that did not reinforce socially-sanctioned notions of motherhood and 'proper' procreation and/or revealed women's use of their sexuality in ways deemed culturally-inappropriate. Kenyan women's invocation of femininity scripts to explain unwanted pregnancy; the centrality of gender in everyday life in contemporary Kenya; women's and girls' poor access to effective family planning services; growing female poverty; and Kenya's restrictive abortion policy imply that unwanted pregnancy and its consequences will persist in the country. Addressing unwanted pregnancy and its consequences requires making accessible quality contraceptive and abortion services as well as sexuality information. It also calls for providers who understand the socio-cultural norms that circumscribe fertility and reproductive behaviours.
Invocations and intoxication: does prayer decrease alcohol consumption?
Lambert, Nathaniel M; Fincham, Frank D; Marks, Loren D; Stillman, Tyler F
2010-06-01
Four methodologically diverse studies (N = 1,758) show that prayer frequency and alcohol consumption are negatively related. In Study 1 (n = 824), we used a cross-sectional design and found that higher prayer frequency was related to lower alcohol consumption and problematic drinking behavior. Study 2 (n = 702) used a longitudinal design and found that more frequent prayer at Time 1 predicted less alcohol consumption and problematic drinking behavior at Time 2, and this relationship held when controlling for baseline levels of drinking and prayer. In Study 3 (n = 117), we used an experimental design to test for a causal relationship between prayer frequency and alcohol consumption. Participants assigned to pray every day (either an undirected prayer or a prayer for a relationship partner) for 4 weeks drank about half as much alcohol at the conclusion of the study as control participants. Study 4 (n = 115) replicated the findings of Study 3, as prayer again reduced drinking by about half. These findings are discussed in terms of prayer as reducing drinking motives. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Service-based analysis of biological pathways
Zheng, George; Bouguettaya, Athman
2009-01-01
Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403
NASA Astrophysics Data System (ADS)
Rivers, M. L.; Gualda, G. A.
2009-05-01
One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk, and (3) generate MPEG movies for inclusion in presentations, publications, websites, etc. Both are freely available as run-time ('.sav') versions that can be run using the free IDL Virtual Machine TM, available from ITT Visual Information Solutions: http://www.ittvis.com/ProductServices/IDL/VirtualMachine.aspx The run-time versions of 'tomo_display' and 'vol_tools' can be downloaded from: http://cars.uchicago.edu/software/idl/tomography.html http://sites.google.com/site/voltools/
Web processing service for landslide hazard assessment
NASA Astrophysics Data System (ADS)
Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.
2012-04-01
Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file
NASA Astrophysics Data System (ADS)
Hardman, M.; Brodzik, M. J.; Long, D. G.
2017-12-01
Since 1978, the satellite passive microwave data record has been a mainstay of remote sensing of the cryosphere, providing twice-daily, near-global spatial coverage for monitoring changes in hydrologic and cryospheric parameters that include precipitation, soil moisture, surface water, vegetation, snow water equivalent, sea ice concentration and sea ice motion. Up until recently, the available global gridded passive microwave data sets have not been produced consistently. Various projections (equal-area, polar stereographic), a number of different gridding techniques were used, along with various temporal sampling as well as a mix of Level 2 source data versions. In addition, not all data from all sensors have been processed completely and they have not been processed in any one consistent way. Furthermore, the original gridding techniques were relatively primitive and were produced on 25 km grids using the original EASE-Grid definition that is not easily accommodated in modern software packages. As part of NASA MEaSUREs, we have re-processed all data from SMMR, all SSM/I-SSMIS and AMSR-E instruments, using the most mature Level 2 data. The Calibrated, Enhanced-Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR) gridded data are now available from the NSIDC DAAC. The data are distributed as netCDF files that comply with CF-1.6 and ACDD-1.3 conventions. The data have been produced on EASE 2.0 projections at smoothed, 25 kilometer resolution and spatially-enhanced resolutions, up to 3.125 km depending on channel frequency, using the radiometer version of the Scatterometer Image Reconstruction (rSIR) method. We expect this newly produced data set to enable scientists to better analyze trends in coastal regions, marginal ice zones and in mountainous terrain that were not possible with the previous gridded passive microwave data. The use of the EASE-Grid 2.0 definition and netCDF-CF formatting allows users to extract compliant geotiff images and provides for easy importing and correct reprojection interoperability in many standard packages. As a consistently-processed, high-quality satellite passive microwave ESDR, we expect this data set to replace earlier gridded passive microwave data sets, and to pave the way for new insights from higher-resolution derived geophysical products.
Implementation of Web Processing Services (WPS) over IPSL Earth System Grid Federation (ESGF) node
NASA Astrophysics Data System (ADS)
Kadygrov, Nikolay; Denvil, Sebastien; Carenton, Nicolas; Levavasseur, Guillaume; Hempelmann, Nils; Ehbrecht, Carsten
2016-04-01
The Earth System Grid Federation (ESGF) is aimed to provide access to climate data for the international climate community. ESGF is a system of distributed and federated nodes that dynamically interact with each other. ESGF user may search and download climatic data, geographically distributed over the world, from one common web interface and through standardized API. With the continuous development of the climate models and the beginning of the sixth phase of the Coupled Model Intercomparison Project (CMIP6), the amount of data available from ESGF will continuously increase during the next 5 years. IPSL holds a replication of the different global and regional climate models output, observations and reanalysis data (CMIP5, CORDEX, obs4MIPs, etc) that are available on the IPSL ESGF node. In order to let scientists perform analysis of the models without downloading vast amount of data the Web Processing Services (WPS) were installed at IPSL compute node. The work is part of the CONVERGENCE project founded by French National Research Agency (ANR). PyWPS implementation of the Web processing Service standard from Open Geospatial Consortium (OGC) in the framework of birdhouse software is used. The processes could be run by user remotely through web-based WPS client or by using command-line tool. All the calculations are performed on the server side close to the data. If the models/observations are not available at IPSL it will be downloaded and cached by WPS process from ESGF network using synda tool. The outputs of the WPS processes are available for download as plots, tar-archives or as NetCDF files. We present the architecture of WPS at IPSL along with the processes for evaluation of the model performance, on-site diagnostics and post-analysis processing of the models output, e.g.: - regriding/interpolation/aggregation - ocgis (OpenClimateGIS) based polygon subsetting of the data - average seasonal cycle, multimodel mean, multimodel mean bias - calculation of the climate indices with icclim library (CERFACS) - atmospheric modes of variability In order to evaluate performance of any new model, once it became available in ESGF, we implement WPS with several model diagnostics and performance metrics calculated using ESMValTool (Eyring et al., GMDD 2015). As a further step we are developing new WPS processes and core-functions to be implemented at ISPL ESGF compute node following the scientific community needs.
Cyberinfrastructure to support Real-time, End-to-End, High Resolution, Localized Forecasting
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Lindholm, D.; Baltzer, T.; Domenico, B.
2004-12-01
From natural disasters such as flooding and forest fires to man-made disasters such as toxic gas releases, the impact of weather-influenced severe events on society can be profound. Understanding, predicting, and mitigating such local, mesoscale events calls for a cyberinfrastructure to integrate multidisciplinary data, tools, and services as well as the capability to generate and use high resolution data (such as wind and precipitation) from localized models. The need for such end to end systems -- including data collection, distribution, integration, assimilation, regionalized mesoscale modeling, analysis, and visualization -- has been realized to some extent in many academic and quasi-operational environments, especially for atmospheric sciences data. However, many challenges still remain in the integration and synthesis of data from multiple sources and the development of interoperable data systems and services across those disciplines. Over the years, the Unidata Program Center has developed several tools that have either directly or indirectly facilitated these local modeling activities. For example, the community is using Unidata technologies such as the Internet Data Distribution (IDD) system, Local Data Manger (LDM), decoders, netCDF libraries, Thematic Realtime Environmental Distributed Data Services (THREDDS), and the Integrated Data Viewer (IDV) in their real-time prediction efforts. In essence, these technologies for data reception and processing, local and remote access, cataloging, and analysis and visualization coupled with technologies from others in the community are becoming the foundation of a cyberinfrastructure to support an end-to-end regional forecasting system. To build on these capabilities, the Unidata Program Center is pleased to be a significant contributor to the Linked Environments for Atmospheric Discovery (LEAD) project, a NSF-funded multi-institutional large Information Technology Research effort. The goal of LEAD is to create an integrated and scalable framework for identifying, accessing, preparing, assimilating, predicting, managing, analyzing, mining, and visualizing a broad array of meteorological data and model output, independent of format and physical location. To that end, LEAD will create a series of interconnected, heterogeneous Grid environments to provide a complete framework for mesoscale research, including a set of integrated Grid and Web Services. This talk will focus on the transition from today's end-to-end systems into the types of systems that the LEAD project envisions and the multidisciplinary research problems they will enable.
Using R for analysing spatio-temporal datasets: a satellite-based precipitation case study
NASA Astrophysics Data System (ADS)
Zambrano-Bigiarini, Mauricio
2017-04-01
Increasing computer power and the availability of remote-sensing data measuring different environmental variables has led to unprecedented opportunities for Earth sciences in recent decades. However, dealing with hundred or thousands of files, usually in different vectorial and raster formats and measured with different temporal frequencies, impose high computation challenges to take full advantage of all the available data. R is a language and environment for statistical computing and graphics which includes several functions for data manipulation, calculation and graphical display, which are particularly well suited for Earth sciences. In this work I describe how R was used to exhaustively evaluate seven state-of-the-art satellite-based rainfall estimates (SRE) products (TMPA 3B42v7, CHIRPSv2, CMORPH, PERSIANN-CDR, PERSIAN-CCS-adj, MSWEPv1.1 and PGFv3) over the complex topography and diverse climatic gradients of Chile. First, built-in functions were used to automatically download the satellite-images in different raster formats and spatial resolutions and to clip them into the Chilean spatial extent if necessary. Second, the raster package was used to read, plot, and conduct an exploratory data analysis in selected files of each SRE product, in order to detect unexpected problems (rotated spatial domains, order or variables in NetCDF files, etc). Third, raster was used along with the hydroTSM package to aggregate SRE files into different temporal scales (daily, monthly, seasonal, annual). Finally, the hydroTSM and hydroGOF packages were used to carry out a point-to-pixel comparison between precipitation time series measured at 366 stations and the corresponding grid cell of each SRE. The modified Kling-Gupta index of model performance was used to identify possible sources of systematic errors in each SRE, while five categorical indices (PC, POD, FAR, ETS, fBIAS) were used to assess the ability of each SRE to correctly identify different precipitation intensities. In the end, R proved to be and efficient environment to deal with thousands of raster, vectorial and time series files, with different spatial and temporal resolutions and spatial reference systems. In addition, the use of well-documented R scripts made code readable and re-usable, facilitating reproducible research which is essential to build trust in stakeholders and scientific community.
NASA Technical Reports Server (NTRS)
Liu, Zhong; Ostrenga, D.; Teng, W. L.; Trivedi, Bhagirath; Kempler, S.
2012-01-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is home of global precipitation product archives, in particular, the Tropical Rainfall Measuring Mission (TRMM) products. TRMM is a joint U.S.-Japan satellite mission to monitor tropical and subtropical (40 S - 40 N) precipitation and to estimate its associated latent heating. The TRMM satellite provides the first detailed and comprehensive dataset on the four dimensional distribution of rainfall and latent heating over vastly undersampled tropical and subtropical oceans and continents. The TRMM satellite was launched on November 27, 1997. TRMM data products are archived at and distributed by GES DISC. The newly released TRMM Version 7 consists of several changes including new parameters, new products, meta data, data structures, etc. For example, hydrometeor profiles in 2A12 now have 28 layers (14 in V6). New parameters have been added to several popular Level-3 products, such as, 3B42, 3B43. Version 2.2 of the Global Precipitation Climatology Project (GPCP) dataset has been added to the TRMM Online Visualization and Analysis System (TOVAS; URL: http://disc2.nascom.nasa.gov/Giovanni/tovas/), allowing online analysis and visualization without downloading data and software. The GPCP dataset extends back to 1979. Version 3 of the Global Precipitation Climatology Centre (GPCC) monitoring product has been updated in TOVAS as well. The product provides global gauge-based monthly rainfall along with number of gauges per grid. The dataset begins in January 1986. To facilitate data and information access and support precipitation research and applications, we have developed a Precipitation Data and Information Services Center (PDISC; URL: http://disc.gsfc.nasa.gov/precipitation). In addition to TRMM, PDISC provides current and past observational precipitation data. Users can access precipitation data archives consisting of both remote sensing and in-situ observations. Users can use these data products to conduct a wide variety of activities, including case studies, model evaluation, uncertainty investigation, etc. To support Earth science applications, PDISC provides users near-real-time precipitation products over the Internet. At PDISC, users can access tools and software. Documentation, FAQ and assistance are also available. Other capabilities include: 1) Mirador (http://mirador.gsfc.nasa.gov/), a simplified interface for searching, browsing, and ordering Earth science data at NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Mirador is designed to be fast and easy to learn; 2)TOVAS; 3) NetCDF data download for the GIS community; 4) Data via OPeNDAP (http://disc.sci.gsfc.nasa.gov/services/opendap/). The OPeNDAP provides remote access to individual variables within datasets in a form usable by many tools, such as IDV, McIDAS-V, Panoply, Ferret and GrADS; 5) The Open Geospatial Consortium (OGC) Web Map Service (WMS) (http://disc.sci.gsfc.nasa.gov/services/wxs_ogc.shtml). The WMS is an interface that allows the use of data and enables clients to build customized maps with data coming from a different network.
NASA Astrophysics Data System (ADS)
Ostrenga, D.; Liu, Z.; Teng, W. L.; Trivedi, B.; Kempler, S.
2011-12-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is home of global precipitation product archives, in particular, the Tropical Rainfall Measuring Mission (TRMM) products. TRMM is a joint U.S.-Japan satellite mission to monitor tropical and subtropical (40deg S - 40deg N) precipitation and to estimate its associated latent heating. The TRMM satellite provides the first detailed and comprehensive dataset on the four dimensional distribution of rainfall and latent heating over vastly undersampled tropical and subtropical oceans and continents. The TRMM satellite was launched on November 27, 1997. TRMM data products are archived at and distributed by GES DISC. The newly released TRMM Version 7 consists of several changes including new parameters, new products, meta data, data structures, etc. For example, hydrometeor profiles in 2A12 now have 28 layers (14 in V6). New parameters have been added to several popular Level-3 products, such as, 3B42, 3B43. Version 2.2 of the Global Precipitation Climatology Project (GPCP) dataset has been added to the TRMM Online Visualization and Analysis System (TOVAS; URL: http://disc2.nascom.nasa.gov/Giovanni/tovas/), allowing online analysis and visualization without downloading data and software. The GPCP dataset extends back to 1979. Results of basic intercomparison between the new and the previous versions of both TRMM and GPCP will be presented to help understand changes in data product characteristics. To facilitate data and information access and support precipitation research and applications, we have developed a Precipitation Data and Information Services Center (PDISC; URL: http://disc.gsfc.nasa.gov/precipitation). In addition to TRMM, PDISC provides current and past observational precipitation data. Users can access precipitation data archives consisting of both remote sensing and in-situ observations. Users can use these data products to conduct a wide variety of activities, including case studies, model evaluation, uncertainty investigation, etc. To support Earth science applications, PDISC provides users near-real-time precipitation products over the Internet. At PDISC, users can access tools and software. Documentation, FAQ and assistance are also available. Other capabilities include: 1) Mirador (http://mirador.gsfc.nasa.gov/), a simplified interface for searching, browsing, and ordering Earth science data at NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Mirador is designed to be fast and easy to learn; 2)TOVAS; 3) NetCDF data download for the GIS community; 4) Data via OPeNDAP (http://disc.sci.gsfc.nasa.gov/services/opendap/). The OPeNDAP provides remote access to individual variables within datasets in a form usable by many tools, such as IDV, McIDAS-V, Panoply, Ferret and GrADS; 5) The Open Geospatial Consortium (OGC) Web Map Service (WMS) (http://disc.sci.gsfc.nasa.gov/services/wxs_ogc.shtml). The WMS is an interface that allows the use of data and enables clients to build customized maps with data coming from a different network. More details along with examples will be presented.
NASA Astrophysics Data System (ADS)
Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.
2015-05-01
Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing-canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.
NASA Astrophysics Data System (ADS)
Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.
2015-01-01
Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
Effects-based strategy development through center of gravity and target system analysis
NASA Astrophysics Data System (ADS)
White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen
2003-09-01
This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.
1976-03-26
On March 26, 1976, the NASA Flight Research Center opened its doors to hundreds of guests for the dedication of the center in honor of Hugh Latimer Dryden. The dedication was very much a local event; following Center Director David Scott’s opening remarks, the Antelope Valley High School’s symphonic band played the national anthem. Invocation was given followed by recognition of the invited guests. Dr. Hugh Dryden, a man of total humility, received praise from all those present. Dryden, who died in 1965, had been a pioneering aeronautical scientist who became director of the National Advisory Committee for Aeronautics (NACA) in 1949 and then deputy administrator of the NACA’s successor, NASA, in 1958. Very much interested in flight research, he had been responsible for establishing a permanent facility at the location later named in his honor. As Center Director David Scott looks on, Mrs. Hugh L. Dryden (Mary Libbie Travers) unveils the memorial to her husband at the dedication ceremony.On March 26, 1976, the NASA Flight Research Center opened its doors to hundreds of guests for the dedication of the center in honor of Hugh Latimer Dryden.
Mrs. Hugh Dryden unveils the memorial to her late husband at center dedication, with center director
NASA Technical Reports Server (NTRS)
1976-01-01
On March 26, 1976, the NASA Flight Research Center opened its doors to hundreds of guests for the dedication of the center in honor of Hugh Latimer Dryden. The dedication was very much a local event; following Center Director David Scott's opening remarks, the Antelope Valley High School's symphonic band played the national anthem. Invocation was given followed by recognition of the invited guests. Dr. Hugh Dryden, a man of total humility, received praise from all those present. Dryden, who died in 1965, had been a pioneering aeronautical scientist who became director of the National Advisory Committee for Aeronautics (NACA) in 1949 and then deputy administrator of the NACA's successor, NASA, in 1958. Very much interested in flight research, he had been responsible for establishing a permanent facility at the location later named in his honor. As Center Director David Scott looks on, Mrs. Hugh L. Dryden (Mary Libbie Travers) unveils the memorial to her husband at the dedication ceremony.On March 26, 1976, the NASA Flight Research Center opened its doors to hundreds of guests for the dedication of the center in honor of Hugh Latimer Dryden.
Liberty has its responsibilities
Caplan, Arthur
2013-01-01
“The only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others.” J.S. Mill, On Liberty “Liberty consists in the freedom to do everything which injures no one else” Declaration of the Rights of Man and of the Citizen “The right to swing my fist ends where the other man's nose begins.” Oliver Wendell Holmes David Ropiek in his useful essay on how society should respond to the risks created by those who choose not to vaccinate themselves or their children does a very useful job of identifying the enormous costs in money and health that non-vaccinators create.1 He also pinpoints the many factors that drive vaccine resistance locating them not in a misunderstanding of the facts but, in fears and negative emotions.1 It is important to pay attention to his message since frequently those who want to try to reduce vaccine hesitation or outright non-vaccination behavior put their faith in education and resort to an invocation of the facts about the value of vaccines when it is fear and emotions that must be addressed.2 PMID:24013297
How Does Hawai’i Really Feel about the Thirty Meter Telescope?
NASA Astrophysics Data System (ADS)
Currie, Thayne; Ha, Richard; Imai-Hong, Amber; Silva, Jasmin; Stark, Chris S.; Naea Stevens, Dashiel
2018-01-01
In 2015, protests temporarily halted the construction of the Thirty Meter Telescope on Mauna Kea in Hawai’i and a Hawai’i Supreme Court decision later revoked the permit for this observatory, requiring that the permitting process be restarted. Mainland United States media sources often alternately and simplistically described the opposition to TMT as similar to creationism or as a stand against colonialism, pitting astronomers on one side and Native Hawaiians on the other side. Both of these descriptions are wildly inaccurate, despite their continued invocation. Using a combination of scientific polling and on-the-ground discussions with Hawai’i community members, we present our impression of how Hawai’i residents and Hawaiians feel about the Thirty Meter Telescope. Polls show that support for TMT is very strong (70+% island wide) and increasing over time. The Hawaiian community is either split 50/50 on TMT or now is slightly in favor of the telescope. Finally, we describe *why* Hawai’i residents are for or against the telescope. Perhaps surprisingly, we find that support for TMT often has little to do with the scientific merit of astronomy; those against TMT often do not hold a blanket opposition based on sacredness or sovereignty.
A multiple process solution to the logical problem of language acquisition*
MACWHINNEY, BRIAN
2006-01-01
Many researchers believe that there is a logical problem at the center of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load. PMID:15658750
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, G.R.
1986-03-03
Prototypes of components of the Saguaro distributed operating system were implemented and the design of the entire system refined based on the experience. The philosophy behind Saguaro is to support the illusion of a single virtual machine while taking advantage of the concurrency and robustness that are possible in a network architecture. Within the system, these advantages are realized by the use of pools of server processes and decentralized allocation protocols. Potential concurrency and robustness are also made available to the user through low-cost mechanisms to control placement of executing commands and files, and to support semi-transparent file replication andmore » access. Another unique aspect of Saguaro is its extensive use of type system to describe user data such as files and to specify the types of arguments to commands and procedures. This enables the system to assist in type checking and leads to a user interface in which command-specific templates are available to facilitate command invocation. A mechanism, channels, is also provided to enable users to construct applications containing general graphs of communication processes.« less
RBSE: Product development team research activity deliverables
NASA Technical Reports Server (NTRS)
1992-01-01
The GHG Functions and Extensions to be added to the NASA Electronic Library System (NELS) 1.1 product are described. These functions will implement the 'output request' capability within the Object Browser. The functions will be implemented in two parts. The first part is a code to be added to the Object Browser (X version) to implement menus allowing the user to request that objects be copied to specific media, or that objects be downloaded to the user's system following a specific protocol, or that the object be printed to one of the printers attached to the host system. The second part is shell scripts which support the various menu selections. Additional scripts to support functions within the GHG shell (X version) will also be created along with the X version of the GHG Shell as initial capability for the 27 Mar. prototype. The scripts will be composed of C shell routines that will accept parameters (primary file pathways). Certain limitations in functionality will invoke Mail instead of Oracle Mail since that has yet to be delivered and the NELS invocation will default to the X-Windows version instead of the ASCII version.
Introduction. Presidential disability and presidential succession.
Gilbert, Robert E; Bucy, Erik P
2014-01-01
This introduction to the special issue on presidential disability and succession focuses on the distinctly positive contributions that invocations of the Twenty-Fifth Amendment have made to American political life since the Amendment's ratification in 1967. It also underlines the importance for Presidents, their family members and aides to understand the necessity for putting the welfare of the country first, above all else-even at times above the wishes of a disabled Chief Executive. As the articles in this special issue make clear, the Twenty-Fifth Amendment provides an effective constitutional mechanism by which the country's well-being can be maintained while simultaneously showing compassion and respect for a disabled leader. The idea for this issue emerged from a conference organized by Professor Robert E. Gilbert focusing on presidential disability and succession held on the campus of Northeastern University in April 2014. Papers from the conference assembled here clarify and add to the historical record about presidential inability while illuminating the many political, legal, and constitutional contingencies that future presidential administrators may face. Contributors to this issue have varied disciplinary and professional backgrounds, including expertise in American politics, constitutional law, the presidency and vice presidency, presidential impairment, and, of course, the Twenty-Fifth Amendment to the Constitution.
Rawlinson, Mary C; Donchin, Anne
2005-09-01
This essay focuses on two underlying presumptions that impinge on the effort of UNESCO to engender universal agreement on a set of bioethical norms: the conception of universality that pervades much of the document, and its disregard of structural inequalities that significantly impact health. Drawing on other UN system documents and recent feminist bioethics scholarship, we argue that the formulation of universal principles should not rely solely on shared ethical values, as the draft document affirms, but also on differences in ethical values that obtain across cultures. UNESCO's earlier work on gender mainstreaming illustrates the necessity of thinking from multiple perspectives in generating universal norms. The declaration asserts the 'fundamental equality of all human beings in dignity and rights'(1) and insists that 'the highest attainable standard of health is one of the fundamental rights of every human being without distinction of race, religion, political belief, economic or social condition'(2) yet it does not explicitly recognize disparities of power and wealth that deny equal dignity and rights to many. Without attention to structural (as opposed to merely accidental) inequities, UNESCO's invocation of rights is so abstract as to be incompatible with its avowed intention.
Ensemble of European regional climate simulations for the winter of 2013 and 2014 from HadAM3P-RM3P
NASA Astrophysics Data System (ADS)
Schaller, Nathalie; Sparrow, Sarah N.; Massey, Neil R.; Bowery, Andy; Miller, Jonathan; Wilson, Simon; Wallom, David C. H.; Otto, Friederike E. L.
2018-04-01
Large data sets used to study the impact of anthropogenic climate change on the 2013/14 floods in the UK are provided. The data consist of perturbed initial conditions simulations using the Weather@Home regional climate modelling framework. Two different base conditions, Actual, including atmospheric conditions (anthropogenic greenhouse gases and human induced aerosols) as at present and Natural, with these forcings all removed are available. The data set is made up of 13 different ensembles (2 actual and 11 natural) with each having more than 7500 members. The data is available as NetCDF V3 files representing monthly data within the period of interest (1st Dec 2013 to 15th February 2014) for both a specified European region at a 50 km horizontal resolution and globally at N96 resolution. The data is stored within the UK Natural and Environmental Research Council Centre for Environmental Data Analysis repository.
A visualization tool to support decision making in environmental and biological planning
Romañach, Stephanie S.; McKelvy, James M.; Conzelmann, Craig; Suir, Kevin J.
2014-01-01
Large-scale ecosystem management involves consideration of many factors for informed decision making. The EverVIEW Data Viewer is a cross-platform desktop decision support tool to help decision makers compare simulation model outputs from competing plans for restoring Florida's Greater Everglades. The integration of NetCDF metadata conventions into EverVIEW allows end-users from multiple institutions within and beyond the Everglades restoration community to share information and tools. Our development process incorporates continuous interaction with targeted end-users for increased likelihood of adoption. One of EverVIEW's signature features is side-by-side map panels, which can be used to simultaneously compare species or habitat impacts from alternative restoration plans. Other features include examination of potential restoration plan impacts across multiple geographic or tabular displays, and animation through time. As a result of an iterative, standards-driven approach, EverVIEW is relevant to large-scale planning beyond Florida, and is used in multiple biological planning efforts in the United States.
McrEngine: A Scalable Checkpointing System Using Data-Aware Aggregation and Compression
Islam, Tanzima Zerin; Mohror, Kathryn; Bagchi, Saurabh; ...
2013-01-01
High performance computing (HPC) systems use checkpoint-restart to tolerate failures. Typically, applications store their states in checkpoints on a parallel file system (PFS). As applications scale up, checkpoint-restart incurs high overheads due to contention for PFS resources. The high overheads force large-scale applications to reduce checkpoint frequency, which means more compute time is lost in the event of failure. We alleviate this problem through a scalable checkpoint-restart system, mcrEngine. McrEngine aggregates checkpoints from multiple application processes with knowledge of the data semantics available through widely-used I/O libraries, e.g., HDF5 and netCDF, and compresses them. Our novel scheme improves compressibility ofmore » checkpoints up to 115% over simple concatenation and compression. Our evaluation with large-scale application checkpoints show that mcrEngine reduces checkpointing overhead by up to 87% and restart overhead by up to 62% over a baseline with no aggregation or compression.« less
Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework
NASA Astrophysics Data System (ADS)
Gannon, C.
2017-12-01
As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less
An interactive environment for the analysis of large Earth observation and model data sets
NASA Technical Reports Server (NTRS)
Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.
1994-01-01
Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.
A flexible tool for diagnosing water, energy, and entropy budgets in climate models
NASA Astrophysics Data System (ADS)
Lembo, Valerio; Lucarini, Valerio
2017-04-01
We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.
MWR3C physical retrievals of precipitable water vapor and cloud liquid water path
Cadeddu, Maria
2016-10-12
The data set contains physical retrievals of PWV and cloud LWP retrieved from MWR3C measurements during the MAGIC campaign. Additional data used in the retrieval process include radiosondes and ceilometer. The retrieval is based on an optimal estimation technique that starts from a first guess and iteratively repeats the forward model calculations until a predefined convergence criterion is satisfied. The first guess is a vector of [PWV,LWP] from the neural network retrieval fields in the netcdf file. When convergence is achieved the 'a posteriori' covariance is computed and its square root is expressed in the file as the retrieval 1-sigma uncertainty. The closest radiosonde profile is used for the radiative transfer calculations and ceilometer data are used to constrain the cloud base height. The RMS error between the brightness temperatures is computed at the last iterations as a consistency check and is written in the last column of the output file.
NASA Astrophysics Data System (ADS)
Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.
2010-11-01
As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.
Weather forecasting with open source software
NASA Astrophysics Data System (ADS)
Rautenhaus, Marc; Dörnbrack, Andreas
2013-04-01
To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.
ParCAT: A Parallel Climate Analysis Toolkit
NASA Astrophysics Data System (ADS)
Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.
2012-12-01
Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.
A polarimetric scattering database for non-spherical ice particles at microwave wavelengths
NASA Astrophysics Data System (ADS)
Lu, Yinghui; Jiang, Zhiyuan; Aydin, Kultegin; Verlinde, Johannes; Clothiaux, Eugene E.; Botta, Giovanni
2016-10-01
The atmospheric science community has entered a period in which electromagnetic scattering properties at microwave frequencies of realistically constructed ice particles are necessary for making progress on a number of fronts. One front includes retrieval of ice-particle properties and signatures from ground-based, airborne, and satellite-based radar and radiometer observations. Another front is evaluation of model microphysics by application of forward operators to their outputs and comparison to observations during case study periods. Yet a third front is data assimilation, where again forward operators are applied to databases of ice-particle scattering properties and the results compared to observations, with their differences leading to corrections of the model state. Over the past decade investigators have developed databases of ice-particle scattering properties at microwave frequencies and made them openly available. Motivated by and complementing these earlier efforts, a database containing polarimetric single-scattering properties of various types of ice particles at millimeter to centimeter wavelengths is presented. While the database presented here contains only single-scattering properties of ice particles in a fixed orientation, ice-particle scattering properties are computed for many different directions of the radiation incident on them. These results are useful for understanding the dependence of ice-particle scattering properties on ice-particle orientation with respect to the incident radiation. For ice particles that are small compared to the wavelength, the number of incident directions of the radiation is sufficient to compute reasonable estimates of their (randomly) orientation-averaged scattering properties. This database is complementary to earlier ones in that it contains complete (polarimetric) scattering property information for each ice particle - 44 plates, 30 columns, 405 branched planar crystals, 660 aggregates, and 640 conical graupel - and direction of incident radiation but is limited to four frequencies (X-, Ku-, Ka-, and W-bands), does not include temperature dependencies of the single-scattering properties, and does not include scattering properties averaged over randomly oriented ice particles. Rules for constructing the morphologies of ice particles from one database to the next often differ; consequently, analyses that incorporate all of the different databases will contain the most variability, while illuminating important differences between them. Publication of this database is in support of future analyses of this nature and comes with the hope that doing so helps contribute to the development of a database standard for ice-particle scattering properties, like the NetCDF (Network Common Data Form) CF (Climate and Forecast) or NetCDF CF/Radial metadata conventions.
NASA Astrophysics Data System (ADS)
Chiriaco, Marjolaine; Dupont, Jean-Charles; Bastin, Sophie; Badosa, Jordi; Lopez, Julio; Haeffelin, Martial; Chepfer, Helene; Guzman, Rodrigo
2018-05-01
A scientific approach is presented to aggregate and harmonize a set of 60 geophysical variables at hourly timescale over a decade, and to allow multiannual and multi-variable studies combining atmospheric dynamics and thermodynamics, radiation, clouds and aerosols from ground-based observations. Many datasets from ground-based observations are currently in use worldwide. They are very valuable because they contain complete and precise information due to their spatio-temporal co-localization over more than a decade. These datasets, in particular the synergy between different type of observations, are under-used because of their complexity and diversity due to calibration, quality control, treatment, format, temporal averaging, metadata, etc. Two main results are presented in this article: (1) a set of methods available for the community to robustly and reliably process ground-based data at an hourly timescale over a decade is described and (2) a single netCDF file is provided based on the SIRTA supersite observations. This file contains approximately 60 geophysical variables (atmospheric and in ground) hourly averaged over a decade for the longest variables. The netCDF file is available and easy to use for the community. In this article, observations are re-analyzed
. The prefix re
refers to six main steps: calibration, quality control, treatment, hourly averaging, homogenization of the formats and associated metadata, as well as expertise on more than a decade of observations. In contrast, previous studies (i) took only some of these six steps into account for each variable, (ii) did not aggregate all variables together in a single file and (iii) did not offer an hourly resolution for about 60 variables over a decade (for the longest variables). The approach described in this article can be applied to different supersites and to additional variables. The main implication of this work is that complex atmospheric observations are made readily available for scientists who are non-experts in measurements. The dataset from SIRTA observations can be downloaded at http://sirta.ipsl.fr/reobs.html (last access: April 2017) (Downloads tab, no password required) under https://doi.org/10.14768/4F63BAD4-E6AF-4101-AD5A-61D4A34620DE.
ABINIT: First-principles approach to material and nanosystem properties
NASA Astrophysics Data System (ADS)
Gonze, X.; Amadon, B.; Anglade, P.-M.; Beuken, J.-M.; Bottin, F.; Boulanger, P.; Bruneval, F.; Caliste, D.; Caracas, R.; Côté, M.; Deutsch, T.; Genovese, L.; Ghosez, Ph.; Giantomassi, M.; Goedecker, S.; Hamann, D. R.; Hermet, P.; Jollet, F.; Jomard, G.; Leroux, S.; Mancini, M.; Mazevet, S.; Oliveira, M. J. T.; Onida, G.; Pouillon, Y.; Rangel, T.; Rignanese, G.-M.; Sangalli, D.; Shaltaf, R.; Torrent, M.; Verstraete, M. J.; Zerah, G.; Zwanziger, J. W.
2009-12-01
ABINIT [ http://www.abinit.org] allows one to study, from first-principles, systems made of electrons and nuclei (e.g. periodic solids, molecules, nanostructures, etc.), on the basis of Density-Functional Theory (DFT) and Many-Body Perturbation Theory. Beyond the computation of the total energy, charge density and electronic structure of such systems, ABINIT also implements many dynamical, dielectric, thermodynamical, mechanical, or electronic properties, at different levels of approximation. The present paper provides an exhaustive account of the capabilities of ABINIT. It should be helpful to scientists that are not familiarized with ABINIT, as well as to already regular users. First, we give a broad overview of ABINIT, including the list of the capabilities and how to access them. Then, we present in more details the recent, advanced, developments of ABINIT, with adequate references to the underlying theory, as well as the relevant input variables, tests and, if available, ABINIT tutorials. Program summaryProgram title: ABINIT Catalogue identifier: AEEU_v1_0 Distribution format: tar.gz Journal reference: Comput. Phys. Comm. Programming language: Fortran95, PERL scripts, Python scripts Computer: All systems with a Fortran95 compiler Operating system: All systems with a Fortran95 compiler Has the code been vectorized or parallelized?: Sequential, or parallel with proven speed-up up to one thousand processors. RAM: Ranges from a few Mbytes to several hundred Gbytes, depending on the input file. Classification: 7.3, 7.8 External routines: (all optional) BigDFT [1], ETSF IO [2], libxc [3], NetCDF [4], MPI [5], Wannier90 [6] Nature of problem: This package has the purpose of computing accurately material and nanostructure properties: electronic structure, bond lengths, bond angles, primitive cell size, cohesive energy, dielectric properties, vibrational properties, elastic properties, optical properties, magnetic properties, non-linear couplings, electronic and vibrational lifetimes, etc. Solution method: Software application based on Density-Functional Theory and Many-Body Perturbation Theory, pseudopotentials, with planewaves, Projector-Augmented Waves (PAW) or wavelets as basis functions. Running time: From less than one second for the simplest tests, to several weeks. The vast majority of the >600 provided tests run in less than 30 seconds. References:[1] http://inac.cea.fr/LSim/BigDFT. [2] http://etsf.eu/index.php?page=standardization. [3] http://www.tddft.org/programs/octopus/wiki/index.php/Libxc. [4] http://www.unidata.ucar.edu/software/netcdf. [5] http://en.wikipedia.org/wiki/MessagePassingInterface. [6] http://www.wannier.org.
Fluid models and simulations of biological cell phenomena
NASA Technical Reports Server (NTRS)
Greenspan, H. P.
1982-01-01
The dynamics of coated droplets are examined within the context of biofluids. Of specific interest is the manner in which the shape of a droplet, the motion within it as well as that of aggregates of droplets can be controlled by the modulation of surface properties and the extent to which such fluid phenomena are an intrinsic part of cellular processes. From the standpoint of biology, an objective is to elucidate some of the general dynamical features that affect the disposition of an entire cell, cell colonies and tissues. Conventionally averaged field variables of continuum mechanics are used to describe the overall global effects which result from the myriad of small scale molecular interactions. An attempt is made to establish cause and effect relationships from correct dynamical laws of motion rather than by what may have been unnecessary invocation of metabolic or life processes. Several topics are discussed where there are strong analogies droplets and cells including: encapsulated droplets/cell membranes; droplet shape/cell shape; adhesion and spread of a droplet/cell motility and adhesion; and oams and multiphase flows/cell aggregates and tissues. Evidence is presented to show that certain concepts of continuum theory such as suface tension, surface free energy, contact angle, bending moments, etc. are relevant and applicable to the study of cell biology.
Plemons, Eric D
2014-10-01
This article explores the research project that led to the development of facial feminization surgery, a set of bone and soft tissue reconstructive surgical procedures intended to feminize the faces of male-to-female trans- women. Conducted by a pioneering surgeon in the mid-1980s, this research consisted of three steps: (1) assessments of sexual differences of the skull taken from early 20th-century physical anthropology, (2) the application of statistical analyses taken from late 20th-century orthodontic research, and (3) the vetting of this new morphological and metric knowledge in a dry skull collection. When the 'feminine type' of early 20th-century physical anthropology was made to articulate with the 'female mean' of 1970s' statistical analysis, these two very different epistemological artifacts worked together to produce something new: a singular model of a distinctively female skull. In this article, I show how the development of facial feminization surgery worked across epistemic styles, transforming historically racialized and gendered descriptions of sex difference into contemporary surgical prescriptions for sex change. Fundamental to this transformation was an explicit invocation of the scientific origins of facial sexual dimorphism, a claim that frames surgical sex change of the face as not only possible, but objectively certain.
Network Location-Aware Service Recommendation with Random Walk in Cyber-Physical Systems.
Yin, Yuyu; Yu, Fangzheng; Xu, Yueshen; Yu, Lifeng; Mu, Jinglong
2017-09-08
Cyber-physical systems (CPS) have received much attention from both academia and industry. An increasing number of functions in CPS are provided in the way of services, which gives rise to an urgent task, that is, how to recommend the suitable services in a huge number of available services in CPS. In traditional service recommendation, collaborative filtering (CF) has been studied in academia, and used in industry. However, there exist several defects that limit the application of CF-based methods in CPS. One is that under the case of high data sparsity, CF-based methods are likely to generate inaccurate prediction results. In this paper, we discover that mining the potential similarity relations among users or services in CPS is really helpful to improve the prediction accuracy. Besides, most of traditional CF-based methods are only capable of using the service invocation records, but ignore the context information, such as network location, which is a typical context in CPS. In this paper, we propose a novel service recommendation method for CPS, which utilizes network location as context information and contains three prediction models using random walking. We conduct sufficient experiments on two real-world datasets, and the results demonstrate the effectiveness of our proposed methods and verify that the network location is indeed useful in QoS prediction.
Ma, Ke; Forsman, Jan; Woodward, Clifford E
2015-05-07
We explore the influence of ion pairing in room temperature ionic liquids confined by planar electrode surfaces. Using a coarse-grained model for the aromatic ionic liquid [C4MIM(+)][BF4 (-)], we account for an ion pairing component as an equilibrium associating species within a classical density functional theory. We investigated the resulting structure of the electrical double layer as well as the ensuing surface forces and differential capacitance, as a function of the degree of ion association. We found that the short-range structure adjacent to surfaces was remarkably unaffected by the degree of ion pairing, up to several molecular diameters. This was even the case for 100% of ions being paired. The physical implications of ion pairing only become apparent in equilibrium properties that depend upon the long-range screening of charges, such as the asymptotic behaviour of surface forces and the differential capacitance, especially at low surface potential. The effect of ion pairing on capacitance is consistent with their invocation as a source of the anomalous temperature dependence of the latter. This work shows that ion pairing effects on equilibrium properties are subtle and may be difficult to extract directly from simulations.
ATLAS Metadata Infrastructure Evolution for Run 2 and Beyond
NASA Astrophysics Data System (ADS)
van Gemmeren, P.; Cranshaw, J.; Malon, D.; Vaniachine, A.
2015-12-01
ATLAS developed and employed for Run 1 of the Large Hadron Collider a sophisticated infrastructure for metadata handling in event processing jobs. This infrastructure profits from a rich feature set provided by the ATLAS execution control framework, including standardized interfaces and invocation mechanisms for tools and services, segregation of transient data stores with concomitant object lifetime management, and mechanisms for handling occurrences asynchronous to the control framework's state machine transitions. This metadata infrastructure is evolving and being extended for Run 2 to allow its use and reuse in downstream physics analyses, analyses that may or may not utilize the ATLAS control framework. At the same time, multiprocessing versions of the control framework and the requirements of future multithreaded frameworks are leading to redesign of components that use an incident-handling approach to asynchrony. The increased use of scatter-gather architectures, both local and distributed, requires further enhancement of metadata infrastructure in order to ensure semantic coherence and robust bookkeeping. This paper describes the evolution of ATLAS metadata infrastructure for Run 2 and beyond, including the transition to dual-use tools—tools that can operate inside or outside the ATLAS control framework—and the implications thereof. It further examines how the design of this infrastructure is changing to accommodate the requirements of future frameworks and emerging event processing architectures.
Succeeding in Science Communication amid Contentious Public Policy Debates
NASA Astrophysics Data System (ADS)
Huertas, A.
2014-12-01
Scientists are often hesitant to engage in public dialogues about their work, especially when their research has bearing on contentious public policy issues. The Union of Concerned Scientists has conducted dozens of workshops to assist its members in communicating science fairly, accurately and effectively to audiences with mixed opinions about relevant public policy. While public polling indicates that people admire scientists and support scientific research, public understanding lags behind scientific understanding on a variety of issues, from climate change to evolution to vaccination. In many cases, people reject or discount scientific evidence when they perceive their ideology, beliefs or policy preferences as being in conflict with that evidence. These biases make it difficult for scientists to convey their research to many audiences. Based on reviews of social science literature and interactions with its members, the Union of Concerned Scientists has explored methods for surmounting public ideological biases while staying true to the science. In particular, scientists have found success with communicating based on shared values, asking audience members questions about their reactions to science, avoiding unintentional invocation of ideological biases and partnering with non-scientist speakers who can address contentious public policy questions. These methods can allow scientists to more effectively collaborate with stakeholders interested in their research and can build public support for science.
Implementing embedded artificial intelligence rules within algorithmic programming languages
NASA Technical Reports Server (NTRS)
Feyock, Stefan
1988-01-01
Most integrations of artificial intelligence (AI) capabilities with non-AI (usually FORTRAN-based) application programs require the latter to execute separately to run as a subprogram or, at best, as a coroutine, of the AI system. In many cases, this organization is unacceptable; instead, the requirement is for an AI facility that runs in embedded mode; i.e., is called as subprogram by the application program. The design and implementation of a Prolog-based AI capability that can be invoked in embedded mode are described. The significance of this system is twofold: Provision of Prolog-based symbol-manipulation and deduction facilities makes a powerful symbolic reasoning mechanism available to applications programs written in non-AI languages. The power of the deductive and non-procedural descriptive capabilities of Prolog, which allow the user to describe the problem to be solved, rather than the solution, is to a large extent vitiated by the absence of the standard control structures provided by other languages. Embedding invocations of Prolog rule bases in programs written in non-AI languages makes it possible to put Prolog calls inside DO loops and similar control constructs. The resulting merger of non-AI and AI languages thus results in a symbiotic system in which the advantages of both programming systems are retained, and their deficiencies largely remedied.
Modular VO oriented Java EE service deployer
NASA Astrophysics Data System (ADS)
Molinaro, Marco; Cepparo, Francesco; De Marco, Marco; Knapic, Cristina; Apollo, Pietro; Smareglia, Riccardo
2014-07-01
The International Virtual Observatory Alliance (IVOA) has produced many standards and recommendations whose aim is to generate an architecture that starts from astrophysical resources, in a general sense, and ends up in deployed consumable services (that are themselves astrophysical resources). Focusing on the Data Access Layer (DAL) system architecture, that these standards define, in the last years a web based application has been developed and maintained at INAF-OATs IA2 (Italian National institute for Astrophysics - Astronomical Observatory of Trieste, Italian center of Astronomical Archives) to try to deploy and manage multiple VO (Virtual Observatory) services in a uniform way: VO-Dance. However a set of criticalities have arisen since when the VO-Dance idea has been produced, plus some major changes underwent and are undergoing at the IVOA DAL layer (and related standards): this urged IA2 to identify a new solution for its own service layer. Keeping on the basic ideas from VO-Dance (simple service configuration, service instantiation at call time and modularity) while switching to different software technologies (e.g. dismissing Java Reflection in favour of Enterprise Java Bean, EJB, based solution), the new solution has been sketched out and tested for feasibility. Here we present the results originating from this test study. The main constraints for this new project come from various fields. A better homogenized solution rising from IVOA DAL standards: for example the new DALI (Data Access Layer Interface) specification that acts as a common interface system for previous and oncoming access protocols. The need for a modular system where each component is based upon a single VO specification allowing services to rely on common capabilities instead of homogenizing them inside service components directly. The search for a scalable system that takes advantage from distributed systems. The constraints find answer in the adopted solutions hereafter sketched. The development of the new system using Java Enterprise technologies can better benefit from existing libraries to build up the single tokens implementing the IVOA standards. Each component can be built from single standards and each deployed service (i.e. service components instantiations) can consume the other components' exposed methods and services without the need of homogenizing them in dedicated libraries. Scalability can be achieved in an easier way by deploying components or sets of services on a distributed environment and using JNDI (Java Naming and Directory Interface) and RMI (Remote Method Invocation) technologies. Single service configuration will not be significantly different from the VO-Dance solution given that Java class instantiation that benefited from Java Reflection will only be moved to Java EJB pooling (and not, e.g. embedded in bundles for subsequent deployment).
CryoSat Ice Processor: Known Processor Anomalies and Potential Future Product Evolutions
NASA Astrophysics Data System (ADS)
Mannan, R.; Webb, E.; Hall, A.; Bouffard, J.; Femenias, P.; Parrinello, T.; Bouffard, J.; Brockley, D.; Baker, S.; Scagliola, M.; Urien, S.
2016-08-01
Launched in 2010, CryoSat was designed to measure changes in polar sea ice thickness and ice sheet elevation. To reach this goal the CryoSat data products have to meet the highest performance standards and are subjected to a continual cycle of improvement achieved through upgrades to the Instrument Processing Facilities (IPFs). Following the switch to the Baseline-C Ice IPFs there are already planned evolutions for the next processing Baseline, based on recommendations from the Scientific Community, Expert Support Laboratory (ESL), Quality Control (QC) Centres and Validation campaigns. Some of the proposed evolutions, to be discussed with the scientific community, include the activation of freeboard computation in SARin mode, the potential operation of SARin mode over flat-to-slope transitory land ice areas, further tuning of the land ice retracker, the switch to NetCDF format and the resolution of anomalies arising in Baseline-C. This paper describes some of the anomalies known to affect Baseline-C in addition to potential evolutions that are planned and foreseen for Baseline-D.
Development of a gridded meteorological dataset over Java island, Indonesia 1985-2014.
Yanto; Livneh, Ben; Rajagopalan, Balaji
2017-05-23
We describe a gridded daily meteorology dataset consisting of precipitation, minimum and maximum temperature over Java Island, Indonesia at 0.125°×0.125° (~14 km) resolution spanning 30 years from 1985-2014. Importantly, this data set represents a marked improvement from existing gridded data sets over Java with higher spatial resolution, derived exclusively from ground-based observations unlike existing satellite or reanalysis-based products. Gap-infilling and gridding were performed via the Inverse Distance Weighting (IDW) interpolation method (radius, r, of 25 km and power of influence, α, of 3 as optimal parameters) restricted to only those stations including at least 3,650 days (~10 years) of valid data. We employed MSWEP and CHIRPS rainfall products in the cross-validation. It shows that the gridded rainfall presented here produces the most reasonable performance. Visual inspection reveals an increasing performance of gridded precipitation from grid, watershed to island scale. The data set, stored in a network common data form (NetCDF), is intended to support watershed-scale and island-scale studies of short-term and long-term climate, hydrology and ecology.
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
A computer program partly automates the task of determining whether an HDF-EOS 5 file is valid in that it conforms to specifications for such characteristics as attribute names, dimensionality of data products, and ranges of legal data values. ["HDF-EOS" and variants thereof are defined in "Converting EOS Data From HDF-EOS to netCDF" (GSC-15007-1), which is the first of several preceding articles in this issue of NASA Tech Briefs.] Previously, validity of a file was determined in a tedious and error-prone process in which a person examined human-readable dumps of data-file-format information. The present software helps a user to encode the specifications for an HDFEOS 5 file, and then inspects the file for conformity with the specifications: First, the user writes the specifications in Extensible Markup Language (XML) by use of a document type definition (DTD) that is part of the program. Next, the portion of the program (denoted the validator) that performs the inspection is executed, using, as inputs, the specifications in XML and the HDF-EOS 5 file to be validated. Finally, the user examines the output of the validator.
Use Hierarchical Storage and Analysis to Exploit Intrinsic Parallelism
NASA Astrophysics Data System (ADS)
Zender, C. S.; Wang, W.; Vicente, P.
2013-12-01
Big Data is an ugly name for the scientific opportunities and challenges created by the growing wealth of geoscience data. How to weave large, disparate datasets together to best reveal their underlying properties, to exploit their strengths and minimize their weaknesses, to continually aggregate more information than the world knew yesterday and less than we will learn tomorrow? Data analytics techniques (statistics, data mining, machine learning, etc.) can accelerate pattern recognition and discovery. However, often researchers must, prior to analysis, organize multiple related datasets into a coherent framework. Hierarchical organization permits entire dataset to be stored in nested groups that reflect their intrinsic relationships and similarities. Hierarchical data can be simpler and faster to analyze by coding operators to automatically parallelize processes over isomorphic storage units, i.e., groups. The newest generation of netCDF Operators (NCO) embody this hierarchical approach, while still supporting traditional analysis approaches. We will use NCO to demonstrate the trade-offs involved in processing a prototypical Big Data application (analysis of CMIP5 datasets) using hierarchical and traditional analysis approaches.
Owgis 2.0: Open Source Java Application that Builds Web GIS Interfaces for Desktop Andmobile Devices
NASA Astrophysics Data System (ADS)
Zavala Romero, O.; Chassignet, E.; Zavala-Hidalgo, J.; Pandav, H.; Velissariou, P.; Meyer-Baese, A.
2016-12-01
OWGIS is an open source Java and JavaScript application that builds easily configurable Web GIS sites for desktop and mobile devices. The current version of OWGIS generates mobile interfaces based on HTML5 technology and can be used to create mobile applications. The style of the generated websites can be modified using COMPASS, a well known CSS Authoring Framework. In addition, OWGIS uses several Open Geospatial Consortium standards to request datafrom the most common map servers, such as GeoServer. It is also able to request data from ncWMS servers, allowing the websites to display 4D data from NetCDF files. This application is configured by XML files that define which layers, geographic datasets, are displayed on the Web GIS sites. Among other features, OWGIS allows for animations; streamlines from vector data; virtual globe display; vertical profiles and vertical transects; different color palettes; the ability to download data; and display text in multiple languages. OWGIS users are mainly scientists in the oceanography, meteorology and climate fields.
Xray: N-dimensional, labeled arrays for analyzing physical datasets in Python
NASA Astrophysics Data System (ADS)
Hoyer, S.
2015-12-01
Efficient analysis of geophysical datasets requires tools that both preserve and utilize metadata, and that transparently scale to process large datas. Xray is such a tool, in the form of an open source Python library for analyzing the labeled, multi-dimensional array (tensor) datasets that are ubiquitous in the Earth sciences. Xray's approach pairs Python data structures based on the data model of the netCDF file format with the proven design and user interface of pandas, the popular Python data analysis library for labeled tabular data. On top of the NumPy array, xray adds labeled dimensions (e.g., "time") and coordinate values (e.g., "2015-04-10"), which it uses to enable a host of operations powered by these labels: selection, aggregation, alignment, broadcasting, split-apply-combine, interoperability with pandas and serialization to netCDF/HDF5. Many of these operations are enabled by xray's tight integration with pandas. Finally, to allow for easy parallelism and to enable its labeled data operations to scale to datasets that does not fit into memory, xray integrates with the parallel processing library dask.
Design of FastQuery: How to Generalize Indexing and Querying System for Scientific Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Jerry; Wu, Kesheng
2011-04-18
Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies such as FastBit are critical for facilitating interactive exploration of large datasets. These technologies rely on adding auxiliary information to existing datasets to accelerate query processing. To use these indices, we need to match the relational data model used by the indexing systems with the array data model used by most scientific data, and to provide an efficient input and output layer for reading and writing the indices. In this work, we present a flexible design that can be easily applied to most scientific datamore » formats. We demonstrate this flexibility by applying it to two of the most commonly used scientific data formats, HDF5 and NetCDF. We present two case studies using simulation data from the particle accelerator and climate simulation communities. To demonstrate the effectiveness of the new design, we also present a detailed performance study using both synthetic and real scientific workloads.« less
NASA Astrophysics Data System (ADS)
Wilkinson, D. C.
2012-12-01
NOAA's Geosynchronous Operational Environmental Satellites (GOES) have been observing the environment in near-earth-space for over 37 years. Those data are down-linked and processed by the Space Weather Prediction Center (SWPC) and form the cornerstone of their alert and forecast services. At the close of each UT day these data are ingested by the National Geophysical Data Center (NGDC) where they are merged into the national archive and made available to the user community in a uniform manner. In 2012 NGDC unveiled a RESTful web service for accessing these data. What does this mean? Users can now build a web-like URL using simple predefined constructs that allows their browser or custom software to directly access the relational archives and bundle the requested data into a variety of popular formats. The user can select precisely the data they need and the results are delivered immediately. NGDC understands that many users are perfectly happy retrieving data via pre-generated files and will continue to provide internally documented NetCDF and CSV files far into the future.
NASA Astrophysics Data System (ADS)
Wilkinson, D. C.
2013-12-01
NOAA's Geosynchronous Operational Environmental Satellites (GOES) have been observing the environment in near-earth-space for over 37 years. Those data are down-linked and processed by the Space Weather Prediction Center (SWPC) and form the cornerstone of their alert and forecast services. At the close of each UT day these data are ingested by the National Geophysical Data Center (NGDC) where they are merged into the national archive and made available to the user community in a uniform manner. In 2012 NGDC unveiled a RESTful web service for accessing these data. What does this mean? Users can now build a web-like URL using simple predefined constructs that allows their browser or custom software to directly access the relational archives and bundle the requested data into a variety of popular formats. The user can select precisely the data they need and the results are delivered immediately. NGDC understands that many users are perfectly happy retrieving data via pre-generated files and will continue to provide internally documented NetCDF and CSV files far into the future.
In this study, modeled gas- and aerosol phase ammonia, nitric acid, and hydrogen chloride are compared to measurements taken during a field campaign conducted in northern Colorado in February and March 2011. We compare the modeled and observed gas-particle partitioning, and assess potential reasons for discrepancies between the model and measurements. This data set contains scripts and data used for each figure in the associated manuscript. Figures are generated using the R project statistical programming language. Data files are in either comma-separated value (CSV) format or netCDF, a standard self-describing binary data format commonly used in the earth and atmospheric sciences. This dataset is associated with the following publication:Kelly , J., K. Baker , C. Nolte, S. Napelenok , W.C. Keene, and A.A.P. Pszenny. Simulating the phase partitioning of NH3, HNO3, and HCl with size-resolved particles over northern Colorado in winter. ATMOSPHERIC ENVIRONMENT. Elsevier Science Ltd, New York, NY, USA, 131: 67-77, (2016).
Ocean Tracking Network (OTN): Development of Oceanographic Data Integration with Animal Movement
NASA Astrophysics Data System (ADS)
Bajona, L.
2016-02-01
OTN is a $168-million ocean research and technology development platform headquartered at Dalhousie University, Canada. Using acoustic and satellite telemetry to globally document the movements and survival of aquatic animals, and their environmental correlates. The OTN Mission: to foster conservation and sustainability of valued species by generating knowledge on the movement patterns of aquatic species in their changing environment. OTN's ever-expanding global network of acoustic receivers listening for over 90 different key animal species is providing for the data needed in working in collaboration with researchers for the development of oceanographic data integration with animal movement. Presented here is Data Management's work to date, status and challenges in OTN's move towards a community standard to enable sharing between projects nationally and internationally; permitting inter-operability with other large national (e.g. CHONe, ArcticNET) and international (IOOS, IMOS) networks. This work includes co-development of Animal Acoustic Telemetry (AAT) metadata standard and implementation using an ERDDAP data server (NOAA, Environmental Research Division's Data Access Program) facilitating ingestion for modelers (eg. netcdf).
Estimation of gross land-use change and its uncertainty using a Bayesian data assimilation approach
NASA Astrophysics Data System (ADS)
Levy, Peter; van Oijen, Marcel; Buys, Gwen; Tomlinson, Sam
2018-03-01
We present a method for estimating land-use change using a Bayesian data assimilation approach. The approach provides a general framework for combining multiple disparate data sources with a simple model. This allows us to constrain estimates of gross land-use change with reliable national-scale census data, whilst retaining the detailed information available from several other sources. Eight different data sources, with three different data structures, were combined in our posterior estimate of land use and land-use change, and other data sources could easily be added in future. The tendency for observations to underestimate gross land-use change is accounted for by allowing for a skewed distribution in the likelihood function. The data structure produced has high temporal and spatial resolution, and is appropriate for dynamic process-based modelling. Uncertainty is propagated appropriately into the output, so we have a full posterior distribution of output and parameters. The data are available in the widely used netCDF file format from http://eidc.ceh.ac.uk/.
Error mitigation for CCSD compressed imager data
NASA Astrophysics Data System (ADS)
Gladkova, Irina; Grossberg, Michael; Gottipati, Srikanth; Shahriar, Fazlul; Bonev, George
2009-08-01
To efficiently use the limited bandwidth available on the downlink from satellite to ground station, imager data is usually compressed before transmission. Transmission introduces unavoidable errors, which are only partially removed by forward error correction and packetization. In the case of the commonly used CCSD Rice-based compression, it results in a contiguous sequence of dummy values along scan lines in a band of the imager data. We have developed a method capable of using the image statistics to provide a principled estimate of the missing data. Our method outperforms interpolation yet can be performed fast enough to provide uninterrupted data flow. The estimation of the lost data provides significant value to end users who may use only part of the data, may not have statistical tools, or lack the expertise to mitigate the impact of the lost data. Since the locations of the lost data will be clearly marked as meta-data in the HDF or NetCDF header, experts who prefer to handle error mitigation themselves will be free to use or ignore our estimates as they see fit.
Common Data Format (CDF) and Coordinated Data Analysis Web (CDAWeb)
NASA Technical Reports Server (NTRS)
Candey, Robert M.
2010-01-01
The Coordinated Data Analysis Web (CDAWeb)
Using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.
2016-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
US Geoscience Information Network, Web Services for Geoscience Information Discovery and Access
NASA Astrophysics Data System (ADS)
Richard, S.; Allison, L.; Clark, R.; Coleman, C.; Chen, G.
2012-04-01
The US Geoscience information network has developed metadata profiles for interoperable catalog services based on ISO19139 and the OGC CSW 2.0.2. Currently data services are being deployed for the US Dept. of Energy-funded National Geothermal Data System. These services utilize OGC Web Map Services, Web Feature Services, and THREDDS-served NetCDF for gridded datasets. Services and underlying datasets (along with a wide variety of other information and non information resources are registered in the catalog system. Metadata for registration is produced by various workflows, including harvest from OGC capabilities documents, Drupal-based web applications, transformation from tabular compilations. Catalog search is implemented using the ESRI Geoportal open-source server. We are pursuing various client applications to demonstrated discovery and utilization of the data services. Currently operational applications allow catalog search and data acquisition from map services in an ESRI ArcMap extension, a catalog browse and search application built on openlayers and Django. We are developing use cases and requirements for other applications to utilize geothermal data services for resource exploration and evaluation.
Climate Data Provenance Tracking for Just-In-Time Computation
NASA Astrophysics Data System (ADS)
Fries, S.; Nadeau, D.; Doutriaux, C.; Williams, D. N.
2016-12-01
The "Climate Data Management System" (CDMS) was created in 1996 as part of the Climate Data Analysis Tools suite of software. It provides a simple interface into a wide variety of climate data formats, and creates NetCDF CF-Compliant files. It leverages the NumPy framework for high performance computation, and is an all-in-one IO and computation package. CDMS has been extended to track manipulations of data, and trace that data all the way to the original raw data. This extension tracks provenance about data, and enables just-in-time (JIT) computation. The provenance for each variable is packaged as part of the variable's metadata, and can be used to validate data processing and computations (by repeating the analysis on the original data). It also allows for an alternate solution for sharing analyzed data; if the bandwidth for a transfer is prohibitively expensive, the provenance serialization can be passed in a much more compact format and the analysis rerun on the input data. Data provenance tracking in CDMS enables far-reaching and impactful functionalities, permitting implementation of many analytical paradigms.
NASA World Wind Near Real Time Data for Earth
NASA Astrophysics Data System (ADS)
Hogan, P.
2013-12-01
Innovation requires open standards for data exchange, not to mention ^access to data^ so that value-added, the information intelligence, can be continually created and advanced by the larger community. Likewise, innovation by academia and entrepreneurial enterprise alike, are greatly benefited by an open platform that provides the basic technology for access and visualization of that data. NASA World Wind Java, and now NASA World Wind iOS for the iPhone and iPad, provides that technology. Whether the interest is weather science or climate science, emergency response or supply chain, seeing spatial data in its native context of Earth accelerates understanding and improves decision-making. NASA World Wind open source technology provides the basic elements for 4D visualization, using Open Geospatial Consortium (OGC) protocols, while allowing for customized access to any data, big or small, including support for NetCDF. NASA World Wind includes access to a suite of US Government WMS servers with near real time data. The larger community can readily capitalize on this technology, building their own value-added applications, either open or proprietary. Night lights heat map Glacier National Park
NASA Astrophysics Data System (ADS)
Keck, N. N.; Macduff, M.; Martin, T.
2017-12-01
The Atmospheric Radiation Measurement's (ARM) Data Management Facility (DMF) plays a critical support role in processing and curating data generated by the Department of Energy's ARM Program. Data are collected near real time from hundreds of observational instruments spread out all over the globe. Data are then ingested hourly to provide time series data in NetCDF (network Common Data Format) and includes standardized metadata. Based on automated processes and a variety of user reviews the data may need to be reprocessed. Final data sets are then stored and accessed by users through the ARM Archive. Over the course of 20 years, a suite of data visualization tools have been developed to facilitate the operational processes to manage and maintain the more than 18,000 real time events, that move 1.3 TB of data each day through the various stages of the DMF's data system. This poster will present the resources and methodology used to capture metadata and the tools that assist in routine data management and discoverability.
SchemaOnRead: A Package for Schema-on-Read in R
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less
Development of an Oceanographic Data Archiving and Service System for the Korean Researchers
NASA Astrophysics Data System (ADS)
Kim, Sung Dae; Park, Hyuk Min; Baek, Sang Ho
2014-05-01
Oceanographic Data and Information Center of Korea Institute of Ocean Science and Technology (KIOST) started to develop an oceanographic data archiving and service system in 2010 to support the Korean ocean researchers by providing quality controlled data continuously. Many physical oceanographic data available in the public domain and Korean domestic data were collected periodically, quality controlled, manipulated and provided to ocean modelers who need ocean data continuously and marine biologists who don't know well physical data but need it. The northern limit and the southern limit of the spatial coverage are 20°N and 55°N, and the western limit and the eastern limit are 110°E and 150°E, respectively. To archive TS (Temperature and Salinity) profile data, ARGO data were gathered from ARGO GDACs (France and USA) and many historical TS profile data observed by CTD, OSD and BT were retrieved from World Ocean Database 2009. The quality control software for TS profile data, which meets QC criteria suggested by the ARGO program and the GTSPP (Global Temperature-Salinity Profile Program), was programmed and applied to the collected data. By the end of 2013, the total number of vertical profile data from the ARGO GDACs was 59,642 and total number of station data from WOD 2009 was 1,604,422. We also collected the global satellite SST data produced by NCDC and global SSH data from AVISO every day. An automatic program was coded to collect satellite data, extract sub data sets of the North West Pacific area and produce distribution maps. The total number of collected satellite data sets was 3,613 by the end of 2013. We use 3 different data services to provide archived data to the Korean experts. A FTP service was prepared to allow data users to download data in the original format. We developed TS database system using Oracle RDBMS to contain all collected temperature salinity data and support SQL data retrieval with various conditions. The KIOST ocean data portal was used as the data retrieving service of TS DB, which uses GIS interface made by open source GIS software. We also installed Live Access Service developed by US PMEL for service of the satellite netCDF data files, which support on-the-fly visualization and OPeNDAP (Open-source Project for a Network Data Access Protocol) service for remote connection and sub-setting of large data set
Best Practices for Preparing Interoperable Geospatial Data
NASA Astrophysics Data System (ADS)
Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.
2010-12-01
Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.
Unified Access Architecture for Large-Scale Scientific Datasets
NASA Astrophysics Data System (ADS)
Karna, Risav
2014-05-01
Data-intensive sciences have to deploy diverse large scale database technologies for data analytics as scientists have now been dealing with much larger volume than ever before. While array databases have bridged many gaps between the needs of data-intensive research fields and DBMS technologies (Zhang 2011), invocation of other big data tools accompanying these databases is still manual and separate the database management's interface. We identify this as an architectural challenge that will increasingly complicate the user's work flow owing to the growing number of useful but isolated and niche database tools. Such use of data analysis tools in effect leaves the burden on the user's end to synchronize the results from other data manipulation analysis tools with the database management system. To this end, we propose a unified access interface for using big data tools within large scale scientific array database using the database queries themselves to embed foreign routines belonging to the big data tools. Such an invocation of foreign data manipulation routines inside a query into a database can be made possible through a user-defined function (UDF). UDFs that allow such levels of freedom as to call modules from another language and interface back and forth between the query body and the side-loaded functions would be needed for this purpose. For the purpose of this research we attempt coupling of four widely used tools Hadoop (hadoop1), Matlab (matlab1), R (r1) and ScaLAPACK (scalapack1) with UDF feature of rasdaman (Baumann 98), an array-based data manager, for investigating this concept. The native array data model used by an array-based data manager provides compact data storage and high performance operations on ordered data such as spatial data, temporal data, and matrix-based data for linear algebra operations (scidbusr1). Performances issues arising due to coupling of tools with different paradigms, niche functionalities, separate processes and output data formats have been anticipated and considered during the design of the unified architecture. The research focuses on the feasibility of the designed coupling mechanism and the evaluation of the efficiency and benefits of our proposed unified access architecture. Zhang 2011: Zhang, Ying and Kersten, Martin and Ivanova, Milena and Nes, Niels, SciQL: Bridging the Gap Between Science and Relational DBMS, Proceedings of the 15th Symposium on International Database Engineering Applications, 2011. Baumann 98: Baumann, P., Dehmel, A., Furtado, P., Ritsch, R., Widmann, N., "The Multidimensional Database System RasDaMan", SIGMOD 1998, Proceedings ACM SIGMOD International Conference on Management of Data, June 2-4, 1998, Seattle, Washington, 1998. hadoop1: hadoop.apache.org, "Hadoop", http://hadoop.apache.org/, [Online; accessed 12-Jan-2014]. scalapack1: netlib.org/scalapack, "ScaLAPACK", http://www.netlib.org/scalapack,[Online; accessed 12-Jan-2014]. r1: r-project.org, "R", http://www.r-project.org/,[Online; accessed 12-Jan-2014]. matlab1: mathworks.com, "Matlab Documentation", http://www.mathworks.de/de/help/matlab/,[Online; accessed 12-Jan-2014]. scidbusr1: scidb.org, "SciDB User's Guide", http://scidb.org/HTMLmanual/13.6/scidb_ug,[Online; accessed 01-Dec-2013].
NASA Astrophysics Data System (ADS)
Vines, Aleksander; Hamre, Torill; Lygre, Kjetil
2014-05-01
The GreenSeas project (Development of global plankton data base and model system for eco-climate early warning) aims to advance the knowledge and predictive capacities of how marine ecosystems will respond to global change. A main task has been to set up a data delivery and monitoring core service following the open and free data access policy implemented in the Global Monitoring for the Environment and Security (GMES) programme. The aim is to ensure open and free access to historical plankton data, new data (EO products and in situ measurements), model data (including estimates of simulation error) and biological, environmental and climatic indicators to a range of stakeholders, such as scientists, policy makers and environmental managers. To this end, we have developed a geo-spatial database of both historical and new in situ physical, biological and chemical parameters for the Southern Ocean, Atlantic, Nordic Seas and the Arctic, and organized related satellite-derived quantities and model forecasts in a joint geo-spatial repository. For easy access to these data, we have implemented a web-based GIS (Geographical Information Systems) where observed, derived and forcasted parameters can be searched, displayed, compared and exported. Model forecasts can also be uploaded dynamically to the system, to allow modelers to quickly compare their results with available in situ and satellite observations. We have implemented the web-based GIS(Geographical Information Systems) system based on free and open source technologies: Thredds Data Server, ncWMS, GeoServer, OpenLayers, PostGIS, Liferay, Apache Tomcat, PRTree, NetCDF-Java, json-simple, Geotoolkit, Highcharts, GeoExt, MapFish, FileSaver, jQuery, jstree and qUnit. We also wanted to used open standards to communicate between the different services and we use WMS, WFS, netCDF, GML, OPeNDAP, JSON, and SLD. The main advantage we got from using FOSS was that we did not have to invent the wheel all over again, but could use already existing code and functionalities on our software for free: Of course most the software did not have to be open source for this, but in some cases we had to do minor modifications to make the different technologies work together. We could extract the parts of the code that we needed for a specific task. One example of this was to use part of the code from ncWMS and Thredds to help our main application to both read netCDF files and present them in the browser. This presentation will focus on both difficulties we had with and advantages we got from developing this tool with FOSS.
A model-driven privacy compliance decision support for medical data sharing in Europe.
Boussi Rahmouni, H; Solomonides, T; Casassa Mont, M; Shiu, S; Rahmouni, M
2011-01-01
Clinical practitioners and medical researchers often have to share health data with other colleagues across Europe. Privacy compliance in this context is very important but challenging. Automated privacy guidelines are a practical way of increasing users' awareness of privacy obligations and help eliminating unintentional breaches of privacy. In this paper we present an ontology-plus-rules based approach to privacy decision support for the sharing of patient data across European platforms. We use ontologies to model the required domain and context information about data sharing and privacy requirements. In addition, we use a set of Semantic Web Rule Language rules to reason about legal privacy requirements that are applicable to a specific context of data disclosure. We make the complete set invocable through the use of a semantic web application acting as an interactive privacy guideline system can then invoke the full model in order to provide decision support. When asked, the system will generate privacy reports applicable to a specific case of data disclosure described by the user. Also reports showing guidelines per Member State may be obtained. The advantage of this approach lies in the expressiveness and extensibility of the modelling and inference languages adopted and the ability they confer to reason with complex requirements interpreted from high level regulations. However, the system cannot at this stage fully simulate the role of an ethics committee or review board.
Network Location-Aware Service Recommendation with Random Walk in Cyber-Physical Systems
Yin, Yuyu; Yu, Fangzheng; Xu, Yueshen; Yu, Lifeng; Mu, Jinglong
2017-01-01
Cyber-physical systems (CPS) have received much attention from both academia and industry. An increasing number of functions in CPS are provided in the way of services, which gives rise to an urgent task, that is, how to recommend the suitable services in a huge number of available services in CPS. In traditional service recommendation, collaborative filtering (CF) has been studied in academia, and used in industry. However, there exist several defects that limit the application of CF-based methods in CPS. One is that under the case of high data sparsity, CF-based methods are likely to generate inaccurate prediction results. In this paper, we discover that mining the potential similarity relations among users or services in CPS is really helpful to improve the prediction accuracy. Besides, most of traditional CF-based methods are only capable of using the service invocation records, but ignore the context information, such as network location, which is a typical context in CPS. In this paper, we propose a novel service recommendation method for CPS, which utilizes network location as context information and contains three prediction models using random walking. We conduct sufficient experiments on two real-world datasets, and the results demonstrate the effectiveness of our proposed methods and verify that the network location is indeed useful in QoS prediction. PMID:28885602
The martyrdom of St. Zoilus, a urological issue. History and development of the tradition.
Domínguez-Freire, F
2016-06-01
To highlight, for its urological importance, the martyrdom of St. Zoilus. To elaborate on the tradition of invocation and worship of the saint and to establish their historical bases. We conducted a study of the images of the martyrdom of St. Zoilus, with a detailed review of the history and tradition of the saint and performed a comparative study of the various saints known as patrons of kidney pain and disease. We found three paintings in different churches and locations depicting the kidney extraction of St. Zoilus. In addition to the three pieces, a preserved chest at the National Archaeological Museum and 2 tapestries in the sacristy of the church of the monastery of St. Zoilus in the Palencian town of Carrion de los Condes provided abundant information on the circumstances in which they were made. By analysing the style, we can deduce its affiliation to a specific artistic milieu and thereby propose a timeframe. Without meaning to dethrone St. Liborius as the patron saint of urologists, an office claimed earlier by colleagues from various European countries, the martyrdom of St. Zoilus is, in light of the tradition and images provided, an unquestionable urological issue. The tradition is vindicated from a new viewpoint 1,712 years later. Copyright © 2015 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.
[The meeting of Einstein with Cajal (Madrid, 1923): a lost tide of fortune].
Montes-Santiago, J
The year 2005 was the centennial year of the Albert Einstein's transcendental works that changed forever the humans thoughts on the universe. It is also celebrated the 50th anniversary of his death. It was proclaimed 'World Year of Physics' and a multiplicity of celebrations have exhaustively analyzed Einstein's cardinals contributions. However, among these, the meeting of Einstein with another titanic of science, Santiago Ramon y Cajal, has passed some unnoticed. In this study the circumstances of this meeting are evoked. The parallelisms between the lives of both prominent figures awarded with the Nobel Prize are highlighted. They are the 'classic' authors most widely cited in the current scientific literature. The events and persons who made possible that shining but forgotten interview are detailed. Such a meeting took place in Madrid, on the occasion of the Einstein's trip to Spain in 1923. That travel exceeded his primary scientific nature, reaching the category of a social phenomenon and was widely covered by the printed mass media at that time. Finally, the curious coincidence of the invocation of Cajal's theories to justify the genius of the German physicist nearly 75 years after their meeting is mentioned. Although it was a brief meeting and the circumstances surrounding it largely unknown, it produced a great impression to Einstein and constitutes a supreme instant in the history of the 20th century.
The right to fashion in the age of terrorism.
Pham, Minh-Ha T
2011-01-01
As part of a feminist commitment to collaboration, this article appears as a companion essay to Mimi Thi Nguyen's "The Biopower of Beauty: Humanitarian Imperialisms and Global Feminisms" and offers a point of departure for thinking about fashion and beauty as processes that produce subjects recruited to, and aligned with, the national interests of the United States in the war on terror. The Muslim woman in the veil and her imagined opposite in the fashionably modern - and implicitly Western - woman become convenient metaphors for articulating geopolitical contests of power as a human rights concern, as a rescue mission, as a beautifying mandate. This article examines newer iterations of this opposition, in the wake of September 11, 2001, in order to demonstrate the critical resonance of a biopolitics on fashion and beauty. In "The Right to Fashion in the Age of Terrorism," the author examines the relationship between the U.S. war on terror, targeting persons whose sartorial choices are described as terrorist-looking and oppressive, and the right-to-fashion discourse, which promotes fashion's mass-market diffusion as a civil liberty. Looking at these multiple invocations of the democratization of fashion, this article argues that the right-to-fashion discourse colludes with the war on terror by fabricating a neoliberal consumer-citizen who is also a couture-citizen and whose right to fashion reasserts U.S.exceptionalism, which is secured by private property, social mobility, and individualism.
Multiple node remote messaging
Blumrich, Matthias A.; Chen, Dong; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Ohmacht, Martin; Salapura, Valentina; Steinmacher-Burow, Burkhard; Vranas, Pavlos
2010-08-31
A method for passing remote messages in a parallel computer system formed as a network of interconnected compute nodes includes that a first compute node (A) sends a single remote message to a remote second compute node (B) in order to control the remote second compute node (B) to send at least one remote message. The method includes various steps including controlling a DMA engine at first compute node (A) to prepare the single remote message to include a first message descriptor and at least one remote message descriptor for controlling the remote second compute node (B) to send at least one remote message, including putting the first message descriptor into an injection FIFO at the first compute node (A) and sending the single remote message and the at least one remote message descriptor to the second compute node (B).
Propagation Limitations in Remote Sensing.
Contents: Multi-sensors and systems in remote sensing ; Radar sensing systems over land; Remote sensing techniques in oceanography; Influence of...propagation media and background; Infrared techniques in remote sensing ; Photography in remote sensing ; Analytical studies in remote sensing .
Earth view: A business guide to orbital remote sensing
NASA Technical Reports Server (NTRS)
Bishop, Peter C.
1990-01-01
The following subject areas are covered: Earth view - a guide to orbital remote sensing; current orbital remote sensing systems (LANDSAT, SPOT image, MOS-1, Soviet remote sensing systems); remote sensing satellite; and remote sensing organizations.
NASA Technical Reports Server (NTRS)
Fang, Hongliang; Hrubiak, Patricia; Kato, Hiroko; Rodell, Matthew; Teng, William L.; Vollmer, Bruce E.
2008-01-01
The Global Land Data Assimilation System (GLDAS) is generating a series of land surface state (e.g., soil moisture and surface temperature) and flux (e.g., evaporation and sensible heat flux) products simulated by four land surface models (CLM, Mosaic, Noah and VIC). These products are now accessible at the Hydrology Data and Information Services Center (HDISC), a component of the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Current data holdings include a set of 1.0 degree resolution data products from the four models, covering 1979 to the present; and a 0.25 degree data product from the Noah model, covering 2000 to the present. The products are in Gridded Binary (GRIB) format and can be accessed through a number of interfaces. New data formats (e.g., netCDF), temporal averaging and spatial subsetting will be available in the future. The HDISC has the capability to support more hydrology data products and more advanced analysis tools. The goal is to develop HDISC as a data and services portal that supports weather and climate forecast, and water and energy cycle research.
Experimenter's laboratory for visualized interactive science
NASA Technical Reports Server (NTRS)
Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.
1992-01-01
The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.
Experimenter's laboratory for visualized interactive science
NASA Technical Reports Server (NTRS)
Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.
1993-01-01
The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.
NASA Technical Reports Server (NTRS)
Smit, Christine; Hegde, Mahabaleshwara; Strub, Richard; Bryant, Keith; Li, Angela; Petrenko, Maksym
2017-01-01
Giovanni is a data exploration and visualization tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC). It has been around in one form or another for more than 15 years. Giovanni calculates simple statistics and produces 22 different visualizations for more than 1600 geophysical parameters from more than 90 satellite and model products. Giovanni relies on external data format standards to ensure interoperability, including the NetCDF CF Metadata Conventions. Unfortunately, these standards were insufficient to make Giovanni's internal data representation truly simple to use. Finding and working with dimensions can be convoluted with the CF Conventions. Furthermore, the CF Conventions are silent on machine-friendly descriptive metadata such as the parameter's source product and product version. In order to simplify analyzing disparate earth science data parameters in a unified way, we developed Giovanni's internal standard. First, the format standardizes parameter dimensions and variables so they can be easily found. Second, the format adds all the machine-friendly metadata Giovanni needs to present our parameters to users in a consistent and clear manner. At a glance, users can grasp all the pertinent information about parameters both during parameter selection and after visualization.
NetCDF-CF: Supporting Earth System Science with Data Access, Analysis, and Visualization
NASA Astrophysics Data System (ADS)
Davis, E.; Zender, C. S.; Arctur, D. K.; O'Brien, K.; Jelenak, A.; Santek, D.; Dixon, M. J.; Whiteaker, T. L.; Yang, K.
2017-12-01
NetCDF-CF is a community-developed convention for storing and describing earth system science data in the netCDF binary data format. It is an OGC recognized standard with numerous existing FOSS (Free and Open Source Software) and commercial software tools can explore, analyze, and visualize data that is stored and described as netCDF-CF data. To better support a larger segment of the earth system science community, a number of efforts are underway to extend the netCDF-CF convention with the goal of increasing the types of data that can be represented as netCDF-CF data. This presentation will provide an overview and update of work to extend the existing netCDF-CF convention. It will detail the types of earth system science data currently supported by netCDF-CF and the types of data targeted for support by current netCDF-CF convention development efforts. It will also describe some of the tools that support the use of netCDF-CF compliant datasets, the types of data they support, and efforts to extend them to handle the new data types that netCDF-CF will support.
Development of a gridded meteorological dataset over Java island, Indonesia 1985–2014
Yanto; Livneh, Ben; Rajagopalan, Balaji
2017-01-01
We describe a gridded daily meteorology dataset consisting of precipitation, minimum and maximum temperature over Java Island, Indonesia at 0.125°×0.125° (~14 km) resolution spanning 30 years from 1985–2014. Importantly, this data set represents a marked improvement from existing gridded data sets over Java with higher spatial resolution, derived exclusively from ground-based observations unlike existing satellite or reanalysis-based products. Gap-infilling and gridding were performed via the Inverse Distance Weighting (IDW) interpolation method (radius, r, of 25 km and power of influence, α, of 3 as optimal parameters) restricted to only those stations including at least 3,650 days (~10 years) of valid data. We employed MSWEP and CHIRPS rainfall products in the cross-validation. It shows that the gridded rainfall presented here produces the most reasonable performance. Visual inspection reveals an increasing performance of gridded precipitation from grid, watershed to island scale. The data set, stored in a network common data form (NetCDF), is intended to support watershed-scale and island-scale studies of short-term and long-term climate, hydrology and ecology. PMID:28534871
IRIS Earthquake Browser with Integration to the GEON IDV for 3-D Visualization of Hypocenters.
NASA Astrophysics Data System (ADS)
Weertman, B. R.
2007-12-01
We present a new generation of web based earthquake query tool - the IRIS Earthquake Browser (IEB). The IEB combines the DMC's large set of earthquake catalogs (provided by USGS/NEIC, ISC and the ANF) with the popular Google Maps web interface. With the IEB you can quickly and easily find earthquakes in any region of the globe. Using Google's detailed satellite images, earthquakes can be easily co-located with natural geographic features such as volcanoes as well as man made features such as commercial mines. A set of controls allow earthquakes to be filtered by time, magnitude, and depth range as well as catalog name, contributor name and magnitude type. Displayed events can be easily exported in NetCDF format into the GEON Integrated Data Viewer (IDV) where hypocenters may be visualized in three dimensions. Looking "under the hood", the IEB is based on AJAX technology and utilizes REST style web services hosted at the IRIS DMC. The IEB is part of a broader effort at the DMC aimed at making our data holdings available via web services. The IEB is useful both educationally and as a research tool.
Ripamonti, Ugo; Parak, Ruqayya; Klar, Roland M; Dickens, Caroline; Dix-Peek, Thérèse; Duarte, Raquel
2016-10-01
The momentum to compose this Leading Opinion on the synergistic induction of bone formation suddenly arose when a simple question was formulated during a discussion session on how to boost the often limited induction of bone formation seen in clinical contexts. Re-examination of morphological and molecular data available on the rapid induction of bone formation by the recombinant human transforming growth factor-β3 (hTGF-β3) shows that hTGF-β3 replicates the synergistic induction of bone formation as invocated by binary applications of hOP-1:hTGF-β1 at 20:1 by weight when implanted in heterotopic sites of the rectus abdominis muscle of the Chacma baboon, Papio ursinus. The rapid induction of bone formation in primates by hTGF-β3 may stem from bursts of cladistic evolution, now redundant in lower animal species but still activated in primates by relatively high doses of hTGF-β3. Contrary to rodents, lagomorphs and canines, the three mammalian TGF-β isoforms induce rapid and substantial bone formation when implanted in heterotopic rectus abdominis muscle sites of P. ursinus, with unprecedented regeneration of full thickness mandibular defects with rapid mineralization and corticalization. Provocatively, thus providing potential molecular and biological rationales for the apparent redundancy of osteogenic molecular signals in primates, binary applications of recombinant human osteogenic protein-1 (hOP-1) with low doses of hTGF-β1 and -β3, synergize to induce massive ossicles in heterotopic rectus abdominis, orthotopic calvarial and mandibular sites of P. ursinus. The synergistic binary application of homologous but molecularly different soluble molecular signals has indicated that per force several secreted molecular signals are required singly, synchronously and synergistically to induce optimal osteogenesis. The morphological hallmark of the synergistic induction of bone formation is the rapid differentiation of large osteoid seams enveloping haematopoietic bone marrow that forms by day 15 in heterotopic rectus abdominis sites. Synergistic binary applications also induce the morphogenesis of rudimentary embryonic growth plates indicating that the "memory" of developmental events in embryo can be redeployed postnatally by the application of morphogen combinations. Synergistic binary applications or single relatively high doses of hTGF-β3 have shown that hTGF-β3 induces bone by expressing a variety of inductive morphogenetic proteins that result in the rapid induction of bone formation. Tissue induction thus invocated singly by hTGF-β3 recapitulates the synergistic induction of bone formation by binary applications of hTGF-β1 and -β3 isoforms with hOP-1. Both synergistic strategies result in the rapid induction and expansion of the transformed mesenchymal tissue into large corticalized heterotopic ossicles with osteoblast-like cell differentiation at the periphery of the implanted reconstituted specimens with "tissue transfiguration" in vivo. Molecularly, the rapid induction of bone formation by binary applications of hOP-1 and hTGF-β3 or by hTGF-β3 applied singly resides in the up-regulation of selected genes involved in tissue induction and morphogenesis, Osteocalcin, RUNX-2, OP-1, TGF-β1 and -β3 with however the noted lack of TGF-β2 up-regulation. Copyright © 2016. Published by Elsevier Ltd.
Technology study of quantum remote sensing imaging
NASA Astrophysics Data System (ADS)
Bi, Siwen; Lin, Xuling; Yang, Song; Wu, Zhiqiang
2016-02-01
According to remote sensing science and technology development and application requirements, quantum remote sensing is proposed. First on the background of quantum remote sensing, quantum remote sensing theory, information mechanism, imaging experiments and prototype principle prototype research situation, related research at home and abroad are briefly introduced. Then we expounds compress operator of the quantum remote sensing radiation field and the basic principles of single-mode compression operator, quantum quantum light field of remote sensing image compression experiment preparation and optical imaging, the quantum remote sensing imaging principle prototype, Quantum remote sensing spaceborne active imaging technology is brought forward, mainly including quantum remote sensing spaceborne active imaging system composition and working principle, preparation and injection compression light active imaging device and quantum noise amplification device. Finally, the summary of quantum remote sensing research in the past 15 years work and future development are introduced.
Fundamentals and advances in the development of remote welding fabrication systems
NASA Technical Reports Server (NTRS)
Agapakis, J. E.; Masubuchi, K.; Von Alt, C.
1986-01-01
Operational and man-machine issues for welding underwater, in outer space, and at other remote sites are investigated, and recent process developments are described. Probable remote welding missions are classified, and the essential characteristics of fundamental remote welding tasks are analyzed. Various possible operational modes for remote welding fabrication are identified, and appropriate roles for humans and machines are suggested. Human operator performance in remote welding fabrication tasks is discussed, and recent advances in the development of remote welding systems are described, including packaged welding systems, stud welding systems, remotely operated welding systems, and vision-aided remote robotic welding and autonomous welding systems.
Versteeg, H; Pedersen, S S; Mastenbroek, M H; Redekop, W K; Schwab, J O; Mabo, P; Meine, M
2014-10-01
Remote patient monitoring is a safe and effective alternative for the in-clinic follow-up of patients with cardiovascular implantable electronic devices (CIEDs). However, evidence on the patient perspective on remote monitoring is scarce and inconsistent. The primary objective of the REMOTE-CIED study is to evaluate the influence of remote patient monitoring versus in-clinic follow-up on patient-reported outcomes. Secondary objectives are to: 1) identify subgroups of patients who may not be satisfied with remote monitoring; and 2) investigate the cost-effectiveness of remote monitoring. The REMOTE-CIED study is an international randomised controlled study that will include 900 consecutive heart failure patients implanted with an implantable cardioverter defibrillator (ICD) compatible with the Boston Scientific LATITUDE® Remote Patient Management system at participating centres in five European countries. Patients will be randomised to remote monitoring or in-clinic follow-up. The In-Clinic group will visit the outpatient clinic every 3-6 months, according to standard practice. The Remote Monitoring group only visits the outpatient clinic at 12 and 24 months post-implantation, other check-ups are performed remotely. Patients are asked to complete questionnaires at five time points during the 2-year follow-up. The REMOTE-CIED study will provide insight into the patient perspective on remote monitoring in ICD patients, which could help to support patient-centred care in the future.
A high throughput geocomputing system for remote sensing quantitative retrieval and a case study
NASA Astrophysics Data System (ADS)
Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting
2011-12-01
The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.
NASA Astrophysics Data System (ADS)
Takahashi, Y. O.; Takehiro, S.; Sugiyama, K.; Odaka, M.; Ishiwatari, M.; Sasaki, Y.; Nishizawa, S.; Ishioka, K.; Nakajima, K.; Hayashi, Y.
2012-12-01
Toward the understanding of fluid motions of planetary atmospheres and planetary interiors by performing multiple numerical experiments with multiple models, we are now proceeding ``dcmodel project'', where a series of hierarchical numerical models with various complexity is developed and maintained. In ``dcmodel project'', a series of the numerical models are developed taking care of the following points: 1) a common ``style'' of program codes assuring readability of the software, 2) open source codes of the models to the public, 3) scalability of the models assuring execution on various scales of computational resources, 4) stressing the importance of documentation and presenting a method for writing reference manuals. The lineup of the models and utility programs of the project is as follows: Gtool5, ISPACK/SPML, SPMODEL, Deepconv, Dcpam, and Rdoc-f95. In the followings, features of each component are briefly described. Gtool5 (Ishiwatari et al., 2012) is a Fortran90 library, which provides data input/output interfaces and various utilities commonly used in the models of dcmodel project. A self-descriptive data format netCDF is adopted as a IO format of Gtool5. The interfaces of gtool5 library can reduce the number of operation steps for the data IO in the program code of the models compared with the interfaces of the raw netCDF library. Further, by use of gtool5 library, procedures for data IO and addition of metadata for post-processing can be easily implemented in the program codes in a consolidated form independent of the size and complexity of the models. ``ISPACK'' is the spectral transformation library and ``SPML (SPMODEL library)'' (Takehiro et al., 2006) is its wrapper library. Most prominent feature of SPML is a series of array-handling functions with systematic function naming rules, and this enables us to write codes with a form which is easily deduced from the mathematical expressions of the governing equations. ``SPMODEL'' (Takehiro et al., 2006) is a collection of various sample programs using ``SPML''. These sample programs provide the basekit for simple numerical experiments of geophysical fluid dynamics. For example, SPMODEL includes 1-dimensional KdV equation model, 2-dimensional barotropic, shallow water, Boussinesq models, 3-dimensional MHD dynamo models in rotating spherical shells. These models are written in the common style in harmony with SPML functions. ``Deepconv'' (Sugiyama et al., 2010) and ``Dcpam'' are a cloud resolving model and a general circulation model for the purpose of applications to the planetary atmospheres, respectively. ``Deepconv'' includes several physical processes appropriate for simulations of Jupiter and Mars atmospheres, while ``Dcpam'' does for simulations of Earth, Mars, and Venus-like atmospheres. ``Rdoc-f95'' is a automatic generator of reference manuals of Fortran90/95 programs, which is an extension of ruby documentation tool kit ``rdoc''. It analyzes dependency of modules, functions, and subroutines in the multiple program source codes. At the same time, it can list up the namelist variables in the programs.
A User's Guide to the Tsunami Datasets at NOAA's National Data Buoy Center
NASA Astrophysics Data System (ADS)
Bouchard, R. H.; O'Neil, K.; Grissom, K.; Garcia, M.; Bernard, L. J.; Kern, K. J.
2013-12-01
The National Data Buoy Center (NDBC) has maintained and operated the National Oceanic and Atmospheric Administration's (NOAA) tsunameter network since 2003. The tsunameters employ the NOAA-developed Deep-ocean Assessment and Reporting of Tsunamis (DART) technology. The technology measures the pressure and temperature every 15 seconds on the ocean floor and transforms them into equivalent water-column height observations. A complex series of subsampled observations are transmitted acoustically in real-time to a moored buoy or marine autonomous vehicle (MAV) at the ocean surface. The surface platform uses its satellite communications to relay the observations to NDBC. NDBC places the observations onto the Global Telecommunication System (GTS) for relay to NOAA's Tsunami Warning Centers (TWC) in Hawai'i and Alaska and to the international community. It takes less than three minutes to speed the observations from the ocean floor to the TWCs. NDBC can retrieve limited amounts of the 15-s measurements from the instrumentation on the ocean floor using the technology's two-way communications. NDBC recovers the full resolution 15-s measurements about every 2 years and forwards the datasets and metadata to the National Geophysical Data Center for permanent archive. Meanwhile, NDBC retains the real-time observations on its website. The type of real-time observation depends on the operating mode of the tsunameter. NDBC provides the observations in a variety of traditional and innovative methods and formats that include descriptors of the operating mode. Datasets, organized by station, are available from the NDBC website as text files and from the NDBC THREDDS server in netCDF format. The website provides alerts and lists of events that allow users to focus on the information relevant for tsunami hazard analysis. In addition, NDBC developed a basic web service to query station information and observations to support the Short-term Inundation Forecasting for Tsunamis (SIFT) model. NDBC and NOAA's Integrated Ocean Observing System have fielded the innovative Sensor Observation Service (SOS) that allows users access to observations by station, or groups of stations that have been organized into Features of Interest, such as the 2011 Honshu Tsunami. The user can elect to receive the SOS observations in several different formats, such as Sensor Web Enablement (SWE) or delimiter-separated values. Recently, NDBC's Coastal and Offshore Buoys provided meteorological observations used in analyzing possible meteotsunamis on the U.S. East Coast. However, many of these observations are some distance away from the tsunameters. In a demonstration project, NDBC has added sensors to a tsunameter's surface buoy and a MAV to support program requirements for meteorological observations. All these observations are available from NDBC's website in text files, netCDF, and SOS. To aid users in obtaining information relevant to their applications, the presentation documents, in detail, the characteristics of the different types of real-time observations and the availability and organization of the resulting datasets at NDBC .
Applications of Remote Sensing to Emergency Management.
1980-02-15
Contents: Foundations of Remote Sensing : Data Acquisition and Interpretation; Availability of Remote Sensing Technology for Disaster Response...Imaging Systems, Current and Near Future Satellite and Aircraft Remote Sensing Systems; Utilization of Remote Sensing in Disaster Response: Categories of...Disasters, Phases of Monitoring Activities; Recommendations for Utilization of Remote Sensing Technology in Disaster Response; Selected Reading List.
NASA Astrophysics Data System (ADS)
Sagarminaga, Y.; Galparsoro, I.; Reig, R.; Sánchez, J. A.
2012-04-01
Since 2000, an intense effort was conducted in AZTI's Marine Research Division to set up a data management system which could gather all the marine datasets that were being produced by different in-house research projects. For that, a corporative GIS was designed that included a data and metadata repository, a database, a layer catalog & search application and an internet map viewer. Several layers, mostly dealing with physical, chemical and biological in-situ sampling, and basic and thematic cartography including bathymetry, geomorphology, different species habitat maps, and human pressure and activities maps, were successfully gathered in this system. Very soon, it was realised that new marine technologies yielding continuous multidimensional data, sometimes called FES (Fluid Earth System) data, were difficult to handle in this structure. The data affected, mainly included numerical oceanographic and meteorological models, remote sensing data, coastal RADAR data, and some in-situ observational systems such as CTD's casts, moored or lagrangian buoys, etc. A management system for gridded multidimensional data was developed using standardized formats (netcdf using CF conventions) and tools such as THREDDS catalog (UNIDATA/UCAR) providing web services such as OPENDAP, NCSS, and WCS, as well as ncWMS service developed by the Reading e-science Center. At present, a system (ITSASGIS-5D) is being developed, based on OGC standards and open-source tools to allow interoperability between all the data types mentioned before. This system includes, in the server side, postgresql/postgis databases and geoserver for GIS layers, and THREDDS/Opendap and ncWMS services for FES gridded data. Moreover, an on-line client is being developed to allow joint access, user configuration, data visualisation & query and data distribution. This client is using mapfish, ExtJS - GeoEXT, and openlayers libraries. Through this presentation the elements of the first released version of this system will be described and showed, together with the new topics to be developed in new versions that include among others, the integration of geoNetwork libraries and tools for both FES and GIS metadata management, and the use of new OGC Sensor Observation Services (SOS) to integrate non gridded multidimensional data such as time series, depth profiles or trajectories provided by different observational systems. The final aim of this approach is to contribute to the multidisciplinary access and use of marine data for management and research activities, and facilitate the implementation of integrated ecosystem based approaches in the fields of fisheries advice and management, marine spatial planning, or the implementation of the European policies such as the Water Framework Directive, the Marine Strategy Framework Directive or the Habitat Framework Directive.
NASA Astrophysics Data System (ADS)
Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.
2012-12-01
As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the Carbon Dioxide Information Analysis Center (CDIAC), Biological and Chemical Oceanography Data management Office (BCO-DMO), and federal labs, NODC is exploring the challenges of coordinated data flow and quality control for diverse ocean acidification data sets. These data sets include data from coastal and ocean monitoring, laboratory and field experiments, model output, and remotely sensed data. NODC already has in place automated data extraction protocols for archiving oceanographic data from BCO-DMO and CDIAC. We present a vision for how these disparate data streams can be more fully utilized when brought together using data standards. Like the Multiple-Listing Service in the real estate market, the OADS project is dedicated to developing a repository of ocean acidification data from all sources, and to serving them to the ocean acidification community using a user-friendly interface in a timely manner. For further information please contact NODC.Ocean.Acidification@noaa.gov.
The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce
NASA Astrophysics Data System (ADS)
Chen, Xi; Zhou, Liqing
2015-12-01
With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.
Remote monitoring of implantable cardiac devices: current state and future directions.
Ganeshan, Raj; Enriquez, Alan D; Freeman, James V
2018-01-01
Recent evidence has demonstrated substantial benefits associated with remote monitoring of cardiac implantable electronic devices (CIEDs), and treatment guidelines have endorsed the use of remote monitoring. Familiarity with the features of remote monitoring systems and the data supporting its use are vital for physicians' care for patients with CEIDs. Remote monitoring remains underutilized, but its use is expanding including in new practice settings including emergency departments. Patient experience and outcomes are positive, with earlier detection of clinical events such as atrial fibrillation, reductions in inappropriate implantable cardioverter-defibrillator (ICD) shocks and potentially a decrease in mortality with frequent remote monitoring utilizaiton. Rates of hospitalization are reduced among remote monitoring users, and the replacement of outpatient follow-up visits with remote monitoring transmissions has been shown to be well tolerated. In addition, health resource utilization is lower and remote monitoring has been associated with considerable cost savings. A dose relationship exists between use of remote monitoring and patient outcomes, and those with early and high transmission rates have superior outcomes. Remote monitoring provides clinicians with the ability to provide comprehensive follow-up care for patients with CIEDs. Patient outcomes are improved, and resource utilization is decreased with appropriate use of remote monitoring. Future efforts must focus on improving the utilization and efficiency of remote monitoring.
REMOTE SENSING TECHNOLOGIES APPLICATIONS RESEARCH
Remote sensing technologies applications research supports the ORD Landscape Sciences Program (LSP) in two separate areas: operational remote sensing, and remote sensing research and development. Operational remote sensing is provided to the LSP through the use of current and t...
The future of remote ECG monitoring systems.
Guo, Shu-Li; Han, Li-Na; Liu, Hong-Wei; Si, Quan-Jin; Kong, De-Feng; Guo, Fu-Su
2016-09-01
Remote ECG monitoring systems are becoming commonplace medical devices for remote heart monitoring. In recent years, remote ECG monitoring systems have been applied in the monitoring of various kinds of heart diseases, and the quality of the transmission and reception of the ECG signals during remote process kept advancing. However, there remains accompanying challenges. This report focuses on the three components of the remote ECG monitoring system: patient (the end user), the doctor workstation, and the remote server, reviewing and evaluating the imminent challenges on the wearable systems, packet loss in remote transmission, portable ECG monitoring system, patient ECG data collection system, and ECG signals transmission including real-time processing ST segment, R wave, RR interval and QRS wave, etc. This paper tries to clarify the future developmental strategies of the ECG remote monitoring, which can be helpful in guiding the research and development of remote ECG monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, A.I.; Pettersson, C.B.
1988-01-01
Papers and discussions concerning the geotechnical applications of remote sensing and remote data transmission, sources of remotely sensed data, and glossaries of remote sensing and remote data transmission terms, acronyms, and abbreviations are presented. Aspects of remote sensing use covered include the significance of lineaments and their effects on ground-water systems, waste-site use and geotechnical characterization, the estimation of reservoir submerging losses using CIR aerial photographs, and satellite-based investigation of the significance of surficial deposits for surface mining operations. Other topics presented include the location of potential ground subsidence and collapse features in soluble carbonate rock, optical Fourier analysis ofmore » surface features of interest in geotechnical engineering, geotechnical applications of U.S. Government remote sensing programs, updating the data base for a Geographic Information System, the joint NASA/Geosat Test Case Project, the selection of remote data telemetry methods for geotechnical applications, the standardization of remote sensing data collection and transmission, and a comparison of airborne Goodyear electronic mapping system/SAR with satelliteborne Seasat/SAR radar imagery.« less
DeLacy, Michael J; Louca, Christalla; Smithers-Sheedy, Hayley; McIntyre, Sarah
2016-02-01
To determine if families of children with cerebral palsy living in Australia move to less remote areas between birth and 5 years. Children on the Australian Cerebral Palsy Register (n=3399) born 1996 to 2005, were assigned a remoteness value for family residence at birth and 5 years using a modification of the Australian Statistical Geography Standard. Each value at birth was subtracted from the value at 5 years yielding a positive difference if they moved more remotely, negative difference if they moved less remotely and a value of zero if they did not move or moved to an equally remote residence. The small net increase in remoteness across this cohort was non-significant (p=0.43). Fifty-seven per cent of families changed postcode but only 20% changed remoteness, 11% more remotely, and 9% less remotely. There was a small trend for families with a child with more impaired gross motor function (Gross Motor Function Classification System levels IV and V) to move to a less remote area. This cohort of families with children with cerebral palsy did not appear to move to less remote areas by age 5 years. Remoteness at birth and level of gross motor function seem to have little effect. © 2016 The Authors. Developmental Medicine & Child Neurology © 2016 Mac Keith Press.
Tunnel-Site Selection by Remote Sensing Techniques
A study of the role of remote sensing for geologic reconnaissance for tunnel-site selection was commenced. For this study, remote sensing was defined...conventional remote sensing . Future research directions are suggested, and the extension of remote sensing to include airborne passive microwave
Kingston, Gail A; Williams, Gary; Judd, Jenni; Gray, Marion A
2015-04-01
The aim of this study was to explore how interventions were provided to meet the needs of rural/remote residents who have had a traumatic hand injury, including the coordination of services between rural/remote and metro/regional therapists. Barriers to providing services, use of technology and professional support provided to therapists in rural/remote areas were also explored. Cross-sectional survey. Metropolitan/regional and rural/remote public health facilities in Australia. Occupational therapists and physiotherapists who provide hand therapy to rural/remote patients. Quantitative and qualitative questionnaire responses analysed with descriptive statistics and inductive analysis. There were 64 respondents out of a possible 185. Over half of rural/remote respondents provided initial splinting and exercise prescriptions, and over 85% reported that they continued with exercise protocols. Videoconferencing technology for patient intervention and clinical review was used by 39.1% respondents. Barriers to providing services in rural/remote locations included transport, travelling time, limited staff, and lack of expert knowledge in hand injuries or rural/remote health care. Four major themes emerged from the open-ended questions: working relationships, patient-centred care, staff development and education, and rural and remote practice. The use of technology across Australia to support rural/remote patient intervention requires attention to achieve equity and ease of use. Flexible and realistic goals and interventions should be considered when working with rural/remote patients. A shared care approach between metropolitan/regional and rural/remote therapists can improve understanding of rural/remote issues and provide support to therapists. Further research is recommended to determine the suitability of this approach when providing hand therapy to rural/remote residents. © 2015 National Rural Health Alliance Inc.
System and method for evaluating wind flow fields using remote sensing devices
Schroeder, John; Hirth, Brian; Guynes, Jerry
2016-12-13
The present invention provides a system and method for obtaining data to determine one or more characteristics of a wind field using a first remote sensing device and a second remote sensing device. Coordinated data is collected from the first and second remote sensing devices and analyzed to determine the one or more characteristics of the wind field. The first remote sensing device is positioned to have a portion of the wind field within a first scanning sector of the first remote sensing device. The second remote sensing device is positioned to have the portion of the wind field disposed within a second scanning sector of the second remote sensing device.
Exploring Models and Data for Remote Sensing Image Caption Generation
NASA Astrophysics Data System (ADS)
Lu, Xiaoqiang; Wang, Binqiang; Zheng, Xiangtao; Li, Xuelong
2018-04-01
Inspired by recent development of artificial satellite, remote sensing images have attracted extensive attention. Recently, noticeable progress has been made in scene classification and target detection.However, it is still not clear how to describe the remote sensing image content with accurate and concise sentences. In this paper, we investigate to describe the remote sensing images with accurate and flexible sentences. First, some annotated instructions are presented to better describe the remote sensing images considering the special characteristics of remote sensing images. Second, in order to exhaustively exploit the contents of remote sensing images, a large-scale aerial image data set is constructed for remote sensing image caption. Finally, a comprehensive review is presented on the proposed data set to fully advance the task of remote sensing caption. Extensive experiments on the proposed data set demonstrate that the content of the remote sensing image can be completely described by generating language descriptions. The data set is available at https://github.com/201528014227051/RSICD_optimal
NASA Technical Reports Server (NTRS)
Perry, J. C. (Inventor)
1980-01-01
A system for displaying at a remote station data generated at a central station and for powering the remote station from the central station is presented. A power signal is generated at the central station and time multiplexed with the data and then transmitted to the remote station. An energy storage device at the remote station is responsive to the transmitted power signal to provide energizing power for the circuits at the remote station during the time interval data is being transmitted to the remote station. Energizing power for the circuits at the remote station is provided by the power signal itself during the time this signal is transmitted. Preferably the energy storage device is a capacitor which is charged by the power signal during the time the power is transmitted and is slightly discharged during the time the data is transmitted to energize the circuits at the remote station.
NASA Astrophysics Data System (ADS)
Bi, Siwen; Zhen, Ming; Yang, Song; Lin, Xuling; Wu, Zhiqiang
2017-08-01
According to the development and application needs of Remote Sensing Science and technology, Prof. Siwen Bi proposed quantum remote sensing. Firstly, the paper gives a brief introduction of the background of quantum remote sensing, the research status and related researches at home and abroad on the theory, information mechanism and imaging experiments of quantum remote sensing and the production of principle prototype.Then, the quantization of pure remote sensing radiation field, the state function and squeezing effect of quantum remote sensing radiation field are emphasized. It also describes the squeezing optical operator of quantum light field in active imaging information transmission experiment and imaging experiments, achieving 2-3 times higher resolution than that of coherent light detection imaging and completing the production of quantum remote sensing imaging prototype. The application of quantum remote sensing technology can significantly improve both the signal-to-noise ratio of information transmission imaging and the spatial resolution of quantum remote sensing .On the above basis, Prof.Bi proposed the technical solution of active imaging information transmission technology of satellite borne quantum remote sensing, launched researches on its system composition and operation principle and on quantum noiseless amplifying devices, providing solutions and technical basis for implementing active imaging information technology of satellite borne Quantum Remote Sensing.
New SPDF Directions and Evolving Services Supporting Heliophysics Research
NASA Technical Reports Server (NTRS)
McGuire, Robert E.; Candey, Robert M.; Bilitza, D.; Chimiak, Reine A.; Cooper, John F.; Fung, Shing F.; Han, David B.; Harris, Bernie; Johnson R.; Klipsch, C.;
2006-01-01
The next advances in Heliophysics science and its paradigm of a Great Observatory require an increasingly integrated and transparent data environment, where data can be easily accessed and used across the boundaries of both missions and traditional disciplines. The Space Physics Data Facility (SPDF) project includes uniquely important multi-mission data services with current data from most operating space physics missions. This paper reviews the capabilities of key services now available and the directions in which they are expected to evolve to enable future multi-mission correlative research. The Coordinated Data Analysis Web (CDAWeb) and Satellite Situation Center Web (SSCWeb), critically supported by the Common Data Format (CDF) effort and supplemented by more focused science services such as OMNIWeb and technical services such as data format translations are important operational capabilities serving the international community today (and cited last year by 20% of the papers published in JGR Space Physics). These services continue to add data from most current missions as SPDF works with new missions such as THEMIS to help enable their unique science goals and the meaningful sharing of their data in a multi-mission correlative context. Recent enhancements to CDF, our 3D Java interactive orbit viewer (TIPSOD), the CDAWeb Plus system, increasing automation of data service population, the new folding of the VSPO effort into SPDF and our continuing thrust towards fully-functional web services APIs to allow ready invocation from distributed external middleware and clients will be shown.
Testing typicality in multiverse cosmology
NASA Astrophysics Data System (ADS)
Azhar, Feraz
2015-05-01
In extracting predictions from theories that describe a multiverse, we face the difficulty that we must assess probability distributions over possible observations prescribed not just by an underlying theory, but by a theory together with a conditionalization scheme that allows for (anthropic) selection effects. This means we usually need to compare distributions that are consistent with a broad range of possible observations with actual experimental data. One controversial means of making this comparison is by invoking the "principle of mediocrity": that is, the principle that we are typical of the reference class implicit in the conjunction of the theory and the conditionalization scheme. In this paper, we quantitatively assess the principle of mediocrity in a range of cosmological settings, employing "xerographic distributions" to impose a variety of assumptions regarding typicality. We find that for a fixed theory, the assumption that we are typical gives rise to higher likelihoods for our observations. If, however, one allows both the underlying theory and the assumption of typicality to vary, then the assumption of typicality does not always provide the highest likelihoods. Interpreted from a Bayesian perspective, these results support the claim that when one has the freedom to consider different combinations of theories and xerographic distributions (or different "frameworks"), one should favor the framework that has the highest posterior probability; and then from this framework one can infer, in particular, how typical we are. In this way, the invocation of the principle of mediocrity is more questionable than has been recently claimed.
A Semi-Preemptive Garbage Collector for Solid State Drives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Junghee; Kim, Youngjae; Shipman, Galen M
2011-01-01
NAND flash memory is a preferred storage media for various platforms ranging from embedded systems to enterprise-scale systems. Flash devices do not have any mechanical moving parts and provide low-latency access. They also require less power compared to rotating media. Unlike hard disks, flash devices use out-of-update operations and they require a garbage collection (GC) process to reclaim invalid pages to create free blocks. This GC process is a major cause of performance degradation when running concurrently with other I/O operations as internal bandwidth is consumed to reclaim these invalid pages. The invocation of the GC process is generally governedmore » by a low watermark on free blocks and other internal device metrics that different workloads meet at different intervals. This results in I/O performance that is highly dependent on workload characteristics. In this paper, we examine the GC process and propose a semi-preemptive GC scheme that can preempt on-going GC processing and service pending I/O requests in the queue. Moreover, we further enhance flash performance by pipelining internal GC operations and merge them with pending I/O requests whenever possible. Our experimental evaluation of this semi-preemptive GC sheme with realistic workloads demonstrate both improved performance and reduced performance variability. Write-dominant workloads show up to a 66.56% improvement in average response time with a 83.30% reduced variance in response time compared to the non-preemptive GC scheme.« less
Remote sensing of natural resources: Quarterly literature review
NASA Technical Reports Server (NTRS)
1976-01-01
A quarterly review of technical literature concerning remote sensing techniques is presented. The format contains indexed and abstracted materials with emphasis on data gathering techniques performed or obtained remotely from space, aircraft, or ground-based stations. Remote sensor applications including the remote sensing of natural resources are presented.
Impact of Shutting Down En Route Primary Radars within CONUS Interior
1993-06-01
Remote Control Interface Unit ( RCIU ) RMS software for the primary radar will be deleted. Any dependency of the secondary radar on the primary radar data...Generators RCIU Remote Control and Interface Unit RMM Remote Monitoring and Maintenance RMMS Remote Maintenance Monitoring System RMS Remote Maintenance
[Thematic Issue: Remote Sensing.
ERIC Educational Resources Information Center
Howkins, John, Ed.
1978-01-01
Four of the articles in this publication discuss the remote sensing of the Earth and its resources by satellites. Among the topics dealt with are the development and management of remote sensing systems, types of satellites used for remote sensing, the uses of remote sensing, and issues involved in using information obtained through remote…
75 FR 65304 - Advisory Committee on Commercial Remote Sensing (ACCRES); Request for Nominations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-22
... Commercial Remote Sensing (ACCRES); Request for Nominations AGENCY: National Oceanic and Atmospheric... Commercial Remote Sensing (ACCRES). SUMMARY: The Advisory Committee on Commercial Remote Sensing (ACCRES) was... Atmosphere, on matters relating to the U.S. commercial remote sensing industry and NOAA's activities to carry...
Fiber optically isolated and remotely stabilized data transmission system
Nelson, Melvin A.
1992-01-01
A fiber optically isolated and remotely stabilized data transmission system s described wherein optical data may be transmitted over an optical data fiber from a remote source which includes a data transmitter and a power supply at the remote source. The transmitter may be remotely calibrated and stabilized via an optical control fiber, and the power source may be remotely cycled between duty and standby modes via an optical control fiber.
Fiber optically isolated and remotely stabilized data transmission system
Nelson, M.A.
1992-11-10
A fiber optically isolated and remotely stabilized data transmission systems described wherein optical data may be transmitted over an optical data fiber from a remote source which includes a data transmitter and a power supply at the remote source. The transmitter may be remotely calibrated and stabilized via an optical control fiber, and the power source may be remotely cycled between duty and standby modes via an optical control fiber. 3 figs.
Literature relevant to remote sensing of water quality
NASA Technical Reports Server (NTRS)
Middleton, E. M.; Marcell, R. F.
1983-01-01
References relevant to remote sensing of water quality were compiled, organized, and cross-referenced. The following general categories were included: (1) optical properties and measurement of water characteristics; (2) interpretation of water characteristics by remote sensing, including color, transparency, suspended or dissolved inorganic matter, biological materials, and temperature; (3) application of remote sensing for water quality monitoring; (4) application of remote sensing according to water body type; and (5) manipulation, processing and interpretation of remote sensing digital water data.
Learning Methods of Remote Sensing In the 2013 Curriculum of Secondary School
NASA Astrophysics Data System (ADS)
Lili Somantri, Nandi
2016-11-01
The new remote sensing material included in the subjects of geography in the curriculum of 1994. For geography teachers generation of 90s and over who in college do not get the material remote sensing, for teaching is a tough matter. Most teachers only give a theoretical matter, and do not carry out practical reasons in the lack of facilities and infrastructure of computer laboratories. Therefore, in this paper studies the importance about the method or manner of teaching remote sensing material in schools. The purpose of this paper is 1) to explain the position of remote sensing material in the study of geography, 2) analyze the Geography Curriculum 2013 Subjects related to remote sensing material, 3) describes a method of teaching remote sensing material in schools. The method used in this paper is a descriptive analytical study supported by the literature. The conclusion of this paper that the position of remote sensing in the study of geography is a method or a way to obtain spatial data earth's surface. In the 2013 curriculum remote sensing material has been applied to the study of land use and transportation. Remote sensing methods of teaching must go through a practicum, which starts from the introduction of the theory of remote sensing, data extraction phase of remote sensing imagery to produce maps, both visually and digitally, field surveys, interpretation of test accuracy, and improved maps.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-21
... filed an immediately effective proposal regarding Remote Specialists (the ``Remote Specialist filing'') that expanded the Remote Specialist concept.\\3\\ By the Remote Specialist filing, the Exchange enhanced the existing Remote Specialist \\4\\ model so that all eligible ROTs \\5\\ on the Exchange could function...
JPRS Report, Science & Technology, China, Remote Sensing Systems, Applications.
1991-01-17
Partial Contents: Short Introduction to Nation’s Remote Sensing Units, Domestic Airborne Remote - Sensing System, Applications in Monitoring Natural...Disasters, Applications of Imagery From Experimental Satellites Launched in 1985, 1986, Current Status, Future Prospects for Domestic Remote - Sensing -Satellite...Ground Station, and Radar Remote - Sensing Technology Used to Monitor Yellow River Delta,
10 CFR 35.643 - Periodic spot-checks for remote afterloader units.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Periodic spot-checks for remote afterloader units. 35.643... spot-checks for remote afterloader units. (a) A licensee authorized to use a remote afterloader unit for medical use shall perform spot-checks of each remote afterloader facility and on each unit— (1...
[A review on polarization information in the remote sensing detection].
Gong, Jie-Qiong; Zhan, Hai-Gang; Liu, Da-Zhao
2010-04-01
Polarization is one of the inherent characteristics. Because the surface of the target structure, internal structure, and the angle of incident light are different, the earth's surface and any target in atmosphere under optical interaction process will have their own characteristic nature of polarization. Polarimetric characteristics of radiation energy from the targets are used in polarization remote sensing detection as detective information. Polarization remote sensing detection can get the seven-dimensional information of targets in complicated backgrounds, detect well-resolved outline of targets and low-reflectance region of objectives, and resolve the problems of atmospheric detection and identification camouflage detection which the traditional remote sensing detection can not solve, having good foreground in applications. This paper introduces the development of polarization information in the remote sensing detection from the following four aspects. The rationale of polarization remote sensing detection is the base of polarization remote sensing detection, so it is firstly introduced. Secondly, the present researches on equipments that are used in polarization remote sensing detection are particularly and completely expatiated. Thirdly, the present exploration of theoretical simulation of polarization remote sensing detection is well detailed. Finally, the authors present the applications research home and abroad of the polarization remote sensing detection technique in the fields of remote sensing, atmospheric sounding, sea surface and underwater detection, biology and medical diagnosis, astronomical observation and military, summing up the current problems in polarization remote sensing detection. The development trend of polarization remote sensing detection technology in the future is pointed out in order to provide a reference for similar studies.
Use of telemedicine in the remote programming of cochlear implants.
Ramos, Angel; Rodriguez, Carina; Martinez-Beneyto, Paz; Perez, Daniel; Gault, Alexandre; Falcon, Juan Carlos; Boyle, Patrick
2009-05-01
Remote cochlear implant (CI) programming is a viable, safe, user-friendly and cost-effective procedure, equivalent to standard programming in terms of efficacy and user's perception, which can complement the standard procedures. The potential benefits of this technique are outlined. We assessed the technical viability, risks and difficulties of remote CI programming; and evaluated the benefits for the user comparing the standard on-site CI programming versus the remote CI programming. The Remote Programming System (RPS) basically consists of completing the habitual programming protocol in a regular CI centre, assisted by local staff, although guided by a remote expert, who programs the CI device using a remote programming station that takes control of the local station through the Internet. A randomized prospective study has been designed with the appropriate controls comparing RPS to the standard on-site CI programming. Study subjects were implanted adults with a HiRes 90K(R) CI with post-lingual onset of profound deafness and 4-12 weeks of device use. Subjects underwent two daily CI programming sessions either remote or standard, on 4 programming days separated by 3 month intervals. A total of 12 remote and 12 standard sessions were completed. To compare both CI programming modes we analysed: program parameters, subjects' auditory progress, subjects' perceptions of the CI programming sessions, and technical aspects, risks and difficulties of remote CI programming. Control of the local station from the remote station was carried out successfully and remote programming sessions were achieved completely and without incidents. Remote and standard program parameters were compared and no significant differences were found between the groups. The performance evaluated in subjects who had been using either standard or remote programs for 3 months showed no significant difference. Subjects were satisfied with both the remote and standard sessions. Safety was proven by checking emergency stops in different conditions. A very small delay was noticed that did not affect the ease of the fitting. The oral and video communication between the local and the remote equipment was established without difficulties and was of high quality.
Marciniuk, Darcy
2016-01-01
The challenges of providing quality respiratory care to persons living in rural or remote communities can be daunting. These populations are often vulnerable in terms of both health status and access to care, highlighting the need for innovation in service delivery. The rapidly expanding options available using telehealthcare technologies have the capacity to allow patients in rural and remote communities to connect with providers at distant sites and to facilitate the provision of diagnostic, monitoring, and therapeutic services. Successful implementation of telehealthcare programs in rural and remote settings is, however, contingent upon accounting for key technical, organizational, social, and legal considerations at the individual, community, and system levels. This review article discusses five types of telehealthcare delivery that can facilitate respiratory care for residents of rural or remote communities: remote monitoring (including wearable and ambient systems; remote consultations (between providers and between patients and providers), remote pulmonary rehabilitation, telepharmacy, and remote sleep monitoring. Current and future challenges related to telehealthcare are discussed. PMID:26902542
Cybernetic Basis and System Practice of Remote Sensing and Spatial Information Science
NASA Astrophysics Data System (ADS)
Tan, X.; Jing, X.; Chen, R.; Ming, Z.; He, L.; Sun, Y.; Sun, X.; Yan, L.
2017-09-01
Cybernetics provides a new set of ideas and methods for the study of modern science, and it has been fully applied in many areas. However, few people have introduced cybernetics into the field of remote sensing. The paper is based on the imaging process of remote sensing system, introducing cybernetics into the field of remote sensing, establishing a space-time closed-loop control theory for the actual operation of remote sensing. The paper made the process of spatial information coherently, and improved the comprehensive efficiency of the space information from acquisition, procession, transformation to application. We not only describes the application of cybernetics in remote sensing platform control, sensor control, data processing control, but also in whole system of remote sensing imaging process control. We achieve the information of output back to the input to control the efficient operation of the entire system. This breakthrough combination of cybernetics science and remote sensing science will improve remote sensing science to a higher level.
Martin, Shannon K; Tulla, Kiara; Meltzer, David O; Arora, Vineet M; Farnan, Jeanne M
2017-12-01
Advances in information technology have increased remote access to the electronic health record (EHR). Concurrently, standards defining appropriate resident supervision have evolved. How often and under what circumstances inpatient attending physicians remotely access the EHR for resident supervision is unknown. We described a model of attending remote EHR use for resident supervision, and quantified the frequency and magnitude of use. Using a mixed methods approach, general medicine inpatient attendings were surveyed and interviewed about their remote EHR use. Frequency of use and supervisory actions were quantitatively examined via survey. Transcripts from semistructured interviews were analyzed using grounded theory to identify codes and themes. A total of 83% (59 of 71) of attendings participated. Fifty-seven (97%) reported using the EHR remotely, with 54 (92%) reporting they discovered new clinical information not relayed by residents via remote EHR use. A majority (93%, 55 of 59) reported that this resulted in management changes, and 54% (32 of 59) reported making immediate changes by contacting cross-covering teams. Six major factors around remote EHR use emerged: resident, clinical, educational, personal, technical, and administrative. Attendings described resident and clinical factors as facilitating "backstage" supervision via remote EHR use. In our study to assess attending remote EHR use for resident supervision, attendings reported frequent remote use with resulting supervisory actions, describing a previously uncharacterized form of "backstage" oversight supervision. Future work should explore best practices in remote EHR use to provide effective supervision and ultimately improve patient safety.
The survey on data format of Earth observation satellite data at JAXA.
NASA Astrophysics Data System (ADS)
Matsunaga, M.; Ikehata, Y.
2017-12-01
JAXA's earth observation satellite data are distributed by a portal web site for search and deliver called "G-Portal". Users can download the satellite data of GPM, TRMM, Aqua, ADEOS-II, ALOS (search only), ALOS-2 (search only), MOS-1, MOS-1b, ERS-1 and JERS-1 from G-Portal. However, these data formats are different by each satellite like HDF4, HDF5, NetCDF4, CEOS, etc., and which formats are not familiar to new data users. Although the HDF type self-describing format is very convenient and useful for big dataset information, old-type format product is not readable by open GIS tool nor apply OGC standard. Recently, the satellite data are widely used to be applied to the various needs such as disaster, earth resources, monitoring the global environment, Geographic Information System(GIS) and so on. In order to remove a barrier of using Earth Satellite data for new community users, JAXA has been providing the format-converted product like GeoTIFF or KMZ. In addition, JAXA provides format conversion tool itself. We investigate the trend of data format for data archive, data dissemination and data utilization, then we study how to improve the current product format for various application field users and make a recommendation for new product.
NASA Astrophysics Data System (ADS)
Steinberg, P. D.; Brener, G.; Duffy, D.; Nearing, G. S.; Pelissier, C.
2017-12-01
Hyperparameterization, of statistical models, i.e. automated model scoring and selection, such as evolutionary algorithms, grid searches, and randomized searches, can improve forecast model skill by reducing errors associated with model parameterization, model structure, and statistical properties of training data. Ensemble Learning Models (Elm), and the related Earthio package, provide a flexible interface for automating the selection of parameters and model structure for machine learning models common in climate science and land cover classification, offering convenient tools for loading NetCDF, HDF, Grib, or GeoTiff files, decomposition methods like PCA and manifold learning, and parallel training and prediction with unsupervised and supervised classification, clustering, and regression estimators. Continuum Analytics is using Elm to experiment with statistical soil moisture forecasting based on meteorological forcing data from NASA's North American Land Data Assimilation System (NLDAS). There Elm is using the NSGA-2 multiobjective optimization algorithm for optimizing statistical preprocessing of forcing data to improve goodness-of-fit for statistical models (i.e. feature engineering). This presentation will discuss Elm and its components, including dask (distributed task scheduling), xarray (data structures for n-dimensional arrays), and scikit-learn (statistical preprocessing, clustering, classification, regression), and it will show how NSGA-2 is being used for automate selection of soil moisture forecast statistical models for North America.
A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.
2017-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
FastQuery: A Parallel Indexing System for Scientific Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Jerry; Wu, Kesheng; Prabhat,
2011-07-29
Modern scientific datasets present numerous data management and analysis challenges. State-of-the- art index and query technologies such as FastBit can significantly improve accesses to these datasets by augmenting the user data with indexes and other secondary information. However, a challenge is that the indexes assume the relational data model but the scientific data generally follows the array data model. To match the two data models, we design a generic mapping mechanism and implement an efficient input and output interface for reading and writing the data and their corresponding indexes. To take advantage of the emerging many-core architectures, we also developmore » a parallel strategy for indexing using threading technology. This approach complements our on-going MPI-based parallelization efforts. We demonstrate the flexibility of our software by applying it to two of the most commonly used scientific data formats, HDF5 and NetCDF. We present two case studies using data from a particle accelerator model and a global climate model. We also conducted a detailed performance study using these scientific datasets. The results show that FastQuery speeds up the query time by a factor of 2.5x to 50x, and it reduces the indexing time by a factor of 16 on 24 cores.« less
Long-term oceanographic observations in Massachusetts Bay, 1989-2006
Butman, Bradford; Alexander, P. Soupy; Bothner, Michael H.; Borden, Jonathan; Casso, Michael A.; Gutierrez, Benjamin T.; Hastings, Mary E.; Lightsom, Frances L.; Martini, Marinna A.; Montgomery, Ellyn T.; Rendigs, Richard R.; Strahle, William S.
2009-01-01
This data report presents long-term oceanographic observations made in western Massachusetts Bay at long-term site A (LT-A) (42 deg 22.6' N., 70 deg 47.0' W.; nominal water depth 32 meters) from December 1989 through February 2006 and long-term site B (LT-B) (42 deg 9.8' N., 70 deg 38.4' W.; nominal water depth 22 meters) from October 1997 through February 2004 (fig. 1). The observations were collected as part of a U.S. Geological Survey (USGS) study designed to understand the transport and long-term fate of sediments and associated contaminants in Massachusetts Bay. The observations include time-series measurements of current, temperature, salinity, light transmission, pressure, oxygen, fluorescence, and sediment-trapping rate. About 160 separate mooring or tripod deployments were made on about 90 research cruises to collect these long-term observations. This report presents a description of the 16-year field program and the instrumentation used to make the measurements, an overview of the data set, more than 2,500 pages of statistics and plots that summarize the data, and the digital data in Network Common Data Form (NetCDF) format. This research was conducted by the USGS in cooperation with the Massachusetts Water Resources Authority and the U.S. Coast Guard.
The Hierarchical Data Format as a Foundation for Community Data Sharing
NASA Astrophysics Data System (ADS)
Habermann, T.
2017-12-01
Hierarchical Data Format (HDF) formats and libraries have been used by individual researchers and major science programs across many Earth and Space Science disciplines and sectors to provide high-performance information storage and access for several decades. Generic group, dataset, and attribute objects in HDF have been combined in many ways to form domain objects that scientists understand and use. Well-known applications of HDF in the Earth Sciences include thousands of global satellite observations and products produced by NASA's Earth Observing System using the HDF-EOS conventions, navigation quality bathymetry produced as Bathymetric Attributed Grids (BAGs) by the OpenNavigationSurface project and others, seismic wave collections written into the Adoptable Seismic Data Format (ASDF) and many oceanographic and atmospheric products produced using the climate-forecast conventions with the netCDF4 data model and API to HDF5. This is the modus operandi of these communities: 1) develop a model of scientific data objects and associated metadata used in a domain, 2) implement that model using HDF, 3) develop software libraries that connect that model to tools and 4) encourage adoption of those tools in the community. Understanding these domain object implementations and facilitating communication across communities is an important goal of The HDF Group. We will discuss these examples and approaches to community outreach during this session.
Cloud-Enabled Climate Analytics-as-a-Service using Reanalysis data: A case study.
NASA Astrophysics Data System (ADS)
Nadeau, D.; Duffy, D.; Schnase, J. L.; McInerney, M.; Tamkin, G.; Potter, G. L.; Thompson, J. H.
2014-12-01
The NASA Center for Climate Simulation (NCCS) maintains advanced data capabilities and facilities that allow researchers to access the enormous volume of data generated by weather and climate models. The NASA Climate Model Data Service (CDS) and the NCCS are merging their efforts to provide Climate Analytics-as-a-Service for the comparative study of the major reanalysis projects: ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, JMA JRA25, and JRA55. These reanalyses have been repackaged to netCDF4 file format following the CMIP5 Climate and Forecast (CF) metadata convention prior to be sequenced into the Hadoop Distributed File System ( HDFS ). A small set of operations that represent a common starting point in many analysis workflows was then created: min, max, sum, count, variance and average. In this example, Reanalysis data exploration was performed with the use of Hadoop MapReduce and accessibility was achieved using the Climate Data Service(CDS) application programming interface (API) created at NCCS. This API provides a uniform treatment of large amount of data. In this case study, we have limited our exploration to 2 variables, temperature and precipitation, using 3 operations, min, max and avg and using 30-year of Reanalysis data for 3 regions of the world: global, polar, subtropical.
Access to Emissions Distributions and Related Ancillary Data through the ECCAD database
NASA Astrophysics Data System (ADS)
Darras, Sabine; Granier, Claire; Liousse, Catherine; De Graaf, Erica; Enriquez, Edgar; Boulanger, Damien; Brissebrat, Guillaume
2017-04-01
The ECCAD database (Emissions of atmospheric Compounds and Compilation of Ancillary Data) provides a user-friendly access to global and regional surface emissions for a large set of chemical compounds and ancillary data (land use, active fires, burned areas, population,etc). The emissions inventories are time series gridded data at spatial resolution from 1x1 to 0.1x0.1 degrees. ECCAD is the emissions database of the GEIA (Global Emissions InitiAtive) project and a sub-project of the French Atmospheric Data Center AERIS (http://www.aeris-data.fr). ECCAD has currently more than 2200 users originating from more than 80 countries. The project benefits from this large international community of users to expand the number of emission datasets made available. ECCAD provides detailed metadata for each of the datasets and various tools for data visualization, for computing global and regional totals and for interactive spatial and temporal analysis. The data can be downloaded as interoperable NetCDF CF-compliant files, i.e. the data are compatible with many other client interfaces. The presentation will provide information on the datasets available within ECCAD, as well as examples of the analysis work that can be done online through the website: http://eccad.aeris-data.fr.
Access to Emissions Distributions and Related Ancillary Data through the ECCAD database
NASA Astrophysics Data System (ADS)
Darras, Sabine; Enriquez, Edgar; Granier, Claire; Liousse, Catherine; Boulanger, Damien; Fontaine, Alain
2016-04-01
The ECCAD database (Emissions of atmospheric Compounds and Compilation of Ancillary Data) provides a user-friendly access to global and regional surface emissions for a large set of chemical compounds and ancillary data (land use, active fires, burned areas, population,etc). The emissions inventories are time series gridded data at spatial resolution from 1x1 to 0.1x0.1 degrees. ECCAD is the emissions database of the GEIA (Global Emissions InitiAtive) project and a sub-project of the French Atmospheric Data Center AERIS (http://www.aeris-data.fr). ECCAD has currently more than 2200 users originating from more than 80 countries. The project benefits from this large international community of users to expand the number of emission datasets made available. ECCAD provides detailed metadata for each of the datasets and various tools for data visualization, for computing global and regional totals and for interactive spatial and temporal analysis. The data can be downloaded as interoperable NetCDF CF-compliant files, i.e. the data are compatible with many other client interfaces. The presentation will provide information on the datasets available within ECCAD, as well as examples of the analysis work that can be done online through the website: http://eccad.aeris-data.fr.
Climate tools in mainstream Linux distributions
NASA Astrophysics Data System (ADS)
McKinstry, Alastair
2015-04-01
Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.
NASA Astrophysics Data System (ADS)
Signell, R. P.; Camossi, E.
2015-11-01
Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.
NASA Technical Reports Server (NTRS)
Peng, G.; Meier, W. N.; Scott, D. J.; Savoie, M. H.
2013-01-01
A long-term, consistent, and reproducible satellite-based passive microwave sea ice concentration climate data record (CDR) is available for climate studies, monitoring, and model validation with an initial operation capability (IOC). The daily and monthly sea ice concentration data are on the National Snow and Ice Data Center (NSIDC) polar stereographic grid with nominal 25 km × 25 km grid cells in both the Southern and Northern Hemisphere polar regions from 9 July 1987 to 31 December 2007. The data files are available in the NetCDF data format at http://nsidc.org/data/g02202.html and archived by the National Climatic Data Center (NCDC) of the National Oceanic and Atmospheric Administration (NOAA) under the satellite climate data record program (http://www.ncdc.noaa.gov/cdr/operationalcdrs.html). The description and basic characteristics of the NOAA/NSIDC passive microwave sea ice concentration CDR are presented here. The CDR provides similar spatial and temporal variability as the heritage products to the user communities with the additional documentation, traceability, and reproducibility that meet current standards and guidelines for climate data records. The data set, along with detailed data processing steps and error source information, can be found at http://dx.doi.org/10.7265/N5B56GN3.
Web-based visualization of gridded dataset usings OceanBrowser
NASA Astrophysics Data System (ADS)
Barth, Alexander; Watelet, Sylvain; Troupin, Charles; Beckers, Jean-Marie
2015-04-01
OceanBrowser is a web-based visualization tool for gridded oceanographic data sets. Those data sets are typically four-dimensional (longitude, latitude, depth and time). OceanBrowser allows one to visualize horizontal sections at a given depth and time to examine the horizontal distribution of a given variable. It also offers the possibility to display the results on an arbitrary vertical section. To study the evolution of the variable in time, the horizontal and vertical sections can also be animated. Vertical section can be generated by using a fixed distance from coast or fixed ocean depth. The user can customize the plot by changing the color-map, the range of the color-bar, the type of the plot (linearly interpolated color, simple contours, filled contours) and download the current view as a simple image or as Keyhole Markup Language (KML) file for visualization in applications such as Google Earth. The data products can also be accessed as NetCDF files and through OPeNDAP. Third-party layers from a web map service can also be integrated. OceanBrowser is used in the frame of the SeaDataNet project (http://gher-diva.phys.ulg.ac.be/web-vis/) and EMODNET Chemistry (http://oceanbrowser.net/emodnet/) to distribute gridded data sets interpolated from in situ observation using DIVA (Data-Interpolating Variational Analysis).
Cloud Property Retrieval Products for Graciosa Island, Azores
Dong, Xiquan
2014-05-05
The motivation for developing this product was to use the Dong et al. 1998 method to retrieve cloud microphysical properties, such as cloud droplet effective radius, cloud droplets number concentration, and optical thickness. These retrieved properties have been used to validate the satellite retrieval, and evaluate the climate simulations and reanalyses. We had been using this method to retrieve cloud microphysical properties over ARM SGP and NSA sites. We also modified the method for the AMF at Shouxian, China and some IOPs, e.g. ARM IOP at SGP in March, 2000. The ARSCL data from ARM data archive over the SGP and NSA have been used to determine the cloud boundary and cloud phase. For these ARM permanent sites, the ARSCL data was developed based on MMCR measurements, however, there were no data available at the Azores field campaign. We followed the steps to generate this derived product and also include the MPLCMASK cloud retrievals to determine the most accurate cloud boundaries, including the thin cirrus clouds that WACR may under-detect. We use these as input to retrieve the cloud microphysical properties. Due to the different temporal resolutions of the derived cloud boundary heights product and the cloud properties product, we submit them as two separate netcdf files.
NASA Technical Reports Server (NTRS)
Cullather, Richard; Bosilovich, Michael
2017-01-01
The Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA-2) is a global atmospheric reanalysis produced by the NASA Global Modeling and Assimilation Office (GMAO). It spans the satellite observing era from 1980 to the present. The goals of MERRA-2 are to provide a regularly-gridded, homogeneous record of the global atmosphere, and to incorporate additional aspects of the climate system including trace gas constituents (stratospheric ozone), and improved land surface representation, and cryospheric processes. MERRA-2 is also the first satellite-era global reanalysis to assimilate space-based observations of aerosols and represent their interactions with other physical processes in the climate system. The inclusion of these additional components are consistent with the overall objectives of an Integrated Earth System Analysis (IESA). MERRA-2 is intended to replace the original MERRA product, and reflects recent advances in atmospheric modeling and data assimilation. Modern hyperspectral radiance and microwave observations, along with GPS-Radio Occultation and NASA ozone datasets are now assimilated in MERRA-2. Much of the structure of the data files remains the same in MERRA-2. While the original MERRA data format was HDF-EOS, the MERRA-2 supplied binary data format is now NetCDF4 (with lossy compression to save space).
Pacemaker remote monitoring in the pediatric population: is it a real solution?
Leoni, Loira; Padalino, Massimo; Biffanti, Roberta; Ferretto, Sonia; Vettor, Giulia; Corrado, Domenico; Stellin, Giovanni; Milanesi, Ornella; Iliceto, Sabino
2015-05-01
Clinical utility of remote monitoring of implantable cardiac devices has been previously demonstrated in several trials in the adult population. The aim of this study was to assess the clinical utility of remote monitoring in a pediatric population undergoing pacemakers implantation. The study population included 73 consecutive pediatric patients who received an implantable pacemaker. The remote device check was programmed for every 3 months and all patients had a yearly out-patient visit. Data on device-related events, hospitalization, and other clinical information were collected during remote checks and out-patient visits. During a mean follow-up of 18 ± 10 months, 470 remote transmissions were collected and analyzed. Two deaths were reported. Eight transmissions (1.7%) triggered an urgent out-patient visit. Twenty percent of transmissions reported evidence of significant clinical or technical events. All young patients and their families were very satisfied when using remote monitoring to replace out-patient visits. The ease in use, together with satisfaction and acceptance of remote monitoring in pediatric patients, brought very good results. The remote management of our pediatric population was safe and remote monitoring adequately replaced the periodic out-patient device checks without compromising patient safety. ©2015 Wiley Periodicals, Inc.
Lim, Paul Chun Yih; Lee, Audry Shan Yin; Chua, Kelvin Chi Ming; Lim, Eric Tien Siang; Chong, Daniel Thuan Tee; Tan, Boon Yew; Ho, Kah Leng; Teo, Wee Siong; Ching, Chi Keong
2016-07-01
Remote monitoring of cardiac implantable electronic devices (CIED) has been shown to improve patient safety and reduce in-office visits. We report our experience with remote monitoring via the Medtronic CareLink(®) network. Patients were followed up for six months with scheduled monthly remote monitoring transmissions in addition to routine in-office checks. The efficacy of remote monitoring was evaluated by recording compliance to transmissions, number of device alerts requiring intervention and time from transmission to review. Questionnaires were administered to evaluate the experiences of patients, physicians and medical technicians. A total of 57 patients were enrolled; 16 (28.1%) had permanent pacemakers, 34 (59.6%) had implantable cardioverter defibrillators and 7 (12.3%) had cardiac resynchronisation therapy defibrillators. Overall, of 334 remote transmissions scheduled, 73.7% were on time, 14.5% were overdue and 11.8% were missed. 84.6% of wireless transmissions were on time, compared to 53.8% of non-wireless transmissions. Among all transmissions, 4.4% contained alerts for which physicians were informed and only 1.8% required intervention. 98.6% of remote transmissions were reviewed by the second working day. 73.2% of patients preferred remote monitoring. Physicians agreed that remote transmissions provided information equivalent to in-office checks 97.1% of the time. 77.8% of medical technicians felt that remote monitoring would help the hospital improve patient management. No adverse events were reported. Remote monitoring of CIED is safe and feasible. It has possible benefits to patient safety through earlier detection of arrhythmias or device malfunction, permitting earlier intervention. Wireless remote monitoring, in particular, may improve compliance to device monitoring. Patients may prefer remote monitoring due to possible improvements in quality of life. Copyright: © Singapore Medical Association.
Lim, Paul Chun Yih; Lee, Audry Shan Yin; Chua, Kelvin Chi Ming; Lim, Eric Tien Siang; Chong, Daniel Thuan Tee; Tan, Boon Yew; Ho, Kah Leng; Teo, Wee Siong; Ching, Chi Keong
2016-01-01
INTRODUCTION Remote monitoring of cardiac implantable electronic devices (CIED) has been shown to improve patient safety and reduce in-office visits. We report our experience with remote monitoring via the Medtronic CareLink® network. METHODS Patients were followed up for six months with scheduled monthly remote monitoring transmissions in addition to routine in-office checks. The efficacy of remote monitoring was evaluated by recording compliance to transmissions, number of device alerts requiring intervention and time from transmission to review. Questionnaires were administered to evaluate the experiences of patients, physicians and medical technicians. RESULTS A total of 57 patients were enrolled; 16 (28.1%) had permanent pacemakers, 34 (59.6%) had implantable cardioverter defibrillators and 7 (12.3%) had cardiac resynchronisation therapy defibrillators. Overall, of 334 remote transmissions scheduled, 73.7% were on time, 14.5% were overdue and 11.8% were missed. 84.6% of wireless transmissions were on time, compared to 53.8% of non-wireless transmissions. Among all transmissions, 4.4% contained alerts for which physicians were informed and only 1.8% required intervention. 98.6% of remote transmissions were reviewed by the second working day. 73.2% of patients preferred remote monitoring. Physicians agreed that remote transmissions provided information equivalent to in-office checks 97.1% of the time. 77.8% of medical technicians felt that remote monitoring would help the hospital improve patient management. No adverse events were reported. CONCLUSION Remote monitoring of CIED is safe and feasible. It has possible benefits to patient safety through earlier detection of arrhythmias or device malfunction, permitting earlier intervention. Wireless remote monitoring, in particular, may improve compliance to device monitoring. Patients may prefer remote monitoring due to possible improvements in quality of life. PMID:27439396
A national study into the rural and remote pharmacist workforce.
Smith, Janie D; White, Col; Roufeil, Louise; Veitch, Craig; Pont, Lisa; Patel, Bhavini; Battye, Kristine; Luetsch, Karen; Mitchell, Chris
2013-01-01
As for many health professionals, distance presents an enormous challenge to pharmacists working in rural and remote Australia. Previous studies have identified issues relating to the size of the rural and remote pharmacist workforce, and a number of national initiatives have been implemented to promote the recruitment and retention of pharmacists in rural and remote locations. The aim of this study was to explore and describe the current rural and remote pharmacy workforce, and to identify barriers and drivers influencing rural and remote pharmacy practice. A mixed-methods approach was used, which comprised a qualitative national consultation and a quantitative rural and remote pharmacist workforce survey. Semi-structured interviews (n=83) and focus groups (n=15, 143 participants) were conducted throughout Australia in 2009 with stakeholders with an interest in rural and remote pharmacy, practising rural/remote pharmacists and pharmacy educators, and as well as with peak pharmacy organizations, to explore the issues associated with rural/remote practice. Based on the findings of the qualitative work a 45-item survey was developed to further explore the relevance of the issues identified in the qualitative consultation. All registered Australian pharmacists practising in non-urban locations (RRMA 3-7, n=3,300) were identified and invited to participate in the study, with a response rate of 23.4%. The main themes identified from the qualitative consultation were the impact of national increases in the pharmacist workforce on rural/remote practice; the role of the regional pharmacy schools in contributing to the rural/remote workforce; and the perceptions of differences in pharmacist roles in rural/remote practice. The survey indicated that pharmacists practising in rural and remote locations were older than the national average (55.8 years versus 40 years). Differences in their professional role were seen in different pharmacy sectors, with hospital pharmacists spending significantly more time on the delivery of professional services and education and teaching, but less time on medication supply than community pharmacists. Rural/remote pharmacists were generally found to be satisfied with their current role. The main 'satisfiers' reported were task variety, customer appreciation, use of advanced skills, appropriate remuneration, happiness in their work location, sound relationships with other pharmacists, a happy team and relationships with other health professionals. This study described the distribution, roles and factors affecting rural and remote pharmacy practice. While the results presented provide an extensive overview of the rural/remote workforce, a comparable national study comparing rural/remote and urban pharmacists would further contribute to this discussion. Knowledge on why pharmacists chose to work in a particular geographical location, or why pharmacists chose to leave a location would further enrich our knowledge on what drives and sustains the rural/remote pharmacist workforce.
Development of wide area environment accelerator operation and diagnostics method
NASA Astrophysics Data System (ADS)
Uchiyama, Akito; Furukawa, Kazuro
2015-08-01
Remote operation and diagnostic systems for particle accelerators have been developed for beam operation and maintenance in various situations. Even though fully remote experiments are not necessary, the remote diagnosis and maintenance of the accelerator is required. Considering remote-operation operator interfaces (OPIs), the use of standard protocols such as the hypertext transfer protocol (HTTP) is advantageous, because system-dependent protocols are unnecessary between the remote client and the on-site server. Here, we have developed a client system based on WebSocket, which is a new protocol provided by the Internet Engineering Task Force for Web-based systems, as a next-generation Web-based OPI using the Experimental Physics and Industrial Control System Channel Access protocol. As a result of this implementation, WebSocket-based client systems have become available for remote operation. Also, as regards practical application, the remote operation of an accelerator via a wide area network (WAN) faces a number of challenges, e.g., the accelerator has both experimental device and radiation generator characteristics. Any error in remote control system operation could result in an immediate breakdown. Therefore, we propose the implementation of an operator intervention system for remote accelerator diagnostics and support that can obviate any differences between the local control room and remote locations. Here, remote-operation Web-based OPIs, which resolve security issues, are developed.
Remote programming of cochlear implants: a telecommunications model.
McElveen, John T; Blackburn, Erin L; Green, J Douglas; McLear, Patrick W; Thimsen, Donald J; Wilson, Blake S
2010-09-01
Evaluate the effectiveness of remote programming for cochlear implants. Retrospective review of the cochlear implant performance for patients who had undergone mapping and programming of their cochlear implant via remote connection through the Internet. Postoperative Hearing in Noise Test and Consonant/Nucleus/Consonant word scores for 7 patients who had undergone remote mapping and programming of their cochlear implant were compared with the mean scores of 7 patients who had been programmed by the same audiologist over a 12-month period. Times required for remote and direct programming were also compared. The quality of the Internet connection was assessed using standardized measures. Remote programming was performed via a virtual private network with a separate software program used for video and audio linkage. All 7 patients were programmed successfully via remote connectivity. No untoward patient experiences were encountered. No statistically significant differences could be found in comparing postoperative Hearing in Noise Test and Consonant/Nucleus/Consonant word scores for patients who had undergone remote programming versus a similar group of patients who had their cochlear implant programmed directly. Remote programming did not require a significantly longer programming time for the audiologist with these 7 patients. Remote programming of a cochlear implant can be performed safely without any deterioration in the quality of the programming. This ability to remotely program cochlear implant patients gives the potential to extend cochlear implantation to underserved areas in the United States and elsewhere.
Operational programs in forest management and priority in the utilization of remote sensing
NASA Technical Reports Server (NTRS)
Douglass, R. W.
1978-01-01
A speech is given on operational remote sensing programs in forest management and the importance of remote sensing in forestry is emphasized. Forest service priorities in using remote sensing are outlined.
Single transmission line interrogated multiple channel data acquisition system
Fasching, George E.; Keech, Jr., Thomas W.
1980-01-01
A single transmission line interrogated multiple channel data acquisition system is provided in which a plurality of remote station/sensor circuits each monitors a specific process variable and each transmits measurement values over a single transmission line to a master interrogating station when addressed by said master interrogating station. Typically, as many as 330 remote stations may be parallel connected to the transmission line which may exceed 7,000 feet. The interrogation rate is typically 330 stations/second. The master interrogating station samples each station according to a shared, charging transmit-receive cycle. All remote station address signals, all data signals from the remote stations/sensors and all power for all of the remote station/sensors are transmitted via a single continuous terminated coaxial cable. A means is provided for periodically and remotely calibrating all remote sensors for zero and span. A provision is available to remotely disconnect any selected sensor station from the main transmission line.
Kampik, Timotheus; Larsen, Frank; Bellika, Johan Gustav
2015-01-01
The objective of the study was to identify experiences and attitudes of German and Norwegian general practitioners (GPs) towards Internet-based remote consultation solutions supporting communication between GPs and patients in the context of the German and Norwegian healthcare systems. Interviews with four German and five Norwegian GPs were conducted. The results were qualitatively analyzed. All interviewed GPs stated they would like to make use of Internet-based remote consultations in the future. Current experiences with remote consultations are existent to a limited degree. No GP reported to use a comprehensive remote consultation solution. The main features GPs would like to see in a remote consultation solution include asynchronous exchange of text messages, video conferencing with text chat, scheduling of remote consultation appointments, secure login and data transfer and the integration of the remote consultation solution into the GP's EHR system.
Bradetich, Ryan; Dearien, Jason A; Grussling, Barry Jakob; Remaley, Gavin
2013-11-05
The present disclosure provides systems and methods for remote device management. According to various embodiments, a local intelligent electronic device (IED) may be in communication with a remote IED via a limited bandwidth communication link, such as a serial link. The limited bandwidth communication link may not support traditional remote management interfaces. According to one embodiment, a local IED may present an operator with a management interface for a remote IED by rendering locally stored templates. The local IED may render the locally stored templates using sparse data obtained from the remote IED. According to various embodiments, the management interface may be a web client interface and/or an HTML interface. The bandwidth required to present a remote management interface may be significantly reduced by rendering locally stored templates rather than requesting an entire management interface from the remote IED. According to various embodiments, an IED may comprise an encryption transceiver.
Remote sensing, land use, and demography - A look at people through their effects on the land
NASA Technical Reports Server (NTRS)
Paul, C. K.; Landini, A. J.
1976-01-01
Relevant causes of failure by the remote sensing community in the urban scene are analyzed. The reasons for the insignificant role of remote sensing in urban land use data collection are called the law of realism, the incompatibility of remote sensing and urban management system data formats is termed the law of nominal/ordinal systems compatibility, and the land use/population correlation dilemma is referred to as the law of missing persons. The study summarizes the three laws of urban land use information for which violations, avoidance, or ignorance have caused the decline of present remote sensing research. Particular attention is given to the rationale for urban land use information and for remote sensing. It is shown that remote sensing of urban land uses compatible with the three laws can be effectively developed by realizing the 10 percent contribution of remote sensing to urban land use planning data collection.
Remote information service access system based on a client-server-service model
Konrad, Allan M.
1996-01-01
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.
Remote information service access system based on a client-server-service model
Konrad, A.M.
1997-12-09
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.
Remote information service access system based on a client-server-service model
Konrad, Allan M.
1999-01-01
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.
Remote information service access system based on a client-server-service model
Konrad, A.M.
1996-08-06
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.
Remote information service access system based on a client-server-service model
Konrad, Allan M.
1997-01-01
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.
NASA Technical Reports Server (NTRS)
1991-01-01
The proceedings contain papers discussing the state-of-the-art exploration, engineering, and environmental applications of geologic remote sensing, along with the research and development activities aimed at increasing the future capabilities of this technology. The following topics are addressed: spectral geology, U.S. and international hydrocarbon exporation, radar and thermal infrared remote sensing, engineering geology and hydrogeology, mineral exploration, remote sensing for marine and environmental applications, image processing and analysis, geobotanical remote sensing, and data integration and geographic information systems. Particular attention is given to spectral alteration mapping with imaging spectrometers, mapping the coastal plain of the Congo with airborne digital radar, applications of remote sensing techniques to the assessment of dam safety, remote sensing of ferric iron minerals as guides for gold exploration, principal component analysis for alteration mappping, and the application of remote sensing techniques for gold prospecting in the north Fujian province.
Methods of training the graduate level and professional geologist in remote sensing technology
NASA Technical Reports Server (NTRS)
Kolm, K. E.
1981-01-01
Requirements for a basic course in remote sensing to accommodate the needs of the graduate level and professional geologist are described. The course should stress the general topics of basic remote sensing theory, the theory and data types relating to different remote sensing systems, an introduction to the basic concepts of computer image processing and analysis, the characteristics of different data types, the development of methods for geological interpretations, the integration of all scales and data types of remote sensing in a given study, the integration of other data bases (geophysical and geochemical) into a remote sensing study, and geological remote sensing applications. The laboratories should stress hands on experience to reinforce the concepts and procedures presented in the lecture. The geologist should then be encouraged to pursue a second course in computer image processing and analysis of remotely sensed data.
Remote sensing of Earth terrain
NASA Technical Reports Server (NTRS)
Kong, J. A.
1993-01-01
Progress report on remote sensing of Earth terrain covering the period from Jan. to June 1993 is presented. Areas of research include: radiative transfer model for active and passive remote sensing of vegetation canopy; polarimetric thermal emission from rough ocean surfaces; polarimetric passive remote sensing of ocean wind vectors; polarimetric thermal emission from periodic water surfaces; layer model with tandom spheriodal scatterers for remote sensing of vegetation canopy; application of theoretical models to active and passive remote sensing of saline ice; radiative transfer theory for polarimetric remote sensing of pine forest; scattering of electromagnetic waves from a dense medium consisting of correlated mie scatterers with size distributions and applications to dry snow; variance of phase fluctuations of waves propagating through a random medium; polarimetric signatures of a canopy of dielectric cylinders based on first and second order vector radiative transfer theory; branching model for vegetation; polarimetric passive remote sensing of periodic surfaces; composite volume and surface scattering model; and radar image classification.
Remote sensing by satellite - Technical and operational implications for international cooperation
NASA Technical Reports Server (NTRS)
Doyle, S. E.
1976-01-01
International cooperation in the U.S. Space Program is discussed and related to the NASA program for remote sensing of the earth. Satellite remote sensing techniques are considered along with the selection of the best sensors and wavelength bands. The technology of remote sensing satellites is considered with emphasis on the Landsat system configuration. Future aspects of remote sensing satellites are considered.
Remote sensing in operational range management programs in Western Canada
NASA Technical Reports Server (NTRS)
Thompson, M. D.
1977-01-01
A pilot program carried out in Western Canada to test remote sensing under semi-operational conditions and display its applicability to operational range management programs was described. Four agencies were involved in the program, two in Alberta and two in Manitoba. Each had different objectives and needs for remote sensing within its range management programs, and each was generally unfamiliar with remote sensing techniques and their applications. Personnel with experience and expertise in the remote sensing and range management fields worked with the agency personnel through every phase of the pilot program. Results indicate that these agencies have found remote sensing to be a cost effective tool and will begin to utilize remote sensing in their operational work during ensuing seasons.
Remote Monitoring of Cardiac Implantable Electronic Devices.
Cheung, Christopher C; Deyell, Marc W
2018-01-08
Over the past decade, technological advancements have transformed the delivery of care for arrhythmia patients. From early transtelephonic monitoring to new devices capable of wireless and cellular transmission, remote monitoring has revolutionized device care. In this article, we review the current evolution and evidence for remote monitoring in patients with cardiac implantable electronic devices. From passive transmission of device diagnostics, to active transmission of patient- and device-triggered alerts, remote monitoring can shorten the time to diagnosis and treatment. Studies have shown that remote monitoring can reduce hospitalization and emergency room visits, and improve survival. Remote monitoring can also reduce the health care costs, while providing increased access to patients living in rural or marginalized communities. Unfortunately, as many as two-thirds of patients with remote monitoring-capable devices do not use, or are not offered, this feature. Current guidelines recommend remote monitoring and interrogation, combined with annual in-person evaluation in all cardiac device patients. Remote monitoring should be considered in all eligible device patients and should be considered standard of care. Copyright © 2018 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.
SAR Altimetry Processing on Demand Service for Cryosat-2 and Sentinel-3 at ESA G-Pod
NASA Astrophysics Data System (ADS)
Dinardo, Salvatore; Benveniste, Jérôme; Ambrózio, Américo; Restano, Marco
2016-07-01
The G-POD SARvatore service to users for the exploitation of CryoSat-2 data was designed and developed by the Altimetry Team at ESA-ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The G-POD service coined SARvatore (SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation) is a web platform that allows any scientist to process on-line, on-demand and with user-selectable configuration CryoSat-2 SAR/SARIN data, from L1a (FBR) data products up to SAR/SARin Level-2 geophysical data products. The Processor takes advantage of the G-POD (Grid Processing On Demand) distributed computing platform (350 CPUs in ~70 Working Nodes) to timely deliver output data products and to interface with ESA-ESRIN FBR data archive (155'000 SAR passes and 41'000 SARin passes). The output data products are generated in standard NetCDF format (using CF Convention), therefore being compatible with the Multi-Mission Radar Altimetry Toolbox (BRAT) and other NetCDF tools. By using the G-POD graphical interface, it is straightforward to select a geographical area of interest within the time-frame related to the Cryosat-2 SAR/SARin FBR data products availability in the service catalogue. The processor prototype is versatile, allowing users to customize and to adapt the processing according to their specific requirements by setting a list of configurable options. After the task submission, users can follow, in real time, the status of the processing, which can be lengthy due to the required intense number-crunching inherent to SAR processing. From the web interface, users can choose to generate experimental SAR data products as stack data and RIP (Range Integrated Power) waveforms. The processing service, initially developed to support the awarded development contracts by confronting the deliverables to ESA's prototype, is now made available to the worldwide SAR Altimetry Community for research & development experiments, for on-site demonstrations/training in training courses and workshops, for cross-comparison to third party products (e.g. CLS/CNES CPP or ESA SAR COP data products), for the preparation of the Sentinel-3 Surface Topography Mission, for producing data and graphics for publications, etc. Initially, the processing was designed and uniquely optimized for open ocean studies. It was based on the SAMOSA model developed for the Sentinel-3 Ground Segment using CryoSat data (Cotton et al., 2008; Ray et al., 2014). However, since June 2015, a new retracker (SAMOSA+) is offered within the service as a dedicated retracker for coastal zone, inland water and sea-ice/ice-sheet. In view of the Sentinel-3 launch, a new flavor of the service will be initiated, exclusively dedicated to the processing of Sentinel-3 mission data products. The scope of this new service will be to maximize the exploitation of the upcoming Sentinel-3 Surface Topography Mission's data over all surfaces. The service is open, free of charge (supported by the ESA SEOM Programme Element) for worldwide scientific applications and available at https://gpod.eo.esa.int/services/CRYOSAT_SAR/
Changing knowledge perspective in a changing world: The Adriatic multidisciplinary TDS approach
NASA Astrophysics Data System (ADS)
Bergamasco, Andrea; Carniel, Sandro; Nativi, Stefano; Signell, Richard P.; Benetazzo, Alvise; Falcieri, Francesco M.; Bonaldo, Davide; Minuzzo, Tiziano; Sclavo, Mauro
2013-04-01
The use and exploitation of the marine environment in recent years has been increasingly high, therefore calling for the need of a better description, monitoring and understanding of its behavior. However, marine scientists and managers often spend too much time in accessing and reformatting data instead of focusing on discovering new knowledge from the processes observed and data acquired. There is therefore the need to make more efficient our approach to data mining, especially in a world where rapid climate change imposes rapid and quick choices. In this context, it is mandatory to explore ways and possibilities to make large amounts of distributed data usable in an efficient and easy way, an effort that requires standardized data protocols, web services and standards-based tools. Following the US-IOOS approach, which has been adopted in many oceanographic and meteorological sectors, we present a CNR experience in the direction of setting up a national Italian IOOS framework (at the moment confined at the Adriatic Sea environment), using the THREDDS (THematic Real-time Environmental Distributed Data Services) Data Server (TDS). A TDS is a middleware designed to fill the gap between data providers and data users, and provides services allowing data users to find the data sets pertaining to their scientific needs, to access, visualize and use them in an easy way, without the need of downloading files to the local workspace. In order to achieve this results, it is necessary that the data providers make their data available in a standard form that the TDS understands, and with sufficient metadata so that the data can be read and searched for in a standard way. The TDS core is a NetCDF- Java Library implementing a Common Data Model (CDM), as developed by Unidata (http://www.unidata.ucar.edu), allowing the access to "array-based" scientific data. Climate and Forecast (CF) compliant NetCDF files can be read directly with no modification, while non-compliant files can be modified to meet appropriate metadata requirements. Once standardized in the CDM, the TDS makes datasets available through a series of web services such as OPeNDAP or Open Geospatial Consortium Web Coverage Service (WCS), allowing the data users to easily obtain small subsets from large datasets, and to quickly visualize their content by using tools such as GODIVA2 or Integrated Data Viewer (IDV). In addition, an ISO metadata service is available through the TDS that can be harvested by catalogue broker services (e.g. GI-cat) to enable distributed search across federated data servers. Example of TDS datasets from oceanographic evolutions (currents, waves, sediments...) will be described and discussed, while some examples can be accessed directly to the Venice site http://tds.ve.ismar.cnr.it:8080/thredds/catalog.html (Bergamasco et al., 2012) also within the framework of RITMARE Project. References Bergamasco A., Benetazzo A., Carniel S., Falcieri F., Minuzzo T., Signell R.P. and M. Sclavo, 2012. From interoperability to knowledge discovery using large model datasets in the marine environment: the THREDDS Data Server example. Advances in Oceanography and Limnology, 3(1), 41-50. DOI:10.1080/19475721.2012.669637
High speed polling protocol for multiple node network
NASA Technical Reports Server (NTRS)
Kirkham, Harold (Inventor)
1995-01-01
The invention is a multiple interconnected network of intelligent message-repeating remote nodes which employs a remote node polling process performed by a master node by transmitting a polling message generically addressed to all remote nodes associated with the master node. Each remote node responds upon receipt of the generically addressed polling message by transmitting a poll-answering informational message and by relaying the polling message to other adjacent remote nodes.
Protocol for multiple node network
NASA Technical Reports Server (NTRS)
Kirkham, Harold (Inventor)
1995-01-01
The invention is a multiple interconnected network of intelligent message-repeating remote nodes which employs an antibody recognition message termination process performed by all remote nodes and a remote node polling process performed by other nodes which are master units controlling remote nodes in respective zones of the network assigned to respective master nodes. Each remote node repeats only those messages originated in the local zone, to provide isolation among the master nodes.
Protocol for multiple node network
NASA Technical Reports Server (NTRS)
Kirkham, Harold (Inventor)
1994-01-01
The invention is a multiple interconnected network of intelligent message-repeating remote nodes which employs an antibody recognition message termination process performed by all remote nodes and a remote node polling process performed by other nodes which are master units controlling remote nodes in respective zones of the network assigned to respective master nodes. Each remote node repeats only those messages originated in the local zone, to provide isolation among the master nodes.
PROCEEDINGS OF THE FOURTH SYMPOSIUM ON REMOTE SENSING OF ENVIRONMENT; 12, 13, 14 APRIL 1966.
The symposium was conducted as part of a continuing program investigating the field of remote sensing , its potential in scientific research and...information on all aspects of remote sensing , with special emphasis on such topics as needs for remotely sensed data, data management, and the special... remote sensing programs, data acquisition, data analysis and application, and equipment design, were presented. (Author)
Optical Fiber Networks for Remote Fiber Optic Sensors
Fernandez-Vallejo, Montserrat; Lopez-Amo, Manuel
2012-01-01
This paper presents an overview of optical fiber sensor networks for remote sensing. Firstly, the state of the art of remote fiber sensor systems has been considered. We have summarized the great evolution of these systems in recent years; this progress confirms that fiber-optic remote sensing is a promising technology with a wide field of practical applications. Afterwards, the most representative remote fiber-optic sensor systems are briefly explained, discussing their schemes, challenges, pros and cons. Finally, a synopsis of the main factors to take into consideration in the design of a remote sensor system is gathered. PMID:22666011
Introduction to the physics and techniques of remote sensing
NASA Technical Reports Server (NTRS)
Elachi, Charles
1987-01-01
This book presents a comprehensive overview of the basics behind remote-sensing physics, techniques, and technology. The physics of wave/matter interactions, techniques of remote sensing across the electromagnetic spectrum, and the concepts behind remote sensing techniques now established and future ones under development are discussed. Applications of remote sensing are described for a wide variety of earth and planetary atmosphere and surface sciences. Solid surface sensing across the electromagnetic spectrum, ocean surface sensing, basic principles of atmospheric sensing and radiative transfer, and atmospheric remote sensing in the microwave, millimeter, submillimeter, and infrared regions are examined.
Remote observing with the Nickel Telescope at Lick Observatory
NASA Astrophysics Data System (ADS)
Grigsby, Bryant; Chloros, Konstantinos; Gates, John; Deich, William T. S.; Gates, Elinor; Kibrick, Robert
2008-07-01
We describe a project to enable remote observing on the Nickel 1-meter Telescope at Lick Observatory. The purpose was to increase the subscription rate and create more economical means for graduate- and undergraduate students to observe with this telescope. The Nickel Telescope resides in a 125 year old dome on Mount Hamilton. Remote observers may work from any of the University of California (UC) remote observing facilities that have been created to support remote work at both Keck Observatory and Lick Observatory. The project included hardware and software upgrades to enable computer control of all equipment that must be operated by the astronomer; a remote observing architecture that is closely modeled on UCO/Lick's work to implement remote observing between UC campuses and Keck Observatory; new policies to ensure safety of Observatory staff and equipment, while ensuring that the telescope subsystems would be suitably configured for remote use; and new software to enforce the safety-related policies. The results increased the subscription rate from a few nights per month to nearly full subscription, and has spurred the installation of remote observing sites at more UC campuses. Thanks to the increased automation and computer control, local observing has also benefitted and is more efficient. Remote observing is now being implemented for the Shane 3- meter telescope.
Kuzovkov, Vladislav; Yanov, Yuri; Levin, Sergey; Bovo, Roberto; Rosignoli, Monica; Eskilsson, Gunnar; Willbas, Staffan
2014-07-01
Remote programming is safe and is well received by health-care professionals and cochlear implant (CI) users. It can be adopted into clinic routine as an alternative to face-to-face programming. Telemedicine allows a patient to be treated anywhere in the world. Although it is a growing field, little research has been published on its application to CI programming. We examined hearing professionals' and CI users' subjective reactions to the remote programming experience, including the quality of the programming and the use of the relevant technology. Remote CI programming was performed in Italy, Sweden, and Russia. Programming sessions had three participants: a CI user, a local host, and a remote expert. After the session, each CI user, local host, and remote expert each completed a questionnaire on their experience. In all, 33 remote programming sessions were carried out, resulting in 99 completed questionnaires. The overwhelming majority of study participants responded positively to all aspects of remote programming. CI users were satisfied with the results in 96.9% of the programming sessions; 100% of participants would use remote programming again. Although technical problems were encountered, they did not cause the sessions to be considerably longer than face-to-face sessions.
Point-of-Care Programming for Neuromodulation: A Feasibility Study Using Remote Presence.
Mendez, Ivar; Song, Michael; Chiasson, Paula; Bustamante, Luis
2013-01-01
The expansion of neuromodulation and its indications has resulted in hundreds of thousands of patients with implanted devices worldwide. Because all patients require programming, this growth has created a heavy burden on neuromodulation centers and patients. Remote point-of-care programming may provide patients with real-time access to neuromodulation expertise in their communities. To test the feasibility of remotely programming a neuromodulation device using a remote-presence robot and to determine the ability of an expert programmer to telementor a nonexpert in programming the device. A remote-presence robot (RP-7) was used for remote programming. Twenty patients were randomly assigned to either conventional programming or a robotic session. The expert remotely mentored 10 nurses with no previous experience to program the devices of patients assigned to the remote-presence sessions. Accuracy of programming, adverse events, and satisfaction scores for all participants were assessed. There was no difference in the accuracy or clinical outcomes of programming between the standard and remote-presence sessions. No adverse events occurred in any session. The patients, nurses, and the expert programmer expressed high satisfaction scores with the remote-presence sessions. This study establishes the proof-of-principle that remote programming of neuromodulation devices using telepresence and expert telementoring of an individual with no previous experience to accurately program a device is feasible. We envision a time in the future when patients with implanted devices will have real-time access to neuromodulation expertise from the comfort of their own home.
Education in Environmental Remote Sensing: Potentials and Problems.
ERIC Educational Resources Information Center
Kiefer, Ralph W.; Lillesand, Thomas M.
1983-01-01
Discusses remote sensing principles and applications and the status and needs of remote sensing education in the United States. A summary of the fundamental policy issues that will determine remote sensing's future role in environmental and resource managements is included. (Author/BC)
1996-04-08
Development tasks and products of remote sensing ground stations in Europe are represented by the In-Sec Corporation and the Schlumberger Industries Corporation. The article presents the main products of these two corporations.
REMOTE: Modem Communicator Program for the IBM personal computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGirt, F.
1984-06-01
REMOTE, a Modem Communicator Program, was developed to provide full duplex serial communication with arbitrary remote computers via either dial-up telephone modems or direct lines. The latest version of REMOTE (documented in this report) was developed for the IBM Personal Computer.
THE EPA REMOTE SENSING ARCHIVE
What would you do if you were faced with organizing 30 years of remote sensing projects that had been haphazardly stored at two separate locations for years then combined? The EPA Remote Sensing Archive, currently located in Las Vegas, Nevada. contains the remote sensing data and...
Remote observing environment using a KVM-over-IP for the OAO 188 cm telescope
NASA Astrophysics Data System (ADS)
Yanagisawa, Kenshi; Inoue, Goki; Kuroda, Daisuke; Ukita, Nobuharu; Mizumoto, Yoshihiko; Izumiura, Hideyuki
2016-08-01
We have prepared remote observing environment for the 188 cm telescope at Okayama Astrophysical Observatory. A KVM-over-IP and a VPN gateway are employed as core devices, which offer reliable, secure and fast link between on site and remote sites. We have confirmed the KVM-over-IP has ideal characteristics for serving the remote observing environment; the use is simple for both users and maintainer; access from any platform is available; multiple and simultaneous access is possible; and maintenance load is small. We also demonstrated that the degradation of observing efficiency specific to the remote observing is negligibly small. The remote observing environment has fully opened since the semester 2016A, about 30% of the total observing time in the last semester was occupied by remote observing.
Ohta, Kunio; Kurosawa, Hiroshi; Shiima, Yuko; Ikeyama, Takanari; Scott, James; Hayes, Scott; Gould, Michael; Buchanan, Newton; Nadkarni, Vinay; Nishisaki, Akira
2017-08-01
To assess the effectiveness of pediatric simulation by remote facilitation. We hypothesized that simulation by remote facilitation is more effective compared to simulation by an on-site facilitator. We defined remote facilitation as a facilitator remotely (1) introduces simulation-based learning and simulation environment, (2) runs scenarios, and (3) performs debriefing with an on-site facilitator. A remote simulation program for medical students during pediatric rotation was implemented. Groups were allocated to either remote or on-site facilitation depending on the availability of telemedicine technology. Both groups had identical 1-hour simulation sessions with 2 scenarios and debriefing. Their team performance was assessed with behavioral assessment tool by a trained rater. Perception by students was evaluated with Likert scale (1-7). Fifteen groups with 89 students participated in a simulation by remote facilitation, and 8 groups with 47 students participated in a simulation by on-site facilitation. Participant demographics and previous simulation experience were similar. Both groups improved their performance from first to second scenario: groups by remote simulation (first [8.5 ± 4.2] vs second [13.2 ± 6.2], P = 0.003), and groups by on-site simulation (first [6.9 ± 4.1] vs second [12.4 ± 6.4], P = 0.056). The performance improvement was not significantly different between the 2 groups (P = 0.94). Faculty evaluation by students was equally high in both groups (7 vs 7; P = 0.65). A pediatric acute care simulation by remote facilitation significantly improved students' performance. In this pilot study, remote facilitation seems as effective as a traditional, locally facilitated simulation. The remote simulation can be a strong alternative method, especially where experienced facilitators are limited.
Research on remote sensing image pixel attribute data acquisition method in AutoCAD
NASA Astrophysics Data System (ADS)
Liu, Xiaoyang; Sun, Guangtong; Liu, Jun; Liu, Hui
2013-07-01
The remote sensing image has been widely used in AutoCAD, but AutoCAD lack of the function of remote sensing image processing. In the paper, ObjectARX was used for the secondary development tool, combined with the Image Engine SDK to realize remote sensing image pixel attribute data acquisition in AutoCAD, which provides critical technical support for AutoCAD environment remote sensing image processing algorithms.
Bibliography of Remote Sensing Techniques Used in Wetland Research.
1993-01-01
remote sensing technology for detecting changes in wetland environments. This report documents a bibliographic search conducted as part of that work unit on applications of remote sensing techniques in wetland research. Results were used to guide research efforts on the use of remote sensing technology for wetland change detection and assessment. The citations are presented in three appendixes, organized by wetland type, sensor type, and author.... Change detection, Wetland assessment, Remote sensing ,
Predicting risk of invasive species occurrence - remote-sesning strategies
USDA-ARS?s Scientific Manuscript database
Remote sensing is a means to describe characteristics of an area without physically sampling the area. Remote sensors can be mounted on a satellite, plane, or other airborne structure. Remotely sensed data allow for landscape perspectives on management issues. Sensors measure the electromagnetic ene...
Remote sensing for detecting and mapping whitefly (Bemisia tabaci) infestations
USDA-ARS?s Scientific Manuscript database
Remote sensing technology has long been used for detecting insect infestations on agricultural crops. With recent advances in remote sensing sensors and other spatial information technologies such as Global Position Systems (GPS) and Geographic Information Systems (GIS), remote sensing is finding mo...
Reflections on Earth--Remote-Sensing Research from Your Classroom.
ERIC Educational Resources Information Center
Campbell, Bruce A.
2001-01-01
Points out the uses of remote sensing in different areas, and introduces the program "Reflections on Earth" which provides access to basic and instructional information on remote sensing to students and teachers. Introduces students to concepts related to remote sensing and measuring distances. (YDS)
Remote-Sensing Practice and Potential
1974-05-01
Six essential processes that must be accomplished if use of a remote - sensing system is to result in useful information are defined as problem...to be useful in remote - sensing projects are described. An overview of the current state-of-the-art of remote sensing is presented.
History and future of remote sensing technology and education
NASA Technical Reports Server (NTRS)
Colwell, R. N.
1980-01-01
A historical overview of the discovery and development of photography, related sciences, and remote sensing technology is presented. The role of education to date in the development of remote sensing is discussed. The probable future and potential of remote sensing and training is described.
NASA Astrophysics Data System (ADS)
Childers, Gina; Jones, M. Gail
2017-02-01
Through partnerships with scientists, students can now conduct research in science laboratories from a distance through remote access technologies. The purpose of this study was to explore factors that contribute to a remote learning environment by documenting high school students' perceptions of science motivation, science identity, and virtual presence during a remote microscopy investigation. Exploratory factor analysis identified 3 factors accounting for 63% of the variance, which suggests that Science Learning Drive (students' perception of their competence and performance in science and intrinsic motivation to do science), Environmental Presence (students' perception of control of the remote technology, sensory, and distraction factors in the learning environment, and relatedness to scientists), and Inner Realism Presence (students' perceptions of how real is the remote programme and being recognised as a science-oriented individual) were factors that contribute to a student's experience during a remote investigation. Motivation, science identity, and virtual presence in remote investigations are explored.