Sample records for extensible grid-based rich

  1. [Analysis on difference of richness of traditional Chinese medicine resources in Chongqing based on grid technology].

    PubMed

    Zhang, Xiao-Bo; Qu, Xian-You; Li, Meng; Wang, Hui; Jing, Zhi-Xian; Liu, Xiang; Zhang, Zhi-Wei; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    After the end of the national and local medicine resources census work, a large number of Chinese medicine resources and distribution of data will be summarized. The species richness between the regions is a valid indicator for objective reflection of inter-regional resources of Chinese medicine. Due to the large difference in the size of the county area, the assessment of the intercropping of the resources of the traditional Chinese medicine by the county as a statistical unit will lead to the deviation of the regional abundance statistics. Based on the rule grid or grid statistical methods, the size of the statistical unit due to different can be reduced, the differences in the richness of traditional Chinese medicine resources are caused. Taking Chongqing as an example, based on the existing survey data, the difference of richness of traditional Chinese medicine resources under different grid scale were compared and analyzed. The results showed that the 30 km grid could be selected and the richness of Chinese medicine resources in Chongqing could reflect the objective situation of intercropping resources richness in traditional Chinese medicine better. Copyright© by the Chinese Pharmaceutical Association.

  2. GridAPPS-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-03-28

    GridAPPS-D is an open-source, open architecture, standards based platform for development of advanced electric power system planning and operations applications. GridAPPS-D provides a documented data abstraction for the application developer enabling creation of applications that can be run in any compliant system or platform. This enables development of applications that are platform vendor independent applications and applications that take advantage of the possibility of data rich and data driven applications based on deployment of smart grid devices and systems.

  3. Testing the Efficacy of Global Biodiversity Hotspots for Insect Conservation: The Case of South African Katydids.

    PubMed

    Bazelet, Corinna S; Thompson, Aileen C; Naskrecki, Piotr

    2016-01-01

    The use of endemism and vascular plants only for biodiversity hotspot delineation has long been contested. Few studies have focused on the efficacy of global biodiversity hotspots for the conservation of insects, an important, abundant, and often ignored component of biodiversity. We aimed to test five alternative diversity measures for hotspot delineation and examine the efficacy of biodiversity hotspots for conserving a non-typical target organism, South African katydids. Using a 1° fishnet grid, we delineated katydid hotspots in two ways: (1) count-based: grid cells in the top 10% of total, endemic, threatened and/or sensitive species richness; vs. (2) score-based: grid cells with a mean value in the top 10% on a scoring system which scored each species on the basis of its IUCN Red List threat status, distribution, mobility and trophic level. We then compared katydid hotspots with each other and with recognized biodiversity hotspots. Grid cells within biodiversity hotspots had significantly higher count-based and score-based diversity than non-hotspot grid cells. There was a significant association between the three types of hotspots. Of the count-based measures, endemic species richness was the best surrogate for the others. However, the score-based measure out-performed all count-based diversity measures. Species richness was the least successful surrogate of all. The strong performance of the score-based method for hotspot prediction emphasizes the importance of including species' natural history information for conservation decision-making, and is easily adaptable to other organisms. Furthermore, these results add empirical support for the efficacy of biodiversity hotspots in conserving non-target organisms.

  4. Testing the Efficacy of Global Biodiversity Hotspots for Insect Conservation: The Case of South African Katydids

    PubMed Central

    Bazelet, Corinna S.; Thompson, Aileen C.; Naskrecki, Piotr

    2016-01-01

    The use of endemism and vascular plants only for biodiversity hotspot delineation has long been contested. Few studies have focused on the efficacy of global biodiversity hotspots for the conservation of insects, an important, abundant, and often ignored component of biodiversity. We aimed to test five alternative diversity measures for hotspot delineation and examine the efficacy of biodiversity hotspots for conserving a non-typical target organism, South African katydids. Using a 1° fishnet grid, we delineated katydid hotspots in two ways: (1) count-based: grid cells in the top 10% of total, endemic, threatened and/or sensitive species richness; vs. (2) score-based: grid cells with a mean value in the top 10% on a scoring system which scored each species on the basis of its IUCN Red List threat status, distribution, mobility and trophic level. We then compared katydid hotspots with each other and with recognized biodiversity hotspots. Grid cells within biodiversity hotspots had significantly higher count-based and score-based diversity than non-hotspot grid cells. There was a significant association between the three types of hotspots. Of the count-based measures, endemic species richness was the best surrogate for the others. However, the score-based measure out-performed all count-based diversity measures. Species richness was the least successful surrogate of all. The strong performance of the score-based method for hotspot prediction emphasizes the importance of including species’ natural history information for conservation decision-making, and is easily adaptable to other organisms. Furthermore, these results add empirical support for the efficacy of biodiversity hotspots in conserving non-target organisms. PMID:27631131

  5. The Mass-loss Return from Evolved Stars to the Large Magellanic Cloud. IV. Construction and Validation of a Grid of Models for Oxygen-rich AGB Stars, Red Supergiants, and Extreme AGB Stars

    NASA Astrophysics Data System (ADS)

    Sargent, Benjamin A.; Srinivasan, S.; Meixner, M.

    2011-02-01

    To measure the mass loss from dusty oxygen-rich (O-rich) evolved stars in the Large Magellanic Cloud (LMC), we have constructed a grid of models of spherically symmetric dust shells around stars with constant mass-loss rates using 2Dust. These models will constitute the O-rich model part of the "Grid of Red supergiant and Asymptotic giant branch star ModelS" (GRAMS). This model grid explores four parameters—stellar effective temperature from 2100 K to 4700 K luminosity from 103 to 106 L sun; dust shell inner radii of 3, 7, 11, and 15 R star; and 10.0 μm optical depth from 10-4 to 26. From an initial grid of ~1200 2Dust models, we create a larger grid of ~69,000 models by scaling to cover the luminosity range required by the data. These models are available online to the public. The matching in color-magnitude diagrams and color-color diagrams to observed O-rich asymptotic giant branch (AGB) and red supergiant (RSG) candidate stars from the SAGE and SAGE-Spec LMC samples and a small sample of OH/IR stars is generally very good. The extreme AGB star candidates from SAGE are more consistent with carbon-rich (C-rich) than O-rich dust composition. Our model grid suggests lower limits to the mid-infrared colors of the dustiest AGB stars for which the chemistry could be O-rich. Finally, the fitting of GRAMS models to spectral energy distributions of sources fit by other studies provides additional verification of our grid and anticipates future, more expansive efforts.

  6. Surfer: An Extensible Pull-Based Framework for Resource Selection and Ranking

    NASA Technical Reports Server (NTRS)

    Zolano, Paul Z.

    2004-01-01

    Grid computing aims to connect large numbers of geographically and organizationally distributed resources to increase computational power; resource utilization, and resource accessibility. In order to effectively utilize grids, users need to be connected to the best available resources at any given time. As grids are in constant flux, users cannot be expected to keep up with the configuration and status of the grid, thus they must be provided with automatic resource brokering for selecting and ranking resources meeting constraints and preferences they specify. This paper presents a new OGSI-compliant resource selection and ranking framework called Surfer that has been implemented as part of NASA's Information Power Grid (IPG) project. Surfer is highly extensible and may be integrated into any grid environment by adding information providers knowledgeable about that environment.

  7. Grid Generation Techniques Utilizing the Volume Grid Manipulator

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1998-01-01

    This paper presents grid generation techniques available in the Volume Grid Manipulation (VGM) code. The VGM code is designed to manipulate existing line, surface and volume grids to improve the quality of the data. It embodies an easy to read rich language of commands that enables such alterations as topology changes, grid adaption and smoothing. Additionally, the VGM code can be used to construct simplified straight lines, splines, and conic sections which are common curves used in the generation and manipulation of points, lines, surfaces and volumes (i.e., grid data). These simple geometric curves are essential in the construction of domain discretizations for computational fluid dynamic simulations. By comparison to previously established methods of generating these curves interactively, the VGM code provides control of slope continuity and grid point-to-point stretchings as well as quick changes in the controlling parameters. The VGM code offers the capability to couple the generation of these geometries with an extensive manipulation methodology in a scripting language. The scripting language allows parametric studies of a vehicle geometry to be efficiently performed to evaluate favorable trends in the design process. As examples of the powerful capabilities of the VGM code, a wake flow field domain will be appended to an existing X33 Venturestar volume grid; negative volumes resulting from grid expansions to enable flow field capture on a simple geometry, will be corrected; and geometrical changes to a vehicle component of the X33 Venturestar will be shown.

  8. Design of Grid Portal System Based on RIA

    NASA Astrophysics Data System (ADS)

    Cao, Caifeng; Luo, Jianguo; Qiu, Zhixin

    Grid portal is an important branch of grid research. In order to solve the weak expressive force, the poor interaction, the low operating efficiency and other insufficiencies of the first and second generation of grid portal system, RIA technology was introduced to it. A new portal architecture was designed based on RIA and Web service. The concrete realizing scheme of portal system was presented by using Adobe Flex/Flash technology, which formed a new design pattern. In system architecture, the design pattern has B/S and C/S superiorities, balances server and its client side, optimizes the system performance, realizes platform irrelevance. In system function, the design pattern realizes grid service call, provides client interface with rich user experience, integrates local resources by using FABridge, LCDS, Flash player and some other components.

  9. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    PubMed Central

    Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA

    2008-01-01

    Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776

  10. Semantics-enabled service discovery framework in the SIMDAT pharma grid.

    PubMed

    Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert

    2008-03-01

    We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.

  11. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  12. Urbanization and the more-individuals hypothesis.

    PubMed

    Chiari, Claudia; Dinetti, Marco; Licciardello, Cinzia; Licitra, Gaetano; Pautasso, Marco

    2010-03-01

    1. Urbanization is a landscape process affecting biodiversity world-wide. Despite many urban-rural studies of bird assemblages, it is still unclear whether more species-rich communities have more individuals, regardless of the level of urbanization. The more-individuals hypothesis assumes that species-rich communities have larger populations, thus reducing the chance of local extinctions. 2. Using newly collated avian distribution data for 1 km(2) grid cells across Florence, Italy, we show a significantly positive relationship between species richness and assemblage abundance for the whole urban area. This richness-abundance relationship persists for the 1 km(2) grid cells with less than 50% of urbanized territory, as well as for the remaining grid cells, with no significant difference in the slope of the relationship. These results support the more-individuals hypothesis as an explanation of patterns in species richness, also in human modified and fragmented habitats. 3. However, the intercept of the species richness-abundance relationship is significantly lower for highly urbanized grid cells. Our study confirms that urban communities have lower species richness but counters the common notion that assemblages in densely urbanized ecosystems have more individuals. In Florence, highly inhabited areas show fewer species and lower assemblage abundance. 4. Urbanized ecosystems are an ongoing large-scale natural experiment which can be used to test ecological theories empirically.

  13. A Vertically Flow-Following, Icosahedral Grid Model for Medium-Range and Seasonal Prediction. Part 1: Model Description

    NASA Technical Reports Server (NTRS)

    Bleck, Rainer; Bao, Jian-Wen; Benjamin, Stanley G.; Brown, John M.; Fiorino, Michael; Henderson, Thomas B.; Lee, Jin-Luen; MacDonald, Alexander E.; Madden, Paul; Middlecoff, Jacques; hide

    2015-01-01

    A hydrostatic global weather prediction model based on an icosahedral horizontal grid and a hybrid terrain following/ isentropic vertical coordinate is described. The model is an extension to three spatial dimensions of a previously developed, icosahedral, shallow-water model featuring user-selectable horizontal resolution and employing indirect addressing techniques. The vertical grid is adaptive to maximize the portion of the atmosphere mapped into the isentropic coordinate subdomain. The model, best described as a stacked shallow-water model, is being tested extensively on real-time medium-range forecasts to ready it for possible inclusion in operational multimodel ensembles for medium-range to seasonal prediction.

  14. GridWise Standards Mapping Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosquet, Mia L.

    ''GridWise'' is a concept of how advanced communications, information and controls technology can transform the nation's energy system--across the spectrum of large scale, central generation to common consumer appliances and equipment--into a collaborative network, rich in the exchange of decision making information and an abundance of market-based opportunities (Widergren and Bosquet 2003) accompanying the electric transmission and distribution system fully into the information and telecommunication age. This report summarizes a broad review of standards efforts which are related to GridWise--those which could ultimately contribute significantly to advancements toward the GridWise vision, or those which represent today's current technological basis uponmore » which this vision must build.« less

  15. Experimental demonstration of an OpenFlow based software-defined optical network employing packet, fixed and flexible DWDM grid technologies on an international multi-domain testbed.

    PubMed

    Channegowda, M; Nejabati, R; Rashidi Fard, M; Peng, S; Amaya, N; Zervas, G; Simeonidou, D; Vilalta, R; Casellas, R; Martínez, R; Muñoz, R; Liu, L; Tsuritani, T; Morita, I; Autenrieth, A; Elbers, J P; Kostecki, P; Kaczmarek, P

    2013-03-11

    Software defined networking (SDN) and flexible grid optical transport technology are two key technologies that allow network operators to customize their infrastructure based on application requirements and therefore minimizing the extra capital and operational costs required for hosting new applications. In this paper, for the first time we report on design, implementation & demonstration of a novel OpenFlow based SDN unified control plane allowing seamless operation across heterogeneous state-of-the-art optical and packet transport domains. We verify and experimentally evaluate OpenFlow protocol extensions for flexible DWDM grid transport technology along with its integration with fixed DWDM grid and layer-2 packet switching.

  16. Non-LTE Line-Blanketed Model Atmospheres of B-type Stars

    NASA Astrophysics Data System (ADS)

    Lanz, T.; Hubeny, I.

    2005-12-01

    We present an extension of our OSTAR2002 grid of NLTE model atmospheres to B-type stars. We have calculated over 1,300 metal line-blanketed, NLTE, plane-parallel, hydrostatic model atmospheres for the basic parameters appropriate to B stars. The grid covers 16 effective temperatures from 15,000 to 30,000 K, with 1000 K steps, 13 surface gravities, log g≤ 4.75 down to the Eddington limit, and 5 compositions (2, 1, 0.5, 0.2, and 0.1 times solar). We have adopted a microturbulent velocity of 2 km/s for all models. In the lower surface gravity range (log g≤ 3.0), we supplemented the main grid with additional model atmospheres accounting for higher microtutbulent velocity (10 km/s) and for alterated surface composition (He and N-rich, C-deficient), as observed in B supergiants. The models incorporate basically all known atomic levels of 46 ions of H, He, C, N, O, Ne, Mg, Al, Si, S, and Fe, which are grouped into 1127 superlevels. Models and spectra will be available at our Web site, http://nova.astro.umd.edu.

  17. Geometrical Characteristics of Cd-Rich Inclusion Defects in CdZnTe Materials

    NASA Astrophysics Data System (ADS)

    Xu, Chao; Sheng, Fengfeng; Yang, Jianrong

    2017-08-01

    The geometrical characteristics of Cd-rich inclusion defects in CdZnTe crystals have been investigated by infrared transmission (IRT) microscopy and chemical etching methods, revealing that they are composed of a Cd-rich inclusion core zone with high dislocation density and defect extension belts. Based on the experimental results, the orientation and shape of these belts were determined, showing that their extension directions in three-dimensional (3-D) space are along <211> crystal orientation. To explain the observed IRT images of Cd-rich inclusion defects, a 3-D model with plate-shaped structure for dislocation extension belts is proposed. Greyscale IRT images of dislocation extension belts thus depend on their absorption layer thickness. Assuming that defects can be discerned by IRT microscopy only when their absorption layer thickness is greater than twice that of the plate-shaped dislocation extension belts, this 3-D defect model can rationalize the IRT images of Cd-rich inclusion defects.

  18. Visual Environment for Rich Data Interpretation (VERDI) program for environmental modeling systems

    EPA Pesticide Factsheets

    VERDI is a flexible, modular, Java-based program used for visualizing multivariate gridded meteorology, emissions and air quality modeling data created by environmental modeling systems such as the CMAQ model and WRF.

  19. Research and design of smart grid monitoring control via terminal based on iOS system

    NASA Astrophysics Data System (ADS)

    Fu, Wei; Gong, Li; Chen, Heli; Pan, Guangji

    2017-06-01

    Aiming at a series of problems existing in current smart grid monitoring Control Terminal, such as high costs, poor portability, simple monitoring system, poor software extensions, low system reliability when transmitting information, single man-machine interface, poor security, etc., smart grid remote monitoring system based on the iOS system has been designed. The system interacts with smart grid server so that it can acquire grid data through WiFi/3G/4G networks, and monitor each grid line running status, as well as power plant equipment operating conditions. When it occurs an exception in the power plant, incident information can be sent to the user iOS terminal equipment timely, which will provide troubleshooting information to help the grid staff to make the right decisions in a timely manner, to avoid further accidents. Field tests have shown the system realizes the integrated grid monitoring functions, low maintenance cost, friendly interface, high security and reliability, and it possesses certain applicable value.

  20. An Extensible Information Grid for Risk Management

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David G.

    2003-01-01

    This paper describes recent work on developing an extensible information grid for risk management at NASA - a RISK INFORMATION GRID. This grid is being developed by integrating information grid technology with risk management processes for a variety of risk related applications. To date, RISK GRID applications are being developed for three main NASA processes: risk management - a closed-loop iterative process for explicit risk management, program/project management - a proactive process that includes risk management, and mishap management - a feedback loop for learning from historical risks that escaped other processes. This is enabled through an architecture involving an extensible database, structuring information with XML, schemaless mapping of XML, and secure server-mediated communication using standard protocols.

  1. Generation of three-dimensional body-fitted grids by solving hyperbolic partial differential equations

    NASA Technical Reports Server (NTRS)

    Steger, Joseph L.

    1989-01-01

    Hyperbolic grid generation procedures are described which have been used in external flow simulations about complex configurations. For many practical applications a single well-ordered (i.e., structured) grid can be used to mesh an entire configuration, in other problems, composite or unstructured grid procedures are needed. Although the hyperbolic partial differential equation grid generation procedure has mainly been utilized to generate structured grids, an extension of the procedure to semiunstructured grids is briefly described. Extensions of the methodology are also described using two-dimensional equations.

  2. Generation of three-dimensional body-fitted grids by solving hyperbolic and parabolic partial differential equations

    NASA Technical Reports Server (NTRS)

    Steger, Joseph L.

    1989-01-01

    Hyperbolic grid generation procedures are described which have been used in external flow simulations about complex configurations. For many practical applications a single well-ordered (i.e., structured) grid can be used to mesh an entire configuration, in other problems, composite or unstructured grid procedures are needed. Although the hyperbolic partial differential equation grid generation procedure has mainly been utilized to generate structured grids, extension of the procedure to semiunstructured grids is briefly described. Extensions of the methodology are also described using two-dimensional equations.

  3. Application of a Scalable, Parallel, Unstructured-Grid-Based Navier-Stokes Solver

    NASA Technical Reports Server (NTRS)

    Parikh, Paresh

    2001-01-01

    A parallel version of an unstructured-grid based Navier-Stokes solver, USM3Dns, previously developed for efficient operation on a variety of parallel computers, has been enhanced to incorporate upgrades made to the serial version. The resultant parallel code has been extensively tested on a variety of problems of aerospace interest and on two sets of parallel computers to understand and document its characteristics. An innovative grid renumbering construct and use of non-blocking communication are shown to produce superlinear computing performance. Preliminary results from parallelization of a recently introduced "porous surface" boundary condition are also presented.

  4. Wildlife monitoring across multiple spatial scales using grid-based sampling

    Treesearch

    Kevin S. McKelvey; Samuel A. Cushman; Michael K. Schwartz; Leonard F. Ruggiero

    2009-01-01

    Recently, noninvasive genetic sampling has become the most effective way to reliably sample occurrence of many species. In addition, genetic data provide a rich data source enabling the monitoring of population status. The combination of genetically based animal data collected at known spatial coordinates with vegetation, topography, and other available covariates...

  5. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques onmore » two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.« less

  6. Property Grids for the Kansas High Plains Aquifer from Water Well Drillers' Logs

    NASA Astrophysics Data System (ADS)

    Bohling, G.; Adkins-Heljeson, D.; Wilson, B. B.

    2017-12-01

    Like a number of state and provincial geological agencies, the Kansas Geological Survey hosts a database of water well drillers' logs, containing the records of sediments and lithologies characterized during drilling. At the moment, the KGS database contains records associated with over 90,000 wells statewide. Over 60,000 of these wells are within the High Plains aquifer (HPA) in Kansas, with the corresponding logs containing descriptions of over 500,000 individual depth intervals. We will present grids of hydrogeological properties for the Kansas HPA developed from this extensive, but highly qualitative, data resource. The process of converting the logs into quantitative form consists of first translating the vast number of unique (and often idiosyncratic) sediment descriptions into a fairly comprehensive set of standardized lithology codes and then mapping the standardized lithologies into a smaller number of property categories. A grid is superimposed on the region and the proportion of each property category is computed within each grid cell, with category proportions in empty grid cells computed by interpolation. Grids of properties such as hydraulic conductivity and specific yield are then computed based on the category proportion grids and category-specific property values. A two-dimensional grid is employed for this large-scale, regional application, with category proportions averaged between two surfaces, such as bedrock and the water table at a particular time (to estimate transmissivity at that time) or water tables at two different times (to estimate specific yield over the intervening time period). We have employed a sequence of water tables for different years, based on annual measurements from an extensive network of wells, providing an assessment of temporal variations in the vertically averaged aquifer properties resulting from water level variations (primarily declines) over time.

  7. Boundary-Layer Stability Analysis of the Mean Flows Obtained Using Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Liao, Wei; Malik, Mujeeb R.; Lee-Rausch, Elizabeth M.; Li, Fei; Nielsen, Eric J.; Buning, Pieter G.; Chang, Chau-Lyan; Choudhari, Meelan M.

    2012-01-01

    Boundary-layer stability analyses of mean flows extracted from unstructured-grid Navier- Stokes solutions have been performed. A procedure has been developed to extract mean flow profiles from the FUN3D unstructured-grid solutions. Extensive code-to-code validations have been performed by comparing the extracted mean ows as well as the corresponding stability characteristics to the predictions based on structured-grid solutions. Comparisons are made on a range of problems from a simple at plate to a full aircraft configuration-a modified Gulfstream-III with a natural laminar flow glove. The future aim of the project is to extend the adjoint-based design capability in FUN3D to include natural laminar flow and laminar flow control by integrating it with boundary-layer stability analysis codes, such as LASTRAC.

  8. PCTDSE: A parallel Cartesian-grid-based TDSE solver for modeling laser-atom interactions

    NASA Astrophysics Data System (ADS)

    Fu, Yongsheng; Zeng, Jiaolong; Yuan, Jianmin

    2017-01-01

    We present a parallel Cartesian-grid-based time-dependent Schrödinger equation (TDSE) solver for modeling laser-atom interactions. It can simulate the single-electron dynamics of atoms in arbitrary time-dependent vector potentials. We use a split-operator method combined with fast Fourier transforms (FFT), on a three-dimensional (3D) Cartesian grid. Parallelization is realized using a 2D decomposition strategy based on the Message Passing Interface (MPI) library, which results in a good parallel scaling on modern supercomputers. We give simple applications for the hydrogen atom using the benchmark problems coming from the references and obtain repeatable results. The extensions to other laser-atom systems are straightforward with minimal modifications of the source code.

  9. Failure probability analysis of optical grid

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  10. Infrastructure for collaborative science and societal applications in the Columbia River estuary

    NASA Astrophysics Data System (ADS)

    Baptista, António M.; Seaton, Charles; Wilkin, Michael P.; Riseman, Sarah F.; Needoba, Joseph A.; Maier, David; Turner, Paul J.; Kärnä, Tuomas; Lopez, Jesse E.; Herfort, Lydie; Megler, V. M.; McNeil, Craig; Crump, Byron C.; Peterson, Tawnya D.; Spitz, Yvette H.; Simon, Holly M.

    2015-12-01

    To meet societal needs, modern estuarine science needs to be interdisciplinary and collaborative, combine discovery with hypotheses testing, and be responsive to issues facing both regional and global stakeholders. Such an approach is best conducted with the benefit of data-rich environments, where information from sensors and models is openly accessible within convenient timeframes. Here, we introduce the operational infrastructure of one such data-rich environment, a collaboratory created to support (a) interdisciplinary research in the Columbia River estuary by the multi-institutional team of investigators of the Science and Technology Center for Coastal Margin Observation & Prediction and (b) the integration of scientific knowledge into regional decision making. Core components of the operational infrastructure are an observation network, a modeling system and a cyber-infrastructure, each of which is described. The observation network is anchored on an extensive array of long-term stations, many of them interdisciplinary, and is complemented by on-demand deployment of temporary stations and mobile platforms, often in coordinated field campaigns. The modeling system is based on finiteelement unstructured-grid codes and includes operational and process-oriented simulations of circulation, sediments and ecosystem processes. The flow of information is managed through a dedicated cyber-infrastructure, conversant with regional and national observing systems.

  11. NSTAR Extended Life Test Discharge Chamber Flake Analysis

    NASA Technical Reports Server (NTRS)

    deGroh, Kim K.; Banks, Bruce A.; Karniotis, Christina A.

    2005-01-01

    The Extended Life Test (ELT) of the NASA Solar Electric Propulsion Technology Readiness (NSTAR) ion thruster was concluded after 30,352 hours of operation. The ELT was conducted using the Deep Space 1 (DS1) back-up flight engine, a 30 cm diameter xenon ion thruster. Post-test inspection of the ELT engine revealed numerous contaminant flakes distributed over the bottom of the cylindrical section of the anode within the discharge chamber (DC). Extensive analyses were conducted to determine the source of the particles, which is critical to the understanding of degradation mechanisms of long life ion thruster operation. Analyses included: optical microscopy (OM) and particle length histograms, field emission scanning electron microscopy (FESEM) combined with energy dispersive spectroscopy (EDS), and atomic oxygen plasma exposure tests. Analyses of the particles indicate that the majority of the DC flakes consist of a layered structure, typically with either two or three layers. The flakes comprising two layers were typically found to have a molybdenum-rich (Mo-rich) layer on one side and a carbon-rich (C-rich) layer on the other side. The flakes comprising three layers were found to be sandwich-like structures with Mo-rich exterior layers and a C-rich interior layer. The presence of the C-rich layers indicates that these particles were produced by sputter deposition build-up on a surface external to the discharge chamber from ion sputter erosion of the graphite target in the test chamber. This contaminant layer became thick enough that particles spalled off, and then were electro-statically attracted into the ion thruster interior, where they were coated with Mo from internal sputter erosion of the screen grid and cathode components. Atomic oxygen tests provided evidence that the DC chamber flakes are composed of a significant fraction of carbon. Particle size histograms further indicated that the source of the particles was spalling of carbon flakes from downstream surfaces. Analyses of flakes taken from the downstream surface of the accelerator grid provided additional supportive information. The production of the downstream carbon flakes, and hence the potential problems associated with the flake particles in the ELT ion thruster engine is a facility induced effect and would not occur in the space environment.

  12. Task Assignment Heuristics for Parallel and Distributed CFD Applications

    NASA Technical Reports Server (NTRS)

    Lopez-Benitez, Noe; Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    This paper proposes a task graph (TG) model to represent a single discrete step of multi-block overset grid computational fluid dynamics (CFD) applications. The TG model is then used to not only balance the computational workload across the overset grids but also to reduce inter-grid communication costs. We have developed a set of task assignment heuristics based on the constraints inherent in this class of CFD problems. Two basic assignments, the smallest task first (STF) and the largest task first (LTF), are first presented. They are then systematically costs. To predict the performance of the proposed task assignment heuristics, extensive performance evaluations are conducted on a synthetic TG with tasks defined in terms of the number of grid points in predetermined overlapping grids. A TG derived from a realistic problem with eight million grid points is also used as a test case.

  13. Foundations for Protecting Renewable-Rich Distribution Systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellis, Abraham; Brahma, Sukumar; Ranade, Satish

    High proliferation of Inverter Interfaced Distributed Energy Resources (IIDERs) into the electric distribution grid introduces new challenges to protection of such systems. This is because the existing protection systems are designed with two assumptions: 1) system is single-sourced, resulting in unidirectional fault current, and (2) fault currents are easily detectable due to much higher magnitudes compared to load currents. Due to the fact that most renewables interface with the grid though inverters, and inverters restrict their current output to levels close to the full load currents, both these assumptions are no longer valid - the system becomes multi-sourced, and overcurrent-basedmore » protection does not work. The primary scope of this study is to analyze the response of a grid-tied inverter to different faults in the grid, leading to new guidelines on protecting renewable-rich distribution systems.« less

  14. Adaptive Grid Based Localized Learning for Multidimensional Data

    ERIC Educational Resources Information Center

    Saini, Sheetal

    2012-01-01

    Rapid advances in data-rich domains of science, technology, and business has amplified the computational challenges of "Big Data" synthesis necessary to slow the widening gap between the rate at which the data is being collected and analyzed for knowledge. This has led to the renewed need for efficient and accurate algorithms, framework,…

  15. Do sampling methods differ in their utility for ecological monitoring? Comparison of line-point intercept, grid-point intercept, and ocular estimate methods

    USDA-ARS?s Scientific Manuscript database

    This study compared the utility of three sampling methods for ecological monitoring based on: interchangeability of data (rank correlations), precision (coefficient of variation), cost (minutes/transect), and potential of each method to generate multiple indicators. Species richness and foliar cover...

  16. Flow solution on a dual-block grid around an airplane

    NASA Technical Reports Server (NTRS)

    Eriksson, Lars-Erik

    1987-01-01

    The compressible flow around a complex fighter-aircraft configuration (fuselage, cranked delta wing, canard, and inlet) is simulated numerically using a novel grid scheme and a finite-volume Euler solver. The patched dual-block grid is generated by an algebraic procedure based on transfinite interpolation, and the explicit Runge-Kutta time-stepping Euler solver is implemented with a high degree of vectorization on a Cyber 205 processor. Results are presented in extensive graphs and diagrams and characterized in detail. The concentration of grid points near the wing apex in the present scheme is shown to facilitate capture of the vortex generated by the leading edge at high angles of attack and modeling of its interaction with the canard wake.

  17. Resilience Metrics for the Electric Power System: A Performance-Based Approach.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vugrin, Eric D.; Castillo, Andrea R; Silva-Monroy, Cesar Augusto

    Grid resilience is a concept related to a power system's ability to continue operating and delivering power even in the event that low probability, high-consequence disruptions such as hurricanes, earthquakes, and cyber-attacks occur. Grid resilience objectives focus on managing and, ideally, minimizing potential consequences that occur as a result of these disruptions. Currently, no formal grid resilience definitions, metrics, or analysis methods have been universally accepted. This document describes an effort to develop and describe grid resilience metrics and analysis methods. The metrics and methods described herein extend upon the Resilience Analysis Process (RAP) developed by Watson et al. formore » the 2015 Quadrennial Energy Review. The extension allows for both outputs from system models and for historical data to serve as the basis for creating grid resilience metrics and informing grid resilience planning and response decision-making. This document describes the grid resilience metrics and analysis methods. Demonstration of the metrics and methods is shown through a set of illustrative use cases.« less

  18. Grid sensitivity for aerodynamic optimization and flow analysis

    NASA Technical Reports Server (NTRS)

    Sadrehaghighi, I.; Tiwari, S. N.

    1993-01-01

    After reviewing relevant literature, it is apparent that one aspect of aerodynamic sensitivity analysis, namely grid sensitivity, has not been investigated extensively. The grid sensitivity algorithms in most of these studies are based on structural design models. Such models, although sufficient for preliminary or conceptional design, are not acceptable for detailed design analysis. Careless grid sensitivity evaluations, would introduce gradient errors within the sensitivity module, therefore, infecting the overall optimization process. Development of an efficient and reliable grid sensitivity module with special emphasis on aerodynamic applications appear essential. The organization of this study is as follows. The physical and geometric representations of a typical model are derived in chapter 2. The grid generation algorithm and boundary grid distribution are developed in chapter 3. Chapter 4 discusses the theoretical formulation and aerodynamic sensitivity equation. The method of solution is provided in chapter 5. The results are presented and discussed in chapter 6. Finally, some concluding remarks are provided in chapter 7.

  19. mantisGRID: a grid platform for DICOM medical images management in Colombia and Latin America.

    PubMed

    Garcia Ruiz, Manuel; Garcia Chaves, Alvin; Ruiz Ibañez, Carlos; Gutierrez Mazo, Jorge Mario; Ramirez Giraldo, Juan Carlos; Pelaez Echavarria, Alejandro; Valencia Diaz, Edison; Pelaez Restrepo, Gustavo; Montoya Munera, Edwin Nelson; Garcia Loaiza, Bernardo; Gomez Gonzalez, Sebastian

    2011-04-01

    This paper presents the mantisGRID project, an interinstitutional initiative from Colombian medical and academic centers aiming to provide medical grid services for Colombia and Latin America. The mantisGRID is a GRID platform, based on open source grid infrastructure that provides the necessary services to access and exchange medical images and associated information following digital imaging and communications in medicine (DICOM) and health level 7 standards. The paper focuses first on the data abstraction architecture, which is achieved via Open Grid Services Architecture Data Access and Integration (OGSA-DAI) services and supported by the Globus Toolkit. The grid currently uses a 30-Mb bandwidth of the Colombian High Technology Academic Network, RENATA, connected to Internet 2. It also includes a discussion on the relational database created to handle the DICOM objects that were represented using Extensible Markup Language Schema documents, as well as other features implemented such as data security, user authentication, and patient confidentiality. Grid performance was tested using the three current operative nodes and the results demonstrated comparable query times between the mantisGRID (OGSA-DAI) and Distributed mySQL databases, especially for a large number of records.

  20. Biogeography of seabirds within a high-latitude ecosystem: Use of a data-assimilative ocean model to assess impacts of mesoscale oceanography

    NASA Astrophysics Data System (ADS)

    Santora, Jarrod A.; Eisner, Lisa B.; Kuletz, Kathy J.; Ladd, Carol; Renner, Martin; Hunt, George L., Jr.

    2018-02-01

    We assessed the biogeography of seabirds within the Bering Sea Large Marine Ecosystem (LME), a highly productive and extensive continental shelf system that supports important fishing grounds. Our objective was to investigate how physical ocean conditions impact distribution of seabirds along latitudinal gradients. We tested the hypothesis that seabird biogeographic patterns reflect differences in ocean conditions relating to the boundary between northern and southern shelf ecosystems. We used a grid-based approach to develop spatial means (1975-2014) of summertime seabird species' abundance, species' richness, and a multivariate seabird assemblage index to examine species composition. Seabird indices were linked to ocean conditions derived from a data-assimilative oceanographic model to quantify relationships between physics (e.g., temperature, salinity, and current velocity), bathymetry and seabirds along latitudinal gradients. Species assemblages reflected two main sources of variation, a mode for elevated richness and abundance, and a mode related to partitioning of inner/middle shelf species from outer shelf-slope species. Overall, species richness and abundance increased markedly at higher latitudes. We found that latitudinal changes in species assemblages, richness and abundance indicates a major shift around 59-60°N within inner and middle shelf regions, but not in the outer shelf. Within the middle shelf, latitudinal shifts in seabird assemblages strongly related to hydrographic structure, as opposed to the inner and outer shelf waters. As expected, elevated species richness and abundance was associated with major breeding colonies and within important coastal foraging areas. Our study also indicates that seabird observations supported the conclusion that the oceanographic model captured mesoscale variability of ocean conditions important for understanding seabird distributions and represents an important step for evaluating modeling and empirical studies. Biogeographic assessments of LMEs that integrate top predator distributions resolve critical habitat requirements and will benefit assessment of climate change impacts (e.g., sea-ice loss) predicted to affect high-latitude marine ecosystems.

  1. Research on the comparison of extension mechanism of cellular automaton based on hexagon grid and rectangular grid

    NASA Astrophysics Data System (ADS)

    Zhai, Xiaofang; Zhu, Xinyan; Xiao, Zhifeng; Weng, Jie

    2009-10-01

    Historically, cellular automata (CA) is a discrete dynamical mathematical structure defined on spatial grid. Research on cellular automata system (CAS) has focused on rule sets and initial condition and has not discussed its adjacency. Thus, the main focus of our study is the effect of adjacency on CA behavior. This paper is to compare rectangular grids with hexagonal grids on their characteristics, strengths and weaknesses. They have great influence on modeling effects and other applications including the role of nearest neighborhood in experimental design. Our researches present that rectangular and hexagonal grids have different characteristics. They are adapted to distinct aspects, and the regular rectangular or square grid is used more often than the hexagonal grid. But their relative merits have not been widely discussed. The rectangular grid is generally preferred because of its symmetry, especially in orthogonal co-ordinate system and the frequent use of raster from Geographic Information System (GIS). However, in terms of complex terrain, uncertain and multidirectional region, we have preferred hexagonal grids and methods to facilitate and simplify the problem. Hexagonal grids can overcome directional warp and have some unique characteristics. For example, hexagonal grids have a simpler and more symmetric nearest neighborhood, which avoids the ambiguities of the rectangular grids. Movement paths or connectivity, the most compact arrangement of pixels, make hexagonal appear great dominance in the process of modeling and analysis. The selection of an appropriate grid should be based on the requirements and objectives of the application. We use rectangular and hexagonal grids respectively for developing city model. At the same time we make use of remote sensing images and acquire 2002 and 2005 land state of Wuhan. On the base of city land state in 2002, we make use of CA to simulate reasonable form of city in 2005. Hereby, these results provide a proof of concept for hexagonal which has great dominance.

  2. Domain modeling and grid generation for multi-block structured grids with application to aerodynamic and hydrodynamic configurations

    NASA Technical Reports Server (NTRS)

    Spekreijse, S. P.; Boerstoel, J. W.; Vitagliano, P. L.; Kuyvenhoven, J. L.

    1992-01-01

    About five years ago, a joint development was started of a flow simulation system for engine-airframe integration studies on propeller as well as jet aircraft. The initial system was based on the Euler equations and made operational for industrial aerodynamic design work. The system consists of three major components: a domain modeller, for the graphical interactive subdivision of flow domains into an unstructured collection of blocks; a grid generator, for the graphical interactive computation of structured grids in blocks; and a flow solver, for the computation of flows on multi-block grids. The industrial partners of the collaboration and NLR have demonstrated that the domain modeller, grid generator and flow solver can be applied to simulate Euler flows around complete aircraft, including propulsion system simulation. Extension to Navier-Stokes flows is in progress. Delft Hydraulics has shown that both the domain modeller and grid generator can also be applied successfully for hydrodynamic configurations. An overview is given about the main aspects of both domain modelling and grid generation.

  3. The Use of Proxy Caches for File Access in a Multi-Tier Grid Environment

    NASA Astrophysics Data System (ADS)

    Brun, R.; Duellmann, D.; Ganis, G.; Hanushevsky, A.; Janyst, L.; Peters, A. J.; Rademakers, F.; Sindrilaru, E.

    2011-12-01

    The use of proxy caches has been extensively studied in the HEP environment for efficient access of database data and showed significant performance with only very moderate operational effort at higher grid tiers (T2, T3). In this contribution we propose to apply the same concept to the area of file access and analyse the possible performance gains, operational impact on site services and applicability to different HEP use cases. Base on a proof-of-concept studies with a modified XROOT proxy server we review the cache efficiency and overheads for access patterns of typical ROOT based analysis programs. We conclude with a discussion of the potential role of this new component at the different tiers of a distributed computing grid.

  4. The Use of Proxy Caches for File Access in a Multi-Tier Grid Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brun, R.; Dullmann, D.; Ganis, G.

    2012-04-19

    The use of proxy caches has been extensively studied in the HEP environment for efficient access of database data and showed significant performance with only very moderate operational effort at higher grid tiers (T2, T3). In this contribution we propose to apply the same concept to the area of file access and analyze the possible performance gains, operational impact on site services and applicability to different HEP use cases. Base on a proof-of-concept studies with a modified XROOT proxy server we review the cache efficiency and overheads for access patterns of typical ROOT based analysis programs. We conclude with amore » discussion of the potential role of this new component at the different tiers of a distributed computing grid.« less

  5. Effects of habitat map generalization in biodiversity assessment

    NASA Technical Reports Server (NTRS)

    Stoms, David M.

    1992-01-01

    Species richness is being mapped as part of an inventory of biological diversity in California (i.e., gap analysis). Species distributions are modeled with a GIS on the basis of maps of each species' preferred habitats. Species richness is then tallied in equal-area sampling units. A GIS sensitivity analysis examined the effects of the level of generalization of the habitat map on the predicted distribution of species richness in the southern Sierra Nevada. As the habitat map was generalized, the number of habitat types mapped within grid cells tended to decrease with a corresponding decline in numbers of species predicted. Further, the ranking of grid cells in order of predicted numbers of species changed dramatically between levels of generalization. Areas predicted to be of greatest conservation value on the basis of species richness may therefore be sensitive to GIS data resolution.

  6. Voltage collapse in complex power grids

    PubMed Central

    Simpson-Porco, John W.; Dörfler, Florian; Bullo, Francesco

    2016-01-01

    A large-scale power grid's ability to transfer energy from producers to consumers is constrained by both the network structure and the nonlinear physics of power flow. Violations of these constraints have been observed to result in voltage collapse blackouts, where nodal voltages slowly decline before precipitously falling. However, methods to test for voltage collapse are dominantly simulation-based, offering little theoretical insight into how grid structure influences stability margins. For a simplified power flow model, here we derive a closed-form condition under which a power network is safe from voltage collapse. The condition combines the complex structure of the network with the reactive power demands of loads to produce a node-by-node measure of grid stress, a prediction of the largest nodal voltage deviation, and an estimate of the distance to collapse. We extensively test our predictions on large-scale systems, highlighting how our condition can be leveraged to increase grid stability margins. PMID:26887284

  7. Gridless, pattern-driven point cloud completion and extension

    NASA Astrophysics Data System (ADS)

    Gravey, Mathieu; Mariethoz, Gregoire

    2016-04-01

    While satellites offer Earth observation with a wide coverage, other remote sensing techniques such as terrestrial LiDAR can acquire very high-resolution data on an area that is limited in extension and often discontinuous due to shadow effects. Here we propose a numerical approach to merge these two types of information, thereby reconstructing high-resolution data on a continuous large area. It is based on a pattern matching process that completes the areas where only low-resolution data is available, using bootstrapped high-resolution patterns. Currently, the most common approach to pattern matching is to interpolate the point data on a grid. While this approach is computationally efficient, it presents major drawbacks for point clouds processing because a significant part of the information is lost in the point-to-grid resampling, and that a prohibitive amount of memory is needed to store large grids. To address these issues, we propose a gridless method that compares point clouds subsets without the need to use a grid. On-the-fly interpolation involves a heavy computational load, which is met by using a GPU high-optimized implementation and a hierarchical pattern searching strategy. The method is illustrated using data from the Val d'Arolla, Swiss Alps, where high-resolution terrestrial LiDAR data are fused with lower-resolution Landsat and WorldView-3 acquisitions, such that the density of points is homogeneized (data completion) and that it is extend to a larger area (data extension).

  8. A low-rank control variate for multilevel Monte Carlo simulation of high-dimensional uncertain systems

    NASA Astrophysics Data System (ADS)

    Fairbanks, Hillary R.; Doostan, Alireza; Ketelsen, Christian; Iaccarino, Gianluca

    2017-07-01

    Multilevel Monte Carlo (MLMC) is a recently proposed variation of Monte Carlo (MC) simulation that achieves variance reduction by simulating the governing equations on a series of spatial (or temporal) grids with increasing resolution. Instead of directly employing the fine grid solutions, MLMC estimates the expectation of the quantity of interest from the coarsest grid solutions as well as differences between each two consecutive grid solutions. When the differences corresponding to finer grids become smaller, hence less variable, fewer MC realizations of finer grid solutions are needed to compute the difference expectations, thus leading to a reduction in the overall work. This paper presents an extension of MLMC, referred to as multilevel control variates (MLCV), where a low-rank approximation to the solution on each grid, obtained primarily based on coarser grid solutions, is used as a control variate for estimating the expectations involved in MLMC. Cost estimates as well as numerical examples are presented to demonstrate the advantage of this new MLCV approach over the standard MLMC when the solution of interest admits a low-rank approximation and the cost of simulating finer grids grows fast.

  9. Moving overlapping grids with adaptive mesh refinement for high-speed reactive and non-reactive flow

    NASA Astrophysics Data System (ADS)

    Henshaw, William D.; Schwendeman, Donald W.

    2006-08-01

    We consider the solution of the reactive and non-reactive Euler equations on two-dimensional domains that evolve in time. The domains are discretized using moving overlapping grids. In a typical grid construction, boundary-fitted grids are used to represent moving boundaries, and these grids overlap with stationary background Cartesian grids. Block-structured adaptive mesh refinement (AMR) is used to resolve fine-scale features in the flow such as shocks and detonations. Refinement grids are added to base-level grids according to an estimate of the error, and these refinement grids move with their corresponding base-level grids. The numerical approximation of the governing equations takes place in the parameter space of each component grid which is defined by a mapping from (fixed) parameter space to (moving) physical space. The mapped equations are solved numerically using a second-order extension of Godunov's method. The stiff source term in the reactive case is handled using a Runge-Kutta error-control scheme. We consider cases when the boundaries move according to a prescribed function of time and when the boundaries of embedded bodies move according to the surface stress exerted by the fluid. In the latter case, the Newton-Euler equations describe the motion of the center of mass of the each body and the rotation about it, and these equations are integrated numerically using a second-order predictor-corrector scheme. Numerical boundary conditions at slip walls are described, and numerical results are presented for both reactive and non-reactive flows that demonstrate the use and accuracy of the numerical approach.

  10. Extended depth of field integral imaging using multi-focus fusion

    NASA Astrophysics Data System (ADS)

    Piao, Yongri; Zhang, Miao; Wang, Xiaohui; Li, Peihua

    2018-03-01

    In this paper, we propose a new method for depth of field extension in integral imaging by realizing the image fusion method on the multi-focus elemental images. In the proposed method, a camera is translated on a 2D grid to take multi-focus elemental images by sweeping the focus plane across the scene. Simply applying an image fusion method on the elemental images holding rich parallax information does not work effectively because registration accuracy of images is the prerequisite for image fusion. To solve this problem an elemental image generalization method is proposed. The aim of this generalization process is to geometrically align the objects in all elemental images so that the correct regions of multi-focus elemental images can be exacted. The all-in focus elemental images are then generated by fusing the generalized elemental images using the block based fusion method. The experimental results demonstrate that the depth of field of synthetic aperture integral imaging system has been extended by realizing the generation method combined with the image fusion on multi-focus elemental images in synthetic aperture integral imaging system.

  11. Semi-implicit integration factor methods on sparse grids for high-dimensional systems

    NASA Astrophysics Data System (ADS)

    Wang, Dongyong; Chen, Weitao; Nie, Qing

    2015-07-01

    Numerical methods for partial differential equations in high-dimensional spaces are often limited by the curse of dimensionality. Though the sparse grid technique, based on a one-dimensional hierarchical basis through tensor products, is popular for handling challenges such as those associated with spatial discretization, the stability conditions on time step size due to temporal discretization, such as those associated with high-order derivatives in space and stiff reactions, remain. Here, we incorporate the sparse grids with the implicit integration factor method (IIF) that is advantageous in terms of stability conditions for systems containing stiff reactions and diffusions. We combine IIF, in which the reaction is treated implicitly and the diffusion is treated explicitly and exactly, with various sparse grid techniques based on the finite element and finite difference methods and a multi-level combination approach. The overall method is found to be efficient in terms of both storage and computational time for solving a wide range of PDEs in high dimensions. In particular, the IIF with the sparse grid combination technique is flexible and effective in solving systems that may include cross-derivatives and non-constant diffusion coefficients. Extensive numerical simulations in both linear and nonlinear systems in high dimensions, along with applications of diffusive logistic equations and Fokker-Planck equations, demonstrate the accuracy, efficiency, and robustness of the new methods, indicating potential broad applications of the sparse grid-based integration factor method.

  12. A robust and contact resolving Riemann solver on unstructured mesh, Part I, Euler method

    NASA Astrophysics Data System (ADS)

    Shen, Zhijun; Yan, Wei; Yuan, Guangwei

    2014-07-01

    This article presents a new cell-centered numerical method for compressible flows on arbitrary unstructured meshes. A multi-dimensional Riemann solver based on the HLLC method (denoted by HLLC-2D solver) is established. The work is an extension from the cell-centered Lagrangian scheme of Maire et al. [27] to the Eulerian framework. Similarly to the work in [27], a two-dimensional contact velocity defined on a grid node is introduced, and the motivation is to keep an edge flux consistency with the node velocity connected to the edge intrinsically. The main new feature of the algorithm is to relax the condition that the contact pressures must be same in the traditional HLLC solver. The discontinuous fluxes are constructed across each wave sampling direction rather than only along the contact wave direction. The two-dimensional contact velocity of the grid node is determined via enforcing conservation of mass, momentum and total energy, and thus the new method satisfies these conservation properties at nodes rather than on grid edges. Other good properties of the HLLC-2d solver, such as the positivity and the contact preserving, are described, and the two-dimensional high-order extension is constructed employing MUSCL type reconstruction procedure. Numerical results based on both quadrilateral and triangular grids are presented to demonstrate the robustness and the accuracy of this new solver, which shows it has better performance than the existing HLLC method.

  13. The relationship between the spectral diversity of satellite imagery, habitat heterogeneity, and plant species richness

    Treesearch

    Steven D. Warren; Martin Alt; Keith D. Olson; Severin D. H. Irl; Manuel J. Steinbauer; Anke Jentsch

    2014-01-01

    Assessment of habitat heterogeneity and plant species richness at the landscape scale is often based on intensive and extensive fieldwork at great cost of time and money. We evaluated the use of satellite imagery as a quantitativemeasure of the relationship between the spectral diversity of satellite imagery, habitat heterogeneity, and plant species richness. A 16 km2...

  14. High-order central ENO finite-volume scheme for hyperbolic conservation laws on three-dimensional cubed-sphere grids

    NASA Astrophysics Data System (ADS)

    Ivan, L.; De Sterck, H.; Susanto, A.; Groth, C. P. T.

    2015-02-01

    A fourth-order accurate finite-volume scheme for hyperbolic conservation laws on three-dimensional (3D) cubed-sphere grids is described. The approach is based on a central essentially non-oscillatory (CENO) finite-volume method that was recently introduced for two-dimensional compressible flows and is extended to 3D geometries with structured hexahedral grids. Cubed-sphere grids feature hexahedral cells with nonplanar cell surfaces, which are handled with high-order accuracy using trilinear geometry representations in the proposed approach. Varying stencil sizes and slope discontinuities in grid lines occur at the boundaries and corners of the six sectors of the cubed-sphere grid where the grid topology is unstructured, and these difficulties are handled naturally with high-order accuracy by the multidimensional least-squares based 3D CENO reconstruction with overdetermined stencils. A rotation-based mechanism is introduced to automatically select appropriate smaller stencils at degenerate block boundaries, where fewer ghost cells are available and the grid topology changes, requiring stencils to be modified. Combining these building blocks results in a finite-volume discretization for conservation laws on 3D cubed-sphere grids that is uniformly high-order accurate in all three grid directions. While solution-adaptivity is natural in the multi-block setting of our code, high-order accurate adaptive refinement on cubed-sphere grids is not pursued in this paper. The 3D CENO scheme is an accurate and robust solution method for hyperbolic conservation laws on general hexahedral grids that is attractive because it is inherently multidimensional by employing a K-exact overdetermined reconstruction scheme, and it avoids the complexity of considering multiple non-central stencil configurations that characterizes traditional ENO schemes. Extensive numerical tests demonstrate fourth-order convergence for stationary and time-dependent Euler and magnetohydrodynamic flows on cubed-sphere grids, and robustness against spurious oscillations at 3D shocks. Performance tests illustrate efficiency gains that can be potentially achieved using fourth-order schemes as compared to second-order methods for the same error level. Applications on extended cubed-sphere grids incorporating a seventh root block that discretizes the interior of the inner sphere demonstrate the versatility of the spatial discretization method.

  15. A varied subglacial landscape under Thwaites Glacier, West Antarctica

    NASA Astrophysics Data System (ADS)

    Christianson, K. A.; Holschuh, N.; Paden, J. D.; Sprick, J.; Peters, L. E.; Anandakrishnan, S.; Alley, R. B.

    2017-12-01

    Deglaciated landscapes, whether subaerial or submarine, are often host to a rich panoply of subglacial landforms, such as drumlims, crags, megascale glacial lineations, grounding-line wedges, deep meltwater channels, and more. These landforms are formed and shaped by interactions between the ice and underlying substrate, and thus have implications for the flow of the overlying ice. Robust interpretations of the relationship between the ice and its substrate based on subglacial landforms that remain after deglaciation have been inhibited by a dearth of high-resolution observations of currently glaciated subglacial landscapes, where ice flow speed is known and where subglacial conditions can be ascertained using geophysical methods. Past direct observations of landforms under currently fast-flowing ice have been limited to a few ice streams, where relatively homogeneous, thick dilatant till layers may favor formation of specific subglacial features, i.e., megascale glacial lineations and grounding-zone wedges. Here we present two detailed gridded subglacial topographies, obtained from ice-penetrating radar measurements, from Thwaites Glacier, West Antarctica, where ice flows over a highly variable bed (in both topography and model-inferred basal shear stress). One grid is located ˜170 km downstream from the ice divide where ice is moving ˜100 m/yr. Here the ice advects over a broad basin and then flows into a subglacial ridge (of several hundred meters amplitude) oriented orthogonally to flow. A deep canyon ( 400 m) that cuts through this ridge in roughly the ice-flow direction and relatively soft sediments on the downstream side of the basin (immediately upstream of the canyon) suggest that a large subglacial lake may have formed in this location and drained catastrophically, as has been hypothesized as the formation mechanism for the deep canyons observed on the Amundsen Sea continental shelf. Numerous multiscale glacial lineations are also observed in the subglacial basin. The second grid is located ˜300 km downstream of the ice divide where the ice is moving ˜350 m/yr. A large crag and even more extensive multiscale subglacial lineations are observed in the downstream grid. Our results suggest that multiple subglacial landforms form in close geographic proximity due to heterogeneous basal conditions.

  16. Energy solutions in rural Africa: mapping electrification costs of distributed solar and diesel generation versus grid extension

    NASA Astrophysics Data System (ADS)

    Szabó, S.; Bódis, K.; Huld, T.; Moner-Girona, M.

    2011-07-01

    Three rural electrification options are analysed showing the cost optimal conditions for a sustainable energy development applying renewable energy sources in Africa. A spatial electricity cost model has been designed to point out whether diesel generators, photovoltaic systems or extension of the grid are the least-cost option in off-grid areas. The resulting mapping application offers support to decide in which regions the communities could be electrified either within the grid or in an isolated mini-grid. Donor programs and National Rural Electrification Agencies (or equivalent governmental departments) could use this type of delineation for their program boundaries and then could use the local optimization tools adapted to the prevailing parameters. The views expressed in this paper are those of the authors and do not necessarily represent European Commission and UNEP policy.

  17. Three-Dimensional Viscous Alternating Direction Implicit Algorithm and Strategies for Shape Optimization

    NASA Technical Reports Server (NTRS)

    Pandya, Mohagna J.; Baysal, Oktay

    1997-01-01

    A gradient-based shape optimization based on quasi-analytical sensitivities has been extended for practical three-dimensional aerodynamic applications. The flow analysis has been rendered by a fully implicit, finite-volume formulation of the Euler and Thin-Layer Navier-Stokes (TLNS) equations. Initially, the viscous laminar flow analysis for a wing has been compared with an independent computational fluid dynamics (CFD) code which has been extensively validated. The new procedure has been demonstrated in the design of a cranked arrow wing at Mach 2.4 with coarse- and fine-grid based computations performed with Euler and TLNS equations. The influence of the initial constraints on the geometry and aerodynamics of the optimized shape has been explored. Various final shapes generated for an identical initial problem formulation but with different optimization path options (coarse or fine grid, Euler or TLNS), have been aerodynamically evaluated via a common fine-grid TLNS-based analysis. The initial constraint conditions show significant bearing on the optimization results. Also, the results demonstrate that to produce an aerodynamically efficient design, it is imperative to include the viscous physics in the optimization procedure with the proper resolution. Based upon the present results, to better utilize the scarce computational resources, it is recommended that, a number of viscous coarse grid cases using either a preconditioned bi-conjugate gradient (PbCG) or an alternating-direction-implicit (ADI) method, should initially be employed to improve the optimization problem definition, the design space and initial shape. Optimized shapes should subsequently be analyzed using a high fidelity (viscous with fine-grid resolution) flow analysis to evaluate their true performance potential. Finally, a viscous fine-grid-based shape optimization should be conducted, using an ADI method, to accurately obtain the final optimized shape.

  18. Service-Oriented Architecture for NVO and TeraGrid Computing

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph; Miller, Craig; Williams, Roy; Steenberg, Conrad; Graham, Matthew

    2008-01-01

    The National Virtual Observatory (NVO) Extensible Secure Scalable Service Infrastructure (NESSSI) is a Web service architecture and software framework that enables Web-based astronomical data publishing and processing on grid computers such as the National Science Foundation's TeraGrid. Characteristics of this architecture include the following: (1) Services are created, managed, and upgraded by their developers, who are trusted users of computing platforms on which the services are deployed. (2) Service jobs can be initiated by means of Java or Python client programs run on a command line or with Web portals. (3) Access is granted within a graduated security scheme in which the size of a job that can be initiated depends on the level of authentication of the user.

  19. Job Scheduling in a Heterogeneous Grid Environment

    NASA Technical Reports Server (NTRS)

    Shan, Hong-Zhang; Smith, Warren; Oliker, Leonid; Biswas, Rupak

    2004-01-01

    Computational grids have the potential for solving large-scale scientific problems using heterogeneous and geographically distributed resources. However, a number of major technical hurdles must be overcome before this potential can be realized. One problem that is critical to effective utilization of computational grids is the efficient scheduling of jobs. This work addresses this problem by describing and evaluating a grid scheduling architecture and three job migration algorithms. The architecture is scalable and does not assume control of local site resources. The job migration policies use the availability and performance of computer systems, the network bandwidth available between systems, and the volume of input and output data associated with each job. An extensive performance comparison is presented using real workloads from leading computational centers. The results, based on several key metrics, demonstrate that the performance of our distributed migration algorithms is significantly greater than that of a local scheduling framework and comparable to a non-scalable global scheduling approach.

  20. Making the most of cloud storage - a toolkit for exploitation by WLCG experiments

    NASA Astrophysics Data System (ADS)

    Alvarez Ayllon, Alejandro; Arsuaga Rios, Maria; Bitzes, Georgios; Furano, Fabrizio; Keeble, Oliver; Manzi, Andrea

    2017-10-01

    Understanding how cloud storage can be effectively used, either standalone or in support of its associated compute, is now an important consideration for WLCG. We report on a suite of extensions to familiar tools targeted at enabling the integration of cloud object stores into traditional grid infrastructures and workflows. Notable updates include support for a number of object store flavours in FTS3, Davix and gfal2, including mitigations for lack of vector reads; the extension of Dynafed to operate as a bridge between grid and cloud domains; protocol translation in FTS3; the implementation of extensions to DPM (also implemented by the dCache project) to allow 3rd party transfers over HTTP. The result is a toolkit which facilitates data movement and access between grid and cloud infrastructures, broadening the range of workflows suitable for cloud. We report on deployment scenarios and prototype experience, explaining how, for example, an Amazon S3 or Azure allocation can be exploited by grid workflows.

  1. Why is China’s wind power generation not living up to its potential?

    NASA Astrophysics Data System (ADS)

    Huenteler, Joern; Tang, Tian; Chan, Gabriel; Diaz Anadon, Laura

    2018-04-01

    Following a decade of unprecedented investment, China now has the world’s largest installed base of wind power capacity. Yet, despite siting most wind farms in the wind-rich Northern and Western provinces, electricity generation from Chinese wind farms has not reached the performance benchmarks of the United States and many other advanced economies. This has resulted in lower environmental, economic, and health benefits than anticipated. We develop a framework to explain the performance of the Chinese and US wind sectors, accounting for a comprehensive set of driving factors. We apply this framework to a novel dataset of virtually all wind farms installed in China and the United States through the end of 2013. We first estimate the wind sector’s technical potential using a methodology that produces consistent estimates for both countries. We compare this potential to actual performance and find that Chinese wind farms generated electricity at 37%–45% of their annual technical potential during 2006–2013 compared to 54%–61% in the United States. Our findings underscore that the larger gap between actual performance and technical potential in China compared to the United States is significantly driven by delays in grid connection (14% of the gap) and curtailment due to constraints in grid management (10% of the gap), two challenges of China’s wind power expansion covered extensively in the literature. However, our findings show that China’s underperformance is also driven by suboptimal turbine model selection (31% of the gap), wind farm siting (23% of the gap), and turbine hub heights (6% of the gap)—factors that have received less attention in the literature and, crucially, are locked-in for the lifetime of wind farms. This suggests that besides addressing grid connection delays and curtailment, China will also need policy measures to address turbine siting and technology choices to achieve its national goals and increase utilization up to US levels.

  2. Extending life for people with a terminal illness: a moral right and an expensive death? Exploring societal perspectives.

    PubMed

    McHugh, Neil; Baker, Rachel M; Mason, Helen; Williamson, Laura; van Exel, Job; Deogaonkar, Rohan; Collins, Marissa; Donaldson, Cam

    2015-03-07

    Many publicly-funded health systems apply cost-benefit frameworks in response to the moral dilemma of how best to allocate scarce healthcare resources. However, implementation of recommendations based on costs and benefit calculations and subsequent challenges have led to 'special cases' with certain types of health benefits considered more valuable than others. Recent debate and research has focused on the relative value of life extensions for people with terminal illnesses. This research investigates societal perspectives in relation to this issue, in the UK. Q methodology was used to elicit societal perspectives from a purposively selected sample of data-rich respondents. Participants ranked 49 statements of opinion (developed for this study), onto a grid, according to level of agreement. These 'Q sorts' were followed by brief interviews. Factor analysis was used to identify shared points of view (patterns of similarity between individuals' Q sorts). Analysis produced a three factor solution. These rich, shared accounts can be broadly summarised as: i) 'A population perspective - value for money, no special cases', ii) 'Life is precious - valuing life-extension and patient choice', iii) 'Valuing wider benefits and opportunity cost - the quality of life and death'. From the factor descriptions it is clear that the main philosophical positions that have long dominated debates on the just allocation of resources have a basis in public opinion. The existence of certain moral positions in the views of society does not ethically imply, and pragmatically cannot mean, that all are translated into policy. Our findings highlight normative tensions and the importance of critically engaging with these normative issues (in addition to the current focus on a procedural justice approach to health policy). Future research should focus on i) the extent to which these perspectives are supported in society, ii) how respondents' perspectives relate to specific resource allocation questions, and iii) the characteristics of respondents associated with each perspective.

  3. GRID[subscript C] Renewable Energy Data Streaming into Classrooms

    ERIC Educational Resources Information Center

    DeLuca, V. William; Carpenter, Pam; Lari, Nasim

    2010-01-01

    For years, researchers have shown the value of using real-world data to enhance instruction in mathematics, science, and social studies. In an effort to help develop students' higher-order thinking skills in a data-rich learning environment, Green Research for Incorporating Data in the Classroom (GRID[subscript C]), a National Science…

  4. Residential expansion as a continental threat to U.S. coastal ecosystems

    Treesearch

    J.G. Bartlett; D.M. Mageean; R.J. O' Connor

    2000-01-01

    Spatially extensive analysis of satellite, climate, and census data reveals human-environment interactions of regional or continental concern in the United States. A grid-based principal components analysis of Bureau of Census variables revealed two independent demographic phenomena, a-settlement reflecting traditional human settlement patterns and p-settlement...

  5. A temperature-precipitation-based model of thiry-year mean snowpack accumulation and melt in Oregon, USA

    EPA Science Inventory

    High-resolution, spatially extensive climate grids can be useful in regional hydrologic applications. However, in regions where precipitation is dominated by snow, snowmelt models are often used to account for timing and magnitude of water delivery. We developed an empirical, non...

  6. Regional and latitudinal patterns of soft-bottom macrobenthic invertebrates along French coasts: Results from the RESOMAR database

    NASA Astrophysics Data System (ADS)

    Gallon, Régis K.; Lavesque, Nicolas; Grall, Jacques; Labrune, Céline; Gremare, Antoine; Bachelet, Guy; Blanchet, Hugues; Bonifácio, Paulo; Bouchet, Vincent M. P.; Dauvin, Jean-Claude; Desroy, Nicolas; Gentil, Franck; Guerin, Laurent; Houbin, Céline; Jourde, Jérôme; Laurand, Sandrine; Le Duff, Michel; Le Garrec, Vincent; de Montaudouin, Xavier; Olivier, Frédéric; Orvain, Francis; Sauriau, Pierre-Guy; Thiebaut, Éric; Gauthier, Olivier

    2017-12-01

    This study aims to describe the patterns of soft bottom macrozoobenthic richness along French coasts. It is based on a collaborative database developed by the "Réseau des Stations et Observatoires Marins" (RESOMAR). We investigated patterns of species richness in sublittoral soft bottom habitats (EUNIS level 3) at two different spatial scales: 1) seaboards: English Channel, Bay of Biscay and Mediterranean Sea and 2) 0.5° latitudinal and longitudinal grid. Total observed richness, rarefaction curves and three incidence-based richness estimators (Chao2, ICE and Jacknife1) were used to compare soft bottom habitats species richness in each seaboard. Overall, the Mediterranean Sea has the highest richness and despite higher sampling effort, the English Channel hosts the lowest number of species. The distribution of species occurrence within and between seaboards was assessed for each major phylum using constrained rarefaction curves. The Mediterranean Sea hosts the highest number of exclusive species. In pairwise comparisons, it also shares a lower proportion of taxa with the Bay of Biscay (34.1%) or the English Channel (27.6%) than that shared between these two seaboards (49.7%). Latitudinal species richness patterns along the Atlantic and English Channel coasts were investigated for each major phylum using partial LOESS regression controlling for sampling effort. This showed the existence of a bell-shaped latitudinal pattern, highlighting Brittany as a hotspot for macrobenthic richness at the confluence of two biogeographic provinces.

  7. Universal access to electricity in Burkina Faso: scaling-up renewable energy technologies

    NASA Astrophysics Data System (ADS)

    Moner-Girona, M.; Bódis, K.; Huld, T.; Kougias, I.; Szabó, S.

    2016-08-01

    This paper describes the status quo of the power sector in Burkina Faso, its limitations, and develops a new methodology that through spatial analysis processes with the aim to provide a possible pathway for universal electricity access. Following the SE4All initiative approach, it recommends the more extensive use of distributed renewable energy systems to increase access to electricity on an accelerated timeline. Less than 5% of the rural population in Burkina Faso have currently access to electricity and supply is lacking at many social structures such as schools and hospitals. Energy access achievements in Burkina Faso are still very modest. According to the latest SE4All Global Tracking Framework (2015), the access to electricity annual growth rate in Burkina Faso from 2010 to 2012 is 0%. The rural electrification strategy for Burkina Faso is scattered in several electricity sector development policies: there is a need of defining a concrete action plan. Planning and coordination between grid extension and the off-grid electrification programme is essential to reach a long-term sustainable energy model and prevent high avoidable infrastructure investments. This paper goes into details on the methodology and findings of the developed Geographic Information Systems tool. The aim of the dynamic planning tool is to provide support to the national government and development partners to define an alternative electrification plan. Burkina Faso proves to be paradigm case for the methodology as its national policy for electrification is still dominated by grid extension and the government subsidising fossil fuel electricity production. However, the results of our analysis suggest that the current grid extension is becoming inefficient and unsustainable in order to reach the national energy access targets. The results also suggest that Burkina Faso’s rural electrification strategy should be driven local renewable resources to power distributed mini-grids. We find that this approach would connect more people to power more quickly, and would reduce fossil fuel use that would otherwise be necessary for grid extension options.

  8. Linking the open source, spatial electrification tool (ONSSET) and the open source energy modelling system (OSeMOSYS), with a focus on Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Mentis, Dimitrios; Howells, Mark; Rogner, Holger; Korkovelos, Alexandros; Arderne, Christopher; Siyal, Shahid; Zepeda, Eduardo; Taliotis, Constantinos; Bazilian, Morgan; de Roo, Ad; Tanvez, Yann; Oudalov, Alexandre; Scholtz, Ernst

    2017-04-01

    In September 2015, the United Nations General Assembly adopted Agenda 2030, which comprises a set of 17 Sustainable Development Goals (SDGs) defined by 169 targets. "Ensuring access to affordable, reliable, sustainable and modern energy for all by 2030" is the seventh goal (SDG7). While access to energy refers to more than electricity, the latter is the central focus of this work. According to the World Bank's 2015 Global Tracking Framework, roughly 15% of world population (or 1.1 billion people) lack access to electricity, and many more rely on poor quality electricity services. The majority of those without access (87%) reside in rural areas. This paper presents results of a Geographic Information Systems (GIS) approach coupled with open access data and linked to the Electricity Model Base for Africa (TEMBA), a model that represents each continental African country's electricity supply system. We present least-cost electrification strategies on a country-by-country basis for Sub-Saharan Africa. The electrification options include grid extension, mini-grid and stand-alone systems for rural, peri-urban, and urban contexts across the economy. At low levels of electricity demand there is a strong penetration of standalone technologies. However, higher electricity demand levels move the favourable electrification option from stand-alone systems to mini grid and to grid extensions.

  9. Turbulence generation through intense kinetic energy sources

    NASA Astrophysics Data System (ADS)

    Maqui, Agustin F.; Donzis, Diego A.

    2016-06-01

    Direct numerical simulations (DNS) are used to systematically study the development and establishment of turbulence when the flow is initialized with concentrated regions of intense kinetic energy. This resembles both active and passive grids which have been extensively used to generate and study turbulence in laboratories at different Reynolds numbers and with different characteristics, such as the degree of isotropy and homogeneity. A large DNS database was generated covering a wide range of initial conditions with a focus on perturbations with some directional preference, a condition found in active jet grids and passive grids passed through a contraction as well as a new type of active grid inspired by the experimental use of lasers to photo-excite the molecules that comprise the fluid. The DNS database is used to assert under what conditions the flow becomes turbulent and if so, the time required for this to occur. We identify a natural time scale of the problem which indicates the onset of turbulence and a single Reynolds number based exclusively on initial conditions which controls the evolution of the flow. It is found that a minimum Reynolds number is needed for the flow to evolve towards fully developed turbulence. An extensive analysis of single and two point statistics, velocity as well as spectral dynamics and anisotropy measures is presented to characterize the evolution of the flow towards realistic turbulence.

  10. Large-eddy simulation of wind turbine wake interactions on locally refined Cartesian grids

    NASA Astrophysics Data System (ADS)

    Angelidis, Dionysios; Sotiropoulos, Fotis

    2014-11-01

    Performing high-fidelity numerical simulations of turbulent flow in wind farms remains a challenging issue mainly because of the large computational resources required to accurately simulate the turbine wakes and turbine/turbine interactions. The discretization of the governing equations on structured grids for mesoscale calculations may not be the most efficient approach for resolving the large disparity of spatial scales. A 3D Cartesian grid refinement method enabling the efficient coupling of the Actuator Line Model (ALM) with locally refined unstructured Cartesian grids adapted to accurately resolve tip vortices and multi-turbine interactions, is presented. Second order schemes are employed for the discretization of the incompressible Navier-Stokes equations in a hybrid staggered/non-staggered formulation coupled with a fractional step method that ensures the satisfaction of local mass conservation to machine zero. The current approach enables multi-resolution LES of turbulent flow in multi-turbine wind farms. The numerical simulations are in good agreement with experimental measurements and are able to resolve the rich dynamics of turbine wakes on grids containing only a small fraction of the grid nodes that would be required in simulations without local mesh refinement. This material is based upon work supported by the Department of Energy under Award Number DE-EE0005482 and the National Science Foundation under Award number NSF PFI:BIC 1318201.

  11. A comparative analysis of dynamic grids vs. virtual grids using the A3pviGrid framework.

    PubMed

    Shankaranarayanan, Avinas; Amaldas, Christine

    2010-11-01

    With the proliferation of Quad/Multi-core micro-processors in mainstream platforms such as desktops and workstations; a large number of unused CPU cycles can be utilized for running virtual machines (VMs) as dynamic nodes in distributed environments. Grid services and its service oriented business broker now termed cloud computing could deploy image based virtualization platforms enabling agent based resource management and dynamic fault management. In this paper we present an efficient way of utilizing heterogeneous virtual machines on idle desktops as an environment for consumption of high performance grid services. Spurious and exponential increases in the size of the datasets are constant concerns in medical and pharmaceutical industries due to the constant discovery and publication of large sequence databases. Traditional algorithms are not modeled at handing large data sizes under sudden and dynamic changes in the execution environment as previously discussed. This research was undertaken to compare our previous results with running the same test dataset with that of a virtual Grid platform using virtual machines (Virtualization). The implemented architecture, A3pviGrid utilizes game theoretic optimization and agent based team formation (Coalition) algorithms to improve upon scalability with respect to team formation. Due to the dynamic nature of distributed systems (as discussed in our previous work) all interactions were made local within a team transparently. This paper is a proof of concept of an experimental mini-Grid test-bed compared to running the platform on local virtual machines on a local test cluster. This was done to give every agent its own execution platform enabling anonymity and better control of the dynamic environmental parameters. We also analyze performance and scalability of Blast in a multiple virtual node setup and present our findings. This paper is an extension of our previous research on improving the BLAST application framework using dynamic Grids on virtualization platforms such as the virtual box.

  12. Space-based Science Operations Grid Prototype

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Welch, Clara L.; Redman, Sandra

    2004-01-01

    Grid technology is the up and coming technology that is enabling widely disparate services to be offered to users that is very economical, easy to use and not available on a wide basis. Under the Grid concept disparate organizations generally defined as "virtual organizations" can share services i.e. sharing discipline specific computer applications, required to accomplish the specific scientific and engineering organizational goals and objectives. Grids are emerging as the new technology of the future. Grid technology has been enabled by the evolution of increasingly high speed networking. Without the evolution of high speed networking Grid technology would not have emerged. NASA/Marshall Space Flight Center's (MSFC) Flight Projects Directorate, Ground Systems Department is developing a Space-based Science Operations Grid prototype to provide to scientists and engineers the tools necessary to operate space-based science payloads/experiments and for scientists to conduct public and educational outreach. In addition Grid technology can provide new services not currently available to users. These services include mission voice and video, application sharing, telemetry management and display, payload and experiment commanding, data mining, high order data processing, discipline specific application sharing and data storage, all from a single grid portal. The Prototype will provide most of these services in a first step demonstration of integrated Grid and space-based science operations technologies. It will initially be based on the International Space Station science operational services located at the Payload Operations Integration Center at MSFC, but can be applied to many NASA projects including free flying satellites and future projects. The Prototype will use the Internet2 Abilene Research and Education Network that is currently a 10 Gb backbone network to reach the University of Alabama at Huntsville and several other, as yet unidentified, Space Station based science experimenters. There is an international aspect to the Grid involving the America's Pathway (AMPath) network, the Chilean REUNA Research and Education Network and the University of Chile in Santiago that will further demonstrate how extensive these services can be used. From the user's perspective, the Prototype will provide a single interface and logon to these varied services without the complexity of knowing the where's and how's of each service. There is a separate and deliberate emphasis on security. Security will be addressed by specifically outlining the different approaches and tools used. Grid technology, unlike the Internet, is being designed with security in mind. In addition we will show the locations, configurations and network paths associated with each service and virtual organization. We will discuss the separate virtual organizations that we define for the varied user communities. These will include certain, as yet undetermined, space-based science functions and/or processes and will include specific virtual organizations required for public and educational outreach and science and engineering collaboration. We will also discuss the Grid Prototype performance and the potential for further Grid applications both space-based and ground based projects and processes. In this paper and presentation we will detail each service and how they are integrated using Grid

  13. Towards Dynamic Service Level Agreement Negotiation:An Approach Based on WS-Agreement

    NASA Astrophysics Data System (ADS)

    Pichot, Antoine; Wäldrich, Oliver; Ziegler, Wolfgang; Wieder, Philipp

    In Grid, e-Science and e-Business environments, Service Level Agreements are often used to establish frameworks for the delivery of services between service providers and the organisations hosting the researchers. While this high level SLAs define the overall quality of the services, it is desirable for the end-user to have dedicated service quality also for individual services like the orchestration of resources necessary for composed services. Grid level scheduling services typically are responsible for the orchestration and co-ordination of resources in the Grid. Co-allocation e.g. requires the Grid level scheduler to co-ordinate resource management systems located in different domains. As the site autonomy has to be respected negotiation is the only way to achieve the intended co-ordination. SLAs emerged as a new way to negotiate and manage usage of resources in the Grid and are already adopted by a number of management systems. Therefore, it is natural to look for ways to adopt SLAs for Grid level scheduling. In order to do this, efficient and flexible protocols are needed, which support dynamic negotiation and creation of SLAs. In this paper we propose and discuss extensions to the WS-Agreement protocol addressing these issues.

  14. A Study of the Relationship between Weather Variables and Electric Power Demand inside a Smart Grid/Smart World Framework

    PubMed Central

    Hernández, Luis; Baladrón, Carlos; Aguiar, Javier M.; Calavia, Lorena; Carro, Belén; Sánchez-Esguevillas, Antonio; Cook, Diane J.; Chinarro, David; Gómez, Jorge

    2012-01-01

    One of the main challenges of today's society is the need to fulfill at the same time the two sides of the dichotomy between the growing energy demand and the need to look after the environment. Smart Grids are one of the answers: intelligent energy grids which retrieve data about the environment through extensive sensor networks and react accordingly to optimize resource consumption. In order to do this, the Smart Grids need to understand the existing relationship between energy demand and a set of relevant climatic variables. All smart “systems” (buildings, cities, homes, consumers, etc.) have the potential to employ their intelligence for self-adaptation to climate conditions. After introducing the Smart World, a global framework for the collaboration of these smart systems, this paper presents the relationship found at experimental level between a range of relevant weather variables and electric power demand patterns, presenting a case study using an agent-based system, and emphasizing the need to consider this relationship in certain Smart World (and specifically Smart Grid and microgrid) applications.

  15. Cloud Based Educational Systems and Its Challenges and Opportunities and Issues

    ERIC Educational Resources Information Center

    Paul, Prantosh Kr.; Lata Dangwal, Kiran

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and…

  16. Identification of linearised RMS-voltage dip patterns based on clustering in renewable plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    García-Sánchez, Tania; Gómez-Lázaro, Emilio; Muljadi, Edward

    Generation units connected to the grid are currently required to meet low-voltage ride-through (LVRT) requirements. In most developed countries, these requirements also apply to renewable sources, mainly wind power plants and photovoltaic installations connected to the grid. This study proposes an alternative characterisation solution to classify and visualise a large number of collected events in light of current limits and requirements. The authors' approach is based on linearised root-mean-square-(RMS)-voltage trajectories, taking into account LRVT requirements, and a clustering process to identify the most likely pattern trajectories. The proposed solution gives extensive information on an event's severity by providing a simplemore » but complete visualisation of the linearised RMS-voltage patterns. In addition, these patterns are compared to current LVRT requirements to determine similarities or discrepancies. A large number of collected events can then be automatically classified and visualised for comparative purposes. Real disturbances collected from renewable sources in Spain are used to assess the proposed solution. Extensive results and discussions are also included in this study.« less

  17. The Mass-loss Return from Evolved Stars to the Large Magellanic Cloud. VI. Luminosities and Mass-loss Rates on Population Scales

    NASA Astrophysics Data System (ADS)

    Riebel, D.; Srinivasan, S.; Sargent, B.; Meixner, M.

    2012-07-01

    We present results from the first application of the Grid of Red Supergiant and Asymptotic Giant Branch ModelS (GRAMS) model grid to the entire evolved stellar population of the Large Magellanic Cloud (LMC). GRAMS is a pre-computed grid of 80,843 radiative transfer models of evolved stars and circumstellar dust shells composed of either silicate or carbonaceous dust. We fit GRAMS models to ~30,000 asymptotic giant branch (AGB) and red supergiant (RSG) stars in the LMC, using 12 bands of photometry from the optical to the mid-infrared. Our published data set consists of thousands of evolved stars with individually determined evolutionary parameters such as luminosity and mass-loss rate. The GRAMS grid has a greater than 80% accuracy rate discriminating between oxygen- and carbon-rich chemistry. The global dust injection rate to the interstellar medium (ISM) of the LMC from RSGs and AGB stars is on the order of 2.1 × 10-5 M ⊙ yr-1, equivalent to a total mass injection rate (including the gas) into the ISM of ~6 × 10-3 M ⊙ yr-1. Carbon stars inject two and a half times as much dust into the ISM as do O-rich AGB stars, but the same amount of mass. We determine a bolometric correction factor for C-rich AGB stars in the K s band as a function of J - K s color, BC_{K_{s}} = -0.40(J-K_{s})^2 + 1.83(J-K_{s}) + 1.29. We determine several IR color proxies for the dust mass-loss rate (\\dot{M}_{d}) from C-rich AGB stars, such as log \\dot{M_{d}} = ({-18.90}/({(K_{s}-[8.0])+3.37}))-5.93. We find that a larger fraction of AGB stars exhibiting the "long-secondary period" phenomenon are more O-rich than stars dominated by radial pulsations, and AGB stars without detectable mass loss do not appear on either the first-overtone or fundamental-mode pulsation sequences.

  18. [Research on tumor information grid framework].

    PubMed

    Zhang, Haowei; Qin, Zhu; Liu, Ying; Tan, Jianghao; Cao, Haitao; Chen, Youping; Zhang, Ke; Ding, Yuqing

    2013-10-01

    In order to realize tumor disease information sharing and unified management, we utilized grid technology to make the data and software resources which distributed in various medical institutions for effective integration so that we could make the heterogeneous resources consistent and interoperable in both semantics and syntax aspects. This article describes the tumor grid framework, the type of the service being packaged in Web Service Description Language (WSDL) and extensible markup language schemas definition (XSD), the client use the serialized document to operate the distributed resources. The service objects could be built by Unified Modeling Language (UML) as middle ware to create application programming interface. All of the grid resources are registered in the index and released in the form of Web Services based on Web Services Resource Framework (WSRF). Using the system we can build a multi-center, large sample and networking tumor disease resource sharing framework to improve the level of development in medical scientific research institutions and the patient's quality of life.

  19. Efficient Authorization of Rich Presence Using Secure and Composed Web Services

    NASA Astrophysics Data System (ADS)

    Li, Li; Chou, Wu

    This paper presents an extended Role-Based Access Control (RBAC) model for efficient authorization of rich presence using secure web services composed with an abstract presence data model. Following the information symmetry principle, the standard RBAC model is extended to support context sensitive social relations and cascaded authority. In conjunction with the extended RBAC model, we introduce an extensible presence architecture prototype using WS-Security and WS-Eventing to secure rich presence information exchanges based on PKI certificates. Applications and performance measurements of our presence system are presented to show that the proposed RBAC framework for presence and collaboration is well suited for real-time communication and collaboration.

  20. ETICS: the international software engineering service for the grid

    NASA Astrophysics Data System (ADS)

    Meglio, A. D.; Bégin, M.-E.; Couvares, P.; Ronchieri, E.; Takacs, E.

    2008-07-01

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.

  1. Smart Grid as Multi-layer Interacting System for Complex Decision Makings

    NASA Astrophysics Data System (ADS)

    Bompard, Ettore; Han, Bei; Masera, Marcelo; Pons, Enrico

    This chapter presents an approach to the analysis of Smart Grids based on a multi-layer representation of their technical, cyber, social and decision-making aspects, as well as the related environmental constraints. In the Smart Grid paradigm, self-interested active customers (prosumers), system operators and market players interact among themselves making use of an extensive cyber infrastructure. In addition, policy decision makers define regulations, incentives and constraints to drive the behavior of the competing operators and prosumers, with the objective of ensuring the global desired performance (e.g. system stability, fair prices). For these reasons, the policy decision making is more complicated than in traditional power systems, and needs proper modeling and simulation tools for assessing "in vitro" and ex-ante the possible impacts of the decisions assumed. In this chapter, we consider the smart grids as multi-layered interacting complex systems. The intricacy of the framework, characterized by several interacting layers, cannot be captured by closed-form mathematical models. Therefore, a new approach using Multi Agent Simulation is described. With case studies we provide some indications about how to develop agent-based simulation tools presenting some preliminary examples.

  2. An improved cellular automaton method to model multispecies biofilms.

    PubMed

    Tang, Youneng; Valocchi, Albert J

    2013-10-01

    Biomass-spreading rules used in previous cellular automaton methods to simulate multispecies biofilm introduced extensive mixing between different biomass species or resulted in spatially discontinuous biomass concentration and distribution; this caused results based on the cellular automaton methods to deviate from experimental results and those from the more computationally intensive continuous method. To overcome the problems, we propose new biomass-spreading rules in this work: Excess biomass spreads by pushing a line of grid cells that are on the shortest path from the source grid cell to the destination grid cell, and the fractions of different biomass species in the grid cells on the path change due to the spreading. To evaluate the new rules, three two-dimensional simulation examples are used to compare the biomass distribution computed using the continuous method and three cellular automaton methods, one based on the new rules and the other two based on rules presented in two previous studies. The relationship between the biomass species is syntrophic in one example and competitive in the other two examples. Simulation results generated using the cellular automaton method based on the new rules agree much better with the continuous method than do results using the other two cellular automaton methods. The new biomass-spreading rules are no more complex to implement than the existing rules. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Intelligent energy allocation strategy for PHEV charging station using gravitational search algorithm

    NASA Astrophysics Data System (ADS)

    Rahman, Imran; Vasant, Pandian M.; Singh, Balbir Singh Mahinder; Abdullah-Al-Wadud, M.

    2014-10-01

    Recent researches towards the use of green technologies to reduce pollution and increase penetration of renewable energy sources in the transportation sector are gaining popularity. The development of the smart grid environment focusing on PHEVs may also heal some of the prevailing grid problems by enabling the implementation of Vehicle-to-Grid (V2G) concept. Intelligent energy management is an important issue which has already drawn much attention to researchers. Most of these works require formulation of mathematical models which extensively use computational intelligence-based optimization techniques to solve many technical problems. Higher penetration of PHEVs require adequate charging infrastructure as well as smart charging strategies. We used Gravitational Search Algorithm (GSA) to intelligently allocate energy to the PHEVs considering constraints such as energy price, remaining battery capacity, and remaining charging time.

  4. Stronger tests of mechanisms underlying geographic gradients of biodiversity: insights from the dimensionality of biodiversity.

    PubMed

    Stevens, Richard D; Tello, J Sebastián; Gavilanez, María Mercedes

    2013-01-01

    Inference involving diversity gradients typically is gathered by mechanistic tests involving single dimensions of biodiversity such as species richness. Nonetheless, because traits such as geographic range size, trophic status or phenotypic characteristics are tied to a particular species, mechanistic effects driving broad diversity patterns should manifest across numerous dimensions of biodiversity. We develop an approach of stronger inference based on numerous dimensions of biodiversity and apply it to evaluate one such putative mechanism: the mid-domain effect (MDE). Species composition of 10,000-km(2) grid cells was determined by overlaying geographic range maps of 133 noctilionoid bat taxa. We determined empirical diversity gradients in the Neotropics by calculating species richness and three indices each of phylogenetic, functional and phenetic diversity for each grid cell. We also created 1,000 simulated gradients of each examined metric of biodiversity based on a MDE model to estimate patterns expected if species distributions were randomly placed within the Neotropics. For each simulation run, we regressed the observed gradient onto the MDE-expected gradient. If a MDE drives empirical gradients, then coefficients of determination from such an analysis should be high, the intercept no different from zero and the slope no different than unity. Species richness gradients predicted by the MDE fit empirical patterns. The MDE produced strong spatially structured gradients of taxonomic, phylogenetic, functional and phenetic diversity. Nonetheless, expected values generated from the MDE for most dimensions of biodiversity exhibited poor fit to most empirical patterns. The MDE cannot account for most empirical patterns of biodiversity. Fuller understanding of latitudinal gradients will come from simultaneous examination of relative effects of random, environmental and historical mechanisms to better understand distribution and abundance of the current biota.

  5. Stronger Tests of Mechanisms Underlying Geographic Gradients of Biodiversity: Insights from the Dimensionality of Biodiversity

    PubMed Central

    Stevens, Richard D.; Tello, J. Sebastián; Gavilanez, María Mercedes

    2013-01-01

    Inference involving diversity gradients typically is gathered by mechanistic tests involving single dimensions of biodiversity such as species richness. Nonetheless, because traits such as geographic range size, trophic status or phenotypic characteristics are tied to a particular species, mechanistic effects driving broad diversity patterns should manifest across numerous dimensions of biodiversity. We develop an approach of stronger inference based on numerous dimensions of biodiversity and apply it to evaluate one such putative mechanism: the mid-domain effect (MDE). Species composition of 10,000-km2 grid cells was determined by overlaying geographic range maps of 133 noctilionoid bat taxa. We determined empirical diversity gradients in the Neotropics by calculating species richness and three indices each of phylogenetic, functional and phenetic diversity for each grid cell. We also created 1,000 simulated gradients of each examined metric of biodiversity based on a MDE model to estimate patterns expected if species distributions were randomly placed within the Neotropics. For each simulation run, we regressed the observed gradient onto the MDE-expected gradient. If a MDE drives empirical gradients, then coefficients of determination from such an analysis should be high, the intercept no different from zero and the slope no different than unity. Species richness gradients predicted by the MDE fit empirical patterns. The MDE produced strong spatially structured gradients of taxonomic, phylogenetic, functional and phenetic diversity. Nonetheless, expected values generated from the MDE for most dimensions of biodiversity exhibited poor fit to most empirical patterns. The MDE cannot account for most empirical patterns of biodiversity. Fuller understanding of latitudinal gradients will come from simultaneous examination of relative effects of random, environmental and historical mechanisms to better understand distribution and abundance of the current biota. PMID:23451099

  6. Links between Bloom's Taxonomy and Gardener's Multiple Intelligences: The Issue of Textbook Analysis

    ERIC Educational Resources Information Center

    Tabari, Mahmoud Abdi; Tabari, Iman Abdi

    2015-01-01

    The major thrust of this research was to investigate the cognitive aspect of the high school textbooks and interchange series, due to their extensive use, through content analysis based on Bloom's taxonomy and Gardner's Multiple Intelligences (MI). This study embraced two perspectives in a grid in order to broaden and deepen the analysis by…

  7. Toward Verification of USM3D Extensions for Mixed Element Grids

    NASA Technical Reports Server (NTRS)

    Pandya, Mohagna J.; Frink, Neal T.; Ding, Ejiang; Parlette, Edward B.

    2013-01-01

    The unstructured tetrahedral grid cell-centered finite volume flow solver USM3D has been recently extended to handle mixed element grids composed of hexahedral, prismatic, pyramidal, and tetrahedral cells. Presently, two turbulence models, namely, baseline Spalart-Allmaras (SA) and Menter Shear Stress Transport (SST), support mixed element grids. This paper provides an overview of the various numerical discretization options available in the newly enhanced USM3D. Using the SA model, the flow solver extensions are verified on three two-dimensional test cases available on the Turbulence Modeling Resource website at the NASA Langley Research Center. The test cases are zero pressure gradient flat plate, planar shear, and bump-inchannel. The effect of cell topologies on the flow solution is also investigated using the planar shear case. Finally, the assessment of various cell and face gradient options is performed on the zero pressure gradient flat plate case.

  8. An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua

    2011-07-09

    Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Gridmore » Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.« less

  9. Modelling a Set of Carbon-Rich AGB Stars at High-Angular Resolution

    NASA Astrophysics Data System (ADS)

    Rau, Gioia; Hron, Josef; Paladini, Claudia; Aringer, Bernard; Eriksson, Kjell; Marigo, Paola; Nowotny, Walter; Grellmann, Rebekka

    2016-07-01

    We compared spectro-photometric and interferometric observations of six carbon-rich AGB stars with a grid of self-consistentmodel atmospheres. The targets are: R Lep, R Vol, Y Pav, AQ Sgr, U Hya and X TrA. Please refer to the publication Rau et al. 2016(subm.) for further details on those findings.

  10. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  11. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  12. Hebbian Plasticity Realigns Grid Cell Activity with External Sensory Cues in Continuous Attractor Models

    PubMed Central

    Mulas, Marcello; Waniek, Nicolai; Conradt, Jörg

    2016-01-01

    After the discovery of grid cells, which are an essential component to understand how the mammalian brain encodes spatial information, three main classes of computational models were proposed in order to explain their working principles. Amongst them, the one based on continuous attractor networks (CAN), is promising in terms of biological plausibility and suitable for robotic applications. However, in its current formulation, it is unable to reproduce important electrophysiological findings and cannot be used to perform path integration for long periods of time. In fact, in absence of an appropriate resetting mechanism, the accumulation of errors over time due to the noise intrinsic in velocity estimation and neural computation prevents CAN models to reproduce stable spatial grid patterns. In this paper, we propose an extension of the CAN model using Hebbian plasticity to anchor grid cell activity to environmental landmarks. To validate our approach we used as input to the neural simulations both artificial data and real data recorded from a robotic setup. The additional neural mechanism can not only anchor grid patterns to external sensory cues but also recall grid patterns generated in previously explored environments. These results might be instrumental for next generation bio-inspired robotic navigation algorithms that take advantage of neural computation in order to cope with complex and dynamic environments. PMID:26924979

  13. Distribution, congruence, and hotspots of higher plants in China

    PubMed Central

    Zhao, Lina; Li, Jinya; Liu, Huiyuan; Qin, Haining

    2016-01-01

    Identifying biodiversity hotspots has become a central issue in setting up priority protection areas, especially as financial resources for biological diversity conservation are limited. Taking China’s Higher Plants Red List (CHPRL), including Bryophytes, Ferns, Gymnosperms, Angiosperms, as the data source, we analyzed the geographic patterns of species richness, endemism, and endangerment via data processing at a fine grid-scale with an average edge length of 30 km based on three aspects of richness information: species richness, endemic species richness, and threatened species richness. We sought to test the accuracy of hotspots used in identifying conservation priorities with regard to higher plants. Next, we tested the congruence of the three aspects and made a comparison of the similarities and differences between the hotspots described in this paper and those in previous studies. We found that over 90% of threatened species in China are concentrated. While a high spatial congruence is observed among the three measures, there is a low congruence between two different sets of hotspots. Our results suggest that biodiversity information should be considered when identifying biological hotspots. Other factors, such as scales, should be included as well to develop biodiversity conservation plans in accordance with the region’s specific conditions. PMID:26750244

  14. Distribution, congruence, and hotspots of higher plants in China.

    PubMed

    Zhao, Lina; Li, Jinya; Liu, Huiyuan; Qin, Haining

    2016-01-11

    Identifying biodiversity hotspots has become a central issue in setting up priority protection areas, especially as financial resources for biological diversity conservation are limited. Taking China's Higher Plants Red List (CHPRL), including Bryophytes, Ferns, Gymnosperms, Angiosperms, as the data source, we analyzed the geographic patterns of species richness, endemism, and endangerment via data processing at a fine grid-scale with an average edge length of 30 km based on three aspects of richness information: species richness, endemic species richness, and threatened species richness. We sought to test the accuracy of hotspots used in identifying conservation priorities with regard to higher plants. Next, we tested the congruence of the three aspects and made a comparison of the similarities and differences between the hotspots described in this paper and those in previous studies. We found that over 90% of threatened species in China are concentrated. While a high spatial congruence is observed among the three measures, there is a low congruence between two different sets of hotspots. Our results suggest that biodiversity information should be considered when identifying biological hotspots. Other factors, such as scales, should be included as well to develop biodiversity conservation plans in accordance with the region's specific conditions.

  15. Federated ontology-based queries over cancer data

    PubMed Central

    2012-01-01

    Background Personalised medicine provides patients with treatments that are specific to their genetic profiles. It requires efficient data sharing of disparate data types across a variety of scientific disciplines, such as molecular biology, pathology, radiology and clinical practice. Personalised medicine aims to offer the safest and most effective therapeutic strategy based on the gene variations of each subject. In particular, this is valid in oncology, where knowledge about genetic mutations has already led to new therapies. Current molecular biology techniques (microarrays, proteomics, epigenetic technology and improved DNA sequencing technology) enable better characterisation of cancer tumours. The vast amounts of data, however, coupled with the use of different terms - or semantic heterogeneity - in each discipline makes the retrieval and integration of information difficult. Results Existing software infrastructures for data-sharing in the cancer domain, such as caGrid, support access to distributed information. caGrid follows a service-oriented model-driven architecture. Each data source in caGrid is associated with metadata at increasing levels of abstraction, including syntactic, structural, reference and domain metadata. The domain metadata consists of ontology-based annotations associated with the structural information of each data source. However, caGrid's current querying functionality is given at the structural metadata level, without capitalising on the ontology-based annotations. This paper presents the design of and theoretical foundations for distributed ontology-based queries over cancer research data. Concept-based queries are reformulated to the target query language, where join conditions between multiple data sources are found by exploiting the semantic annotations. The system has been implemented, as a proof of concept, over the caGrid infrastructure. The approach is applicable to other model-driven architectures. A graphical user interface has been developed, supporting ontology-based queries over caGrid data sources. An extensive evaluation of the query reformulation technique is included. Conclusions To support personalised medicine in oncology, it is crucial to retrieve and integrate molecular, pathology, radiology and clinical data in an efficient manner. The semantic heterogeneity of the data makes this a challenging task. Ontologies provide a formal framework to support querying and integration. This paper provides an ontology-based solution for querying distributed databases over service-oriented, model-driven infrastructures. PMID:22373043

  16. TRIAD: The Translational Research Informatics and Data Management Grid

    PubMed Central

    Payne, P.; Ervin, D.; Dhaval, R.; Borlawsky, T.; Lai, A.

    2011-01-01

    Objective Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. Methods A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis “pipelines.” Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile “working interoperability” between data, information, and knowledge resources. Results Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile “working interoperability” between distributed data and knowledge sources. Conclusion Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling “working interoperability” in heterogeneous biomedical environments. PMID:23616879

  17. TRIAD: The Translational Research Informatics and Data Management Grid.

    PubMed

    Payne, P; Ervin, D; Dhaval, R; Borlawsky, T; Lai, A

    2011-01-01

    Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis "pipelines." Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile "working interoperability" between data, information, and knowledge resources. Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile "working interoperability" between distributed data and knowledge sources. Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling "working interoperability" in heterogeneous biomedical environments.

  18. Japanese national forest inventory and its spatial extension by remote sensing

    Treesearch

    Yasumasa Hirata; Mitsuo Matsumoto; Toshiro Iehara

    2009-01-01

    Japan has two independent forest inventory systems. One forest inventory is required by the forest planning system based on the Forest Law, in which forest registers and forest planning maps are prepared. The other system is a forest resource monitoring survey, in which systematic sampling is done at 4-km grid intervals. Here, we present these national forest inventory...

  19. Towards Stochastic Optimization-Based Electric Vehicle Penetration in a Novel Archipelago Microgrid.

    PubMed

    Yang, Qingyu; An, Dou; Yu, Wei; Tan, Zhengan; Yang, Xinyu

    2016-06-17

    Due to the advantage of avoiding upstream disturbance and voltage fluctuation from a power transmission system, Islanded Micro-Grids (IMG) have attracted much attention. In this paper, we first propose a novel self-sufficient Cyber-Physical System (CPS) supported by Internet of Things (IoT) techniques, namely "archipelago micro-grid (MG)", which integrates the power grid and sensor networks to make the grid operation effective and is comprised of multiple MGs while disconnected with the utility grid. The Electric Vehicles (EVs) are used to replace a portion of Conventional Vehicles (CVs) to reduce CO 2 emission and operation cost. Nonetheless, the intermittent nature and uncertainty of Renewable Energy Sources (RESs) remain a challenging issue in managing energy resources in the system. To address these issues, we formalize the optimal EV penetration problem as a two-stage Stochastic Optimal Penetration (SOP) model, which aims to minimize the emission and operation cost in the system. Uncertainties coming from RESs (e.g., wind, solar, and load demand) are considered in the stochastic model and random parameters to represent those uncertainties are captured by the Monte Carlo-based method. To enable the reasonable deployment of EVs in each MGs, we develop two scheduling schemes, namely Unlimited Coordinated Scheme (UCS) and Limited Coordinated Scheme (LCS), respectively. An extensive simulation study based on a modified 9 bus system with three MGs has been carried out to show the effectiveness of our proposed schemes. The evaluation data indicates that our proposed strategy can reduce both the environmental pollution created by CO 2 emissions and operation costs in UCS and LCS.

  20. Towards Stochastic Optimization-Based Electric Vehicle Penetration in a Novel Archipelago Microgrid

    PubMed Central

    Yang, Qingyu; An, Dou; Yu, Wei; Tan, Zhengan; Yang, Xinyu

    2016-01-01

    Due to the advantage of avoiding upstream disturbance and voltage fluctuation from a power transmission system, Islanded Micro-Grids (IMG) have attracted much attention. In this paper, we first propose a novel self-sufficient Cyber-Physical System (CPS) supported by Internet of Things (IoT) techniques, namely “archipelago micro-grid (MG)”, which integrates the power grid and sensor networks to make the grid operation effective and is comprised of multiple MGs while disconnected with the utility grid. The Electric Vehicles (EVs) are used to replace a portion of Conventional Vehicles (CVs) to reduce CO2 emission and operation cost. Nonetheless, the intermittent nature and uncertainty of Renewable Energy Sources (RESs) remain a challenging issue in managing energy resources in the system. To address these issues, we formalize the optimal EV penetration problem as a two-stage Stochastic Optimal Penetration (SOP) model, which aims to minimize the emission and operation cost in the system. Uncertainties coming from RESs (e.g., wind, solar, and load demand) are considered in the stochastic model and random parameters to represent those uncertainties are captured by the Monte Carlo-based method. To enable the reasonable deployment of EVs in each MGs, we develop two scheduling schemes, namely Unlimited Coordinated Scheme (UCS) and Limited Coordinated Scheme (LCS), respectively. An extensive simulation study based on a modified 9 bus system with three MGs has been carried out to show the effectiveness of our proposed schemes. The evaluation data indicates that our proposed strategy can reduce both the environmental pollution created by CO2 emissions and operation costs in UCS and LCS. PMID:27322281

  1. Synchrotron Imaging Computations on the Grid without the Computing Element

    NASA Astrophysics Data System (ADS)

    Curri, A.; Pugliese, R.; Borghes, R.; Kourousias, G.

    2011-12-01

    Besides the heavy use of the Grid in the Synchrotron Radiation Facility (SRF) Elettra, additional special requirements from the beamlines had to be satisfied through a novel solution that we present in this work. In the traditional Grid Computing paradigm the computations are performed on the Worker Nodes of the grid element known as the Computing Element. A Grid middleware extension that our team has been working on, is that of the Instrument Element. In general it is used to Grid-enable instrumentation; and it can be seen as a neighbouring concept to that of the traditional Control Systems. As a further extension we demonstrate the Instrument Element as the steering mechanism for a series of computations. In our deployment it interfaces a Control System that manages a series of computational demanding Scientific Imaging tasks in an online manner. The instrument control in Elettra is done through a suitable Distributed Control System, a common approach in the SRF community. The applications that we present are for a beamline working in medical imaging. The solution resulted to a substantial improvement of a Computed Tomography workflow. The near-real-time requirements could not have been easily satisfied from our Grid's middleware (gLite) due to the various latencies often occurred during the job submission and queuing phases. Moreover the required deployment of a set of TANGO devices could not have been done in a standard gLite WN. Besides the avoidance of certain core Grid components, the Grid Security infrastructure has been utilised in the final solution.

  2. Towards Integrating Distributed Energy Resources and Storage Devices in Smart Grid.

    PubMed

    Xu, Guobin; Yu, Wei; Griffith, David; Golmie, Nada; Moulema, Paul

    2017-02-01

    Internet of Things (IoT) provides a generic infrastructure for different applications to integrate information communication techniques with physical components to achieve automatic data collection, transmission, exchange, and computation. The smart grid, as one of typical applications supported by IoT, denoted as a re-engineering and a modernization of the traditional power grid, aims to provide reliable, secure, and efficient energy transmission and distribution to consumers. How to effectively integrate distributed (renewable) energy resources and storage devices to satisfy the energy service requirements of users, while minimizing the power generation and transmission cost, remains a highly pressing challenge in the smart grid. To address this challenge and assess the effectiveness of integrating distributed energy resources and storage devices, in this paper we develop a theoretical framework to model and analyze three types of power grid systems: the power grid with only bulk energy generators, the power grid with distributed energy resources, and the power grid with both distributed energy resources and storage devices. Based on the metrics of the power cumulative cost and the service reliability to users, we formally model and analyze the impact of integrating distributed energy resources and storage devices in the power grid. We also use the concept of network calculus, which has been traditionally used for carrying out traffic engineering in computer networks, to derive the bounds of both power supply and user demand to achieve a high service reliability to users. Through an extensive performance evaluation, our data shows that integrating distributed energy resources conjointly with energy storage devices can reduce generation costs, smooth the curve of bulk power generation over time, reduce bulk power generation and power distribution losses, and provide a sustainable service reliability to users in the power grid.

  3. Towards Integrating Distributed Energy Resources and Storage Devices in Smart Grid

    PubMed Central

    Xu, Guobin; Yu, Wei; Griffith, David; Golmie, Nada; Moulema, Paul

    2017-01-01

    Internet of Things (IoT) provides a generic infrastructure for different applications to integrate information communication techniques with physical components to achieve automatic data collection, transmission, exchange, and computation. The smart grid, as one of typical applications supported by IoT, denoted as a re-engineering and a modernization of the traditional power grid, aims to provide reliable, secure, and efficient energy transmission and distribution to consumers. How to effectively integrate distributed (renewable) energy resources and storage devices to satisfy the energy service requirements of users, while minimizing the power generation and transmission cost, remains a highly pressing challenge in the smart grid. To address this challenge and assess the effectiveness of integrating distributed energy resources and storage devices, in this paper we develop a theoretical framework to model and analyze three types of power grid systems: the power grid with only bulk energy generators, the power grid with distributed energy resources, and the power grid with both distributed energy resources and storage devices. Based on the metrics of the power cumulative cost and the service reliability to users, we formally model and analyze the impact of integrating distributed energy resources and storage devices in the power grid. We also use the concept of network calculus, which has been traditionally used for carrying out traffic engineering in computer networks, to derive the bounds of both power supply and user demand to achieve a high service reliability to users. Through an extensive performance evaluation, our data shows that integrating distributed energy resources conjointly with energy storage devices can reduce generation costs, smooth the curve of bulk power generation over time, reduce bulk power generation and power distribution losses, and provide a sustainable service reliability to users in the power grid1. PMID:29354654

  4. Mass-loss From Evolved Stellar Populations In The Large Magellanic Cloud

    NASA Astrophysics Data System (ADS)

    Riebel, David

    2012-01-01

    I have conducted a study of a sample of 30,000 evolved stars in the Large Magellanic Cloud (LMC) and 6,000 in the Small Magellanic Cloud (SMC), covering their variability, mass-loss properties, and chemistry. The initial stages of of my thesis work focused on the infrared variability of Asymptotic Giant Branch (AGB) stars in the LMC. I determined the period-luminosity (P-L) relations for 6 separate sequences of 30,000 evolved star candidates at 8 wavelengths, as a function of photometrically assigned chemistry, and showed that the P-L relations are different for different chemical populations (O-rich or C-rich). I also present results from the Grid of Red supergiant and Asymptotic giant branch star ModelS (GRAMS) radiative transfer (RT) model grid applied to the evolved stellar population of the LMC. GRAMS is a pre-computed grid of RT models of RSG and AGB stars and surrounding circumstellar dust. Best-fit models are determined based on 12 bands of photometry from the optical to the mid-infrared. Using a pre-computed grid, I can present the first reasonably detailed radiative transfer modeling for tens of thousands of stars, allowing me to make statistically accurate estimations of the carbon-star luminosity function and the global dust mass return to the interstellar medium from AGB stars, both key parameters for stellar population synthesis models to reproduce. In the SAGE-Var program, I used the warm Spitzer mission to take 4 additional epochs of observations of 7500 AGB stars in the LMC and SMC. These epochs, combined with existing data, enable me to derive mean fluxes at 3.6 and 4.5 microns, that will be used for tighter constraints for GRAMS, which is currently limited by the variability induced error on the photometry. This work is support by NASA NAG5-12595 and Spitzer contract 1415784.

  5. A Numerical Study of Three Moving-Grid Methods for One-Dimensional Partial Differential Equations Which Are Based on the Method of Lines

    NASA Astrophysics Data System (ADS)

    Furzeland, R. M.; Verwer, J. G.; Zegeling, P. A.

    1990-08-01

    In recent years, several sophisticated packages based on the method of lines (MOL) have been developed for the automatic numerical integration of time-dependent problems in partial differential equations (PDEs), notably for problems in one space dimension. These packages greatly benefit from the very successful developments of automatic stiff ordinary differential equation solvers. However, from the PDE point of view, they integrate only in a semiautomatic way in the sense that they automatically adjust the time step sizes, but use just a fixed space grid, chosen a priori, for the entire calculation. For solutions possessing sharp spatial transitions that move, e.g., travelling wave fronts or emerging boundary and interior layers, a grid held fixed for the entire calculation is computationally inefficient, since for a good solution this grid often must contain a very large number of nodes. In such cases methods which attempt automatically to adjust the sizes of both the space and the time steps are likely to be more successful in efficiently resolving critical regions of high spatial and temporal activity. Methods and codes that operate this way belong to the realm of adaptive or moving-grid methods. Following the MOL approach, this paper is devoted to an evaluation and comparison, mainly based on extensive numerical tests, of three moving-grid methods for 1D problems, viz., the finite-element method of Miller and co-workers, the method published by Petzold, and a method based on ideas adopted from Dorfi and Drury. Our examination of these three methods is aimed at assessing which is the most suitable from the point of view of retaining the acknowledged features of reliability, robustness, and efficiency of the conventional MOL approach. Therefore, considerable attention is paid to the temporal performance of the methods.

  6. Towards Characterization, Modeling, and Uncertainty Quantification in Multi-scale Mechanics of Oragnic-rich Shales

    NASA Astrophysics Data System (ADS)

    Abedi, S.; Mashhadian, M.; Noshadravan, A.

    2015-12-01

    Increasing the efficiency and sustainability in operation of hydrocarbon recovery from organic-rich shales requires a fundamental understanding of chemomechanical properties of organic-rich shales. This understanding is manifested in form of physics-bases predictive models capable of capturing highly heterogeneous and multi-scale structure of organic-rich shale materials. In this work we present a framework of experimental characterization, micromechanical modeling, and uncertainty quantification that spans from nanoscale to macroscale. Application of experiments such as coupled grid nano-indentation and energy dispersive x-ray spectroscopy and micromechanical modeling attributing the role of organic maturity to the texture of the material, allow us to identify unique clay mechanical properties among different samples that are independent of maturity of shale formations and total organic content. The results can then be used to inform the physically-based multiscale model for organic rich shales consisting of three levels that spans from the scale of elementary building blocks (e.g. clay minerals in clay-dominated formations) of organic rich shales to the scale of the macroscopic inorganic/organic hard/soft inclusion composite. Although this approach is powerful in capturing the effective properties of organic-rich shale in an average sense, it does not account for the uncertainty in compositional and mechanical model parameters. Thus, we take this model one step forward by systematically incorporating the main sources of uncertainty in modeling multiscale behavior of organic-rich shales. In particular we account for the uncertainty in main model parameters at different scales such as porosity, elastic properties and mineralogy mass percent. To that end, we use Maximum Entropy Principle and random matrix theory to construct probabilistic descriptions of model inputs based on available information. The Monte Carlo simulation is then carried out to propagate the uncertainty and consequently construct probabilistic descriptions of properties at multiple length-scales. The combination of experimental characterization and stochastic multi-scale modeling presented in this work improves the robustness in the prediction of essential subsurface parameters in engineering scale.

  7. THE MASS-LOSS RETURN FROM EVOLVED STARS TO THE LARGE MAGELLANIC CLOUD. VI. LUMINOSITIES AND MASS-LOSS RATES ON POPULATION SCALES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riebel, D.; Meixner, M.; Srinivasan, S.

    We present results from the first application of the Grid of Red Supergiant and Asymptotic Giant Branch ModelS (GRAMS) model grid to the entire evolved stellar population of the Large Magellanic Cloud (LMC). GRAMS is a pre-computed grid of 80,843 radiative transfer models of evolved stars and circumstellar dust shells composed of either silicate or carbonaceous dust. We fit GRAMS models to {approx}30,000 asymptotic giant branch (AGB) and red supergiant (RSG) stars in the LMC, using 12 bands of photometry from the optical to the mid-infrared. Our published data set consists of thousands of evolved stars with individually determined evolutionarymore » parameters such as luminosity and mass-loss rate. The GRAMS grid has a greater than 80% accuracy rate discriminating between oxygen- and carbon-rich chemistry. The global dust injection rate to the interstellar medium (ISM) of the LMC from RSGs and AGB stars is on the order of 2.1 Multiplication-Sign 10{sup -5} M{sub Sun} yr{sup -1}, equivalent to a total mass injection rate (including the gas) into the ISM of {approx}6 Multiplication-Sign 10{sup -3} M{sub Sun} yr{sup -1}. Carbon stars inject two and a half times as much dust into the ISM as do O-rich AGB stars, but the same amount of mass. We determine a bolometric correction factor for C-rich AGB stars in the K{sub s} band as a function of J - K{sub s} color, BC{sub K{sub s}}= -0.40(J-K{sub s}){sup 2} + 1.83(J-K{sub s}) + 1.29. We determine several IR color proxies for the dust mass-loss rate (M-dot{sub d}) from C-rich AGB stars, such as log M-dot{sub d} = (-18.90/((K{sub s}-[8.0])+3.37) - 5.93. We find that a larger fraction of AGB stars exhibiting the 'long-secondary period' phenomenon are more O-rich than stars dominated by radial pulsations, and AGB stars without detectable mass loss do not appear on either the first-overtone or fundamental-mode pulsation sequences.« less

  8. A Job Monitoring and Accounting Tool for the LSF Batch System

    NASA Astrophysics Data System (ADS)

    Sarkar, Subir; Taneja, Sonia

    2011-12-01

    This paper presents a web based job monitoring and group-and-user accounting tool for the LSF Batch System. The user oriented job monitoring displays a simple and compact quasi real-time overview of the batch farm for both local and Grid jobs. For Grid jobs the Distinguished Name (DN) of the Grid users is shown. The overview monitor provides the most up-to-date status of a batch farm at any time. The accounting tool works with the LSF accounting log files. The accounting information is shown for a few pre-defined time periods by default. However, one can also compute the same information for any arbitrary time window. The tool already proved to be an extremely useful means to validate more extensive accounting tools available in the Grid world. Several sites have already been using the present tool and more sites running the LSF batch system have shown interest. We shall discuss the various aspects that make the tool essential for site administrators and end-users alike and outline the current status of development as well as future plans.

  9. Extending WS-Agreement with Multi-round Negotiation Capability

    NASA Astrophysics Data System (ADS)

    Rumpl, Angela; Wäldrich, Oliver; Ziegler, Wolfgang

    The WS-Agreement specification of the Open Grid Forum defines a language and a protocol for advertising the capabilities of service providers and creating agreements based on templates, and for monitoring agreement compliance at runtime. While the specification, which currently is in the process of transition from a proposed recommendation of the Open Grid Forum to a full recommendation, has been widely used after the initial publication in May 2007, it became obvious that the missing possibility to negotiate an agreement rather than just accepting an offer is limiting or inhibiting the use of WS-Agreement for a number of use-cases. Therefore, the Grid Resource Allocation Agreement Working Group of the Open Grid Forum started in 2008 to prepare an extension of WS-Agreement that adds negotiation capabilities without changing the current specification in a way, which leads to an incompatible new version of WS-Agreement. In this paper we present the results of this process with an updated version of the specification in mind and the first implementation in the European project SmartLM.

  10. The constitution of the atmospheric layers and the extreme ultraviolet spectrum of hot hydrogen-rich white dwarfs

    NASA Technical Reports Server (NTRS)

    Vennes, Stephane

    1992-01-01

    An analysis is presented of the atmospheric properties of hot, H-rich, DA white dwarfs that is based on optical, UV, and X-ray observations aimed at predicting detailed spectral properties of these stars in the range 80-800 A. The divergences between observations from a sample of 15 hot DA white dwarfs emitting in the EUV/soft X-ray range and pure H synthetic spectra calculated from a grid of model atmospheres characterized by Teff and g are examined. Seven out of 15 DA stars are found to consistently exhibit pure hydrogen atmospheres, the remaining seven stars showing inconsistency between FUV and EUV/soft X-ray data that can be explained by the presence of trace EUV/soft X-ray absorbers. Synthetic data are computed assuming two other possible chemical structures: photospheric traces of radiatively levitated heavy elements and a stratified hydrogen/helium distribution. Predictions about forthcoming medium-resolution observations of the EUV spectrum of selected hot H-rich white dwarfs are made.

  11. Physics-based investigation of negative ion behavior in a negative-ion-rich plasma using integrated diagnostics

    NASA Astrophysics Data System (ADS)

    Tsumori, K.; Takeiri, Y.; Ikeda, K.; Nakano, H.; Geng, S.; Kisaki, M.; Nagaoka, K.; Tokuzawa, T.; Wada, M.; Sasaki, K.; Nishiyama, S.; Goto, M.; Osakabe, M.

    2017-08-01

    Total power of 16 MW has been successfully delivered to the plasma confined in the Large Helical Device (LHD) from three Neutral Beam Injectors (NBIs) equipped with negative hydrogen (H-) ion sources. However, the detailed mechanisms from production through extraction of H- ions are still yet to be clarified and a similar size ion source on an independent acceleration test bench called Research and development Negative Ion Source (RNIS) serves as the facility to study physics related to H- production and transport for further improvement of NBI. The production of negative-ion-rich plasma and the H- ions behavior in the beam extraction region in RNIS is being investigated by employing an integrated diagnostic system. Flow patterns of electrons, positive ions and H- ions in the extraction region are described in a two-dimensional map. The measured flow patterns indicate the existence a stagnation region, where the H- flow changes the direction at a distance about 20 mm from the plasma grid. The pattern also suggested the H- flow originated from plasma grid (PG) surface that turned back toward extraction apertures. The turning region seems formed by a layer of combined magnetic field produced by the magnetic filter field and the Electron-Deflection Magnetic (EDM) field created by magnets installed in the extraction electrode.

  12. The CBM RICH project

    NASA Astrophysics Data System (ADS)

    Adamczewski-Musch, J.; Akishin, P.; Becker, K.-H.; Belogurov, S.; Bendarouach, J.; Boldyreva, N.; Chernogorov, A.; Deveaux, C.; Dobyrn, V.; Dürr, M.; Eschke, J.; Förtsch, J.; Heep, J.; Höhne, C.; Kampert, K.-H.; Kochenda, L.; Kopfer, J.; Kravtsov, P.; Kres, I.; Lebedev, S.; Lebedeva, E.; Leonova, E.; Linev, S.; Mahmoud, T.; Michel, J.; Miftakhov, N.; Niebur, W.; Ovcharenko, E.; Patel, V.; Pauly, C.; Pfeifer, D.; Querchfeld, S.; Rautenberg, J.; Reinecke, S.; Riabov, Y.; Roshchin, E.; Samsonov, V.; Tarasenkova, O.; Traxler, M.; Ugur, C.; Vznuzdaev, E.; Vznuzdaev, M.

    2017-02-01

    The CBM RICH detector is an integral component of the future CBM experiment at FAIR, providing efficient electron identification and pion suppression necessary for the measurement of rare dileptonic probes in heavy ion collisions. The RICH design is based on CO2 gas as radiator, a segmented spherical glass focussing mirror with Al+MgF2 reflective coating, and Multianode Photomultipliers for efficient Cherenkov photon detection. Hamamatsu H12700 MAPMTs have recently been selected as photon sensors, following an extensive sensor evaluation, including irradiation tests to ensure sufficient radiation hardness of the MAPMTs. A brief overview of the detector design and concept is given, results on the radiation hardness of the photon sensors are shown, and the development of a FPGA-TDC based readout chain is discussed.

  13. Conservation biogeography of red oaks (Quercus, section Lobatae) in Mexico and Central America.

    PubMed

    Torres-Miranda, Andrés; Luna-Vega, Isolda; Oyama, Ken

    2011-02-01

    Oaks are dominant trees and key species in many temperate and subtropical forests in the world. In this study, we analyzed patterns of distribution of red oaks (Quercus, section Lobatae) occurring in Mexico and Central America to determine areas of species richness and endemism to propose areas of conservation. Patterns of richness and endemism of 75 red oak species were analyzed using three different units. Two complementarity algorithms based on species richness and three algorithms based on species rarity were used to identify important areas for conservation. A simulated annealing analysis was performed to evaluate and formulate effective new reserves for red oaks that are useful for conserving the ecosystems associated with them after the systematic conservation planning approach. Two main centers of species richness were detected. The northern Sierra Madre Oriental and Serranías Meridionales of Jalisco had the highest values of endemism. Fourteen areas were considered as priorities for conservation of red oak species based on the 26 priority political entities, 11 floristic units and the priority grid-cells obtained in the complementarity analysis. In the present network of Natural Protected Areas in Mexico and Central America, only 41.3% (31 species) of the red oak species are protected. The simulated annealing analysis indicated that to protect all 75 species of red oaks, 12 current natural protected areas need to be expanded by 120000 ha of additional land, and 26 new natural protected areas with 512500 ha need to be created. Red oaks are a useful model to identify areas for conservation based on species richness and endemism as a result of their wide geographic distribution and a high number of species. We evaluated and reformulated new reserves for red oaks that are also useful for the conservation of ecosystems associated with them.

  14. Framing of grid cells within and beyond navigation boundaries

    PubMed Central

    Savelli, Francesco; Luck, JD; Knierim, James J

    2017-01-01

    Grid cells represent an ideal candidate to investigate the allocentric determinants of the brain’s cognitive map. Most studies of grid cells emphasized the roles of geometric boundaries within the navigational range of the animal. Behaviors such as novel route-taking between local environments indicate the presence of additional inputs from remote cues beyond the navigational borders. To investigate these influences, we recorded grid cells as rats explored an open-field platform in a room with salient, remote cues. The platform was rotated or translated relative to the room frame of reference. Although the local, geometric frame of reference often exerted the strongest control over the grids, the remote cues demonstrated a consistent, sometimes dominant, countervailing influence. Thus, grid cells are controlled by both local geometric boundaries and remote spatial cues, consistent with prior studies of hippocampal place cells and providing a rich representational repertoire to support complex navigational (and perhaps mnemonic) processes. DOI: http://dx.doi.org/10.7554/eLife.21354.001 PMID:28084992

  15. Drag Prediction for the NASA CRM Wing-Body-Tail Using CFL3D and OVERFLOW on an Overset Mesh

    NASA Technical Reports Server (NTRS)

    Sclafani, Anthony J.; DeHaan, Mark A.; Vassberg, John C.; Rumsey, Christopher L.; Pulliam, Thomas H.

    2010-01-01

    In response to the fourth AIAA CFD Drag Prediction Workshop (DPW-IV), the NASA Common Research Model (CRM) wing-body and wing-body-tail configurations are analyzed using the Reynolds-averaged Navier-Stokes (RANS) flow solvers CFL3D and OVERFLOW. Two families of structured, overset grids are built for DPW-IV. Grid Family 1 (GF1) consists of a coarse (7.2 million), medium (16.9 million), fine (56.5 million), and extra-fine (189.4 million) mesh. Grid Family 2 (GF2) is an extension of the first and includes a superfine (714.2 million) and an ultra-fine (2.4 billion) mesh. The medium grid anchors both families with an established build process for accurate cruise drag prediction studies. This base mesh is coarsened and enhanced to form a set of parametrically equivalent grids that increase in size by a factor of roughly 3.4 from one level to the next denser level. Both CFL3D and OVERFLOW are run on GF1 using a consistent numerical approach. Additional OVERFLOW runs are made to study effects of differencing scheme and turbulence model on GF1 and to obtain results for GF2. All CFD results are post-processed using Richardson extrapolation, and approximate grid-converged values of drag are compared. The medium grid is also used to compute a trimmed drag polar for both codes.

  16. Security attack detection algorithm for electric power gis system based on mobile application

    NASA Astrophysics Data System (ADS)

    Zhou, Chao; Feng, Renjun; Wang, Liming; Huang, Wei; Guo, Yajuan

    2017-05-01

    Electric power GIS is one of the key information technologies to satisfy the power grid construction in China, and widely used in power grid construction planning, weather, and power distribution management. The introduction of electric power GIS based on mobile applications is an effective extension of the geographic information system that has been widely used in the electric power industry. It provides reliable, cheap and sustainable power service for the country. The accurate state estimation is the important conditions to maintain the normal operation of the electric power GIS. Recent research has shown that attackers can inject the complex false data into the power system. The injection attack of this new type of false data (load integrity attack LIA) can successfully bypass the routine detection to achieve the purpose of attack, so that the control center will make a series of wrong decision. Eventually, leading to uneven distribution of power in the grid. In order to ensure the safety of the electric power GIS system based on mobile application, it is very important to analyze the attack mechanism and propose a new type of attack, and to study the corresponding detection method and prevention strategy in the environment of electric power GIS system based on mobile application.

  17. Efficient Gradient-Based Shape Optimization Methodology Using Inviscid/Viscous CFD

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay

    1997-01-01

    The formerly developed preconditioned-biconjugate-gradient (PBCG) solvers for the analysis and the sensitivity equations had resulted in very large error reductions per iteration; quadratic convergence was achieved whenever the solution entered the domain of attraction to the root. Its memory requirement was also lower as compared to a direct inversion solver. However, this memory requirement was high enough to preclude the realistic, high grid-density design of a practical 3D geometry. This limitation served as the impetus to the first-year activity (March 9, 1995 to March 8, 1996). Therefore, the major activity for this period was the development of the low-memory methodology for the discrete-sensitivity-based shape optimization. This was accomplished by solving all the resulting sets of equations using an alternating-direction-implicit (ADI) approach. The results indicated that shape optimization problems which required large numbers of grid points could be resolved with a gradient-based approach. Therefore, to better utilize the computational resources, it was recommended that a number of coarse grid cases, using the PBCG method, should initially be conducted to better define the optimization problem and the design space, and obtain an improved initial shape. Subsequently, a fine grid shape optimization, which necessitates using the ADI method, should be conducted to accurately obtain the final optimized shape. The other activity during this period was the interaction with the members of the Aerodynamic and Aeroacoustic Methods Branch of Langley Research Center during one stage of their investigation to develop an adjoint-variable sensitivity method using the viscous flow equations. This method had algorithmic similarities to the variational sensitivity methods and the control-theory approach. However, unlike the prior studies, it was considered for the three-dimensional, viscous flow equations. The major accomplishment in the second period of this project (March 9, 1996 to March 8, 1997) was the extension of the shape optimization methodology for the Thin-Layer Navier-Stokes equations. Both the Euler-based and the TLNS-based analyses compared with the analyses obtained using the CFL3D code. The sensitivities, again from both levels of the flow equations, also compared very well with the finite-differenced sensitivities. A fairly large set of shape optimization cases were conducted to study a number of issues previously not well understood. The testbed for these cases was the shaping of an arrow wing in Mach 2.4 flow. All the final shapes, obtained either from a coarse-grid-based or a fine-grid-based optimization, using either a Euler-based or a TLNS-based analysis, were all re-analyzed using a fine-grid, TLNS solution for their function evaluations. This allowed for a more fair comparison of their relative merits. From the aerodynamic performance standpoint, the fine-grid TLNS-based optimization produced the best shape, and the fine-grid Euler-based optimization produced the lowest cruise efficiency.

  18. elsA-Hybrid: an all-in-one structured/unstructured solver for the simulation of internal and external flows. Application to turbomachinery

    NASA Astrophysics Data System (ADS)

    de la Llave Plata, M.; Couaillier, V.; Le Pape, M.-C.; Marmignon, C.; Gazaix, M.

    2013-03-01

    This paper reports recent work on the extension of the multiblock structured solver elsA to deal with hybrid grids. The new hybrid-grid solver is called elsA-H (elsA-Hybrid), is based on the investigation of a new unstructured-grid module has been built within the original elsA CFD (computational fluid dynamics) system. The implementation benefits from the flexibility of the object-oriented design. The aim of elsA-H is to take advantage of the full potential of structured solvers and unstructured mesh generation by allowing any type of grid to be used within the same simulation process. The main challenge lies in the numerical treatment of the hybrid-grid interfaces where blocks of different type meet. In particular, one must pay attention to the transfer of information across these boundaries, so that the accuracy of the numerical scheme is preserved and flux conservation is guaranteed. In this paper, the numerical approach allowing to achieve this is presented. A comparison between the hybrid and the structured-grid methods is also carried out by considering a fully hexahedral multiblock mesh for which a few blocks have been transformed into unstructured. The performance of elsA-H for the simulation of internal flows will be demonstrated on a number of turbomachinery configurations.

  19. Spatial variation in keystone effects: Small mammal diversity associated with black-tailed prairie dog colonies

    USGS Publications Warehouse

    Cully, J.F.; Collinge, S.K.; Van Nimwegen, R. E.; Ray, C.; Johnson, W.C.; Thiagarajan, Bala; Conlin, D.B.; Holmes, B.E.

    2010-01-01

    Species with extensive geographic ranges may interact with different species assemblages at distant locations, with the result that the nature of the interactions may vary spatially. Black-tailed prairie dogs Cynomys ludovicianus occur from Canada to Mexico in grasslands of the western Great Plains of North America. Black-tailed prairie dogs alter vegetation and dig extensive burrow systems that alter grassland habitats for plants and other animal species. These alterations of habitat justify the descriptor " ecological engineer," and the resulting changes in species composition have earned them status as a keystone species. We examined the impact of black-tailed prairie dogs on small mammal assemblages by trapping at on- and off-colony locations at eight study areas across the species' geographic range. We posed 2 nested hypotheses: 1) prairie dogs function as a keystone species for other rodent species; and 2) the keystone role varies spatially. Assuming that it does, we asked what are the sources of the variation? Black-tailed prairie dogs consistently functioned as a keystone species in that there were strong statistically significant differences in community composition on versus off prairie dog colonies across the species range in prairie grassland. Small mammal species composition varied along both latitudinal and longitudinal gradients, and species richness varied from 4 to 11. Assemblages closer together were more similar; such correlations approximately doubled when including only on- or off-colony grids. Black-tailed prairie dogs had a significant effect on associated rodent assemblages that varied regionally, dependent upon the composition of the local rodent species pool. Over the range of the black-tailed prairie dog, on-colony rodent richness and evenness were less variable, and species composition was more consistent than off-colony assemblages. ?? 2010 The Authors.

  20. GreenView and GreenLand Applications Development on SEE-GRID Infrastructure

    NASA Astrophysics Data System (ADS)

    Mihon, Danut; Bacu, Victor; Gorgan, Dorian; Mészáros, Róbert; Gelybó, Györgyi; Stefanut, Teodor

    2010-05-01

    The GreenView and GreenLand applications [1] have been developed through the SEE-GRID-SCI (SEE-GRID eInfrastructure for regional eScience) FP7 project co-funded by the European Commission [2]. The development of environment applications is a challenge for Grid technologies and software development methodologies. This presentation exemplifies the development of the GreenView and GreenLand applications over the SEE-GRID infrastructure by the Grid Application Development Methodology [3]. Today's environmental applications are used in vary domains of Earth Science such as meteorology, ground and atmospheric pollution, ground metal detection or weather prediction. These applications run on satellite images (e.g. Landsat, MERIS, MODIS, etc.) and the accuracy of output results depends mostly of the quality of these images. The main drawback of such environmental applications regards the need of computation power and storage power (some images are almost 1GB in size), in order to process such a large data volume. Actually, almost applications requiring high computation resources have approached the migration onto the Grid infrastructure. This infrastructure offers the computing power by running the atomic application components on different Grid nodes in sequential or parallel mode. The middleware used between the Grid infrastructure and client applications is ESIP (Environment Oriented Satellite Image Processing Platform), which is based on gProcess platform [4]. In its current format, gProcess is used for launching new processes on the Grid nodes, but also for monitoring the execution status of these processes. This presentation highlights two case studies of Grid based environmental applications, GreenView and GreenLand [5]. GreenView is used in correlation with MODIS (Moderate Resolution Imaging Spectroradiometer) satellite images and meteorological datasets, in order to produce pseudo colored temperature and vegetation maps for different geographical CEE (Central Eastern Europe) regions. On the other hand, GreenLand is used for generating maps for different vegetation indexes (e.g. NDVI, EVI, SAVI, GEMI) based on Landsat satellite images. Both applications are using interpolation and random value generation algorithms, but also specific formulas for computing vegetation index values. The GreenView and GreenLand applications have been experimented over the SEE-GRID infrastructure and the performance evaluation is reported in [6]. The improvement of the execution time (obtained through a better parallelization of jobs), the extension of geographical areas to other parts of the Earth, and new user interaction techniques on spatial data and large set of satellite images are the goals of the future work. References [1] GreenView application on Wiki, http://wiki.egee-see.org/index.php/GreenView [2] SEE-GRID-SCI Project, http://www.see-grid-sci.eu/ [3] Gorgan D., Stefanut T., Bâcu V., Mihon D., Grid based Environment Application Development Methodology, SCICOM, 7th International Conference on "Large-Scale Scientific Computations", 4-8 June, 2009, Sozopol, Bulgaria, (To be published by Springer), (2009). [4] Gorgan D., Bacu V., Stefanut T., Rodila D., Mihon D., Grid based Satellite Image Processing Platform for Earth Observation Applications Development. IDAACS'2009 - IEEE Fifth International Workshop on "Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications", 21-23 September, Cosenza, Italy, IEEE Published in Computer Press, 247-252 (2009). [5] Mihon D., Bacu V., Stefanut T., Gorgan D., "Grid Based Environment Application Development - GreenView Application". ICCP2009 - IEEE 5th International Conference on Intelligent Computer Communication and Processing, 27 Aug, 2009 Cluj-Napoca. Published by IEEE Computer Press, pp. 275-282 (2009). [6] Danut Mihon, Victor Bacu, Dorian Gorgan, Róbert Mészáros, Györgyi Gelybó, Teodor Stefanut, Practical Considerations on the GreenView Application Development and Execution over SEE-GRID. SEE-GRID-SCI User Forum, 9-10 Dec 2009, Bogazici University, Istanbul, Turkey, ISBN: 978-975-403-510-0, pp. 167-175 (2009).

  1. Performance Evaluation of Counter-Based Dynamic Load Balancing Schemes for Massive Contingency Analysis with Different Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Chavarría-Miranda, Daniel

    Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimation. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. High performance computing holds the promise of faster analysis of more contingency cases for the purpose of safe and reliable operation of today’s power grids with less operating margin and more intermittent renewable energy sources. This paper evaluates the performance of counter-based dynamic load balancing schemes for massive contingency analysis under different computing environments. Insights frommore » the performance evaluation can be used as guidance for users to select suitable schemes in the application of massive contingency analysis. Case studies, as well as MATLAB simulations, of massive contingency cases using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing with counter-based dynamic load balancing schemes.« less

  2. e-Human Grid Ecology - understanding and approaching the inverse tragedy of the commons in the e-Grid society.

    PubMed

    Knoch, Tobias A; Baumgärtner, Volkmar; de Zeeuw, Luc V; Grosveld, Frank G; Egger, Kurt

    2009-01-01

    With ever-new technologies emerging also the amount of information to be stored and processed is growing exponentially and is believed to be always at the limit. In contrast, however, huge resources are available in the IT sector alike e.g. the renewable energy sector, which are often even not at all used. This under-usage bares any rational especially in the IT sector where e.g. virtualisation and grid approaches could be fast implemented due to the great technical and fast turnover opportunities. Here, we describe this obvious paradox for the first time as the Inverse Tragedy of the Commons, in contrast to the Classical Tragedy of the Commons where resources are overexploited. From this perspective the grid IT sector attempting to share resources for better efficiency, reveals two challenges leading to the heart of the paradox: i) From a macro perspective all grid infrastructures involve not only mere technical solutions but also dominantly all of the autopoietic social sub-systems ranging from religion to policy. ii) On the micro level the individual players and their psychology and risk behaviour are of major importance for acting within the macro autopoietic framework. Thus, the challenges of grid implementation are similar to those of e.g. climate protection. This is well described by the classic Human Ecology triangle and our extension to a rectangle: invironment-individual-society-environment. Extension of this classical interdisciplinary field of basic and applied research to an e-Human Grid Ecology rational, allows the Inverse Tragedy of the Commons of the grid sector to be understood and approached better and implies obvious guidelines in the day-to-day management for grid and other (networked) resources, which is of importance for many fields with similar paradoxes as in (e-)society.

  3. Planar triode pulser socket

    DOEpatents

    Booth, Rex

    1994-01-01

    A planar triode is mounted in a PC board orifice by means of a U-shaped capacitor housing and anode contact yoke removably attached to cathode leg extensions passing through and soldered to the cathode side of the PC board by means of a PC cathode pad. A pliant/flexible contact attached to the orifice make triode grid contact with a grid pad on the grid side of the PC board, permitting quick and easy replacement of bad triodes.

  4. 75 FR 42727 - Implementing the National Broadband Plan; Comment Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ..., state, and private entities seek to develop Smart Grid technologies. The second RFI requested information on the evolving needs of electric utilities as Smart Grid technologies are more broadly deployed... accept reply comments, data, and information regarding the National Broadband Plan RFI: Data Access and...

  5. Subject Expression in L2 Spanish: Convergence of Generative and Usage-Based Perspectives?

    ERIC Educational Resources Information Center

    Zyzik, Eve

    2017-01-01

    The extensive literature on subject expression in Spanish makes for rich comparisons between generative (formal) and usage-based (functional) approaches to language acquisition. This article explores how the problem of subject expression has been conceptualized within each research tradition, as well as unanswered questions that both approaches…

  6. Elastic extension of a local analysis facility on external clouds for the LHC experiments

    NASA Astrophysics Data System (ADS)

    Ciaschini, V.; Codispoti, G.; Rinaldi, L.; Aiftimiei, D. C.; Bonacorsi, D.; Calligola, P.; Dal Pra, S.; De Girolamo, D.; Di Maria, R.; Grandi, C.; Michelotto, D.; Panella, M.; Taneja, S.; Semeria, F.

    2017-10-01

    The computing infrastructures serving the LHC experiments have been designed to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, the LHC experiments are exploring the opportunity to access Cloud resources provided by external partners or commercial providers. In this work we present the proof of concept of the elastic extension of a local analysis facility, specifically the Bologna Tier-3 Grid site, for the LHC experiments hosted at the site, on an external OpenStack infrastructure. We focus on the Cloud Bursting of the Grid site using DynFarm, a newly designed tool that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on an OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage.

  7. A Pseudo-Temporal Multi-Grid Relaxation Scheme for Solving the Parabolized Navier-Stokes Equations

    NASA Technical Reports Server (NTRS)

    White, J. A.; Morrison, J. H.

    1999-01-01

    A multi-grid, flux-difference-split, finite-volume code, VULCAN, is presented for solving the elliptic and parabolized form of the equations governing three-dimensional, turbulent, calorically perfect and non-equilibrium chemically reacting flows. The space marching algorithms developed to improve convergence rate and or reduce computational cost are emphasized. The algorithms presented are extensions to the class of implicit pseudo-time iterative, upwind space-marching schemes. A full approximate storage, full multi-grid scheme is also described which is used to accelerate the convergence of a Gauss-Seidel relaxation method. The multi-grid algorithm is shown to significantly improve convergence on high aspect ratio grids.

  8. A Data-Driven Approach to Interactive Visualization of Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Jun

    Driven by emerging industry standards, electric utilities and grid coordination organizations are eager to seek advanced tools to assist grid operators to perform mission-critical tasks and enable them to make quick and accurate decisions. The emerging field of visual analytics holds tremendous promise for improving the business practices in today’s electric power industry. The conducted investigation, however, has revealed that the existing commercial power grid visualization tools heavily rely on human designers, hindering user’s ability to discover. Additionally, for a large grid, it is very labor-intensive and costly to build and maintain the pre-designed visual displays. This project proposes amore » data-driven approach to overcome the common challenges. The proposed approach relies on developing powerful data manipulation algorithms to create visualizations based on the characteristics of empirically or mathematically derived data. The resulting visual presentations emphasize what the data is rather than how the data should be presented, thus fostering comprehension and discovery. Furthermore, the data-driven approach formulates visualizations on-the-fly. It does not require a visualization design stage, completely eliminating or significantly reducing the cost for building and maintaining visual displays. The research and development (R&D) conducted in this project is mainly divided into two phases. The first phase (Phase I & II) focuses on developing data driven techniques for visualization of power grid and its operation. Various data-driven visualization techniques were investigated, including pattern recognition for auto-generation of one-line diagrams, fuzzy model based rich data visualization for situational awareness, etc. The R&D conducted during the second phase (Phase IIB) focuses on enhancing the prototyped data driven visualization tool based on the gathered requirements and use cases. The goal is to evolve the prototyped tool developed during the first phase into a commercial grade product. We will use one of the identified application areas as an example to demonstrate how research results achieved in this project are successfully utilized to address an emerging industry need. In summary, the data-driven visualization approach developed in this project has proven to be promising for building the next-generation power grid visualization tools. Application of this approach has resulted in a state-of-the-art commercial tool currently being leveraged by more than 60 utility organizations in North America and Europe .« less

  9. Conservation conflicts across Africa.

    PubMed

    Balmford, A; Moore, J L; Brooks, T; Burgess, N; Hansen, L A; Williams, P; Rahbek, C

    2001-03-30

    There is increasing evidence that areas of outstanding conservation importance may coincide with dense human settlement or impact. We tested the generality of these findings using 1 degree-resolution data for sub-Saharan Africa. We find that human population density is positively correlated with species richness of birds, mammals, snakes, and amphibians. This association holds for widespread, narrowly endemic, and threatened species and looks set to persist in the face of foreseeable population growth. Our results contradict earlier expectations of low conflict based on the idea that species richness decreases and human impact increases with primary productivity. We find that across Africa, both variables instead exhibit unimodal relationships with productivity. Modifying priority-setting to take account of human density shows that, at this scale, conflicts between conservation and development are not easily avoided, because many densely inhabited grid cells contain species found nowhere else.

  10. Planar triode pulser socket

    DOEpatents

    Booth, R.

    1994-10-25

    A planar triode is mounted in a PC board orifice by means of a U-shaped capacitor housing and anode contact yoke removably attached to cathode leg extensions passing through and soldered to the cathode side of the PC board by means of a PC cathode pad. A pliant/flexible contact attached to the orifice make triode grid contact with a grid pad on the grid side of the PC board, permitting quick and easy replacement of bad triodes. 14 figs.

  11. Distributed Computing Framework for Synthetic Radar Application

    NASA Technical Reports Server (NTRS)

    Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael

    2006-01-01

    We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.

  12. EPPRD: An Efficient Privacy-Preserving Power Requirement and Distribution Aggregation Scheme for a Smart Grid.

    PubMed

    Zhang, Lei; Zhang, Jing

    2017-08-07

    A Smart Grid (SG) facilitates bidirectional demand-response communication between individual users and power providers with high computation and communication performance but also brings about the risk of leaking users' private information. Therefore, improving the individual power requirement and distribution efficiency to ensure communication reliability while preserving user privacy is a new challenge for SG. Based on this issue, we propose an efficient and privacy-preserving power requirement and distribution aggregation scheme (EPPRD) based on a hierarchical communication architecture. In the proposed scheme, an efficient encryption and authentication mechanism is proposed for better fit to each individual demand-response situation. Through extensive analysis and experiment, we demonstrate how the EPPRD resists various security threats and preserves user privacy while satisfying the individual requirement in a semi-honest model; it involves less communication overhead and computation time than the existing competing schemes.

  13. EPPRD: An Efficient Privacy-Preserving Power Requirement and Distribution Aggregation Scheme for a Smart Grid

    PubMed Central

    Zhang, Lei; Zhang, Jing

    2017-01-01

    A Smart Grid (SG) facilitates bidirectional demand-response communication between individual users and power providers with high computation and communication performance but also brings about the risk of leaking users’ private information. Therefore, improving the individual power requirement and distribution efficiency to ensure communication reliability while preserving user privacy is a new challenge for SG. Based on this issue, we propose an efficient and privacy-preserving power requirement and distribution aggregation scheme (EPPRD) based on a hierarchical communication architecture. In the proposed scheme, an efficient encryption and authentication mechanism is proposed for better fit to each individual demand-response situation. Through extensive analysis and experiment, we demonstrate how the EPPRD resists various security threats and preserves user privacy while satisfying the individual requirement in a semi-honest model; it involves less communication overhead and computation time than the existing competing schemes. PMID:28783122

  14. AGScan: a pluggable microarray image quantification software based on the ImageJ library.

    PubMed

    Cathelin, R; Lopez, F; Klopp, Ch

    2007-01-15

    Many different programs are available to analyze microarray images. Most programs are commercial packages, some are free. In the latter group only few propose automatic grid alignment and batch mode. More often than not a program implements only one quantification algorithm. AGScan is an open source program that works on all major platforms. It is based on the ImageJ library [Rasband (1997-2006)] and offers a plug-in extension system to add new functions to manipulate images, align grid and quantify spots. It is appropriate for daily laboratory use and also as a framework for new algorithms. The program is freely distributed under X11 Licence. The install instructions can be found in the user manual. The software can be downloaded from http://mulcyber.toulouse.inra.fr/projects/agscan/. The questions and plug-ins can be sent to the contact listed below.

  15. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  16. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE PAGES

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  17. Design and performance analysis of generalised integrator-based controller for grid connected PV system

    NASA Astrophysics Data System (ADS)

    Saxena, Hemant; Singh, Alka; Rai, J. N.

    2018-07-01

    This article discusses the design and control of a single-phase grid-connected photovoltaic (PV) system. A 5-kW PV system is designed and integrated at the DC link of an H-bridge voltage source converter (VSC). The control of the VSC and switching logic is modelled using a generalised integrator (GI). The use of GI or its variants such as second-order GI have recently evolved for synchronisation and are being used as phase locked loop (PLL) circuits for grid integration. Design of PLL circuits and the use of transformations such as Park's and Clarke's are much easier in three-phase systems. But obtaining in-phase and quadrature components becomes an important and challenging issue in single-phase systems. This article addresses this issue and discusses an altogether different application of GI for the design of compensator based on the extraction of in-phase and quadrature components. GI is frequently used as a PLL; however, in this article, it is not used for synchronisation purposes. A new controller has been designed for a single-phase grid-connected PV system working as a single-phase active compensator. Extensive simulation results are shown for the working of integrated PV system under different atmospheric and operating conditions during daytime as well as night conditions. Experimental results showing the proposed control approach are presented and discussed for the hardware set-up developed in the laboratory.

  18. Spaceflight Operations Services Grid (SOSG)

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.

    2004-01-01

    In an effort to adapt existing space flight operations services to new emerging Grid technologies we are developing a Grid-based prototype space flight operations Grid. This prototype is based on the operational services being provided to the International Space Station's Payload operations located at the Marshall Space Flight Center, Alabama. The prototype services will be Grid or Web enabled and provided to four user communities through portal technology. Users will have the opportunity to assess the value and feasibility of Grid technologies to their specific areas or disciplines. In this presentation descriptions of the prototype development, User-based services, Grid-based services and status of the project will be presented. Expected benefits, findings and observations (if any) to date will also be discussed. The focus of the presentation will be on the project in general, status to date and future plans. The End-use services to be included in the prototype are voice, video, telemetry, commanding, collaboration tools and visualization among others. Security is addressed throughout the project and is being designed into the Grid technologies and standards development. The project is divided into three phases. Phase One establishes the baseline User-based services required for space flight operations listed above. Phase Two involves applying Gridlweb technologies to the User-based services and development of portals for access by users. Phase Three will allow NASA and end users to evaluate the services and determine the future of the technology as applied to space flight operational services. Although, Phase One, which includes the development of the quasi-operational User-based services of the prototype, development will be completed by March 2004, the application of Grid technologies to these services will have just begun. We will provide status of the Grid technologies to the individual User-based services. This effort will result in an extensible environment that incorporates existing and new spaceflight services into a standards-based framework providing current and future NASA programs with cost savings and new and evolvable methods to conduct science. This project will demonstrate how the use of new programming paradigms such as web and grid services can provide three significant benefits to the cost-effective delivery of spaceflight services. They will enable applications to operate more efficiently by being able to utilize pooled resources. They will also permit the reuse of common services to rapidly construct new and more powerful applications. Finally they will permit easy and secure access to services via a combination of grid and portal technology by a distributed user community consisting of NASA operations centers, scientists, the educational community and even the general population as outreach. The approach will be to deploy existing mission support applications such as the Telescience Resource Kit (TReK) and new applications under development, such as the Grid Video Distribution System (GViDS), together with existing grid applications and services such as high-performance computing and visualization services provided by NASA s Information Power Grid (IPG) in the MSFC s Payload Operations Integration Center (POIC) HOSC Annex. Once the initial applications have been moved to the grid, a process will begin to apply the new programming paradigms to integrate them where possible. For example, with GViDS, instead of viewing the Distribution service as an application that must run on a single node, the new approach is to build it such that it can be dispatched across a pool of resources in response to dynamic loads. To make this a reality, reusable services will be critical, such as a brokering service to locate appropriate resource within the pool. This brokering service can then be used by other applications such as the TReK. To expand further, if the GViDS application is constructed using a services-based mel, then other applications such as the Video Auditorium can then use GViDS as a service to easily incorporate these video streams into a collaborative conference. Finally, as these applications are re-factored into this new services-based paradigm, the construction of portals to integrate them will be a simple process. As a result, portals can be tailored to meet the requirements of specific user communities.

  19. Assigning value to energy storage systems at multiple points in an electrical grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balducci, Patrick J.; Alam, M. Jan E.; Hardy, Trevor D.

    This article presents a taxonomy for assigning benefits to the services provided by energy storage systems (ESSs), defines approaches for monetizing the value associated with these services, assigns values to major ESS applications by region based on a review of an extensive set of literature, and summarizes and evaluates the capabilities of several tools currently used to estimate value for specific ESS deployments.

  20. Assigning value to energy storage systems at multiple points in an electrical grid

    DOE PAGES

    Balducci, Patrick J.; Alam, M. Jan E.; Hardy, Trevor D.; ...

    2018-01-01

    This article presents a taxonomy for assigning benefits to the services provided by energy storage systems (ESSs), defines approaches for monetizing the value associated with these services, assigns values to major ESS applications by region based on a review of an extensive set of literature, and summarizes and evaluates the capabilities of several tools currently used to estimate value for specific ESS deployments.

  1. Development of Three-Dimensional DRAGON Grid Technology

    NASA Technical Reports Server (NTRS)

    Zheng, Yao; Kiou, Meng-Sing; Civinskas, Kestutis C.

    1999-01-01

    For a typical three dimensional flow in a practical engineering device, the time spent in grid generation can take 70 percent of the total analysis effort, resulting in a serious bottleneck in the design/analysis cycle. The present research attempts to develop a procedure that can considerably reduce the grid generation effort. The DRAGON grid, as a hybrid grid, is created by means of a Direct Replacement of Arbitrary Grid Overlapping by Nonstructured grid. The DRAGON grid scheme is an adaptation to the Chimera thinking. The Chimera grid is a composite structured grid, composing a set of overlapped structured grids, which are independently generated and body-fitted. The grid is of high quality and amenable for efficient solution schemes. However, the interpolation used in the overlapped region between grids introduces error, especially when a sharp-gradient region is encountered. The DRAGON grid scheme is capable of completely eliminating the interpolation and preserving the conservation property. It maximizes the advantages of the Chimera scheme and adapts the strengths of the unstructured and while at the same time keeping its weaknesses minimal. In the present paper, we describe the progress towards extending the DRAGON grid technology into three dimensions. Essential and programming aspects of the extension, and new challenges for the three-dimensional cases, are addressed.

  2. Mass production of extensive air showers for the Pierre Auger Collaboration using Grid Technology

    NASA Astrophysics Data System (ADS)

    Lozano Bahilo, Julio; Pierre Auger Collaboration

    2012-06-01

    When ultra-high energy cosmic rays enter the atmosphere they interact producing extensive air showers (EAS) which are the objects studied by the Pierre Auger Observatory. The number of particles involved in an EAS at these energies is of the order of billions and the generation of a single simulated EAS requires many hours of computing time with current processors. In addition, the storage space consumed by the output of one simulated EAS is very high. Therefore we have to make use of Grid resources to be able to generate sufficient quantities of showers for our physics studies in reasonable time periods. We have developed a set of highly automated scripts written in common software scripting languages in order to deal with the high number of jobs which we have to submit regularly to the Grid. In spite of the low number of sites supporting our Virtual Organization (VO) we have reached the top spot on CPU consumption among non LHC (Large Hadron Collider) VOs within EGI (European Grid Infrastructure).

  3. Nonlinear adaptive control of grid-connected three-phase inverters for renewable energy applications

    NASA Astrophysics Data System (ADS)

    Mahdian-Dehkordi, N.; Namvar, M.; Karimi, H.; Piya, P.; Karimi-Ghartemani, M.

    2017-01-01

    Distributed generation (DG) units are often interfaced to the main grid using power electronic converters including voltage-source converters (VSCs). A VSC offers dc/ac power conversion, high controllability, and fast dynamic response. Because of nonlinearities, uncertainties, and system parameters' changes involved in the nature of a grid-connected renewable DG system, conventional linear control methods cannot completely and efficiently address all control objectives. In this paper, a nonlinear adaptive control scheme based on adaptive backstepping strategy is presented to control the operation of a grid-connected renewable DG unit. As compared to the popular vector control technique, the proposed controller offers smoother transient responses, and lower level of current distortions. The Lyapunov approach is used to establish global asymptotic stability of the proposed control system. Linearisation technique is employed to develop guidelines for parameters tuning of the controller. Extensive time-domain digital simulations are performed and presented to verify the performance of the proposed controller when employed in a VSC to control the operation of a two-stage DG unit and also that of a single-stage solar photovoltaic system. Desirable and superior performance of the proposed controller is observed.

  4. Wave propagation in anisotropic elastic materials and curvilinear coordinates using a summation-by-parts finite difference method

    DOE PAGES

    Petersson, N. Anders; Sjogreen, Bjorn

    2015-07-20

    We develop a fourth order accurate finite difference method for solving the three-dimensional elastic wave equation in general heterogeneous anisotropic materials on curvilinear grids. The proposed method is an extension of the method for isotropic materials, previously described in the paper by Sjögreen and Petersson (2012) [11]. The method we proposed discretizes the anisotropic elastic wave equation in second order formulation, using a node centered finite difference method that satisfies the principle of summation by parts. The summation by parts technique results in a provably stable numerical method that is energy conserving. Also, we generalize and evaluate the super-grid far-fieldmore » technique for truncating unbounded domains. Unlike the commonly used perfectly matched layers (PML), the super-grid technique is stable for general anisotropic material, because it is based on a coordinate stretching combined with an artificial dissipation. Moreover, the discretization satisfies an energy estimate, proving that the numerical approximation is stable. We demonstrate by numerical experiments that sufficiently wide super-grid layers result in very small artificial reflections. Applications of the proposed method are demonstrated by three-dimensional simulations of anisotropic wave propagation in crystals.« less

  5. Advanced microgrid design and analysis for forward operating bases

    NASA Astrophysics Data System (ADS)

    Reasoner, Jonathan

    This thesis takes a holistic approach in creating an improved electric power generation system for a forward operating base (FOB) in the future through the design of an isolated microgrid. After an extensive literature search, this thesis found a need for drastic improvement of the FOB power system. A thorough design process analyzed FOB demand, researched demand side management improvements, evaluated various generation sources and energy storage options, and performed a HOMERRTM discrete optimization to determine the best microgrid design. Further sensitivity analysis was performed to see how changing parameters would affect the outcome. Lastly, this research also looks at some of the challenges which are associated with incorporating a design which relies heavily on inverter-based generation sources, and gives possible solutions to help make a renewable energy powered microgrid a reality. While this thesis uses a FOB as the case study, the process and discussion can be adapted to aide in the design of an off-grid small-scale power grid which utilizes high-penetration levels of renewable energy.

  6. Research in millimeter wave techniques

    NASA Technical Reports Server (NTRS)

    Mcmillan, R. W.

    1978-01-01

    During the past six months, efforts on this project have been devoted to: (1) continuation of construction and testing of a 6 GHz subharmonic mixer model with extension of the pumping frequency of this mixer to omega sub s/4, (2) construction of a 183 GHz subharmonic mixer based on the results of tests on this 6 GHz model, (3) ground-based radiometric measurements at 183 GHz, (4) fabrication and testing of wire grid interferometers, (5) calculations of reflected and lost power in these interferometers, and (6) calculations of the antenna temperature due to water vapor to be expected in down-looking radiometry as a function of frequency. Significant events during the past six months include: (1) Receipt of a 183 GHz single-ended fundamental mixer, (2) attainment of 6 db single sideband conversion loss with the 6 GHz subharmonic mixer model by using a 1.5 GHz (omega sub s/4) pump frequency, (3) additional ground-based radiometric measurements and (4) derivation of equations for reflection and loss for wire grid interferometers.

  7. Numerical Simulation of Flow Through an Artificial Heart

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; Kutler, Paul; Kwak, Dochan; Kiris, Cetin

    1989-01-01

    A solution procedure was developed that solves the unsteady, incompressible Navier-Stokes equations, and was used to numerically simulate viscous incompressible flow through a model of the Pennsylvania State artificial heart. The solution algorithm is based on the artificial compressibility method, and uses flux-difference splitting to upwind the convective terms; a line-relaxation scheme is used to solve the equations. The time-accuracy of the method is obtained by iteratively solving the equations at each physical time step. The artificial heart geometry involves a piston-type action with a moving solid wall. A single H-grid is fit inside the heart chamber. The grid is continuously compressed and expanded with a constant number of grid points to accommodate the moving piston. The computational domain ends at the valve openings where nonreflective boundary conditions based on the method of characteristics are applied. Although a number of simplifing assumptions were made regarding the geometry, the computational results agreed reasonably well with an experimental picture. The computer time requirements for this flow simulation, however, are quite extensive. Computational study of this type of geometry would benefit greatly from improvements in computer hardware speed and algorithm efficiency enhancements.

  8. Integrating Xgrid into the HENP distributed computing model

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.

    2008-07-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  9. A digital repository with an extensible data model for biobanking and genomic analysis management.

    PubMed

    Izzo, Massimiliano; Mortola, Francesco; Arnulfo, Gabriele; Fato, Marco M; Varesio, Luigi

    2014-01-01

    Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information sharing in specific research projects and purposes. This approach can improve sensitively interdisciplinary research collaboration and allows to track patients' clinical records, sample management information, and genomic data. The web interface allows the operators to easily manage, query, and annotate the files, without dealing with the technicalities of the data grid.

  10. A digital repository with an extensible data model for biobanking and genomic analysis management

    PubMed Central

    2014-01-01

    Motivation Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. Results We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Conclusions Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information sharing in specific research projects and purposes. This approach can improve sensitively interdisciplinary research collaboration and allows to track patients' clinical records, sample management information, and genomic data. The web interface allows the operators to easily manage, query, and annotate the files, without dealing with the technicalities of the data grid. PMID:25077808

  11. Functional Resilience against Climate-Driven Extinctions – Comparing the Functional Diversity of European and North American Tree Floras

    PubMed Central

    Liebergesell, Mario; Stahl, Ulrike; Freiberg, Martin; Welk, Erik; Kattge, Jens; Cornelissen, J. Hans C.; Peñuelas, Josep

    2016-01-01

    Future global change scenarios predict a dramatic loss of biodiversity for many regions in the world, potentially reducing the resistance and resilience of ecosystem functions. Once before, during Plio-Pleistocene glaciations, harsher climatic conditions in Europe as compared to North America led to a more depauperate tree flora. Here we hypothesize that this climate driven species loss has also reduced functional diversity in Europe as compared to North America. We used variation in 26 traits for 154 North American and 66 European tree species and grid-based co-occurrences derived from distribution maps to compare functional diversity patterns of the two continents. First, we identified similar regions with respect to contemporary climate in the temperate zone of North America and Europe. Second, we compared the functional diversity of both continents and for the climatically similar sub-regions using the functional dispersion-index (FDis) and the functional richness index (FRic). Third, we accounted in these comparisons for grid-scale differences in species richness, and, fourth, investigated the associated trait spaces using dimensionality reduction. For gymnosperms we find similar functional diversity on both continents, whereas for angiosperms functional diversity is significantly greater in Europe than in North America. These results are consistent across different scales, for climatically similar regions and considering species richness patterns. We decomposed these differences in trait space occupation into differences in functional diversity vs. differences in functional identity. We show that climate-driven species loss on a continental scale might be decoupled from or at least not linearly related to changes in functional diversity. This might be important when analyzing the effects of climate-driven biodiversity change on ecosystem functioning. PMID:26848836

  12. Occurrence and countermeasures of urban power grid accident

    NASA Astrophysics Data System (ADS)

    Wei, Wang; Tao, Zhang

    2018-03-01

    With the advance of technology, the development of network communication and the extensive use of power grids, people can get to know power grid accidents around the world through the network timely. Power grid accidents occur frequently. Large-scale power system blackout and casualty accidents caused by electric shock are also fairly commonplace. All of those accidents have seriously endangered the property and personal safety of the country and people, and the development of society and economy is severely affected by power grid accidents. Through the researches on several typical cases of power grid accidents at home and abroad in recent years and taking these accident cases as the research object, this paper will analyze the three major factors that cause power grid accidents at present. At the same time, combining with various factors and impacts caused by power grid accidents, the paper will put forward corresponding solutions and suggestions to prevent the occurrence of the accident and lower the impact of the accident.

  13. Efficient parallel seismic simulations including topography and 3-D material heterogeneities on locally refined composite grids

    NASA Astrophysics Data System (ADS)

    Petersson, Anders; Rodgers, Arthur

    2010-05-01

    The finite difference method on a uniform Cartesian grid is a highly efficient and easy to implement technique for solving the elastic wave equation in seismic applications. However, the spacing in a uniform Cartesian grid is fixed throughout the computational domain, whereas the resolution requirements in realistic seismic simulations usually are higher near the surface than at depth. This can be seen from the well-known formula h ≤ L-P which relates the grid spacing h to the wave length L, and the required number of grid points per wavelength P for obtaining an accurate solution. The compressional and shear wave lengths in the earth generally increase with depth and are often a factor of ten larger below the Moho discontinuity (at about 30 km depth), than in sedimentary basins near the surface. A uniform grid must have a grid spacing based on the small wave lengths near the surface, which results in over-resolving the solution at depth. As a result, the number of points in a uniform grid is unnecessarily large. In the wave propagation project (WPP) code, we address the over-resolution-at-depth issue by generalizing our previously developed single grid finite difference scheme to work on a composite grid consisting of a set of structured rectangular grids of different spacings, with hanging nodes on the grid refinement interfaces. The computational domain in a regional seismic simulation often extends to depth 40-50 km. Hence, using a refinement ratio of two, we need about three grid refinements from the bottom of the computational domain to the surface, to keep the local grid size in approximate parity with the local wave lengths. The challenge of the composite grid approach is to find a stable and accurate method for coupling the solution across the grid refinement interface. Of particular importance is the treatment of the solution at the hanging nodes, i.e., the fine grid points which are located in between coarse grid points. WPP implements a new, energy conserving, coupling procedure for the elastic wave equation at grid refinement interfaces. When used together with our single grid finite difference scheme, it results in a method which is provably stable, without artificial dissipation, for arbitrary heterogeneous isotropic elastic materials. The new coupling procedure is based on satisfying the summation-by-parts principle across refinement interfaces. From a practical standpoint, an important advantage of the proposed method is the absence of tunable numerical parameters, which seldom are appreciated by application experts. In WPP, the composite grid discretization is combined with a curvilinear grid approach that enables accurate modeling of free surfaces on realistic (non-planar) topography. The overall method satisfies the summation-by-parts principle and is stable under a CFL time step restriction. A feature of great practical importance is that WPP automatically generates the composite grid based on the user provided topography and the depths of the grid refinement interfaces. The WPP code has been verified extensively, for example using the method of manufactured solutions, by solving Lamb's problem, by solving various layer over half- space problems and comparing to semi-analytic (FK) results, and by simulating scenario earthquakes where results from other seismic simulation codes are available. WPP has also been validated against seismographic recordings of moderate earthquakes. WPP performs well on large parallel computers and has been run on up to 32,768 processors using about 26 Billion grid points (78 Billion DOF) and 41,000 time steps. WPP is an open source code that is available under the Gnu general public license.

  14. Three-dimensional unstructured grid refinement and optimization using edge-swapping

    NASA Technical Reports Server (NTRS)

    Gandhi, Amar; Barth, Timothy

    1993-01-01

    This paper presents a three-dimensional (3-D) 'edge-swapping method based on local transformations. This method extends Lawson's edge-swapping algorithm into 3-D. The 3-D edge-swapping algorithm is employed for the purpose of refining and optimizing unstructured meshes according to arbitrary mesh-quality measures. Several criteria including Delaunay triangulations are examined. Extensions from two to three dimensions of several known properties of Delaunay triangulations are also discussed.

  15. Small-Scale Smart Grid Construction and Analysis

    NASA Astrophysics Data System (ADS)

    Surface, Nicholas James

    The smart grid (SG) is a commonly used catch-phrase in the energy industry yet there is no universally accepted definition. The objectives and most useful concepts have been investigated extensively in economic, environmental and engineering research by applying statistical knowledge and established theories to develop simulations without constructing physical models. In this study, a small-scale version (SSSG) is constructed to physically represent these ideas so they can be evaluated. Results of construction show data acquisition three times more expensive than the grid itself although mainly due to the incapability to downsize 70% of data acquisition costs to small-scale. Experimentation on the fully assembled grid exposes the limitations of low cost modified sine wave power, significant enough to recommend pure sine wave investment in future SSSG iterations. Findings can be projected to full-size SG at a ratio of 1:10, based on the appliance representing average US household peak daily load. However this exposes disproportionalities in the SSSG compared with previous SG investigations and recommended changes for future iterations are established to remedy this issue. Also discussed are other ideas investigated in the literature and their suitability for SSSG incorporation. It is highly recommended to develop a user-friendly bidirectional charger to more accurately represent vehicle-to-grid (V2G) infrastructure. Smart homes, BEV swap stations and pumped hydroelectric storage can also be researched on future iterations of the SSSG.

  16. National Grid Deep Energy Retrofit Pilot Program—Clark Residence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-03-30

    In this case study, Building Science Corporation partnered with local utility company, National Grid, Massachusetts homes. This project involved the renovation of a 18th century Cape-style building and achieved a super-insulated enclosure (R-35 walls, R-50+ roof, R-20+ foundation), extensive water management improvements, high-efficiency water heater, and state-of-the-art ventilation.

  17. IEC 61850: Technology Standards and Cyber-Security Threats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youssef, Tarek A; El Hariri, mohamed; Bugay, Nicole

    Substations constitute a fundamental part in providing reliable electricity to consumers. For a substation to maintain electricity reliability and its own real-time operability, communication between its components is inevitable. Before the emergence of IEC 61850, inter-substation communication was established via expensive copper wires with limited capabilities. IEC 61850 is the standard set by the International Electrotechnical Commission (IEC) Technical Committee Number 57 Working Group 10 and IEEE for Ethernet (IEEE 802.3)-based communication in electrical substations. Like many power grid systems standards, IEC 61850 was set without extensive consideration for critical security measures. This paper discusses IEC 61850 technology standards andmore » applications thoroughly and points out major security vulnerabilities it introduces in the context of current cyber-physical smart grid systems.« less

  18. Impact of spatial proxies on the representation of bottom-up emission inventories: A satellite-based analysis

    NASA Astrophysics Data System (ADS)

    Geng, Guannan; Zhang, Qiang; Martin, Randall V.; Lin, Jintai; Huo, Hong; Zheng, Bo; Wang, Siwen; He, Kebin

    2017-03-01

    Spatial proxies used in bottom-up emission inventories to derive the spatial distributions of emissions are usually empirical and involve additional levels of uncertainty. Although uncertainties in current emission inventories have been discussed extensively, uncertainties resulting from improper spatial proxies have rarely been evaluated. In this work, we investigate the impact of spatial proxies on the representation of gridded emissions by comparing six gridded NOx emission datasets over China developed from the same magnitude of emissions and different spatial proxies. GEOS-Chem-modeled tropospheric NO2 vertical columns simulated from different gridded emission inventories are compared with satellite-based columns. The results show that differences between modeled and satellite-based NO2 vertical columns are sensitive to the spatial proxies used in the gridded emission inventories. The total population density is less suitable for allocating NOx emissions than nighttime light data because population density tends to allocate more emissions to rural areas. Determining the exact locations of large emission sources could significantly strengthen the correlation between modeled and observed NO2 vertical columns. Using vehicle population and an updated road network for the on-road transport sector could substantially enhance urban emissions and improve the model performance. When further applying industrial gross domestic product (IGDP) values for the industrial sector, modeled NO2 vertical columns could better capture pollution hotspots in urban areas and exhibit the best performance of the six cases compared to satellite-based NO2 vertical columns (slope = 1.01 and R2 = 0. 85). This analysis provides a framework for information from satellite observations to inform bottom-up inventory development. In the future, more effort should be devoted to the representation of spatial proxies to improve spatial patterns in bottom-up emission inventories.

  19. The mass-loss return from evolved stars to the Large Magellanic Cloud. V. The GRAMS carbon-star model grid

    NASA Astrophysics Data System (ADS)

    Srinivasan, S.; Sargent, B. A.; Meixner, M.

    2011-08-01

    Context. Outflows from asymptotic giant branch (AGB) and red supergiant (RSG) stars inject dust into the interstellar medium. The total rate of dust return provides an important constraint to galactic chemical evolution models. However, this requires detailed radiative transfer (RT) modeling of individual stars, which becomes impractical for large data sets. An alternative approach is to select the best-fit spectral energy distribution (SED) from a grid of dust shell models, allowing for a faster determination of the luminosities and mass-loss rates for entire samples. Aims: We have developed the Grid of RSG and AGB ModelS (GRAMS) to measure the mass-loss return from evolved stars. The models span the range of stellar, dust shell and grain properties relevant to evolved stars. The GRAMS model database will be made available to the scientific community. In this paper we present the carbon-rich AGB model grid and compare our results with photometry and spectra of Large Magellanic Cloud (LMC) carbon stars from the SAGE (Surveying the Agents of Galaxy Evolution) and SAGE-Spec programs. Methods: We generate models for spherically symmetric dust shells using the 2Dust code, with hydrostatic models for the central stars. The model photospheres have effective temperatures between 2600 and 4000 K and luminosities from ~2000 L⊙ to ~40 000 L⊙. Assuming a constant expansion velocity, we explore five values of the inner radius Rin of the dust shell (1.5, 3, 4.5, 7 and 12 Rstar). We fix the outer radius at 1000 Rin. Based on the results from our previous study, we use amorphous carbon dust mixed with 10% silicon carbide by mass. The grain size distribution follows a power-law and an exponential falloff at large sizes. The models span twenty-six values of 11.3 μm optical depth, ranging from 0.001 to 4. For each model, 2Dust calculates the output SED from 0.2 to 200 μm. Results: Over 12 000 models have dust temperatures below 1800 K. For these, we derive synthetic photometry in optical, near-infrared and mid-infrared filters for comparison with available data. We find good agreement with magnitudes and colors observed for LMC carbon-rich and extreme AGB star candidates from the SAGE survey, as well as spectroscopically confirmed carbon stars from the SAGE-Spec study. Our models reproduce the IRAC colors of most of the extreme AGB star candidates, consistent with the expectation that a majority of these enshrouded stars have carbon-rich dust. Finally, we fit the SEDs of some well-studied carbon stars and compare the resulting luminosities and mass-loss rates with those from previous studies. The model grid is only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/532/A54

  20. Implementing groundwater extraction in life cycle impact assessment: characterization factors based on plant species richness for The Netherlands.

    PubMed

    van Zelm, Rosalie; Schipper, Aafke M; Rombouts, Michiel; Snepvangers, Judith; Huijbregts, Mark A J

    2011-01-15

    An operational method to evaluate the environmental impacts associated with groundwater use is currently lacking in life cycle assessment (LCA). This paper outlines a method to calculate characterization factors that address the effects of groundwater extraction on the species richness of terrestrial vegetation. Characterization factors (CF) were derived for The Netherlands and consist of a fate and an effect part. The fate factor equals the change in drawdown due to a change in groundwater extraction and expresses the amount of time required for groundwater replenishment. It was obtained with a grid-specific steady-state groundwater flow model. Effect factors were obtained from groundwater level response curves of potential plant species richness, which was constructed based on the soil moisture requirements of 625 plant species. Depending on the initial groundwater level, effect factors range up to 9.2% loss of species per 10 cm of groundwater level decrease. The total Dutch CF for groundwater extraction depended on the value choices taken and ranged from 0.09 to 0.61 m(2)·yr/m(3). For tap water production, we showed that groundwater extraction can be responsible for up to 32% of the total terrestrial ecosystem damage. With the proposed approach, effects of groundwater extraction on terrestrial ecosystems can be systematically included in LCA.

  1. Reptiles of Chubut province, Argentina: richness, diversity, conservation status and geographic distribution maps.

    PubMed

    Minoli, Ignacio; Morando, Mariana; Avila, Luciano Javier

    2015-01-01

    An accurate estimation of species and population geographic ranges is essential for species-focused studies and conservation and management plans. Knowledge of the geographic distributions of reptiles from Patagonian Argentina is in general limited and dispersed over manuscripts from a wide variety of topics. We completed an extensive review of reptile species of central Patagonia (Argentina) based on information from a wide variety of sources. We compiled and checked geographic distribution records from published literature and museum records, including extensive new data from the LJAMM-CNP (CENPAT-CONICET) herpetological collection. Our results show that there are 52 taxa recorded for this region and the highest species richness was seen in the families Liolaemidae and Dipsadidae with 31 and 10 species, respectively. The Patagónica was the phytogeographic province most diverse in species and Phymaturus was the genus of conservation concern most strongly associated with it. We present a detailed species list with geographical information, richness species, diversity analyses with comparisons across phytogeographical provinces, conservation status, taxonomic comments and distribution maps for all of these taxa.

  2. Reptiles of Chubut province, Argentina: richness, diversity, conservation status and geographic distribution maps

    PubMed Central

    Minoli, Ignacio; Morando, Mariana; Avila, Luciano Javier

    2015-01-01

    Abstract An accurate estimation of species and population geographic ranges is essential for species-focused studies and conservation and management plans. Knowledge of the geographic distributions of reptiles from Patagonian Argentina is in general limited and dispersed over manuscripts from a wide variety of topics. We completed an extensive review of reptile species of central Patagonia (Argentina) based on information from a wide variety of sources. We compiled and checked geographic distribution records from published literature and museum records, including extensive new data from the LJAMM-CNP (CENPAT-CONICET) herpetological collection. Our results show that there are 52 taxa recorded for this region and the highest species richness was seen in the families Liolaemidae and Dipsadidae with 31 and 10 species, respectively. The Patagónica was the phytogeographic province most diverse in species and Phymaturus was the genus of conservation concern most strongly associated with it. We present a detailed species list with geographical information, richness species, diversity analyses with comparisons across phytogeographical provinces, conservation status, taxonomic comments and distribution maps for all of these taxa. PMID:25931966

  3. Changing from computing grid to knowledge grid in life-science grid.

    PubMed

    Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy

    2009-09-01

    Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  4. Mass Loss at Higher Metallicity: Quantifying the Mass Return from Evolved Stars in the Galactic

    NASA Astrophysics Data System (ADS)

    Sargent, Benjamin

    Bulge Mass-losing evolved stars, and in particular asymptotic giant branch (AGB) stars and red supergiant (RSG) stars, are expected to be the major producers of dust in galaxies. This dust will help form planetary systems around future generations of stars. Our ADAP program to measure the mass loss from the AGB and RSG stars in the Magellanic Clouds is nearing completion, and we wish to extend this successful study to the Galactic bulge of the Milky Way Galaxy. Metallicity should determine the amount of elements available to condense dust in the star's outflow, so evolved stars of differing metallicities should have differing mass-loss rates. Building upon our work on evolved stars in the Magellanic Clouds, we will compare the mass-loss rates from AGB and RSG stars in the older and potentially more metal-rich Bulge to the mass-loss rates of AGB and RSG stars in the Magellanic Clouds, which have lower metallicity, making for an interesting contrast. In addition, the Galactic bulge, like the Clouds, is located at a well-determined distance ( 8 kpc), thereby removing the distance ambiguities that present a major uncertainty in determining mass-loss rates and luminosities for evolved stars. To model photometric observations of outflowing dust shells around evolved stars, we have constructed the Grid of Red supergiant and Asymptotic giant branch ModelS (GRAMS; Sargent et al 2011; Srinivasan et al 2011) using the radiative transfer code 2Dust (Ueta and Meixner 2003). Our study will apply these models to the large photometric database of sources identified in the Spitzer Space Telescope GLIMPSE survey of the Milky Way and also to the various infrared spectra of Bulge AGB and RSG stars from Spitzer, ISO, etc. We have already modeled a few Galactic bulge evolved stars with GRAMS, and we will use these results as the foundation for modeling a large and representative sample of Galactic bulge evolved stars identified and measured photometrically by GLIMPSE. We will use our GRAMS grid, expanding as necessary to enable modeling of the higher metallicity evolved stars of the Galactic bulge, along with models of other types of stars, such as YSOs (Robitaille et al 2006), to identify the evolved stars in the GLIMPSE sample of the Galactic bulge. We will use these well-tested GRAMS models, which we have already extensively applied to study populations of mass losing evolved stars in the Magellanic Clouds, to fit the Spectral Energy Distributions (SEDs; plots of emitted power versus wavelength) of GLIMPSE Galactic bulge sources identified as RSG stars and oxygen-rich (O-rich), carbon-rich (C-rich), and extreme AGB stars. This modeling will yield stellar luminosities and mass-loss rates, as well as general dust chemistry (Orich versus C-rich) and other essential characteristics of the dust produced by evolved stars in the galactic plane. Our ongoing Magellanic Cloud and proposed Milky Way Galactic bulge evolved star studies will lay the groundwork for future studies of evolved stars in other nearby galaxies using data from the James Webb Space Telescope and other planned missions.

  5. Convergence acceleration of viscous flow computations

    NASA Technical Reports Server (NTRS)

    Johnson, G. M.

    1982-01-01

    A multiple-grid convergence acceleration technique introduced for application to the solution of the Euler equations by means of Lax-Wendroff algorithms is extended to treat compressible viscous flow. Computational results are presented for the solution of the thin-layer version of the Navier-Stokes equations using the explicit MacCormack algorithm, accelerated by a convective coarse-grid scheme. Extensions and generalizations are mentioned.

  6. NCAR's Research Data Archive: OPeNDAP Access for Complex Datasets

    NASA Astrophysics Data System (ADS)

    Dattore, R.; Worley, S. J.

    2014-12-01

    Many datasets have complex structures including hundreds of parameters and numerous vertical levels, grid resolutions, and temporal products. Making these data accessible is a challenge for a data provider. OPeNDAP is powerful protocol for delivering in real-time multi-file datasets that can be ingested by many analysis and visualization tools, but for these datasets there are too many choices about how to aggregate. Simple aggregation schemes can fail to support, or at least make it very challenging, for many potential studies based on complex datasets. We address this issue by using a rich file content metadata collection to create a real-time customized OPeNDAP service to match the full suite of access possibilities for complex datasets. The Climate Forecast System Reanalysis (CFSR) and it's extension, the Climate Forecast System Version 2 (CFSv2) datasets produced by the National Centers for Environmental Prediction (NCEP) and hosted by the Research Data Archive (RDA) at the Computational and Information Systems Laboratory (CISL) at NCAR are examples of complex datasets that are difficult to aggregate with existing data server software. CFSR and CFSv2 contain 141 distinct parameters on 152 vertical levels, six grid resolutions and 36 products (analyses, n-hour forecasts, multi-hour averages, etc.) where not all parameter/level combinations are available at all grid resolution/product combinations. These data are archived in the RDA with the data structure provided by the producer; no additional re-organization or aggregation have been applied. Since 2011, users have been able to request customized subsets (e.g. - temporal, parameter, spatial) from the CFSR/CFSv2, which are processed in delayed-mode and then downloaded to a user's system. Until now, the complexity has made it difficult to provide real-time OPeNDAP access to the data. We have developed a service that leverages the already-existing subsetting interface and allows users to create a virtual dataset with its own structure (das, dds). The user receives a URL to the customized dataset that can be used by existing tools to ingest, analyze, and visualize the data. This presentation will detail the metadata system and OPeNDAP server that enable user-customized real-time access and show an example of how a visualization tool can access the data.

  7. Phylogeny and taxonomy of Ophiognomonia (Gnomoniaceae, Diaporthales), including twenty-five new species in this highly diverse genus

    USDA-ARS?s Scientific Manuscript database

    Species of Ophiognomonia are leaf-inhabiting endophytes, pathogens, and saprobes that infect plants in the families Betulaceae, Fagaceae, Juglandaceae, Lauraceae, Malvaceae, Platanaceae, Rosaceae, Salicaceae, and Sapindaceae. Based on extensive collecting, this species-rich genus is now known to hav...

  8. Probing the Dusty Stellar Populations of the Local Volume Galaxies with JWST/MIRI

    NASA Astrophysics Data System (ADS)

    Jones, Olivia C.; Meixner, Margaret; Justtanont, Kay; Glasse, Alistair

    2017-05-01

    The Mid-Infrared Instrument (MIRI) for the James Webb Space Telescope (JWST) will revolutionize our understanding of infrared stellar populations in the Local Volume. Using the rich Spitzer-IRS spectroscopic data set and spectral classifications from the Surveying the Agents of Galaxy Evolution (SAGE)-Spectroscopic survey of more than 1000 objects in the Magellanic Clouds, the Grid of Red Supergiant and Asymptotic Giant Branch Star Model (grams), and the grid of YSO models by Robitaille et al., we calculate the expected flux densities and colors in the MIRI broadband filters for prominent infrared stellar populations. We use these fluxes to explore the JWST/MIRI colors and magnitudes for composite stellar population studies of Local Volume galaxies. MIRI color classification schemes are presented; these diagrams provide a powerful means of identifying young stellar objects, evolved stars, and extragalactic background galaxies in Local Volume galaxies with a high degree of confidence. Finally, we examine which filter combinations are best for selecting populations of sources based on their JWST colors.

  9. Application of a lower-upper implicit scheme and an interactive grid generation for turbomachinery flow field simulations

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Soh, Woo-Yung; Yoon, Seokkwan

    1989-01-01

    A finite-volume lower-upper (LU) implicit scheme is used to simulate an inviscid flow in a tubine cascade. This approximate factorization scheme requires only the inversion of sparse lower and upper triangular matrices, which can be done efficiently without extensive storage. As an implicit scheme it allows a large time step to reach the steady state. An interactive grid generation program (TURBO), which is being developed, is used to generate grids. This program uses the control point form of algebraic grid generation which uses a sparse collection of control points from which the shape and position of coordinate curves can be adjusted. A distinct advantage of TURBO compared with other grid generation programs is that it allows the easy change of local mesh structure without affecting the grid outside the domain of independence. Sample grids are generated by TURBO for a compressor rotor blade and a turbine cascade. The turbine cascade flow is simulated by using the LU implicit scheme on the grid generated by TURBO.

  10. Asynchronous Replica Exchange Software for Grid and Heterogeneous Computing.

    PubMed

    Gallicchio, Emilio; Xia, Junchao; Flynn, William F; Zhang, Baofeng; Samlalsingh, Sade; Mentes, Ahmet; Levy, Ronald M

    2015-11-01

    Parallel replica exchange sampling is an extended ensemble technique often used to accelerate the exploration of the conformational ensemble of atomistic molecular simulations of chemical systems. Inter-process communication and coordination requirements have historically discouraged the deployment of replica exchange on distributed and heterogeneous resources. Here we describe the architecture of a software (named ASyncRE) for performing asynchronous replica exchange molecular simulations on volunteered computing grids and heterogeneous high performance clusters. The asynchronous replica exchange algorithm on which the software is based avoids centralized synchronization steps and the need for direct communication between remote processes. It allows molecular dynamics threads to progress at different rates and enables parameter exchanges among arbitrary sets of replicas independently from other replicas. ASyncRE is written in Python following a modular design conducive to extensions to various replica exchange schemes and molecular dynamics engines. Applications of the software for the modeling of association equilibria of supramolecular and macromolecular complexes on BOINC campus computational grids and on the CPU/MIC heterogeneous hardware of the XSEDE Stampede supercomputer are illustrated. They show the ability of ASyncRE to utilize large grids of desktop computers running the Windows, MacOS, and/or Linux operating systems as well as collections of high performance heterogeneous hardware devices.

  11. Spectral (Finite) Volume Method for Conservation Laws on Unstructured Grids II: Extension to Two Dimensional Scalar Equation

    NASA Technical Reports Server (NTRS)

    Wang, Z. J.; Liu, Yen; Kwak, Dochan (Technical Monitor)

    2002-01-01

    The framework for constructing a high-order, conservative Spectral (Finite) Volume (SV) method is presented for two-dimensional scalar hyperbolic conservation laws on unstructured triangular grids. Each triangular grid cell forms a spectral volume (SV), and the SV is further subdivided into polygonal control volumes (CVs) to supported high-order data reconstructions. Cell-averaged solutions from these CVs are used to reconstruct a high order polynomial approximation in the SV. Each CV is then updated independently with a Godunov-type finite volume method and a high-order Runge-Kutta time integration scheme. A universal reconstruction is obtained by partitioning all SVs in a geometrically similar manner. The convergence of the SV method is shown to depend on how a SV is partitioned. A criterion based on the Lebesgue constant has been developed and used successfully to determine the quality of various partitions. Symmetric, stable, and convergent linear, quadratic, and cubic SVs have been obtained, and many different types of partitions have been evaluated. The SV method is tested for both linear and non-linear model problems with and without discontinuities.

  12. Development of data acquisition and over-current protection systems for a suppressor-grid current with a neutral-beam ion source

    NASA Astrophysics Data System (ADS)

    Wei, LIU; Chundong, HU; Sheng, LIU; Shihua, SONG; Jinxin, WANG; Yan, WANG; Yuanzhe, ZHAO; Lizhen, LIANG

    2017-12-01

    Neutral beam injection is one of the effective auxiliary heating methods in magnetic-confinement-fusion experiments. In order to acquire the suppressor-grid current signal and avoid the grid being damaged by overheating, a data acquisition and over-current protection system based on the PXI (PCI eXtensions for Instrumentation) platform has been developed. The system consists of a current sensor, data acquisition module and over-current protection module. In the data acquisition module, the acquired data of one shot will be transferred in isolation and saved in a data-storage server in a txt file. It can also be recalled using NBWave for future analysis. The over-current protection module contains two modes: remote and local. This gives it the function of setting a threshold voltage remotely and locally, and the forbidden time of over-current protection also can be set by a host PC in remote mode. Experimental results demonstrate that the data acquisition and over-current protection system has the advantages of setting forbidden time and isolation transmission.

  13. A pressure-based semi-implicit space-time discontinuous Galerkin method on staggered unstructured meshes for the solution of the compressible Navier-Stokes equations at all Mach numbers

    NASA Astrophysics Data System (ADS)

    Tavelli, Maurizio; Dumbser, Michael

    2017-07-01

    We propose a new arbitrary high order accurate semi-implicit space-time discontinuous Galerkin (DG) method for the solution of the two and three dimensional compressible Euler and Navier-Stokes equations on staggered unstructured curved meshes. The method is pressure-based and semi-implicit and is able to deal with all Mach number flows. The new DG scheme extends the seminal ideas outlined in [1], where a second order semi-implicit finite volume method for the solution of the compressible Navier-Stokes equations with a general equation of state was introduced on staggered Cartesian grids. Regarding the high order extension we follow [2], where a staggered space-time DG scheme for the incompressible Navier-Stokes equations was presented. In our scheme, the discrete pressure is defined on the primal grid, while the discrete velocity field and the density are defined on a face-based staggered dual grid. Then, the mass conservation equation, as well as the nonlinear convective terms in the momentum equation and the transport of kinetic energy in the energy equation are discretized explicitly, while the pressure terms appearing in the momentum and energy equation are discretized implicitly. Formal substitution of the discrete momentum equation into the total energy conservation equation yields a linear system for only one unknown, namely the scalar pressure. Here the equation of state is assumed linear with respect to the pressure. The enthalpy and the kinetic energy are taken explicitly and are then updated using a simple Picard procedure. Thanks to the use of a staggered grid, the final pressure system is a very sparse block five-point system for three dimensional problems and it is a block four-point system in the two dimensional case. Furthermore, for high order in space and piecewise constant polynomials in time, the system is observed to be symmetric and positive definite. This allows to use fast linear solvers such as the conjugate gradient (CG) method. In addition, all the volume and surface integrals needed by the scheme depend only on the geometry and the polynomial degree of the basis and test functions and can therefore be precomputed and stored in a preprocessing stage. This leads to significant savings in terms of computational effort for the time evolution part. In this way also the extension to a fully curved isoparametric approach becomes natural and affects only the preprocessing step. The viscous terms and the heat flux are also discretized making use of the staggered grid by defining the viscous stress tensor and the heat flux vector on the dual grid, which corresponds to the use of a lifting operator, but on the dual grid. The time step of our new numerical method is limited by a CFL condition based only on the fluid velocity and not on the sound speed. This makes the method particularly interesting for low Mach number flows. Finally, a very simple combination of artificial viscosity and the a posteriori MOOD technique allows to deal with shock waves and thus permits also to simulate high Mach number flows. We show computational results for a large set of two and three-dimensional benchmark problems, including both low and high Mach number flows and using polynomial approximation degrees up to p = 4.

  14. Smart Grid Privacy through Distributed Trust

    NASA Astrophysics Data System (ADS)

    Lipton, Benjamin

    Though the smart electrical grid promises many advantages in efficiency and reliability, the risks to consumer privacy have impeded its deployment. Researchers have proposed protecting privacy by aggregating user data before it reaches the utility, using techniques of homomorphic encryption to prevent exposure of unaggregated values. However, such schemes generally require users to trust in the correct operation of a single aggregation server. We propose two alternative systems based on secret sharing techniques that distribute this trust among multiple service providers, protecting user privacy against a misbehaving server. We also provide an extensive evaluation of the systems considered, comparing their robustness to privacy compromise, error handling, computational performance, and data transmission costs. We conclude that while all the systems should be computationally feasible on smart meters, the two methods based on secret sharing require much less computation while also providing better protection against corrupted aggregators. Building systems using these techniques could help defend the privacy of electricity customers, as well as customers of other utilities as they move to a more data-driven architecture.

  15. Erratum: ``A Grid of Non-LTE Line-blanketed Model Atmospheres of O-Type Stars'' (ApJS, 146, 417 [2003])

    NASA Astrophysics Data System (ADS)

    Lanz, Thierry; Hubeny, Ivan

    2003-07-01

    We have constructed a comprehensive grid of 680 metal line-blanketed, non-LTE, plane-parallel, hydrostatic model atmospheres for the basic parameters appropriate to O-type stars. The OSTAR2002 grid considers 12 values of effective temperatures, 27,500K<=Teff<=55,000 K with 2500 K steps, eight surface gravities, 3.0<=logg<=4.75 with 0.25 dex steps, and 10 chemical compositions, from metal-rich relative to the Sun to metal-free. The lower limit of logg for a given effective temperature is set by an approximate location of the Eddington limit. The selected chemical compositions have been chosen to cover a number of typical environments of massive stars: the Galactic center, the Magellanic Clouds, blue compact dwarf galaxies like I Zw 18, and galaxies at high redshifts. The paper contains a description of the OSTAR2002 grid and some illustrative examples and comparisons. The complete OSTAR2002 grid is available at our Web site at ApJS, 146, 417 [2003]. Laboratory for Astronomy and Solar Physics, NASA Goddard Space Flight Center, Code 681, Greenbelt, MD 20771.

  16. An adaptive grid scheme using the boundary element method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munipalli, R.; Anderson, D.A.

    1996-09-01

    A technique to solve the Poisson grid generation equations by Green`s function related methods has been proposed, with the source terms being purely position dependent. The use of distributed singularities in the flow domain coupled with the boundary element method (BEM) formulation is presented in this paper as a natural extension of the Green`s function method. This scheme greatly simplifies the adaption process. The BEM reduces the dimensionality of the given problem by one. Internal grid-point placement can be achieved for a given boundary distribution by adding continuous and discrete source terms in the BEM formulation. A distribution of vortexmore » doublets is suggested as a means of controlling grid-point placement and grid-line orientation. Examples for sample adaption problems are presented and discussed. 15 refs., 20 figs.« less

  17. UDE-based control of variable-speed wind turbine systems

    NASA Astrophysics Data System (ADS)

    Ren, Beibei; Wang, Yeqin; Zhong, Qing-Chang

    2017-01-01

    In this paper, the control of a PMSG (permanent magnet synchronous generator)-based variable-speed wind turbine system with a back-to-back converter is considered. The uncertainty and disturbance estimator (UDE)-based control approach is applied to the regulation of the DC-link voltage and the control of the RSC (rotor-side converter) and the GSC (grid-side converter). For the rotor-side controller, the UDE-based vector control is developed for the RSC with PMSG control to facilitate the application of the MPPT (maximum power point tracking) algorithm for the maximum wind energy capture. For the grid-side controller, the UDE-based vector control is developed to control the GSC with the power reference generated by a UDE-based DC-link voltage controller. Compared with the conventional vector control, the UDE-based vector control can achieve reliable current decoupling control with fast response. Moreover, the UDE-based DC-link voltage regulation can achieve stable DC-link voltage under model uncertainties and external disturbances, e.g. wind speed variations. The effectiveness of the proposed UDE-based control approach is demonstrated through extensive simulation studies in the presence of coupled dynamics, model uncertainties and external disturbances under varying wind speeds. The UDE-based control is able to generate more energy, e.g. by 5% for the wind profile tested.

  18. Exploring the Impact of Individualism and Uncertainty Avoidance in Web-Based Electronic Learning: An Empirical Analysis in European Higher Education

    ERIC Educational Resources Information Center

    Sanchez-Franco, Manuel J.; Martinez-Lopez, Francisco J.; Martin-Velicia, Felix A.

    2009-01-01

    Our research specifically focuses on the effects of the national cultural background of educators on the acceptance and usage of ICT, particularly the Web as an extensive and expanding information base that provides the ultimate in resource-rich learning. Most research has been used North Americans as subjects. For this reason, we interviewed…

  19. A new interpolation method for gridded extensive variables with application in Lagrangian transport and dispersion models

    NASA Astrophysics Data System (ADS)

    Hittmeir, Sabine; Philipp, Anne; Seibert, Petra

    2017-04-01

    In discretised form, an extensive variable usually represents an integral over a 3-dimensional (x,y,z) grid cell. In the case of vertical fluxes, gridded values represent integrals over a horizontal (x,y) grid face. In meteorological models, fluxes (precipitation, turbulent fluxes, etc.) are usually written out as temporally integrated values, thus effectively forming 3D (x,y,t) integrals. Lagrangian transport models require interpolation of all relevant variables towards the location in 4D space of each of the computational particles. Trivial interpolation algorithms usually implicitly assume the integral value to be a point value valid at the grid centre. If the integral value would be reconstructed from the interpolated point values, it would in general not be correct. If nonlinear interpolation methods are used, non-negativity cannot easily be ensured. This problem became obvious with respect to the interpolation of precipitation for the calculation of wet deposition FLEXPART (http://flexpart.eu) which uses ECMWF model output or other gridded input data. The presently implemented method consists of a special preprocessing in the input preparation software and subsequent linear interpolation in the model. The interpolated values are positive but the criterion of cell-wise conservation of the integral property is violated; it is also not very accurate as it smoothes the field. A new interpolation algorithm was developed which introduces additional supporting grid points in each time interval with linear interpolation to be applied in FLEXPART later between them. It preserves the integral precipitation in each time interval, guarantees the continuity of the time series, and maintains non-negativity. The function values of the remapping algorithm at these subgrid points constitute the degrees of freedom which can be prescribed in various ways. Combining the advantages of different approaches leads to a final algorithm respecting all the required conditions. To improve the monotonicity behaviour we additionally derived a filter to restrict over- or undershooting. At the current stage, the algorithm is meant primarily for the temporal dimension. It can also be applied with operator-splitting to include the two horizontal dimensions. An extension to 2D appears feasible, while a fully 3D version would most likely not justify the effort compared to the operator-splitting approach.

  20. Gradient-Based Aerodynamic Shape Optimization Using ADI Method for Large-Scale Problems

    NASA Technical Reports Server (NTRS)

    Pandya, Mohagna J.; Baysal, Oktay

    1997-01-01

    A gradient-based shape optimization methodology, that is intended for practical three-dimensional aerodynamic applications, has been developed. It is based on the quasi-analytical sensitivities. The flow analysis is rendered by a fully implicit, finite volume formulation of the Euler equations.The aerodynamic sensitivity equation is solved using the alternating-direction-implicit (ADI) algorithm for memory efficiency. A flexible wing geometry model, that is based on surface parameterization and platform schedules, is utilized. The present methodology and its components have been tested via several comparisons. Initially, the flow analysis for for a wing is compared with those obtained using an unfactored, preconditioned conjugate gradient approach (PCG), and an extensively validated CFD code. Then, the sensitivities computed with the present method have been compared with those obtained using the finite-difference and the PCG approaches. Effects of grid refinement and convergence tolerance on the analysis and shape optimization have been explored. Finally the new procedure has been demonstrated in the design of a cranked arrow wing at Mach 2.4. Despite the expected increase in the computational time, the results indicate that shape optimization, which require large numbers of grid points can be resolved with a gradient-based approach.

  1. Integrated renewable energy networks

    NASA Astrophysics Data System (ADS)

    Mansouri Kouhestani, F.; Byrne, J. M.; Hazendonk, P.; Brown, M. B.; Spencer, L.

    2015-12-01

    This multidisciplinary research is focused on studying implementation of diverse renewable energy networks. Our modern economy now depends heavily on large-scale, energy-intensive technologies. A transition to low carbon, renewable sources of energy is needed. We will develop a procedure for designing and analyzing renewable energy systems based on the magnitude, distribution, temporal characteristics, reliability and costs of the various renewable resources (including biomass waste streams) in combination with various measures to control the magnitude and timing of energy demand. The southern Canadian prairies are an ideal location for developing renewable energy networks. The region is blessed with steady, westerly winds and bright sunshine for more hours annually than Houston Texas. Extensive irrigation agriculture provides huge waste streams that can be processed biologically and chemically to create a range of biofuels. The first stage involves mapping existing energy and waste flows on a neighbourhood, municipal, and regional level. Optimal sites and combinations of sites for solar and wind electrical generation, such as ridges, rooftops and valley walls, will be identified. Geomatics based site and grid analyses will identify best locations for energy production based on efficient production and connectivity to regional grids.

  2. Transient Side Load Analysis of Out-of-Round Film-Cooled Nozzle Extensions

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike

    2012-01-01

    There was interest in understanding the impact of out-of-round nozzle extension on the nozzle side load during transient startup operations. The out-of-round nozzle extension could be the result of asymmetric internal stresses, deformation induced by previous tests, and asymmetric loads induced by hardware attached to the nozzle. The objective of this study was therefore to computationally investigate the effect of out-of-round nozzle extension on the nozzle side loads during an engine startup transient. The rocket engine studied encompasses a regeneratively cooled chamber and nozzle, along with a film cooled nozzle extension. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and transient inlet boundary flow properties derived from an engine system simulation. Six three-dimensional cases were performed with the out-of-roundness achieved by three different degrees of ovalization, elongated on lateral y and z axes: one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The results show that the separation line jump was the primary source of the peak side loads. Comparing to the peak side load of the perfectly round nozzle, the peak side loads increased for the slightly and more ovalized nozzle extensions, and either increased or decreased for the two significantly ovalized nozzle extensions. A theory based on the counteraction of the flow destabilizing effect of an exacerbated asymmetrical flow caused by a lower degree of ovalization, and the flow stabilizing effect of a more symmetrical flow, created also by ovalization, is presented to explain the observations obtained in this effort.

  3. A socio-technical investigation of the smart grid: Implications for demand-side activities of electricity service providers

    NASA Astrophysics Data System (ADS)

    Corbett, Jacqueline Marie

    Enabled by advanced communication and information technologies, the smart grid represents a major transformation for the electricity sector. Vast quantities of data and two-way communications abilities create the potential for a flexible, data-driven, multi-directional supply and consumption network well equipped to meet the challenges of the next century. For electricity service providers ("utilities"), the smart grid provides opportunities for improved business practices and new business models; however, a transformation of such magnitude is not without risks. Three related studies are conducted to explore the implications of the smart grid on utilities' demand-side activities. An initial conceptual framework, based on organizational information processing theory, suggests that utilities' performance depends on the fit between the information processing requirements and capacities associated with a given demand-side activity. Using secondary data and multiple regression analyses, the first study finds, consistent with OIPT, a positive relationship between utilities' advanced meter deployments and demand-side management performance. However, it also finds that meters with only data collection capacities are associated with lower performance, suggesting the presence of information waste causing operational inefficiencies. In the second study, interviews with industry participants provide partial support for the initial conceptual model, new insights are gained with respect to information processing fit and information waste, and "big data" is identified as a central theme of the smart grid. To derive richer theoretical insights, the third study employs a grounded theory approach examining the experience of one successful utility in detail. Based on interviews and documentary data, the paradox of dynamic stability emerges as an essential enabler of utilities' performance in the smart grid environment. Within this context, the frames of opportunity, control, and data limitation interact to support dynamic stability and contribute to innovation within tradition. The main contributions of this thesis include theoretical extensions to OIPT and the development of an emergent model of dynamic stability in relation to big data. The thesis also adds to the green IS literature and identifies important practical implications for utilities as they endeavour to bring the smart grid to reality.

  4. Job Superscheduler Architecture and Performance in Computational Grid Environments

    NASA Technical Reports Server (NTRS)

    Shan, Hongzhang; Oliker, Leonid; Biswas, Rupak

    2003-01-01

    Computational grids hold great promise in utilizing geographically separated heterogeneous resources to solve large-scale complex scientific problems. However, a number of major technical hurdles, including distributed resource management and effective job scheduling, stand in the way of realizing these gains. In this paper, we propose a novel grid superscheduler architecture and three distributed job migration algorithms. We also model the critical interaction between the superscheduler and autonomous local schedulers. Extensive performance comparisons with ideal, central, and local schemes using real workloads from leading computational centers are conducted in a simulation environment. Additionally, synthetic workloads are used to perform a detailed sensitivity analysis of our superscheduler. Several key metrics demonstrate that substantial performance gains can be achieved via smart superscheduling in distributed computational grids.

  5. A Petri Net model for distributed energy system

    NASA Astrophysics Data System (ADS)

    Konopko, Joanna

    2015-12-01

    Electrical networks need to evolve to become more intelligent, more flexible and less costly. The smart grid is the next generation power energy, uses two-way flows of electricity and information to create a distributed automated energy delivery network. Building a comprehensive smart grid is a challenge for system protection, optimization and energy efficient. Proper modeling and analysis is needed to build an extensive distributed energy system and intelligent electricity infrastructure. In this paper, the whole model of smart grid have been proposed using Generalized Stochastic Petri Nets (GSPN). The simulation of created model is also explored. The simulation of the model has allowed the analysis of how close the behavior of the model is to the usage of the real smart grid.

  6. Separating Added Value from Hype: Some Experiences and Prognostications

    NASA Astrophysics Data System (ADS)

    Reed, Dan

    2004-03-01

    These are exciting times for the interplay of science and computing technology. As new data archives, instruments and computing facilities are connected nationally and internationally, a new model of distributed scientific collaboration is emerging. However, any new technology brings both opportunities and challenges -- Grids are no exception. In this talk, we will discuss some of the experiences deploying Grid software in production environments, illustrated with experiences from the NSF PACI Alliance, the NSF Extensible Terascale Facility (ETF) and other Grid projects. From these experiences, we derive some guidelines for deployment and some suggestions for community engagement, software development and infrastructure

  7. Comparison of High-Frequency Solar Irradiance: Ground Measured vs. Satellite-Derived

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lave, Matthew; Weekley, Andrew

    2016-11-21

    High-frequency solar variability is an important to grid integration studies, but ground measurements are scarce. The high resolution irradiance algorithm (HRIA) has the ability to produce 4-sceond resolution global horizontal irradiance (GHI) samples, at locations across North America. However, the HRIA has not been extensively validated. In this work, we evaluate the HRIA against a database of 10 high-frequency ground-based measurements of irradiance. The evaluation focuses on variability-based metrics. This results in a greater understanding of the errors in the HRIA as well as suggestions for improvement to the HRIA.

  8. Supporting Collocation Learning with a Digital Library

    ERIC Educational Resources Information Center

    Wu, Shaoqun; Franken, Margaret; Witten, Ian H.

    2010-01-01

    Extensive knowledge of collocations is a key factor that distinguishes learners from fluent native speakers. Such knowledge is difficult to acquire simply because there is so much of it. This paper describes a system that exploits the facilities offered by digital libraries to provide a rich collocation-learning environment. The design is based on…

  9. Testing the influence of environmental heterogeneity on fish species richness in two biogeographic provinces.

    PubMed

    Massicotte, Philippe; Proulx, Raphaël; Cabana, Gilbert; Rodríguez, Marco A

    2015-01-01

    Environmental homogenization in coastal ecosystems impacted by human activities may be an important factor explaining the observed decline in fish species richness. We used fish community data (>200 species) from extensive surveys conducted in two biogeographic provinces (extent >1,000 km) in North America to quantify the relationship between fish species richness and local (grain <10 km(2)) environmental heterogeneity. Our analyses are based on samples collected at nearly 800 stations over a period of five years. We demonstrate that fish species richness in coastal ecosystems is associated locally with the spatial heterogeneity of environmental variables but not with their magnitude. The observed effect of heterogeneity on species richness was substantially greater than that generated by simulations from a random placement model of community assembly, indicating that the observed relationship is unlikely to arise from veil or sampling effects. Our results suggest that restoring or actively protecting areas of high habitat heterogeneity may be of great importance for slowing current trends of decreasing biodiversity in coastal ecosystems.

  10. Electron-Poor Polar Intermetallics: Complex Structures, Novel Clusters, and Intriguing Bonding with Pronounced Electron Delocalization.

    PubMed

    Lin, Qisheng; Miller, Gordon J

    2018-01-16

    Intermetallic compounds represent an extensive pool of candidates for energy related applications stemming from magnetic, electric, optic, caloric, and catalytic properties. The discovery of novel intermetallic compounds can enhance understanding of the chemical principles that govern structural stability and chemical bonding as well as finding new applications. Valence electron-poor polar intermetallics with valence electron concentrations (VECs) between 2.0 and 3.0 e - /atom show a plethora of unprecedented and fascinating structural motifs and bonding features. Therefore, establishing simple structure-bonding-property relationships is especially challenging for this compound class because commonly accepted valence electron counting rules are inappropriate. During our efforts to find quasicrystals and crystalline approximants by valence electron tuning near 2.0 e - /atom, we observed that compositions close to those of quasicrystals are exceptional sources for unprecedented valence electron-poor polar intermetallics, e.g., Ca 4 Au 10 In 3 containing (Au 10 In 3 ) wavy layers, Li 14.7 Mg 36.8 Cu 21.5 Ga 66 adopting a type IV clathrate framework, and Sc 4 Mg x Cu 15-x Ga 7.5 that is incommensurately modulated. In particular, exploratory syntheses of AAu 3 T (A = Ca, Sr, Ba and T = Ge, Sn) phases led to interesting bonding features for Au, such as columns, layers, and lonsdaleite-type tetrahedral frameworks. Overall, the breadth of Au-rich polar intermetallics originates, in part, from significant relativistics effect on the valence electrons of Au, effects which result in greater 6s/5d orbital mixing, a small effective metallic radius, and an enhanced Mulliken electronegativity, all leading to ultimate enhanced binding with nearly all metals including itself. Two other successful strategies to mine electron-poor polar intermetallics include lithiation and "cation-rich" phases. Along these lines, we have studied lithiated Zn-rich compounds in which structural complexity can be realized by small amounts of Li replacing Zn atoms in the parent binary compounds CaZn 2 , CaZn 3 , and CaZn 5 ; their phase formation and bonding schemes can be rationalized by Fermi surface-Brillouin zone interactions between nearly free-electron states. "Cation-rich", electron-poor polar intermetallics have emerged using rare earth metals as the electropositive ("cationic") component together metal/metalloid clusters that mimic the backbones of aromatic hydrocarbon molecules, which give evidence of extensive electronic delocalization and multicenter bonding. Thus, we can identify three distinct, valence electron-poor, polar intermetallic systems that have yielded unprecedented phases adopting novel structures containing complex clusters and intriguing bonding characteristics. In this Account, we summarize our recent specific progress in the developments of novel Au-rich BaAl 4 -type related structures, shown in the "gold-rich grid", lithiation-modulated Ca-Li-Zn phases stabilized by different bonding characteristics, and rare earth-rich polar intermetallics containing unprecedented hydrocarbon-like planar Co-Ge metal clusters and pronounced delocalized multicenter bonding. We will focus mainly on novel structural motifs, bonding analyses, and the role of valence electrons for phase stability.

  11. Spatial partitioning of environmental correlates of avian biodiversity in the conterminous United States

    USGS Publications Warehouse

    O'Connor, R.J.; Jones, M.T.; White, D.; Hunsaker, C.; Loveland, Tom; Jones, Bruce; Preston, E.

    1996-01-01

    Classification and regression tree (CART) analysis was used to create hierarchically organized models of the distribution of bird species richness across the conterminous United States. Species richness data were taken from the Breeding Bird Survey and were related to climatic and land use data. We used a systematic spatial grid of approximately 12,500 hexagons, each approximately 640 square kilometres in area. Within each hexagon land use was characterized by the Loveland et al. land cover classification based on Advanced Very High Resolution Radiometer (AVHRR) data from NOAA polar orbiting meteorological satellites. These data were aggregated to yield fourteen land classes equivalent to an Anderson level II coverage; urban areas were added from the Digital Chart of the World. Each hexagon was characterized by climate data and landscape pattern metrics calculated from the land cover. A CART model then related the variation in species richness across the 1162 hexagons for which bird species richness data were available to the independent variables, yielding an R2-type goodness of fit metric of 47.5% deviance explained. The resulting model recognized eleven groups of hexagons, with species richness within each group determined by unique sequences of hierarchically constrained independent variables. Within the hierarchy, climate data accounted for more variability in the bird data, followed by land cover proportion, and then pattern metrics. The model was then used to predict species richness in all 12,500 hexagons of the conterminous United States yielding a map of the distribution of these eleven classes of bird species richness as determined by the environmental correlates. The potential for using this technique to interface biogeographic theory with the hierarchy theory of ecology is discussed. ?? 1996 Blackwell Science Ltd.

  12. An Analysis for an Internet Grid to Support Space Based Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Currently, and in the past, dedicated communication circuits and "network services" with very stringent performance requirements have been used to support manned and unmanned mission critical ground operations at GSFC, JSC, MSFC, KSC and other NASA facilities. Because of the evolution of network technology, it is time to investigate other approaches to providing mission services for space ground and flight operations. In various scientific disciplines, effort is under way to develop network/komputing grids. These grids consisting of networks and computing equipment are enabling lower cost science. Specifically, earthquake research is headed in this direction. With a standard for network and computing interfaces using a grid, a researcher would not be required to develop and engineer NASA/DoD specific interfaces with the attendant increased cost. Use of the Internet Protocol (IP), CCSDS packet spec, and reed-solomon for satellite error correction etc. can be adopted/standardized to provide these interfaces. Generally most interfaces are developed at least to some degree end to end. This study would investigate the feasibility of using existing standards and protocols necessary to implement a SpaceOps Grid. New interface definitions or adoption/modification of existing ones for the various space operational services is required for voice both space based and ground, video, telemetry, commanding and planning may play a role to some undefined level. Security will be a separate focus in the study since security is such a large issue in using public networks. This SpaceOps Grid would be transparent to users. It would be anagulous to the Ethernet protocol's ease of use in that a researcher would plug in their experiment or instrument at one end and would be connected to the appropriate host or server without further intervention. Free flyers would be in this category as well. They would be launched and would transmit without any further intervention with the researcher or ground ops personnel. The payback in developing these new approaches in support of manned and unmanned operations is lower cost and will enable direct participation by more people in organizations and educational institutions in space based science. By lowering the high cost of space based operations and networking, more resource will be available to the science community for science. With a specific grid in place, experiment development and operations would be much less costly by using standardized network interfaces. Because of the extensive connectivity on a global basis, significant numbers of people would participate in science who otherwise would not be able to participate.

  13. Revised and annotated checklist of aquatic and semi-aquatic Heteroptera of Hungary with comments on biodiversity patterns

    PubMed Central

    Boda, Pál; Bozóki, Tamás; Vásárhelyi, Tamás; Bakonyi, Gábor; Várbíró, Gábor

    2015-01-01

    Abstract A basic knowledge of regional faunas is necessary to follow the changes in macroinvertebrate communities caused by environmental influences and climatic trends in the future. We collected all the available data on water bugs in Hungary using an inventory method, a UTM grid based database was built, and Jackknife richness estimates and species accumulation curves were calculated. Fauna compositions were compared among Central-European states. As a result, an updated and annotated checklist for Hungary is provided, containing 58 species in 21 genera and 12 families. A total 66.8% of the total UTM 10 × 10 km squares in Hungary possess faunistic data for water bugs. The species number in grid cells numbered from 0 to 42, and their diversity patterns showed heterogeneity. The estimated species number of 58 is equal to the actual number of species known from the country. The asymptotic shape of the accumulative species curve predicts that additional sampling efforts will not increase the number of species currently known from Hungary. These results suggest that the number of species in the country was estimated correctly and that the species accumulation curve levels off at an asymptotic value. Thus a considerable increase in species richness is not expected in the future. Even with the species composition changing the chance of species turn-over does exist. Overall, 36.7% of the European water bug species were found in Hungary. The differences in faunal composition between Hungary and its surrounding countries were caused by the rare or unique species, whereas 33 species are common in the faunas of the eight countries. Species richness does show a correlation with latitude, and similar species compositions were observed in the countries along the same latitude. The species list and the UTM-based database are now up-to-date for Hungary, and it will provide a basis for future studies of distributional and biodiversity patterns, biogeography, relative abundance and frequency of occurrences important in community ecology, or the determination of conservation status. PMID:25987880

  14. Modelling and Simulation of Grid Connected SPV System with Active Power Filtering Features

    NASA Astrophysics Data System (ADS)

    Saroha, Jaipal; Pandove, Gitanjali; Singh, Mukhtiar

    2017-09-01

    In this paper, the detailed simulation studies for a grid connected solar photovoltaic system (SPV) have been presented. The power electronics devices like DC-DC boost converter and grid interfacing inverter are most important components of proposed system. Here, the DC-DC boost converter is controlled to extract maximum power out of SPV under different irradiation levels, while the grid interfacing inverter is utilized to evacuate the active power and feed it into grid at synchronized voltage and frequency. Moreover, the grid interfacing inverter is also controlled to sort out the issues related to power quality by compensating the reactive power and harmonics current component of nearby load at point of common coupling. Besides, detailed modeling of various component utilized in proposed system is also presented. Finally, extensive simulations have been performed under different irradiation levels with various kinds of load to validate the aforementioned claims. The overall system design and simulation have been performed by using Sim Power System toolbox available in the library of MATLAB.

  15. A De-centralized Scheduling and Load Balancing Algorithm for Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Arora, Manish; Das, Sajal K.; Biswas, Rupak

    2002-01-01

    In the past two decades, numerous scheduling and load balancing techniques have been proposed for locally distributed multiprocessor systems. However, they all suffer from significant deficiencies when extended to a Grid environment: some use a centralized approach that renders the algorithm unscalable, while others assume the overhead involved in searching for appropriate resources to be negligible. Furthermore, classical scheduling algorithms do not consider a Grid node to be N-resource rich and merely work towards maximizing the utilization of one of the resources. In this paper, we propose a new scheduling and load balancing algorithm for a generalized Grid model of N-resource nodes that not only takes into account the node and network heterogeneity, but also considers the overhead involved in coordinating among the nodes. Our algorithm is decentralized, scalable, and overlaps the node coordination time with that of the actual processing of ready jobs, thus saving valuable clock cycles needed for making decisions. The proposed algorithm is studied by conducting simulations using the Message Passing Interface (MPI) paradigm.

  16. A De-Centralized Scheduling and Load Balancing Algorithm for Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Arora, Manish; Das, Sajal K.; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2002-01-01

    In the past two decades, numerous scheduling and load balancing techniques have been proposed for locally distributed multiprocessor systems. However, they all suffer from significant deficiencies when extended to a Grid environment: some use a centralized approach that renders the algorithm unscalable, while others assume the overhead involved in searching for appropriate resources to be negligible. Furthermore, classical scheduling algorithms do not consider a Grid node to be N-resource rich and merely work towards maximizing the utilization of one of the resources. In this paper we propose a new scheduling and load balancing algorithm for a generalized Grid model of N-resource nodes that not only takes into account the node and network heterogeneity, but also considers the overhead involved in coordinating among the nodes. Our algorithm is de-centralized, scalable, and overlaps the node coordination time of the actual processing of ready jobs, thus saving valuable clock cycles needed for making decisions. The proposed algorithm is studied by conducting simulations using the Message Passing Interface (MPI) paradigm.

  17. CFD Simulation On The Pressure Distribution For An Isolated Single-Story House With Extension: Grid Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Yahya, W. N. W.; Zaini, S. S.; Ismail, M. A.; Majid, T. A.; Deraman, S. N. C.; Abdullah, J.

    2018-04-01

    Damage due to wind-related disasters is increasing due to global climate change. Many studies have been conducted to study the wind effect surrounding low-rise building using wind tunnel tests or numerical simulations. The use of numerical simulation is relatively cheap but requires very good command in handling the software, acquiring the correct input parameters and obtaining the optimum grid or mesh. However, before a study can be conducted, a grid sensitivity test must be conducted to get a suitable cell number for the final to ensure an accurate result with lesser computing time. This study demonstrates the numerical procedures for conducting a grid sensitivity analysis using five models with different grid schemes. The pressure coefficients (CP) were observed along the wall and roof profile and compared between the models. The results showed that medium grid scheme can be used and able to produce high accuracy results compared to finer grid scheme as the difference in terms of the CP values was found to be insignificant.

  18. Seismic data are rich in information about subsurface formations and fluids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farfour, Mohammed; Yoon, Wang Jung; Kim, Dongshin

    2016-06-08

    Seismic attributes are defined as any measured or computed information derived from seismic data. Throughout the last decades extensive work has been done in developing variety of mathematical approaches to extract maximum information from seismic data. Nevertheless, geoscientists found that seismic is still mature and rich in information. In this paper a new seismic attribute is introduced. Instantaneous energy seismic attribute is an amplitude based attribute that has the potential to emphasize anomalous amplitude associated with hydrocarbons. Promising results have been obtained from applying the attribute on seismic section traversing hydrocarbon filled sand from Alberta, Canada.

  19. Transmission Electron Microscopy of Plagioclase-Rich Itokawa Grains: Space Weathering Effects and Solar Flare Track Exposure Ages

    NASA Technical Reports Server (NTRS)

    Keller, Lindsay P.; Berger, Eve L.

    2017-01-01

    Limited samples are available for the study of space weathering effects on airless bodies. The grains returned by the Hayabusa mission to asteroid 25143 Itokawa provide the only samples currently available to study space weathering of ordinary chondrite regolith. We have previously studied olivine-rich Itokawa grains and documented their surface alteration and exposure ages based on the observed density of solar flare particle tracks. Here we focus on the rarer Itokawa plagioclase grains, in order to allow comparisons between Itokawa and lunar soil plagioclase grains for which an extensive data set exists.

  20. Impacts of the transformation of the German energy system on the transmission grid

    NASA Astrophysics Data System (ADS)

    Pesch, T.; Allelein, H.-J.; Hake, J.-F.

    2014-10-01

    The German Energiewende, the transformation of the energy system, has deep impacts on all parts of the system. This paper presents an approach that has been developed to simultaneously analyse impacts on the energy system as a whole and on the electricity system in particular. In the analysis, special emphasis is placed on the transmission grid and the efficiency of recommended grid extensions according to the German Network Development Plan. The analysis reveals that the measures in the concept are basically suitable for integrating the assumed high share of renewables in the future electricity system. Whereas a high feed-in from PV will not cause problems in the transmission grid in 2022, congestion may occur in situations with a high proportion of wind feed-in. Moreover, future bottlenecks in the grid are located in the same regions as today.

  1. Automatic Overset Grid Generation with Heuristic Feedback Control

    NASA Technical Reports Server (NTRS)

    Robinson, Peter I.

    2001-01-01

    An advancing front grid generation system for structured Overset grids is presented which automatically modifies Overset structured surface grids and control lines until user-specified grid qualities are achieved. The system is demonstrated on two examples: the first refines a space shuttle fuselage control line until global truncation error is achieved; the second advances, from control lines, the space shuttle orbiter fuselage top and fuselage side surface grids until proper overlap is achieved. Surface grids are generated in minutes for complex geometries. The system is implemented as a heuristic feedback control (HFC) expert system which iteratively modifies the input specifications for Overset control line and surface grids. It is developed as an extension of modern control theory, production rules systems and subsumption architectures. The methodology provides benefits over the full knowledge lifecycle of an expert system for knowledge acquisition, knowledge representation, and knowledge execution. The vector/matrix framework of modern control theory systematically acquires and represents expert system knowledge. Missing matrix elements imply missing expert knowledge. The execution of the expert system knowledge is performed through symbolic execution of the matrix algebra equations of modern control theory. The dot product operation of matrix algebra is generalized for heuristic symbolic terms. Constant time execution is guaranteed.

  2. Extensible Interest Management for Scalable Persistent Distributed Virtual Environments

    DTIC Science & Technology

    1999-12-01

    Calvin, Cebula et al. 1995; Morse, Bic et al. 2000) uses a two grid, with each grid cell having two multicast addresses. An entity expresses interest...Entity distribution for experimental runs 78 s I * • ...... ^..... * * a» Sis*«*»* 1 ***** Jj |r...Multiple Users and Shared Applications with VRML. VRML 97, Monterey, CA. pp. 33-40. Calvin, J. O., D. P. Cebula , et al. (1995). Data Subscription in

  3. Lighting the World: the first application of an open source, spatial electrification tool (OnSSET) on Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Mentis, Dimitrios; Howells, Mark; Rogner, Holger; Korkovelos, Alexandros; Arderne, Christopher; Zepeda, Eduardo; Siyal, Shahid; Taliotis, Costantinos; Bazilian, Morgan; de Roo, Ad; Tanvez, Yann; Oudalov, Alexandre; Scholtz, Ernst

    2017-08-01

    In September 2015, the United Nations General Assembly adopted Agenda 2030, which comprises a set of 17 Sustainable Development Goals (SDGs) defined by 169 targets. ‘Ensuring access to affordable, reliable, sustainable and modern energy for all by 2030’ is the seventh goal (SDG7). While access to energy refers to more than electricity, the latter is the central focus of this work. According to the World Bank’s 2015 Global Tracking Framework, roughly 15% of the world’s population (or 1.1 billion people) lack access to electricity, and many more rely on poor quality electricity services. The majority of those without access (87%) reside in rural areas. This paper presents results of a geographic information systems approach coupled with open access data. We present least-cost electrification strategies on a country-by-country basis for Sub-Saharan Africa. The electrification options include grid extension, mini-grid and stand-alone systems for rural, peri-urban, and urban contexts across the economy. At low levels of electricity demand there is a strong penetration of standalone technologies. However, higher electricity demand levels move the favourable electrification option from stand-alone systems to mini grid and to grid extensions.

  4. Internet-based wide area measurement applications in deregulated power systems

    NASA Astrophysics Data System (ADS)

    Khatib, Abdel-Rahman Amin

    Since the deregulation of power systems was started in 1989 in the UK, many countries have been motivated to undergo deregulation. The United State started deregulation in the energy sector in California back in 1996. Since that time many other states have also started the deregulation procedures in different utilities. Most of the deregulation market in the United States now is in the wholesale market area, however, the retail market is still undergoing changes. Deregulation has many impacts on power system network operation and control. The number of power transactions among the utilities has increased and many Independent Power Producers (IPPs) now have a rich market for competition especially in the green power market. The Federal Energy Regulatory Commission (FERC) called upon utilities to develop the Regional Transmission Organization (RTO). The RTO is a step toward the national transmission grid. RTO is an independent entity that will operate the transmission system in a large region. The main goal of forming RTOs is to increase the operation efficiency of the power network under the impact of the deregulated market. The objective of this work is to study Internet based Wide Area Information Sharing (WAIS) applications in the deregulated power system. The study is the first step toward building a national transmission grid picture using information sharing among utilities. Two main topics are covered as applications for the WAIS in the deregulated power system, state estimation and Total Transfer Capability (TTC) calculations. As a first step for building this national transmission grid picture, WAIS and the level of information sharing of the state estimation calculations have been discussed. WAIS impacts to the TTC calculations are also covered. A new technique to update the TTC using on line measurements based on WAIS created by sharing state estimation is presented.

  5. Simulating pre-galactic metal enrichment for JWST deep-field observations

    NASA Astrophysics Data System (ADS)

    Jaacks, Jason

    2017-08-01

    We propose to create a new suite of mesoscale cosmological volume simulations with custom built sub-grid physics in which we independently track the contribution from Population III and Population II star formation to the total metals in the interstellar medium (ISM) of the first galaxies, and in the diffuse IGM at an epoch prior to reionization. These simulations will fill a gap in our simulation knowledge about chemical enrichment in the pre-reionization universe, which is a crucial need given the impending observational push into this epoch with near-future ground and space-based telescopes. This project is the natural extension of our successful Cycle 24 theory proposal (HST-AR-14569.001-A; PI Jaacks) in which we developed a new Pop III star formation sub-grid model which is currently being utilized to study the baseline metal enrichment of pre-reionization systems.

  6. Extension of a streamwise upwind algorithm to a moving grid system

    NASA Technical Reports Server (NTRS)

    Obayashi, Shigeru; Goorjian, Peter M.; Guruswamy, Guru P.

    1990-01-01

    A new streamwise upwind algorithm was derived to compute unsteady flow fields with the use of a moving-grid system. The temporally nonconservative LU-ADI (lower-upper-factored, alternating-direction-implicit) method was applied for time marching computations. A comparison of the temporally nonconservative method with a time-conservative implicit upwind method indicates that the solutions are insensitive to the conservative properties of the implicit solvers when practical time steps are used. Using this new method, computations were made for an oscillating wing at a transonic Mach number. The computed results confirm that the present upwind scheme captures the shock motion better than the central-difference scheme based on the beam-warming algorithm. The new upwind option of the code allows larger time-steps and thus is more efficient, even though it requires slightly more computational time per time step than the central-difference option.

  7. Shape coexistence in neutron-rich nuclei near N=40

    NASA Astrophysics Data System (ADS)

    Carpenter, M. P.; Janssens, R. V. F.; Zhu, S.

    2013-04-01

    Recent data show that both the 2+ and 4+ levels in the even neutron-rich Cr and Fe isotopes decrease in excitation energy toward N=40. This observation, along with Coulomb excitation and lifetime data, strongly indicates an increase in collectivity near N=40 in contradiction with expectations based on first principles. A straightforward two-band mixing model is used to investigate the structure of these neutron-rich Cr and Fe nuclei. The approach takes advantage of the extensive data available for 60Fe to provide the parameter values with which to reproduce the experimental observations in the 58-64Cr and 60-68Fe isotopic chains. Comparisons between the model and the data suggest marked structural differences for the ground-state configurations of N=40 Cr and Fe.

  8. A Petri Net model for distributed energy system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konopko, Joanna

    2015-12-31

    Electrical networks need to evolve to become more intelligent, more flexible and less costly. The smart grid is the next generation power energy, uses two-way flows of electricity and information to create a distributed automated energy delivery network. Building a comprehensive smart grid is a challenge for system protection, optimization and energy efficient. Proper modeling and analysis is needed to build an extensive distributed energy system and intelligent electricity infrastructure. In this paper, the whole model of smart grid have been proposed using Generalized Stochastic Petri Nets (GSPN). The simulation of created model is also explored. The simulation of themore » model has allowed the analysis of how close the behavior of the model is to the usage of the real smart grid.« less

  9. Magma-poor vs. magma-rich continental rifting and breakup in the Labrador Sea

    NASA Astrophysics Data System (ADS)

    Gouiza, M.; Paton, D.

    2017-12-01

    Magma-poor and magma-rich rifted margins show distinct structural and stratigraphic geometries during the rift to breakup period. In magma-poor margins, crustal stretching is accommodated mainly by brittle faulting and the formation of wide rift basins shaped by numerous graben and half-graben structures. Continental breakup and oceanic crust accretion are often preceded by a localised phase of (hyper-) extension where the upper mantle is embrittled, serpentinized, and exhumed to the surface. In magma-rich margins, the rift basin is narrow and extension is accompanied by a large magmatic supply. Continental breakup and oceanic crust accretion is preceded by the emplacement of a thick volcanic crust juxtaposing and underplating a moderately thinned continental crust. Both magma-poor and magma-rich rifting occur in response to lithospheric extension but the driving forces and processes are believed to be different. In the former extension is assumed to be driven by plate boundary forces, while in the latter extension is supposed to be controlled by sublithospheric mantle dynamics. However, this view fails in explaining observations from many Atlantic conjugate margins where magma-poor and magma-rich segments alternate in a relatively abrupt fashion. This is the case of the Labrador margin where the northern segment shows major magmatic supply during most of the syn-rift phase which culminate in the emplacement of a thick volcanic crust in the transitional domain along with high density bodies underplating the thinned continental crust; while the southern segment is characterized mainly by brittle extension, mantle seprentinization and exhumation prior to continental breakup. In this work, we use seismic and potential field data to describe the crustal and structural architectures of the Labrador margin, and investigate the tectonic and mechanical processes of rifting that may have controlled the magmatic supply in the different segments of the margin.

  10. Global mammal distributions, biodiversity hotspots, and conservation.

    PubMed

    Ceballos, Gerardo; Ehrlich, Paul R

    2006-12-19

    Hotspots, which have played a central role in the selection of sites for reserves, require careful rethinking. We carried out a global examination of distributions of all nonmarine mammals to determine patterns of species richness, endemism, and endangerment, and to evaluate the degree of congruence among hotspots of these three measures of diversity in mammals. We then compare congruence of hotspots in two animal groups (mammals and birds) to assess the generality of these patterns. We defined hotspots as the richest 2.5% of cells in a global equal-area grid comparable to 1 degrees latitude x 1 degrees longitude. Hotspots of species richness, "endemism," and extinction threat were noncongruent. Only 1% of cells and 16% of species were common to the three types of mammalian hotspots. Congruence increased with increases in both the geographic scope of the analysis and the percentage of cells defined as being hotspots. The within-mammal hotspot noncongruence was similar to the pattern recently found for birds. Thus, assigning global conservation priorities based on hotspots is at best a limited strategy.

  11. Containerless low gravity processing of glass forming and immiscible alloys

    NASA Technical Reports Server (NTRS)

    Andrews, J. Barry; Briggs, Craig; Robinson, M. B.

    1990-01-01

    Under normal one-g conditions immiscible alloys segregate extensively during solidification due to sedimentation of the more dense of the immiscible liquid phases. Immiscible (hypermonotectic) gold-rhodium alloys were processed in the 100 meter drop tube under low gravity, containerless conditions to determine the feasibility of producing dispersed structures. Three alloy compositions were utilized. Alloys containing 10 percent by volume of the gold-rich hypermonotectic phase exhibited a tendency for the gold-rich liquid to wet the outer surface of the samples. This wetting tendency led to extensive segregation in several cases. Alloys containing 80 and 90 percent by volume of the gold-rich phase possessed completely different microstructures from the 10 percent samples when processed under low-g, containerless conditions. Several samples exhibited microstructures consisting of well dispersed 2 to 3 microns diameter rhodium-rich spheres in a gold-rich matrix.

  12. Challenging the Gifted through Problem Solving Experiences: Design and Evaluation of the COMET Program.

    ERIC Educational Resources Information Center

    Feldhusen, John F.; And Others

    1992-01-01

    The COMET summer residential program at Purdue University (Indiana) offers gifted and talented youth in grades 4-6 a week of intensive study in a single content area. Courses stress specific problem-solving skills and development of a rich knowledge base. Extensive program evaluation by students, teachers, counselors, and parents was highly…

  13. Semantic Interoperability for Computational Mineralogy: Experiences of the eMinerals Consortium

    NASA Astrophysics Data System (ADS)

    Walker, A. M.; White, T. O.; Dove, M. T.; Bruin, R. P.; Couch, P. A.; Tyer, R. P.

    2006-12-01

    The use of atomic scale computer simulation of minerals to obtain information for geophysics and environmental science has grown enormously over the past couple of decades. It is now routine to probe mineral behavior in the Earth's deep interior and in the surface environment by borrowing methods and simulation codes from computational chemistry and physics. It is becoming increasingly important to use methods embodied in more than one of these codes to solve any single scientific problem. However, scientific codes are rarely designed for easy interoperability and data exchange; data formats are often code-specific, poorly documented and fragile, liable to frequent change between software versions, and even compiler versions. This means that the scientist's simple desire to use the methodological approaches offered by multiple codes is frustrated, and even the sharing of data between collaborators becomes fraught with difficulties. The eMinerals consortium was formed in the early stages of the UK eScience program with the aim of developing the tools needed to apply atomic scale simulation to environmental problems in a grid-enabled world, and to harness the computational power offered by grid technologies to address some outstanding mineralogical problems. One example of the kind of problem we can tackle is the origin of the compressibility anomaly in silica glass. By passing data directly between simulation and analysis tools we were able to probe this effect in more detail than has previously been possible and have shown how the anomaly is related to the details of the amorphous structure. In order to approach this kind of problem we have constructed a mini-grid, a small scale and extensible combined compute- and data-grid that allows the execution of many calculations in parallel, and the transparent storage of semantically-rich marked-up result data. Importantly, we automatically capture multiple kinds of metadata and key results from each calculation. We believe that the lessons learned and tools developed will be useful in many areas of science beyond the computational mineralogy. Key tools that will be described include: a pure Fortran XML library (FoX) that presents XPath, SAX and DOM interfaces as well as permitting the easy production of valid XML from legacy Fortran programs; a job submission framework that automatically schedules calculations to remote grid resources, handles data staging and metadata capture; and a tool (AgentX) that map concepts from an ontology onto locations in documents of various formats that we use to enable data exchange.

  14. Wind Farm Stabilization by using DFIG with Current Controlled Voltage Source Converters Taking Grid Codes into Consideration

    NASA Astrophysics Data System (ADS)

    Okedu, Kenneth Eloghene; Muyeen, S. M.; Takahashi, Rion; Tamura, Junji

    Recent wind farm grid codes require wind generators to ride through voltage sags, which means that normal power production should be re-initiated once the nominal grid voltage is recovered. However, fixed speed wind turbine generator system using induction generator (IG) has the stability problem similar to the step-out phenomenon of a synchronous generator. On the other hand, doubly fed induction generator (DFIG) can control its real and reactive powers independently while being operated in variable speed mode. This paper proposes a new control strategy using DFIGs for stabilizing a wind farm composed of DFIGs and IGs, without incorporating additional FACTS devices. A new current controlled voltage source converter (CC-VSC) scheme is proposed to control the converters of DFIG and the performance is verified by comparing the results with those of voltage controlled voltage source converter (VC-VSC) scheme. Another salient feature of this study is to reduce the number of proportionate integral (PI) controllers used in the rotor side converter without degrading dynamic and transient performances. Moreover, DC-link protection scheme during grid fault can be omitted in the proposed scheme which reduces overall cost of the system. Extensive simulation analyses by using PSCAD/EMTDC are carried out to clarify the effectiveness of the proposed CC-VSC based control scheme of DFIGs.

  15. DIRAC3 - the new generation of the LHCb grid software

    NASA Astrophysics Data System (ADS)

    Tsaregorodtsev, A.; Brook, N.; Casajus Ramo, A.; Charpentier, Ph; Closier, J.; Cowan, G.; Graciani Diaz, R.; Lanciotti, E.; Mathe, Z.; Nandakumar, R.; Paterson, S.; Romanovsky, V.; Santinelli, R.; Sapunov, M.; Smith, A. C.; Seco Miguelez, M.; Zhelezov, A.

    2010-04-01

    DIRAC, the LHCb community Grid solution, was considerably reengineered in order to meet all the requirements for processing the data coming from the LHCb experiment. It is covering all the tasks starting with raw data transportation from the experiment area to the grid storage, data processing up to the final user analysis. The reengineered DIRAC3 version of the system includes a fully grid security compliant framework for building service oriented distributed systems; complete Pilot Job framework for creating efficient workload management systems; several subsystems to manage high level operations like data production and distribution management. The user interfaces of the DIRAC3 system providing rich command line and scripting tools are complemented by a full-featured Web portal providing users with a secure access to all the details of the system status and ongoing activities. We will present an overview of the DIRAC3 architecture, new innovative features and the achieved performance. Extending DIRAC3 to manage computing resources beyond the WLCG grid will be discussed. Experience with using DIRAC3 by other user communities than LHCb and in other application domains than High Energy Physics will be shown to demonstrate the general-purpose nature of the system.

  16. Self-adjusting grid methods for one-dimensional hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Harten, A.; Hyman, J. M.

    1983-01-01

    The automatic adjustment of a grid which follows the dynamics of the numerical solution of hyperbolic conservation laws is given. The grid motion is determined by averaging the local characteristic velocities of the equations with respect to the amplitudes of the signals. The resulting algorithm is a simple extension of many currently popular Godunov-type methods. Computer codes using one of these methods can be easily modified to add the moving mesh as an option. Numerical examples are given that illustrate the improved accuracy of Godunov's and Roe's methods on a self-adjusting mesh. Previously announced in STAR as N83-15008

  17. A repository based on a dynamically extensible data model supporting multidisciplinary research in neuroscience.

    PubMed

    Corradi, Luca; Porro, Ivan; Schenone, Andrea; Momeni, Parastoo; Ferrari, Raffaele; Nobili, Flavio; Ferrara, Michela; Arnulfo, Gabriele; Fato, Marco M

    2012-10-08

    Robust, extensible and distributed databases integrating clinical, imaging and molecular data represent a substantial challenge for modern neuroscience. It is even more difficult to provide extensible software environments able to effectively target the rapidly changing data requirements and structures of research experiments. There is an increasing request from the neuroscience community for software tools addressing technical challenges about: (i) supporting researchers in the medical field to carry out data analysis using integrated bioinformatics services and tools; (ii) handling multimodal/multiscale data and metadata, enabling the injection of several different data types according to structured schemas; (iii) providing high extensibility, in order to address different requirements deriving from a large variety of applications simply through a user runtime configuration. A dynamically extensible data structure supporting collaborative multidisciplinary research projects in neuroscience has been defined and implemented. We have considered extensibility issues from two different points of view. First, the improvement of data flexibility has been taken into account. This has been done through the development of a methodology for the dynamic creation and use of data types and related metadata, based on the definition of "meta" data model. This way, users are not constrainted to a set of predefined data and the model can be easily extensible and applicable to different contexts. Second, users have been enabled to easily customize and extend the experimental procedures in order to track each step of acquisition or analysis. This has been achieved through a process-event data structure, a multipurpose taxonomic schema composed by two generic main objects: events and processes. Then, a repository has been built based on such data model and structure, and deployed on distributed resources thanks to a Grid-based approach. Finally, data integration aspects have been addressed by providing the repository application with an efficient dynamic interface designed to enable the user to both easily query the data depending on defined datatypes and view all the data of every patient in an integrated and simple way. The results of our work have been twofold. First, a dynamically extensible data model has been implemented and tested based on a "meta" data-model enabling users to define their own data types independently from the application context. This data model has allowed users to dynamically include additional data types without the need of rebuilding the underlying database. Then a complex process-event data structure has been built, based on this data model, describing patient-centered diagnostic processes and merging information from data and metadata. Second, a repository implementing such a data structure has been deployed on a distributed Data Grid in order to provide scalability both in terms of data input and data storage and to exploit distributed data and computational approaches in order to share resources more efficiently. Moreover, data managing has been made possible through a friendly web interface. The driving principle of not being forced to preconfigured data types has been satisfied. It is up to users to dynamically configure the data model for the given experiment or data acquisition program, thus making it potentially suitable for customized applications. Based on such repository, data managing has been made possible through a friendly web interface. The driving principle of not being forced to preconfigured data types has been satisfied. It is up to users to dynamically configure the data model for the given experiment or data acquisition program, thus making it potentially suitable for customized applications.

  18. A repository based on a dynamically extensible data model supporting multidisciplinary research in neuroscience

    PubMed Central

    2012-01-01

    Background Robust, extensible and distributed databases integrating clinical, imaging and molecular data represent a substantial challenge for modern neuroscience. It is even more difficult to provide extensible software environments able to effectively target the rapidly changing data requirements and structures of research experiments. There is an increasing request from the neuroscience community for software tools addressing technical challenges about: (i) supporting researchers in the medical field to carry out data analysis using integrated bioinformatics services and tools; (ii) handling multimodal/multiscale data and metadata, enabling the injection of several different data types according to structured schemas; (iii) providing high extensibility, in order to address different requirements deriving from a large variety of applications simply through a user runtime configuration. Methods A dynamically extensible data structure supporting collaborative multidisciplinary research projects in neuroscience has been defined and implemented. We have considered extensibility issues from two different points of view. First, the improvement of data flexibility has been taken into account. This has been done through the development of a methodology for the dynamic creation and use of data types and related metadata, based on the definition of “meta” data model. This way, users are not constrainted to a set of predefined data and the model can be easily extensible and applicable to different contexts. Second, users have been enabled to easily customize and extend the experimental procedures in order to track each step of acquisition or analysis. This has been achieved through a process-event data structure, a multipurpose taxonomic schema composed by two generic main objects: events and processes. Then, a repository has been built based on such data model and structure, and deployed on distributed resources thanks to a Grid-based approach. Finally, data integration aspects have been addressed by providing the repository application with an efficient dynamic interface designed to enable the user to both easily query the data depending on defined datatypes and view all the data of every patient in an integrated and simple way. Results The results of our work have been twofold. First, a dynamically extensible data model has been implemented and tested based on a “meta” data-model enabling users to define their own data types independently from the application context. This data model has allowed users to dynamically include additional data types without the need of rebuilding the underlying database. Then a complex process-event data structure has been built, based on this data model, describing patient-centered diagnostic processes and merging information from data and metadata. Second, a repository implementing such a data structure has been deployed on a distributed Data Grid in order to provide scalability both in terms of data input and data storage and to exploit distributed data and computational approaches in order to share resources more efficiently. Moreover, data managing has been made possible through a friendly web interface. The driving principle of not being forced to preconfigured data types has been satisfied. It is up to users to dynamically configure the data model for the given experiment or data acquisition program, thus making it potentially suitable for customized applications. Conclusions Based on such repository, data managing has been made possible through a friendly web interface. The driving principle of not being forced to preconfigured data types has been satisfied. It is up to users to dynamically configure the data model for the given experiment or data acquisition program, thus making it potentially suitable for customized applications. PMID:23043673

  19. Intelligent Operation and Maintenance of Micro-grid Technology and System Development

    NASA Astrophysics Data System (ADS)

    Fu, Ming; Song, Jinyan; Zhao, Jingtao; Du, Jian

    2018-01-01

    In order to achieve the micro-grid operation and management, Studying the micro-grid operation and maintenance knowledge base. Based on the advanced Petri net theory, the fault diagnosis model of micro-grid is established, and the intelligent diagnosis and analysis method of micro-grid fault is put forward. Based on the technology, the functional system and architecture of the intelligent operation and maintenance system of micro-grid are studied, and the microcomputer fault diagnosis function is introduced in detail. Finally, the system is deployed based on the micro-grid of a park, and the micro-grid fault diagnosis and analysis is carried out based on the micro-grid operation. The system operation and maintenance function interface is displayed, which verifies the correctness and reliability of the system.

  20. Compressive-sampling-based positioning in wireless body area networks.

    PubMed

    Banitalebi-Dehkordi, Mehdi; Abouei, Jamshid; Plataniotis, Konstantinos N

    2014-01-01

    Recent achievements in wireless technologies have opened up enormous opportunities for the implementation of ubiquitous health care systems in providing rich contextual information and warning mechanisms against abnormal conditions. This helps with the automatic and remote monitoring/tracking of patients in hospitals and facilitates and with the supervision of fragile, elderly people in their own domestic environment through automatic systems to handle the remote drug delivery. This paper presents a new modeling and analysis framework for the multipatient positioning in a wireless body area network (WBAN) which exploits the spatial sparsity of patients and a sparse fast Fourier transform (FFT)-based feature extraction mechanism for monitoring of patients and for reporting the movement tracking to a central database server containing patient vital information. The main goal of this paper is to achieve a high degree of accuracy and resolution in the patient localization with less computational complexity in the implementation using the compressive sensing theory. We represent the patients' positions as a sparse vector obtained by the discrete segmentation of the patient movement space in a circular grid. To estimate this vector, a compressive-sampling-based two-level FFT (CS-2FFT) feature vector is synthesized for each received signal from the biosensors embedded on the patient's body at each grid point. This feature extraction process benefits in the combination of both short-time and long-time properties of the received signals. The robustness of the proposed CS-2FFT-based algorithm in terms of the average positioning error is numerically evaluated using the realistic parameters in the IEEE 802.15.6-WBAN standard in the presence of additive white Gaussian noise. Due to the circular grid pattern and the CS-2FFT feature extraction method, the proposed scheme represents a significant reduction in the computational complexity, while improving the level of the resolution and the localization accuracy when compared to some classical CS-based positioning algorithms.

  1. Estimating the system price of redox flow batteries for grid storage

    NASA Astrophysics Data System (ADS)

    Ha, Seungbum; Gallagher, Kevin G.

    2015-11-01

    Low-cost energy storage systems are required to support extensive deployment of intermittent renewable energy on the electricity grid. Redox flow batteries have potential advantages to meet the stringent cost target for grid applications as compared to more traditional batteries based on an enclosed architecture. However, the manufacturing process and therefore potential high-volume production price of redox flow batteries is largely unquantified. We present a comprehensive assessment of a prospective production process for aqueous all vanadium flow battery and nonaqueous lithium polysulfide flow battery. The estimated investment and variable costs are translated to fixed expenses, profit, and warranty as a function of production volume. When compared to lithium-ion batteries, redox flow batteries are estimated to exhibit lower costs of manufacture, here calculated as the unit price less materials costs, owing to their simpler reactor (cell) design, lower required area, and thus simpler manufacturing process. Redox flow batteries are also projected to achieve the majority of manufacturing scale benefits at lower production volumes as compared to lithium-ion. However, this advantage is offset due to the dramatically lower present production volume of flow batteries compared to competitive technologies such as lithium-ion.

  2. Development of design technique for vacuum insulation in large size multi-aperture multi-grid accelerator for nuclear fusion.

    PubMed

    Kojima, A; Hanada, M; Tobari, H; Nishikiori, R; Hiratsuka, J; Kashiwagi, M; Umeda, N; Yoshida, M; Ichikawa, M; Watanabe, K; Yamano, Y; Grisham, L R

    2016-02-01

    Design techniques for the vacuum insulation have been developed in order to realize a reliable voltage holding capability of multi-aperture multi-grid (MAMuG) accelerators for fusion application. In this method, the nested multi-stage configuration of the MAMuG accelerator can be uniquely designed to satisfy the target voltage within given boundary conditions. The evaluation of the voltage holding capabilities of each acceleration stages was based on the previous experimental results about the area effect and the multi-aperture effect. Since the multi-grid effect was found to be the extension of the area effect by the total facing area this time, the total voltage holding capability of the multi-stage can be estimated from that per single stage by assuming the stage with the highest electric field, the total facing area, and the total apertures. By applying these consideration, the analysis on the 3-stage MAMuG accelerator for JT-60SA agreed well with the past gap-scan experiments with an accuracy of less than 10% variation, which demonstrated the high reliability to design MAMuG accelerators and also multi-stage high voltage bushings.

  3. "Tools For Analysis and Visualization of Large Time- Varying CFD Data Sets"

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; vanGelder, Allen

    1999-01-01

    During the four years of this grant (including the one year extension), we have explored many aspects of the visualization of large CFD (Computational Fluid Dynamics) datasets. These have included new direct volume rendering approaches, hierarchical methods, volume decimation, error metrics, parallelization, hardware texture mapping, and methods for analyzing and comparing images. First, we implemented an extremely general direct volume rendering approach that can be used to render rectilinear, curvilinear, or tetrahedral grids, including overlapping multiple zone grids, and time-varying grids. Next, we developed techniques for associating the sample data with a k-d tree, a simple hierarchial data model to approximate samples in the regions covered by each node of the tree, and an error metric for the accuracy of the model. We also explored a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH (Association for Computing Machinery Special Interest Group on Computer Graphics) '96. In our initial implementation, we automatically image the volume from 32 approximately evenly distributed positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation.

  4. Development of design technique for vacuum insulation in large size multi-aperture multi-grid accelerator for nuclear fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kojima, A., E-mail: kojima.atsushi@jaea.go.jp; Hanada, M.; Tobari, H.

    Design techniques for the vacuum insulation have been developed in order to realize a reliable voltage holding capability of multi-aperture multi-grid (MAMuG) accelerators for fusion application. In this method, the nested multi-stage configuration of the MAMuG accelerator can be uniquely designed to satisfy the target voltage within given boundary conditions. The evaluation of the voltage holding capabilities of each acceleration stages was based on the previous experimental results about the area effect and the multi-aperture effect. Since the multi-grid effect was found to be the extension of the area effect by the total facing area this time, the total voltagemore » holding capability of the multi-stage can be estimated from that per single stage by assuming the stage with the highest electric field, the total facing area, and the total apertures. By applying these consideration, the analysis on the 3-stage MAMuG accelerator for JT-60SA agreed well with the past gap-scan experiments with an accuracy of less than 10% variation, which demonstrated the high reliability to design MAMuG accelerators and also multi-stage high voltage bushings.« less

  5. Spatial Pattern Classification for More Accurate Forecasting of Variable Energy Resources

    NASA Astrophysics Data System (ADS)

    Novakovskaia, E.; Hayes, C.; Collier, C.

    2014-12-01

    The accuracy of solar and wind forecasts is becoming increasingly essential as grid operators continue to integrate additional renewable generation onto the electric grid. Forecast errors affect rate payers, grid operators, wind and solar plant maintenance crews and energy traders through increases in prices, project down time or lost revenue. While extensive and beneficial efforts were undertaken in recent years to improve physical weather models for a broad spectrum of applications these improvements have generally not been sufficient to meet the accuracy demands of system planners. For renewables, these models are often used in conjunction with additional statistical models utilizing both meteorological observations and the power generation data. Forecast accuracy can be dependent on specific weather regimes for a given location. To account for these dependencies it is important that parameterizations used in statistical models change as the regime changes. An automated tool, based on an artificial neural network model, has been developed to identify different weather regimes as they impact power output forecast accuracy at wind or solar farms. In this study, improvements in forecast accuracy were analyzed for varying time horizons for wind farms and utility-scale PV plants located in different geographical regions.

  6. Development of iterative techniques for the solution of unsteady compressible viscous flows

    NASA Technical Reports Server (NTRS)

    Hixon, Duane; Sankar, L. N.

    1993-01-01

    During the past two decades, there has been significant progress in the field of numerical simulation of unsteady compressible viscous flows. At present, a variety of solution techniques exist such as the transonic small disturbance analyses (TSD), transonic full potential equation-based methods, unsteady Euler solvers, and unsteady Navier-Stokes solvers. These advances have been made possible by developments in three areas: (1) improved numerical algorithms; (2) automation of body-fitted grid generation schemes; and (3) advanced computer architectures with vector processing and massively parallel processing features. In this work, the GMRES scheme has been considered as a candidate for acceleration of a Newton iteration time marching scheme for unsteady 2-D and 3-D compressible viscous flow calculation; from preliminary calculations, this will provide up to a 65 percent reduction in the computer time requirements over the existing class of explicit and implicit time marching schemes. The proposed method has ben tested on structured grids, but is flexible enough for extension to unstructured grids. The described scheme has been tested only on the current generation of vector processor architecture of the Cray Y/MP class, but should be suitable for adaptation to massively parallel machines.

  7. Spitzer-IRS Spectroscopic Studies of Oxygen-Rich Asymptotic Giant Branch Star and Red Supergiant Star Dust Properties

    NASA Astrophysics Data System (ADS)

    Sargent, Benjamin A.; Srinivasan, Sundar; Speck, Angela; Volk, Kevin; Kemper, Ciska; Reach, William T.; Lagadec, Eric; Bernard, Jean-Philippe; McDonald, Iain; Meixner, Margaret

    2015-01-01

    We analyze the dust emission features seen in Spitzer Space Telescope Infrared Spectrograph (IRS) spectra of Oxygen-rich (O-rich) asymptotic giant branch (AGB) and red supergiant (RSG) stars. The spectra come from the Spitzer Legacy program SAGE-Spectroscopy (PI: F. Kemper) and other archival Spitzer-IRS programs. The broad 10 and 20 micron emission features attributed to amorphous dust of silicate composition seen in the spectra show evidence for systematic differences in the centroid of both emission features between O-rich AGB and RSG populations. Radiative transfer modeling using the GRAMS grid of models of AGB and RSG stars suggests that the centroid differences are due to differences in dust properties. We present an update of our investigation of differences in dust composition, size, shape, etc that might be responsible for these spectral differences. We explore how these differences may arise from the different circumstellar environments around RSG and O-rich AGB stars. BAS acknowledges funding from NASA ADAP grant NNX13AD54G.

  8. Spitzer-IRS Spectroscopic Studies of the Properties of Dust from Oxygen-Rich Asymptotic Giant Branch and Red Supergiant Stars

    NASA Astrophysics Data System (ADS)

    Sargent, Benjamin A.; Speck, A.; Volk, K.; Kemper, C.; Reach, W. T.; Lagadec, E.; Bernard, J.; McDonald, I.; Meixner, M.; Srinivasan, S.

    2014-01-01

    We analyze the dust emission features seen in Spitzer Space Telescope Infrared Spectrograph (IRS) spectra of Oxygen-rich (O-rich) asymptotic giant branch (AGB) and red supergiant (RSG) stars. The spectra come from the Spitzer Legacy program SAGE-Spectroscopy (PI: F. Kemper) and other archival Spitzer-IRS programs. The broad 10 and 20 micron emission features attributed to amorphous dust of silicate composition seen in the spectra show evidence for systematic differences in the centroid of both emission features between O-rich AGB and RSG populations. Radiative transfer modeling using the GRAMS grid of models of AGB and RSG stars suggests that the centroid differences are due to differences in dust properties. We investigate differences in dust composition, size, shape, etc that might be responsible for these spectral differences. We explore how these differences may arise from the different circumstellar environments around RSG and O-rich AGB stars. BAS acknowledges funding from NASA ADAP grant NNX13AD54G.

  9. Fabrication of core-shell nanostructures via silicon on insulator dewetting and germanium condensation: towards a strain tuning method for SiGe-based heterostructures in a three-dimensional geometry.

    PubMed

    Naffouti, Meher; David, Thomas; Benkouider, Abdelmalek; Favre, Luc; Cabie, Martiane; Ronda, Antoine; Berbezier, Isabelle; Abbarchi, Marco

    2016-07-29

    We report on a novel method for the implementation of core-shell SiGe-based nanocrystals combining silicon on insulator dewetting in a molecular beam epitaxy reactor with an ex situ Ge condensation process. With an in situ two-step process (annealing and Ge deposition) we produce two families of islands on the same sample: Si-rich, formed during the first step and, all around them, Ge-rich formed after Ge deposition. By increasing the amount of Ge deposited on the annealed samples from 0 to 18 monolayers, the islands' shape in the Si-rich zones can be tuned from elongated and flat to more symmetric and with a larger vertical aspect ratio. At the same time, the spatial extension of the Ge-rich zones is progressively increased as well as the Ge content in the islands. Further processing by ex situ rapid thermal oxidation results in the formation of a core-shell composition profile in both Si and Ge-rich zones with atomically sharp heterointerfaces. The Ge condensation induces a Ge enrichment of the islands' shell of up to 50% while keeping a pure Si core in the Si-rich zones and a ∼25% SiGe alloy in the Ge-rich ones. The large lattice mismatch between core and shell, the absence of dislocations and the islands' monocrystalline nature render this novel class of nanostructures a promising device platform for strain-based band-gap engineering. Finally, this method can be used for the implementation of ultralarge scale meta-surfaces with dielectric Mie resonators for light manipulation at the nanoscale.

  10. Probing the Dusty Stellar Populations of the Local Volume Galaxies with JWST /MIRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Olivia C.; Meixner, Margaret; Justtanont, Kay

    The Mid-Infrared Instrument (MIRI) for the James Webb Space Telescope ( JWST ) will revolutionize our understanding of infrared stellar populations in the Local Volume. Using the rich Spitzer -IRS spectroscopic data set and spectral classifications from the Surveying the Agents of Galaxy Evolution (SAGE)–Spectroscopic survey of more than 1000 objects in the Magellanic Clouds, the Grid of Red Supergiant and Asymptotic Giant Branch Star Model (grams), and the grid of YSO models by Robitaille et al., we calculate the expected flux densities and colors in the MIRI broadband filters for prominent infrared stellar populations. We use these fluxes tomore » explore the JWST /MIRI colors and magnitudes for composite stellar population studies of Local Volume galaxies. MIRI color classification schemes are presented; these diagrams provide a powerful means of identifying young stellar objects, evolved stars, and extragalactic background galaxies in Local Volume galaxies with a high degree of confidence. Finally, we examine which filter combinations are best for selecting populations of sources based on their JWST colors.« less

  11. Grid Integration | Water Power | NREL

    Science.gov Websites

    deployment planning and commercialization process. Variable and weather-dependent resources can create /generation balancing, and planning for reserves. NREL has conducted extensive in-depth wind and solar

  12. Evaluating penalized logistic regression models to predict Heat-Related Electric grid stress days

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bramer, L. M.; Rounds, J.; Burleyson, C. D.

    Understanding the conditions associated with stress on the electricity grid is important in the development of contingency plans for maintaining reliability during periods when the grid is stressed. In this paper, heat-related grid stress and the relationship with weather conditions is examined using data from the eastern United States. Penalized logistic regression models were developed and applied to predict stress on the electric grid using weather data. The inclusion of other weather variables, such as precipitation, in addition to temperature improved model performance. Several candidate models and datasets were examined. A penalized logistic regression model fit at the operation-zone levelmore » was found to provide predictive value and interpretability. Additionally, the importance of different weather variables observed at different time scales were examined. Maximum temperature and precipitation were identified as important across all zones while the importance of other weather variables was zone specific. The methods presented in this work are extensible to other regions and can be used to aid in planning and development of the electrical grid.« less

  13. Evaluating penalized logistic regression models to predict Heat-Related Electric grid stress days

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bramer, Lisa M.; Rounds, J.; Burleyson, C. D.

    Understanding the conditions associated with stress on the electricity grid is important in the development of contingency plans for maintaining reliability during periods when the grid is stressed. In this paper, heat-related grid stress and the relationship with weather conditions were examined using data from the eastern United States. Penalized logistic regression models were developed and applied to predict stress on the electric grid using weather data. The inclusion of other weather variables, such as precipitation, in addition to temperature improved model performance. Several candidate models and combinations of predictive variables were examined. A penalized logistic regression model which wasmore » fit at the operation-zone level was found to provide predictive value and interpretability. Additionally, the importance of different weather variables observed at various time scales were examined. Maximum temperature and precipitation were identified as important across all zones while the importance of other weather variables was zone specific. In conclusion, the methods presented in this work are extensible to other regions and can be used to aid in planning and development of the electrical grid.« less

  14. Evaluating penalized logistic regression models to predict Heat-Related Electric grid stress days

    DOE PAGES

    Bramer, Lisa M.; Rounds, J.; Burleyson, C. D.; ...

    2017-09-22

    Understanding the conditions associated with stress on the electricity grid is important in the development of contingency plans for maintaining reliability during periods when the grid is stressed. In this paper, heat-related grid stress and the relationship with weather conditions were examined using data from the eastern United States. Penalized logistic regression models were developed and applied to predict stress on the electric grid using weather data. The inclusion of other weather variables, such as precipitation, in addition to temperature improved model performance. Several candidate models and combinations of predictive variables were examined. A penalized logistic regression model which wasmore » fit at the operation-zone level was found to provide predictive value and interpretability. Additionally, the importance of different weather variables observed at various time scales were examined. Maximum temperature and precipitation were identified as important across all zones while the importance of other weather variables was zone specific. In conclusion, the methods presented in this work are extensible to other regions and can be used to aid in planning and development of the electrical grid.« less

  15. Subgrid Modeling Geomorphological and Ecological Processes in Salt Marsh Evolution

    NASA Astrophysics Data System (ADS)

    Shi, F.; Kirby, J. T., Jr.; Wu, G.; Abdolali, A.; Deb, M.

    2016-12-01

    Numerical modeling a long-term evolution of salt marshes is challenging because it requires an extensive use of computational resources. Due to the presence of narrow tidal creeks, variations of salt marsh topography can be significant over spatial length scales on the order of a meter. With growing availability of high-resolution bathymetry measurements, like LiDAR-derived DEM data, it is increasingly desirable to run a high-resolution model in a large domain and for a long period of time to get trends of sedimentation patterns, morphological change and marsh evolution. However, high spatial-resolution poses a big challenge in both computational time and memory storage, when simulating a salt marsh with dimensions of up to O(100 km^2) with a small time step. In this study, we have developed a so-called Pre-storage, Sub-grid Model (PSM, Wu et al., 2015) for simulating flooding and draining processes in salt marshes. The simulation of Brokenbridge salt marsh, Delaware, shows that, with the combination of the sub-grid model and the pre-storage method, over 2 orders of magnitude computational speed-up can be achieved with minimal loss of model accuracy. We recently extended PSM to include a sediment transport component and models for biomass growth and sedimentation in the sub-grid model framework. The sediment transport model is formulated based on a newly derived sub-grid sediment concentration equation following Defina's (2000) area-averaging procedure. Suspended sediment transport is modeled by the advection-diffusion equation in the coarse grid level, but the local erosion and sedimentation rates are integrated over the sub-grid level. The morphological model is based on the existing morphological model in NearCoM (Shi et al., 2013), extended to include organic production from the biomass model. The vegetation biomass is predicted by a simple logistic equation model proposed by Marani et al. (2010). The biomass component is loosely coupled with hydrodynamic and sedimentation models owing to the different time scales of the physical and ecological processes. The coupled model is being applied to Delaware marsh evolution in response to rising sea level and changing sediment supplies.

  16. The distribution of cultural and biological diversity in Africa.

    PubMed Central

    Moore, Joslin L; Manne, Lisa; Brooks, Thomas; Burgess, Neil D; Davies, Robert; Rahbek, Carsten; Williams, Paul; Balmford, Andrew

    2002-01-01

    Anthropologists, biologists and linguists have all noted an apparent coincidence in species diversity and human cultural or linguistic diversity. We present, to our knowledge, one of the first quantitative descriptions of this coincidence and show that, for 2 degrees x 2 degrees grid cells across sub-Saharan Africa, cultural diversity and vertebrate species diversity exhibit marked similarities in their overall distribution. In addition, we show that 71% of the observed variation in species richness and 36% in language richness can be explained on the basis of environmental factors, suggesting that similar factors, especially those associated with rainfall and productivity, affect the distributions of both species and languages. Nevertheless, the form of the relationships between species richness and language richness and environmental factors differs, and it is unlikely that comparable mechanisms underpin the similar patterns of species and language richness. Moreover, the fact that the environmental factors considered here explain less than half of the variation in language richness indicates that other factors, many of which are likely to be historical or social, also influence the distribution of languages. PMID:12204124

  17. DRACO development for 3D simulations

    NASA Astrophysics Data System (ADS)

    Fatenejad, Milad; Moses, Gregory

    2006-10-01

    The DRACO (r-z) lagrangian radiation-hydrodynamics laser fusion simulation code is being extended to model 3D hydrodynamics in (x-y-z) coordinates with hexahedral cells on a structured grid. The equation of motion is solved with a lagrangian update with optional rezoning. The fluid equations are solved using an explicit scheme based on (Schulz, 1964) while the SALE-3D algorithm (Amsden, 1981) is used as a template for computing cell volumes and other quantities. A second order rezoner has been added which uses linear interpolation of the underlying continuous functions to preserve accuracy (Van Leer, 1976). Artificial restoring force terms and smoothing algorithms are used to avoid grid distortion in high aspect ratio cells. These include alternate node couplers along with a rotational restoring force based on the Tensor Code (Maenchen, 1964). Electron and ion thermal conduction is modeled using an extension of Kershaw's method (Kershaw, 1981) to 3D geometry. Test problem simulations will be presented to demonstrate the applicability of this new version of DRACO to the study of fluid instabilities in three dimensions.

  18. Laser polymerization-based novel lift-off technique

    NASA Astrophysics Data System (ADS)

    Bhuian, B.; Winfield, R. J.; Crean, G. M.

    2009-03-01

    The fabrication of microstructures by two-photon polymerization has been widely reported as a means of directly writing three-dimensional nanoscale structures. In the majority of cases a single point serial writing technique is used to form a polymer model. Single layer writing can also be used to fabricate two-dimensional patterns and we report an extension of this capability by using two-photon polymerization to form a template that can be used as a sacrificial layer for a novel lift-off process. A Ti:sapphire laser, with wavelength 795 nm, 80 MHz repetition rate, 100 fs pulse duration and an average power of 700 mW, was used to write 2D grid patterns with pitches of 0.8 and 1.0 μm in a urethane acrylate resin that was spun on to a lift-off base layer. This was overcoated with gold and the grid lifted away to leave an array of gold islands. The optical transmission properties of the gold arrays were measured and found to be in agreement with a rigorous coupled-wave analysis simulation.

  19. Loci-STREAM Version 0.9

    NASA Technical Reports Server (NTRS)

    Wright, Jeffrey; Thakur, Siddharth

    2006-01-01

    Loci-STREAM is an evolving computational fluid dynamics (CFD) software tool for simulating possibly chemically reacting, possibly unsteady flows in diverse settings, including rocket engines, turbomachines, oil refineries, etc. Loci-STREAM implements a pressure- based flow-solving algorithm that utilizes unstructured grids. (The benefit of low memory usage by pressure-based algorithms is well recognized by experts in the field.) The algorithm is robust for flows at all speeds from zero to hypersonic. The flexibility of arbitrary polyhedral grids enables accurate, efficient simulation of flows in complex geometries, including those of plume-impingement problems. The present version - Loci-STREAM version 0.9 - includes an interface with the Portable, Extensible Toolkit for Scientific Computation (PETSc) library for access to enhanced linear-equation-solving programs therein that accelerate convergence toward a solution. The name "Loci" reflects the creation of this software within the Loci computational framework, which was developed at Mississippi State University for the primary purpose of simplifying the writing of complex multidisciplinary application programs to run in distributed-memory computing environments including clusters of personal computers. Loci has been designed to relieve application programmers of the details of programming for distributed-memory computers.

  20. Differentiated protection services with failure probability guarantee for workflow-based applications

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Jin, Yaohui; Sun, Weiqiang; Hu, Weisheng

    2010-12-01

    A cost-effective and service-differentiated provisioning strategy is very desirable to service providers so that they can offer users satisfactory services, while optimizing network resource allocation. Providing differentiated protection services to connections for surviving link failure has been extensively studied in recent years. However, the differentiated protection services for workflow-based applications, which consist of many interdependent tasks, have scarcely been studied. This paper investigates the problem of providing differentiated services for workflow-based applications in optical grid. In this paper, we develop three differentiated protection services provisioning strategies which can provide security level guarantee and network-resource optimization for workflow-based applications. The simulation demonstrates that these heuristic algorithms provide protection cost-effectively while satisfying the applications' failure probability requirements.

  1. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  2. GRAMS: A Grid of RSG and AGB Models

    NASA Astrophysics Data System (ADS)

    Srinivasan, S.; Sargent, B. A.; Meixner, M.

    2011-09-01

    We present a grid of oxygen- and carbon-rich circumstellar dust radiative transfer models for asymptotic giant branch (AGB) and red supergiant (RSG) stars. The grid samples a large region of the relevant parameter space, and it allows for a quick calculation of bolometric fluxes and dust mass-loss rates from multi-wavelength photometry. This method of fitting observed spectral energy distributions (SEDs) is preferred over detailed radiative transfer calculations, especially for large data sets such as the SAGE (Surveying the Agents of a Galaxy's Evolution) survey of the Magellanic Clouds. The mass-loss rates calculated for SAGE data will allow us to quantify the dust returned to the interstellar medium (ISM) by the entire AGB population. The total injection rate provides an important constraint for models of galactic chemical evolution. Here, we discuss our carbon star models and compare the results to SAGE observations in the Large Magellanic Cloud (LMC).

  3. Current Grid operation and future role of the Grid

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    2012-12-01

    Grid-like technologies and approaches became an integral part of HEP experiments. Some other scientific communities also use similar technologies for data-intensive computations. The distinct feature of Grid computing is the ability to federate heterogeneous resources of different ownership into a seamless infrastructure, accessible via a single log-on. Like other infrastructures of similar nature, Grid functioning requires not only technologically sound basis, but also reliable operation procedures, monitoring and accounting. The two aspects, technological and operational, are closely related: weaker is the technology, more burden is on operations, and other way around. As of today, Grid technologies are still evolving: at CERN alone, every LHC experiment uses an own Grid-like system. This inevitably creates a heavy load on operations. Infrastructure maintenance, monitoring and incident response are done on several levels, from local system administrators to large international organisations, involving massive human effort worldwide. The necessity to commit substantial resources is one of the obstacles faced by smaller research communities when moving computing to the Grid. Moreover, most current Grid solutions were developed under significant influence of HEP use cases, and thus need additional effort to adapt them to other applications. Reluctance of many non-HEP researchers to use Grid negatively affects the outlook for national Grid organisations, which strive to provide multi-science services. We started from the situation where Grid organisations were fused with HEP laboratories and national HEP research programmes; we hope to move towards the world where Grid will ultimately reach the status of generic public computing and storage service provider and permanent national and international Grid infrastructures will be established. How far will we be able to advance along this path, depends on us. If no standardisation and convergence efforts will take place, Grid will become limited to HEP; if however the current multitude of Grid-like systems will converge to a generic, modular and extensible solution, Grid will become true to its name.

  4. Wildlife-friendly farming benefits rare birds, bees and plants.

    PubMed

    Pywell, Richard F; Heard, Matthew S; Bradbury, Richard B; Hinsley, Shelley; Nowakowski, Marek; Walker, Kevin J; Bullock, James M

    2012-10-23

    Agricultural intensification is a leading cause of global biodiversity loss, especially for threatened and near-threatened species. One widely implemented response is 'wildlife-friendly farming', involving the close integration of conservation and extensive farming practices within agricultural landscapes. However, the putative benefits from this controversial policy are currently either unknown or thought unlikely to extend to rare and declining species. Here, we show that new, evidence-based approaches to habitat creation on intensively managed farmland in England can achieve large increases in plant, bee and bird species. In particular, we found that habitat enhancement methods designed to provide the requirements of sensitive target biota consistently increased the richness and abundance of both rare and common species, with 10-fold to greater than 100-fold more rare species per sample area than generalized conventional conservation measures. Furthermore, targeting landscapes of high species richness amplified beneficial effects on the least mobile taxa: plants and bees. Our results provide the first unequivocal support for a national wildlife-friendly farming policy and suggest that this approach should be implemented much more extensively to address global biodiversity loss. However, to be effective, these conservation measures must be evidence-based, and developed using sound knowledge of the ecological requirements of key species.

  5. Minimizing Dispersion in FDTD Methods with CFL Limit Extension

    NASA Astrophysics Data System (ADS)

    Sun, Chen

    The CFL extension in FDTD methods is receiving considerable attention in order to reduce the computational effort and save the simulation time. One of the major issues in the CFL extension methods is the increased dispersion. We formulate a decomposition of FDTD equations to study the behaviour of the dispersion. A compensation scheme to reduce the dispersion in CFL extension is constructed and proposed. We further study the CFL extension in a FDTD subgridding case, where we improve the accuracy by acting only on the FDTD equations of the fine grid. Numerical results confirm the efficiency of the proposed method for minimising dispersion.

  6. Improving National Capability in Biogeochemical Flux Modelling: the UK Environmental Virtual Observatory (EVOp)

    NASA Astrophysics Data System (ADS)

    Johnes, P.; Greene, S.; Freer, J. E.; Bloomfield, J.; Macleod, K.; Reaney, S. M.; Odoni, N. A.

    2012-12-01

    The best outcomes from watershed management arise where policy and mitigation efforts are underpinned by strong science evidence, but there are major resourcing problems associated with the scale of monitoring needed to effectively characterise the sources rates and impacts of nutrient enrichment nationally. The challenge is to increase national capability in predictive modelling of nutrient flux to waters, securing an effective mechanism for transferring knowledge and management tools from data-rich to data-poor regions. The inadequacy of existing tools and approaches to address these challenges provided the motivation for the Environmental Virtual Observatory programme (EVOp), an innovation from the UK Natural Environment Research Council (NERC). EVOp is exploring the use of a cloud-based infrastructure in catchment science, developing an exemplar to explore N and P fluxes to inland and coastal waters in the UK from grid to catchment and national scale. EVOp is bringing together for the first time national data sets, models and uncertainty analysis into cloud computing environments to explore and benchmark current predictive capability for national scale biogeochemical modelling. The objective is to develop national biogeochemical modelling capability, capitalising on extensive national investment in the development of science understanding and modelling tools to support integrated catchment management, and supporting knowledge transfer from data rich to data poor regions, The AERC export coefficient model (Johnes et al., 2007) has been adapted to function within the EVOp cloud environment, and on a geoclimatic basis, using a range of high resolution, geo-referenced digital datasets as an initial demonstration of the enhanced national capacity for N and P flux modelling using cloud computing infrastructure. Geoclimatic regions are landscape units displaying homogenous or quasi-homogenous functional behaviour in terms of process controls on N and P cycling, underpin this approach (Johnes & Butterfield, 2002). Ten regions have been defined across the UK using GIS manipulation of spatial data describing hydrogeology, runoff, topographical slope and soil parent material. The export coefficient model operates within this regional modelling framework, providing mapped, tabulated and statistical outputs at scales from 1km2 grid scale to river catchment, WFD river basin district, major coastal drainage units to the North Sea, North Atlantic and English Channel, to the international reporting units defined under OSPAR, the International Convention for the protection of the marine environment of the North-East Atlantic. Here the geoclimatic modelling framework is presented together with modelled fluxes for N and P for each scale of reporting unit, together with scenario analysis applied at regional scale and mapped at national scale. The ways in which the results can be used to further explore the primary drivers for spatial variation and identify waterbodies at risk, especially in unmonitored and data-poor catchments are discussed, and the technical and computational support of a cloud-based infrastructure is evaluated as a mechanism to explore potential water quality impacts of future mitigation strategies applied at catchment to national scale.

  7. Aligning PEV Charging Times with Electricity Supply and Demand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Cabell

    Plug-in electric vehicles (PEVs) are a growing source of electricity consumption that could either exacerbate supply shortages or smooth electricity demand curves. Extensive research has explored how vehicle-grid integration (VGI) can be optimized by controlling PEV charging timing or providing vehicle-to-grid (V2G) services, such as storing energy in vehicle batteries and returning it to the grid at peak times. While much of this research has modeled charging, implementation in the real world requires a cost-effective solution that accounts for consumer behavior. To function across different contexts, several types of charging administrators and methods of control are necessary to minimize costsmore » in the VGI context.« less

  8. The other prey-capture silk: Fibres made by glow-worms (Diptera: Keroplatidae) comprise cross-β-sheet crystallites in an abundant amorphous fraction.

    PubMed

    Walker, Andrew A; Weisman, Sarah; Trueman, Holly E; Merritt, David J; Sutherland, Tara D

    2015-09-01

    Glow-worms (larvae of dipteran genus Arachnocampa) are restricted to moist habitats where they capture flying prey using snares composed of highly extensible silk fibres and sticky mucus droplets. Little is known about the composition or structure of glow-worm snares, or the extent of possible convergence between glow-worm and arachnid capture silks. We characterised Arachnocampa richardsae silk and mucus using X-ray scattering, Fourier transform infrared spectroscopy and amino acid analysis. Silk but not mucus contained crystallites of the cross-β-sheet type, which occur in unrelated insect silks but have not been reported previously in fibres used for prey capture. Mucus proteins were rich in Gly (28.5%) and existed in predominantly a random coil structure, typical of many adhesive proteins. In contrast, the silk fibres were unusually rich in charged and polar residues, particularly Lys (18.1%), which we propose is related to their use in a highly hydrated state. Comparison of X-ray scattering, infrared spectroscopy and amino acid analysis data suggests that silk fibres contain a high fraction of disordered protein. We suggest that in the native hydrated state, silk fibres are capable of extension via deformation of both disordered regions and cross-β-sheet crystallites, and that high extensibility is an adaptation promoting successful prey capture. This study illustrates the rich variety of protein motifs that are available for recruitment into biopolymers, and how convergently evolved materials can nevertheless be based on fundamentally different protein structures. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  9. Sampling designs matching species biology produce accurate and affordable abundance indices

    PubMed Central

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which raised capture probabilities. The grid design was least biased (−10.5%), but imprecise (CV 21.2%), and used most effort (16,100 trap-nights). The targeted configuration was more biased (−17.3%), but most precise (CV 12.3%), with least effort (7,000 trap-nights). Targeted sampling generated encounter rates four times higher, and capture and recapture probabilities 11% and 60% higher than grid sampling, in a sampling frame 88% smaller. Bears had unequal probability of capture with both sampling designs, partly because some bears never had traps available to sample them. Hence, grid and targeted sampling generated abundance indices, not estimates. Overall, targeted sampling provided the most accurate and affordable design to index abundance. Targeted sampling may offer an alternative method to index the abundance of other species inhabiting expansive and inaccessible landscapes elsewhere, provided their attraction to resource concentrations. PMID:24392290

  10. Water withdrawals reduce native fish diversity across the sunbelt of the US

    NASA Astrophysics Data System (ADS)

    Sabo, J. L.; Bowling, L. C.; Roath, J.; Sinha, T.; Kominoski, J.; Fuller, P.

    2012-12-01

    Water withdrawals for urban, industrial and agricultural uses are known to have negative effects on freshwater biodiversity, but this conclusion is based largely on a small number of place based studies. In this talk we will present results from a continental scale analysis of water withdrawals on the species richness of native and non-native fishes in the coterminous US. To do this we compiled data from the USGS on water withdrawals and the species richness of non-native fishes. We obtained data on the native fish species richness from NatureServe's native fish database. We also compiled spatial data on cropland area and urban impervious surfaces. Finally, we used gridded estimates of streamflow from the Variable Infiltration Capacity model and a routing model to estimate streamflow (less upstream water withdrawal). We estimate the water stress index (WSI) as withdrawals standardized by streamflow (local and upstream deliveries) and use this as a metric of sustainability of human water use. All data were compiled at the spatial resolution of 8-digit hydrologic unit code hydrologic accounting units. Our key finding is that human water use (WSI)--and not impervious surfaces or cropland area--has a strong negative effect on native, but not non-native biodiversity in rivers. This result was robust across the US sunbelt but weaker across the coterminous US. Our result suggests that the effects of cities and farms on native freshwater fauna are outweighed by the upstream and cross-basin extraction of water to support these land uses.

  11. Progress in Grid Generation: From Chimera to DRAGON Grids

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Kao, Kai-Hsiung

    1994-01-01

    Hybrid grids, composed of structured and unstructured grids, combines the best features of both. The chimera method is a major stepstone toward a hybrid grid from which the present approach is evolved. The chimera grid composes a set of overlapped structured grids which are independently generated and body-fitted, yielding a high quality grid readily accessible for efficient solution schemes. The chimera method has been shown to be efficient to generate a grid about complex geometries and has been demonstrated to deliver accurate aerodynamic prediction of complex flows. While its geometrical flexibility is attractive, interpolation of data in the overlapped regions - which in today's practice in 3D is done in a nonconservative fashion, is not. In the present paper we propose a hybrid grid scheme that maximizes the advantages of the chimera scheme and adapts the strengths of the unstructured grid while at the same time keeps its weaknesses minimal. Like the chimera method, we first divide up the physical domain by a set of structured body-fitted grids which are separately generated and overlaid throughout a complex configuration. To eliminate any pure data manipulation which does not necessarily follow governing equations, we use non-structured grids only to directly replace the region of the arbitrarily overlapped grids. This new adaptation to the chimera thinking is coined the DRAGON grid. The nonstructured grid region sandwiched between the structured grids is limited in size, resulting in only a small increase in memory and computational effort. The DRAGON method has three important advantages: (1) preserving strengths of the chimera grid; (2) eliminating difficulties sometimes encountered in the chimera scheme, such as the orphan points and bad quality of interpolation stencils; and (3) making grid communication in a fully conservative and consistent manner insofar as the governing equations are concerned. To demonstrate its use, the governing equations are discretized using the newly proposed flux scheme, AUSM+, which will be briefly described herein. Numerical tests on representative 2D inviscid flows are given for demonstration. Finally, extension to 3D is underway, only paced by the availability of the 3D unstructured grid generator.

  12. Extending OPeNDAP's Data-Access Protocol to Include Enhanced Pre-Retrieval Operations

    NASA Astrophysics Data System (ADS)

    Fulker, D. W.

    2013-12-01

    We describe plans to extend OPeNDAP's Web-services protocol as a Building Block for NSF's EarthCube initiative. Though some data-access services have offered forms of subset-selection for decades, other pre-retrieval operations have been unavailable, in part because their benefits (over equivalent post-retrieval actions) are only now becoming fully evident. This is due in part to rapid growth in the volumes of data that are pertinent to the geosciences, exacerbated by limitations such as Internet speeds and latencies as well as pressures toward data usage on ever-smaller devices. In this context, as recipients of a "Building Blocks" award from the most recent round of EarthCube funding, we are launching the specification and prototype implementation of a new Open Data Services Invocation Protocol (ODSIP), by which clients may invoke a newly rich set of data-acquisition services, ranging from statistical summarization and criteria-driven subsetting to re-gridding/resampling. ODSIP will be an extension to DAP4, the latest version of OPeNDAP's widely used data access protocol, which underpins a number of open-source, multilingual, client-server systems (offering data access as a Web service), including THREDDS, PyDAP, GrADS, ERDAP and FERRET, as well as OPeNDAP's own Hyrax servers. We are motivated by the idea that key parts of EarthCube can be built effectively around clients and servers that employ a common and conceptually rich protocol for data acquisition. This concept extends 'data provision' to include pre-retrieval operations that, even when invoked by remote clients, exhibit efficiencies of data-proximate computation. Our aim for ODSIP is to embed a largely domain-neutral algebra of server functions that, despite being deliberately compact, can fulfill a broad range of user needs for pre-retrieval operations. To that end, our approach builds upon languages and tools that have proven effective in multi-domain contexts, and we will employ a user-centered design process built around three science scenarios: 1) accelerated visualization/analysis of model outputs on non-rectangular meshes (over coastal North Carolina); 2) dynamic downscaling of climate predictions for regional utility (over Hawaii); and 3) feature-oriented retrievals of satellite imagery (focusing on satellite-derived sea-surface-temperature fronts). These scenarios will test important aspects of the server-function algebra: * The Hawaii climate study requires coping with issues of scale on rectangular grids, placing strong emphasis on statistical functions. * The east-coast storm-surge study requires irregular grids, thus exploring mathematical challenges that have been addressed in many domains via the GridFields library, which we will employ. We think important classes of geoscience problems in multiple domains--where dealing with discontinuities, for example--are essentially intractable without polygonal meshes. * The sea-surface fronts study integrates vector-style features with array-style coverages, thus touching on the kinds of mathematics that arise when mixing Eulerian and Lagrangian frameworks. Our presentation will sketch the context for ODSIP, our process for a user-centered design, and our hopes for how ODSIP, as an emerging cyberinfrastructure concept for the Geosciences, may serve as a fundamental building block for EarthCube.

  13. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    NASA Astrophysics Data System (ADS)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the earthquake wave propagation to tsunami mitigation would be feasible once the user community support is in place.

  14. Early stage fatigue damage occurs in bovine tendon fascicles in the absence of changes in mechanics at either the gross or micro-structural level

    PubMed Central

    Shepherd, Jennifer H.; Riley, Graham P.; Screen, Hazel R.C.

    2014-01-01

    Many tendon injuries are believed to result from repetitive motion or overuse, leading to the accumulation of micro-damage over time. In vitro fatigue loading can be used to characterise damage during repeated use and investigate how this may relate to the aetiology of tendinopathy. This study considered the effect of fatigue loading on fascicles from two functionally distinct bovine tendons: the digital extensor and deep digital flexor. Micro-scale extension mechanisms were investigated in fascicles before or after a period of cyclic creep loading, comparing two different measurement techniques – the displacement of a photo-bleached grid and the use of nuclei as fiducial markers. Whilst visual damage was clearly identified after only 300 cycles of creep loading, these visual changes did not affect either gross fascicle mechanics or fascicle microstructural extension mechanisms over the 900 fatigue cycles investigated. However, significantly greater fibre sliding was measured when observing grid deformation rather than the analysis of nuclei movement. Measurement of microstructural extension with both techniques was localised and this may explain the absence of change in microstructural deformation in response to fatigue loading. Alternatively, the data may demonstrate that fascicles can withstand a degree of matrix disruption with no impact on mechanics. Whilst use of a photo-bleached grid to directly measure the collagen is the best indicator of matrix deformation, nuclei tracking may provide a better measure of the strain perceived directly by the cells. PMID:25001495

  15. IGI (the Italian Grid initiative) and its impact on the Astrophysics community

    NASA Astrophysics Data System (ADS)

    Pasian, F.; Vuerli, C.; Taffoni, G.

    IGI - the Association for the Italian Grid Infrastructure - has been established as a consortium of 14 different national institutions to provide long term sustainability to the Italian Grid. Its formal predecessor, the Grid.it project, has come to a close in 2006; to extend the benefits of this project, IGI has taken over and acts as the national coordinator for the different sectors of the Italian e-Infrastructure present in EGEE. IGI plans to support activities in a vast range of scientificdisciplines - e.g. Physics, Astrophysics, Biology, Health, Chemistry, Geophysics, Economy, Finance - and any possible extensions to other sectors such as Civil Protection, e-Learning, dissemination in Universities and secondary schools. Among these, the Astrophysics community is active as a user, by porting applications of various kinds, but also as a resource provider in terms of computing power and storage, and as middleware developer.

  16. A grid-enabled web service for low-resolution crystal structure refinement.

    PubMed

    O'Donovan, Daniel J; Stokes-Rees, Ian; Nam, Yunsun; Blacklow, Stephen C; Schröder, Gunnar F; Brunger, Axel T; Sliz, Piotr

    2012-03-01

    Deformable elastic network (DEN) restraints have proved to be a powerful tool for refining structures from low-resolution X-ray crystallographic data sets. Unfortunately, optimal refinement using DEN restraints requires extensive calculations and is often hindered by a lack of access to sufficient computational resources. The DEN web service presented here intends to provide structural biologists with access to resources for running computationally intensive DEN refinements in parallel on the Open Science Grid, the US cyberinfrastructure. Access to the grid is provided through a simple and intuitive web interface integrated into the SBGrid Science Portal. Using this portal, refinements combined with full parameter optimization that would take many thousands of hours on standard computational resources can now be completed in several hours. An example of the successful application of DEN restraints to the human Notch1 transcriptional complex using the grid resource, and summaries of all submitted refinements, are presented as justification.

  17. Convergance experiments with a hydrodynamic model of Port Royal Sound, South Carolina

    USGS Publications Warehouse

    Lee, J.K.; Schaffranek, R.W.; Baltzer, R.A.

    1989-01-01

    A two-demensional, depth-averaged, finite-difference, flow/transport model, SIM2D, is being used to simulate tidal circulation and transport in the Port Royal Sound, South Carolina, estuarine system. Models of a subregion of the Port Royal Sound system have been derived from an earlier-developed model of the entire system having a grid size of 600 ft. The submodels were implemented with grid sizes of 600, 300, and 150 ft in order to determine the effects of changes in grid size on computed flows in the subregion, which is characterized by narrow channels and extensive tidal flats that flood and dewater with each rise and fall of the tide. Tidal amplitudes changes less than 5 percent as the grid size was decreased. Simulations were performed with the 300-foot submodel for time steps of 60, 30, and 15 s. Study results are discussed.

  18. Rotational-translational fourier imaging system

    NASA Technical Reports Server (NTRS)

    Campbell, Jonathan W. (Inventor)

    2004-01-01

    This invention has the ability to create Fourier-based images with only two grid pairs. The two grid pairs are manipulated in a manner that allows (1) a first grid pair to provide multiple real components of the Fourier-based image and (2) a second grid pair to provide multiple imaginary components of the Fourier-based image. The novelty of this invention resides in the use of only two grid pairs to provide the same imaging information that has been traditionally collected with multiple grid pairs.

  19. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  20. Flow characteristics and scaling past highly porous wall-mounted fences

    NASA Astrophysics Data System (ADS)

    Rodríguez-López, Eduardo; Bruce, Paul J. K.; Buxton, Oliver R. H.

    2017-07-01

    An extensive characterization of the flow past wall-mounted highly porous fences based on single- and multi-scale geometries has been performed using hot-wire anemometry in a low-speed wind tunnel. Whilst drag properties (estimated from the time-averaged momentum equation) seem to be mostly dependent on the grids' blockage ratio; wakes of different size and orientation bars seem to generate distinct behaviours regarding turbulence properties. Far from the near-grid region, the flow is dominated by the presence of two well-differentiated layers: one close to the wall dominated by the near-wall behaviour and another one corresponding to the grid's wake and shear layer, originating from between this and the freestream. It is proposed that the effective thickness of the wall layer can be inferred from the wall-normal profile of root-mean-square streamwise velocity or, alternatively, from the wall-normal profile of streamwise velocity correlation. Using these definitions of wall-layer thickness enables us to collapse different trends of the turbulence behaviour inside this layer. In particular, the root-mean-square level of the wall shear stress fluctuations, longitudinal integral length scale, and spanwise turbulent structure is shown to display a satisfactory scaling with this thickness rather than with the whole thickness of the grid's wake. Moreover, it is shown that certain grids destroy the spanwise arrangement of large turbulence structures in the logarithmic region, which are then re-formed after a particular streamwise extent. It is finally shown that for fences subject to a boundary layer of thickness comparable to their height, the effective thickness of the wall layer scales with the incoming boundary layer thickness. Analogously, it is hypothesized that the growth rate of the internal layer is also partly dependent on the incoming boundary layer thickness.

  1. Distribution System Reliability Analysis for Smart Grid Applications

    NASA Astrophysics Data System (ADS)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  2. Latitudinal Diversity Gradients in New World Bats: Are They a Consequence of Niche Conservatism?

    PubMed Central

    Ramos Pereira, Maria João; Palmeirim, Jorge M.

    2013-01-01

    The increase in species diversity from the Poles to the Equator is a major biogeographic pattern, but the mechanisms underlying it remain obscure. Our aim is to contribute to their clarification by describing the latitudinal gradients in species richness and in evolutionary age of species of New World bats, and testing if those patterns may be explained by the niche conservatism hypothesis. Maps of species ranges were used to estimate species richness in a 100 x 100 km grid. Root distances in a molecular phylogeny were used as a proxy for the age of species, and the mean root distance of the species in each cell of the grid was estimated. Generalised additive models were used to relate latitude with both species richness and mean root distance. This was done for each of the three most specious bat families and for all Chiroptera combined. Species richness increases towards the Equator in the whole of the Chiroptera and in the Phyllostomidae and Molossidae, families that radiated in the tropics, but the opposite trend is observed in the Vespertilionidae, which has a presumed temperate origin. In the whole of the Chiroptera, and in the three main families, there were more basal species in the higher latitudes, and more derived species in tropical areas. In general, our results were not consistent with the predictions of niche conservatism. Tropical niche conservatism seems to keep bat clades of tropical origin from colonizing temperate zones, as they lack adaptations to survive cold winters, such as the capacity to hibernate. However, the lower diversity of Vespertilionidae in the Neotropics is better explained by competition with a diverse pre-existing community of bats than by niche conservatism. PMID:23935963

  3. Latitudinal diversity gradients in New World bats: are they a consequence of niche conservatism?

    PubMed

    Ramos Pereira, Maria João; Palmeirim, Jorge M

    2013-01-01

    The increase in species diversity from the Poles to the Equator is a major biogeographic pattern, but the mechanisms underlying it remain obscure. Our aim is to contribute to their clarification by describing the latitudinal gradients in species richness and in evolutionary age of species of New World bats, and testing if those patterns may be explained by the niche conservatism hypothesis. Maps of species ranges were used to estimate species richness in a 100 x 100 km grid. Root distances in a molecular phylogeny were used as a proxy for the age of species, and the mean root distance of the species in each cell of the grid was estimated. Generalised additive models were used to relate latitude with both species richness and mean root distance. This was done for each of the three most specious bat families and for all Chiroptera combined. Species richness increases towards the Equator in the whole of the Chiroptera and in the Phyllostomidae and Molossidae, families that radiated in the tropics, but the opposite trend is observed in the Vespertilionidae, which has a presumed temperate origin. In the whole of the Chiroptera, and in the three main families, there were more basal species in the higher latitudes, and more derived species in tropical areas. In general, our results were not consistent with the predictions of niche conservatism. Tropical niche conservatism seems to keep bat clades of tropical origin from colonizing temperate zones, as they lack adaptations to survive cold winters, such as the capacity to hibernate. However, the lower diversity of Vespertilionidae in the Neotropics is better explained by competition with a diverse pre-existing community of bats than by niche conservatism.

  4. Integration of Cloud resources in the LHCb Distributed Computing

    NASA Astrophysics Data System (ADS)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  5. Fast and accurate grid representations for atom-based docking with partner flexibility.

    PubMed

    de Vries, Sjoerd J; Zacharias, Martin

    2017-06-30

    Macromolecular docking methods can broadly be divided into geometric and atom-based methods. Geometric methods use fast algorithms that operate on simplified, grid-like molecular representations, while atom-based methods are more realistic and flexible, but far less efficient. Here, a hybrid approach of grid-based and atom-based docking is presented, combining precalculated grid potentials with neighbor lists for fast and accurate calculation of atom-based intermolecular energies and forces. The grid representation is compatible with simultaneous multibody docking and can tolerate considerable protein flexibility. When implemented in our docking method ATTRACT, grid-based docking was found to be ∼35x faster. With the OPLSX forcefield instead of the ATTRACT coarse-grained forcefield, the average speed improvement was >100x. Grid-based representations may allow atom-based docking methods to explore large conformational spaces with many degrees of freedom, such as multiple macromolecules including flexibility. This increases the domain of biological problems to which docking methods can be applied. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  6. Development of a Flexible Framework of Common Hypersonic Navier-Strokes Meshes for the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.; Reuthler, James J.; McDaniel, Ryan D.

    2003-01-01

    A flexible framework for the development of block structured volume grids for hypersonic Navier-Stokes flow simulations was developed for analysis of the Shuttle Orbiter Columbia. The development of the flexible framework, resulted in an ability to quickly generate meshes to directly correlate solutions contributed by participating groups on a common surface mesh, providing confidence for the extension of the envelope of solutions and damage scenarios. The framework draws on the experience of NASA Langely and NASA Ames Research Centers in structured grid generation, and consists of a grid generation process that is implemented through a division of responsibilities. The nominal division of labor consisted of NASA Johnson Space Center coordinating the damage scenarios to be analyzed by the Aerothermodynamics Columbia Accident Investigation (CAI) team, Ames developing the surface grids that described the computational volume about the orbiter, and Langely improving grid quality of Ames generated data and constructing the final volume grids. Distributing the work among the participants in the Aerothermodynamic CIA team resulted in significantly less time required to construct complete meshes than possible by any individual participant. The approach demonstrated that the One-NASA grid generation team could sustain the demand for new meshes to explore new damage scenarios within a aggressive timeline.

  7. The GRID[subscript C] Project: Developing Students' Thinking Skills in a Data-Rich Environment

    ERIC Educational Resources Information Center

    DeLuca, V. William; Lari, Nasim

    2011-01-01

    The purpose of this study was to determine the impact of using renewable energy data, obtained from a comprehensive data acquisition system, on improving students' learning and developing their higher-order learning skills. This study used renewable energy data available through a data acquisition system installed and tested by the Green Research…

  8. Savannah, Georgia: The Lasting Legacy of Colonial City Planning. Teaching with Historic Places.

    ERIC Educational Resources Information Center

    Kratzer, Judson

    Strolling through the old city of Savannah, Georgia's rigid, grid pattern streets, down its linear brick walkways, past over 1,100 residential and public buildings of unparalleled architectural richness and diversity, visitors and residents come to appreciate the original plan that has existed intact since Savannah's founding in 1733. Twenty-four…

  9. Multi-scale recordings for neuroprosthetic control of finger movements.

    PubMed

    Baker, Justin; Bishop, William; Kellis, Spencer; Levy, Todd; House, Paul; Greger, Bradley

    2009-01-01

    We trained a rhesus monkey to perform individuated and combined finger flexions and extensions of the thumb, index, and middle finger. A Utah Electrode Array (UEA) was implanted into the hand region of the motor cortex contralateral to the monkey's trained hand. We also implanted a microwire electrocorticography grid (microECoG) epidurally so that it covered the UEA. The microECoG grid spanned the arm and hand regions of both the primary motor and somatosensory cortices. Previously this monkey had Implantable MyoElectric Sensors (IMES) surgically implanted into the finger muscles of the monkey's forearm. Action potentials (APs), local field potentials (LFPs), and microECoG signals were recorded from wired head-stage connectors for the UEA and microECoG grids, while EMG was recorded wirelessly. The monkey performed a finger flexion/extension task while neural and EMG data were acquired. We wrote an algorithm that uses the spike data from the UEA to perform a real-time decode of the monkey's finger movements. Also, analyses of the LFP and microECoG data indicate that these data show trial-averaged differences between different finger movements, indicating the data are potentially decodeable.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierre, John W.; Wies, Richard; Trudnowski, Daniel

    Time-synchronized measurements provide rich information for estimating a power-system's electromechanical modal properties via advanced signal processing. This information is becoming critical for the improved operational reliability of interconnected grids. A given mode's properties are described by its frequency, damping, and shape. Modal frequencies and damping are useful indicators of power-system stress, usually declining with increased load or reduced grid capacity. Mode shape provides critical information for operational control actions. This project investigated many advanced techniques for power system identification from measured data focusing on mode frequency and damping ratio estimation. Investigators from the three universities coordinated their effort with Pacificmore » Northwest National Laboratory (PNNL). Significant progress was made on developing appropriate techniques for system identification with confidence intervals and testing those techniques on field measured data and through simulation. Experimental data from the western area power system was provided by PNNL and Bonneville Power Administration (BPA) for both ambient conditions and for signal injection tests. Three large-scale tests were conducted for the western area in 2005 and 2006. Measured field PMU (Phasor Measurement Unit) data was provided to the three universities. A 19-machine simulation model was enhanced for testing the system identification algorithms. Extensive simulations were run with this model to test the performance of the algorithms. University of Wyoming researchers participated in four primary activities: (1) Block and adaptive processing techniques for mode estimation from ambient signals and probing signals, (2) confidence interval estimation, (3) probing signal design and injection method analysis, and (4) performance assessment and validation from simulated and field measured data. Subspace based methods have been use to improve previous results from block processing techniques. Bootstrap techniques have been developed to estimate confidence intervals for the electromechanical modes from field measured data. Results were obtained using injected signal data provided by BPA. A new probing signal was designed that puts more strength into the signal for a given maximum peak to peak swing. Further simulations were conducted on a model based on measured data and with the modifications of the 19-machine simulation model. Montana Tech researchers participated in two primary activities: (1) continued development of the 19-machine simulation test system to include a DC line; and (2) extensive simulation analysis of the various system identification algorithms and bootstrap techniques using the 19 machine model. Researchers at the University of Alaska-Fairbanks focused on the development and testing of adaptive filter algorithms for mode estimation using data generated from simulation models and on data provided in collaboration with BPA and PNNL. There efforts consist of pre-processing field data, testing and refining adaptive filter techniques (specifically the Least Mean Squares (LMS), the Adaptive Step-size LMS (ASLMS), and Error Tracking (ET) algorithms). They also improved convergence of the adaptive algorithms by using an initial estimate from block processing AR method to initialize the weight vector for LMS. Extensive testing was performed on simulated data from the 19 machine model. This project was also extensively involved in the WECC (Western Electricity Coordinating Council) system wide tests carried out in 2005 and 2006. These tests involved injecting known probing signals into the western power grid. One of the primary goals of these tests was the reliable estimation of electromechanical mode properties from measured PMU data. Applied to the system were three types of probing inputs: (1) activation of the Chief Joseph Dynamic Brake, (2) mid-level probing at the Pacific DC Intertie (PDCI), and (3) low-level probing on the PDCI. The Chief Joseph Dynamic Brake is a 1400 MW disturbance to the system and is injected for a half of a second. For the mid and low-level probing, the Celilo terminal of the PDCI is modulated with a known probing signal. Similar but less extensive tests were conducted in June of 2000. The low-level probing signals were designed at the University of Wyoming. A number of important design factors are considered. The designed low-level probing signal used in the tests is a multi-sine signal. Its frequency content is focused in the range of the inter-area electromechanical modes. The most frequently used of these low-level multi-sine signals had a period of over two minutes, a root-mean-square (rms) value of 14 MW, and a peak magnitude of 20 MW. Up to 15 cycles of this probing signal were injected into the system resulting in a processing gain of 15. The resulting measured response at points throughout the system was not much larger than the ambient noise present in the measurements.« less

  11. funcLAB/G-service-oriented architecture for standards-based analysis of functional magnetic resonance imaging in HealthGrids.

    PubMed

    Erberich, Stephan G; Bhandekar, Manasee; Chervenak, Ann; Kesselman, Carl; Nelson, Marvin D

    2007-01-01

    Functional MRI is successfully being used in clinical and research applications including preoperative planning, language mapping, and outcome monitoring. However, clinical use of fMRI is less widespread due to its complexity of imaging, image workflow, post-processing, and lack of algorithmic standards hindering result comparability. As a consequence, wide-spread adoption of fMRI as clinical tool is low contributing to the uncertainty of community physicians how to integrate fMRI into practice. In addition, training of physicians with fMRI is in its infancy and requires clinical and technical understanding. Therefore, many institutions which perform fMRI have a team of basic researchers and physicians to perform fMRI as a routine imaging tool. In order to provide fMRI as an advanced diagnostic tool to the benefit of a larger patient population, image acquisition and image post-processing must be streamlined, standardized, and available at any institution which does not have these resources available. Here we describe a software architecture, the functional imaging laboratory (funcLAB/G), which addresses (i) standardized image processing using Statistical Parametric Mapping and (ii) its extension to secure sharing and availability for the community using standards-based Grid technology (Globus Toolkit). funcLAB/G carries the potential to overcome the limitations of fMRI in clinical use and thus makes standardized fMRI available to the broader healthcare enterprise utilizing the Internet and HealthGrid Web Services technology.

  12. Evidence for Sub-Chandrasekhar Mass Type Ia Supernovae from an Extensive Survey of Radiative Transfer Models

    NASA Astrophysics Data System (ADS)

    Goldstein, Daniel A.; Kasen, Daniel

    2018-01-01

    There are two classes of viable progenitors for normal Type Ia supernovae (SNe Ia): systems in which a white dwarf explodes at the Chandrasekhar mass ({M}{ch}), and systems in which a white dwarf explodes below the Chandrasekhar mass (sub-{M}{ch}). It is not clear which of these channels is dominant; observations and light-curve modeling have provided evidence for both. Here we use an extensive grid of 4500 time-dependent, multiwavelength radiation transport simulations to show that the sub-{M}{ch} model can reproduce the entirety of the width–luminosity relation, while the {M}{ch} model can only produce the brighter events (0.8< {{Δ }}{M}15(B)< 1.55), implying that fast-declining SNe Ia come from sub-{M}{ch} explosions. We do not assume a particular theoretical paradigm for the progenitor or explosion mechanism, but instead construct parameterized models that vary the mass, kinetic energy, and compositional structure of the ejecta, thereby realizing a broad range of possible outcomes of white dwarf explosions. We provide fitting functions based on our large grid of detailed simulations that map observable properties of SNe Ia, such as peak brightness and light-curve width, to physical parameters such as {}56{Ni} and total ejected mass. These can be used to estimate the physical properties of observed SNe Ia.

  13. Behavior of plastic sand confinement grids

    DOT National Transportation Integrated Search

    1986-01-01

    The concept of improving the load carrying ability of unbound aggregates, particularly sand, by lateral confinement has been investigated for some time. Extensive full-scale testing of the trafficability of confined beach sand pavement layers has bee...

  14. Distributed intrusion detection system based on grid security model

    NASA Astrophysics Data System (ADS)

    Su, Jie; Liu, Yahui

    2008-03-01

    Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.

  15. Results from the Operational Testing of the Eaton Smart Grid Capable Electric Vehicle Supply Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Brion

    2014-10-01

    The Idaho National Laboratory conducted testing and analysis of the Eaton smart grid capable electric vehicle supply equipment (EVSE), which was a deliverable from Eaton for the U.S. Department of Energy FOA-554. The Idaho National Laboratory has extensive knowledge and experience in testing advanced conductive and wireless charging systems though INL’s support of the U.S. Department of Energy’s Advanced Vehicle Testing Activity. This document details the findings from the EVSE operational testing conducted at the Idaho National Laboratory on the Eaton smart grid capable EVSE. The testing conducted on the EVSE included energy efficiency testing, SAE J1772 functionality testing, abnormalmore » conditions testing, and charging of a plug-in vehicle.« less

  16. Results from Operational Testing of the Siemens Smart Grid-Capable Electric Vehicle Supply Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Brion

    2015-05-01

    The Idaho National Laboratory conducted testing and analysis of the Siemens smart grid capable electric vehicle supply equipment (EVSE), which was a deliverable from Siemens for the U.S. Department of Energy FOA-554. The Idaho National Laboratory has extensive knowledge and experience in testing advanced conductive and wireless charging systems though INL’s support of the U.S. Department of Energy’s Advanced Vehicle Testing Activity. This document details the findings from the EVSE operational testing conducted at the Idaho National Laboratory on the Siemens smart grid capable EVSE. The testing conducted on the EVSE included energy efficiency testing, SAE J1772 functionality testing, abnormalmore » conditions testing, and charging of a plug-in vehicle.« less

  17. A Probabilistic Risk Mitigation Model for Cyber-Attacks to PMU Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mousavian, Seyedamirabbas; Valenzuela, Jorge; Wang, Jianhui

    The power grid is becoming more dependent on information and communication technologies. Complex networks of advanced sensors such as phasor measurement units (PMUs) are used to collect real time data to improve the observability of the power system. Recent studies have shown that the power grid has significant cyber vulnerabilities which could increase when PMUs are used extensively. Therefore, recognizing and responding to vulnerabilities are critical to the security of the power grid. This paper proposes a risk mitigation model for optimal response to cyber-attacks to PMU networks. We model the optimal response action as a mixed integer linear programmingmore » (MILP) problem to prevent propagation of the cyber-attacks and maintain the observability of the power system.« less

  18. Experiences of engineering Grid-based medical software.

    PubMed

    Estrella, F; Hauer, T; McClatchey, R; Odeh, M; Rogulin, D; Solomonides, T

    2007-08-01

    Grid-based technologies are emerging as potential solutions for managing and collaborating distributed resources in the biomedical domain. Few examples exist, however, of successful implementations of Grid-enabled medical systems and even fewer have been deployed for evaluation in practice. The objective of this paper is to evaluate the use in clinical practice of a Grid-based imaging prototype and to establish directions for engineering future medical Grid developments and their subsequent deployment. The MammoGrid project has deployed a prototype system for clinicians using the Grid as its information infrastructure. To assist in the specification of the system requirements (and for the first time in healthgrid applications), use-case modelling has been carried out in close collaboration with clinicians and radiologists who had no prior experience of this modelling technique. A critical qualitative and, where possible, quantitative analysis of the MammoGrid prototype is presented leading to a set of recommendations from the delivery of the first deployed Grid-based medical imaging application. We report critically on the application of software engineering techniques in the specification and implementation of the MammoGrid project and show that use-case modelling is a suitable vehicle for representing medical requirements and for communicating effectively with the clinical community. This paper also discusses the practical advantages and limitations of applying the Grid to real-life clinical applications and presents the consequent lessons learned. The work presented in this paper demonstrates that given suitable commitment from collaborating radiologists it is practical to deploy in practice medical imaging analysis applications using the Grid but that standardization in and stability of the Grid software is a necessary pre-requisite for successful healthgrids. The MammoGrid prototype has therefore paved the way for further advanced Grid-based deployments in the medical and biomedical domains.

  19. Global Population Distribution (1990),Terrestrial Area and Country Name Information on a One by One Degree Grid Cell Basis

    DOE Data Explorer

    Li, Yi-Fan [Canadian Global Emissions Inventory Centre, Downsview, Ontario (Canada); Brenkert, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    1996-01-01

    This data base contains gridded (one degree by one degree) information on the world-wide distribution of the population for 1990 and country-specific information on the percentage of the country's population present in each grid cell (Li, 1996a). Secondly, the data base contains the percentage of a country's total area in a grid cell and the country's percentage of the grid cell that is terrestrial (Li, 1996b). Li (1996b) also developed an indicator signifying how many countries are represented in a grid cell and if a grid cell is part of the sea; this indicator is only relevant for the land, countries, and sea-partitioning information of the grid cell. Thirdly, the data base includes the latitude and longitude coordinates of each grid cell; a grid code number, which is a translation of the latitude/longitude value and is used in the Global Emission Inventory Activity (GEIA) data bases; the country or region's name; and the United Nations three-digit country code that represents that name.

  20. A Data Parallel Multizone Navier-Stokes Code

    NASA Technical Reports Server (NTRS)

    Jespersen, Dennis C.; Levit, Creon; Kwak, Dochan (Technical Monitor)

    1995-01-01

    We have developed a data parallel multizone compressible Navier-Stokes code on the Connection Machine CM-5. The code is set up for implicit time-stepping on single or multiple structured grids. For multiple grids and geometrically complex problems, we follow the "chimera" approach, where flow data on one zone is interpolated onto another in the region of overlap. We will describe our design philosophy and give some timing results for the current code. The design choices can be summarized as: 1. finite differences on structured grids; 2. implicit time-stepping with either distributed solves or data motion and local solves; 3. sequential stepping through multiple zones with interzone data transfer via a distributed data structure. We have implemented these ideas on the CM-5 using CMF (Connection Machine Fortran), a data parallel language which combines elements of Fortran 90 and certain extensions, and which bears a strong similarity to High Performance Fortran (HPF). One interesting feature is the issue of turbulence modeling, where the architecture of a parallel machine makes the use of an algebraic turbulence model awkward, whereas models based on transport equations are more natural. We will present some performance figures for the code on the CM-5, and consider the issues involved in transitioning the code to HPF for portability to other parallel platforms.

  1. Acceleration of 500 keV Negative Ion Beams By Tuning Vacuum Insulation Distance On JT-60 Negative Ion Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kojima, A.; Hanada, M.; Tanaka, Y.

    2011-09-26

    Acceleration of a 500 keV beam up to 2.8 A has been achieved on a JT-60U negative ion source with a three-stage accelerator by overcoming low voltage holding which is one of the critical issues for realization of the JT-60SA ion source. In order to improve the voltage holding, preliminary voltage holding tests with small-size grids with uniform and locally intense electric fields were carried out, and suggested that the voltage holding was degraded by both the size and local electric field effects. Therefore, the local electric field was reduced by tuning gap lengths between the large size grids andmore » grid support structures of the accelerator. Moreover, a beam radiation shield which limited extension of the minimum gap length was also optimized so as to reduce the local electric field while maintaining the shielding effect. These modifications were based on the experiment results, and significantly increased the voltage holding from <150 kV/stage for the original configuration to 200 kV/stage. These techniques for improvement of voltage holding should also be applicable to other large ion sources accelerators such as those for ITER.« less

  2. Sliding over the Blocks in Enzyme-Free RNA Copying – One-Pot Primer Extension in Ice

    PubMed Central

    Löffler, Philipp M. G.; Groen, Joost; Dörr, Mark; Monnard, Pierre-Alain

    2013-01-01

    Template-directed polymerization of RNA in the absence of enzymes is the basis for an information transfer in the ‘RNA-world’ hypothesis and in novel nucleic acid based technology. Previous investigations established that only cytidine rich strands are efficient templates in bulk aqueous solutions while a few specific sequences completely block the extension of hybridized primers. We show that a eutectic water/ice system can support Pb2+/Mg2+-ion catalyzed extension of a primer across such sequences, i.e. AA, AU and AG, in a one-pot synthesis. Using mixtures of imidazole activated nucleotide 5′-monophosphates, the two first “blocking” residues could be passed during template-directed polymerization, i.e., formation of triply extended products containing a high fraction of faithful copies was demonstrated. Across the AG sequence, a mismatch sequence was formed in similar amounts to the correct product due to U·G wobble pairing. Thus, the template-directed extension occurs both across pyrimidine and purine rich sequences and insertions of pyrimidines did not inhibit the subsequent insertions. Products were mainly formed with 2′-5′-phosphodiester linkages, however, the abundance of 3′–5′-linkages was higher than previously reported for pyrimidine insertions. When enzyme-free, template-directed RNA polymerization is performed in a eutectic water ice environment, various intrinsic reaction limitations observed in bulk solution can then be overcome. PMID:24058695

  3. GeoPAT: A toolbox for pattern-based information retrieval from large geospatial databases

    NASA Astrophysics Data System (ADS)

    Jasiewicz, Jarosław; Netzel, Paweł; Stepinski, Tomasz

    2015-07-01

    Geospatial Pattern Analysis Toolbox (GeoPAT) is a collection of GRASS GIS modules for carrying out pattern-based geospatial analysis of images and other spatial datasets. The need for pattern-based analysis arises when images/rasters contain rich spatial information either because of their very high resolution or their very large spatial extent. Elementary units of pattern-based analysis are scenes - patches of surface consisting of a complex arrangement of individual pixels (patterns). GeoPAT modules implement popular GIS algorithms, such as query, overlay, and segmentation, to operate on the grid of scenes. To achieve these capabilities GeoPAT includes a library of scene signatures - compact numerical descriptors of patterns, and a library of distance functions - providing numerical means of assessing dissimilarity between scenes. Ancillary GeoPAT modules use these functions to construct a grid of scenes or to assign signatures to individual scenes having regular or irregular geometries. Thus GeoPAT combines knowledge retrieval from patterns with mapping tasks within a single integrated GIS environment. GeoPAT is designed to identify and analyze complex, highly generalized classes in spatial datasets. Examples include distinguishing between different styles of urban settlements using VHR images, delineating different landscape types in land cover maps, and mapping physiographic units from DEM. The concept of pattern-based spatial analysis is explained and the roles of all modules and functions are described. A case study example pertaining to delineation of landscape types in a subregion of NLCD is given. Performance evaluation is included to highlight GeoPAT's applicability to very large datasets. The GeoPAT toolbox is available for download from

  4. Species richness of Eurasian Zephyrus hairstreaks (Lepidoptera: Lycaenidae: Theclini) with implications on historical biogeography: An NDM/VNDM approach

    PubMed Central

    Yago, Masaya; Settele, Josef; Li, Xiushan; Ueshima, Rei; Grishin, Nick V.; Wang, Min

    2018-01-01

    Aim A database based on distributional records of Eurasian Zephyrus hairstreaks (Lepidoptera: Lycaenidae: Theclini) was compiled to analyse their areas of endemism (AoEs), species richness and distribution patterns, to explore their locations of past glacial refugia and dispersal routes. Methods Over 2000 Zephyrus hairstreaks occurrences are analysed using the NDM/VNDM algorithm, for the recognition of AoEs. Species richness was calculated by using the option ‘Number of different classes’ to count the different classes of a variable presented in each 3.0°×3.0° grid cell, and GIS software was used to visualize distribution patterns of endemic species. Results Centres of species richness of Zephyrus hairstreaks are situated in the eastern Qinghai-Tibet Plateau (EQTP), Hengduan Mountain Region (HDMR) and the Qinling Mountain Region (QLMR). Latitudinal gradients in species richness show normal distribution with the peak between 25° N and 35° N in the temperate zone, gradually decreasing towards the poles. Moreover, most parts of central and southern China, especially the area of QLMR-EQTP-HDMR, were identified as AoEs that may have played a significant role as refugia during Quaternary global cooling. There are four major distributional patterns of Zephyrus hairstreaks in Eurasia: Sino-Japanese, Sino-Himalayan, high-mountain and a combined distribution covering all three patterns. Conclusions Zephyrus hairstreaks probably originated at least 23–24 Myr ago in E. Asia between 25° N to 35° N in the temperate zone. Cenozoic orogenies caused rapid speciation of this tribe and extrusion of the Indochina block resulted in vicariance between the Sino-Japanese and the Sino-Himalayan patterns. The four distribution patterns provided two possible dispersal directions: Sino-Japanese dispersal and Sino-Himalayan dispersal. PMID:29351314

  5. Species richness of Eurasian Zephyrus hairstreaks (Lepidoptera: Lycaenidae: Theclini) with implications on historical biogeography: An NDM/VNDM approach.

    PubMed

    Zhuang, Hailing; Yago, Masaya; Settele, Josef; Li, Xiushan; Ueshima, Rei; Grishin, Nick V; Wang, Min

    2018-01-01

    A database based on distributional records of Eurasian Zephyrus hairstreaks (Lepidoptera: Lycaenidae: Theclini) was compiled to analyse their areas of endemism (AoEs), species richness and distribution patterns, to explore their locations of past glacial refugia and dispersal routes. Over 2000 Zephyrus hairstreaks occurrences are analysed using the NDM/VNDM algorithm, for the recognition of AoEs. Species richness was calculated by using the option 'Number of different classes' to count the different classes of a variable presented in each 3.0°×3.0° grid cell, and GIS software was used to visualize distribution patterns of endemic species. Centres of species richness of Zephyrus hairstreaks are situated in the eastern Qinghai-Tibet Plateau (EQTP), Hengduan Mountain Region (HDMR) and the Qinling Mountain Region (QLMR). Latitudinal gradients in species richness show normal distribution with the peak between 25° N and 35° N in the temperate zone, gradually decreasing towards the poles. Moreover, most parts of central and southern China, especially the area of QLMR-EQTP-HDMR, were identified as AoEs that may have played a significant role as refugia during Quaternary global cooling. There are four major distributional patterns of Zephyrus hairstreaks in Eurasia: Sino-Japanese, Sino-Himalayan, high-mountain and a combined distribution covering all three patterns. Zephyrus hairstreaks probably originated at least 23-24 Myr ago in E. Asia between 25° N to 35° N in the temperate zone. Cenozoic orogenies caused rapid speciation of this tribe and extrusion of the Indochina block resulted in vicariance between the Sino-Japanese and the Sino-Himalayan patterns. The four distribution patterns provided two possible dispersal directions: Sino-Japanese dispersal and Sino-Himalayan dispersal.

  6. Considering the Spatial Layout Information of Bag of Features (BoF) Framework for Image Classification.

    PubMed

    Mu, Guangyu; Liu, Ying; Wang, Limin

    2015-01-01

    The spatial pooling method such as spatial pyramid matching (SPM) is very crucial in the bag of features model used in image classification. SPM partitions the image into a set of regular grids and assumes that the spatial layout of all visual words obey the uniform distribution over these regular grids. However, in practice, we consider that different visual words should obey different spatial layout distributions. To improve SPM, we develop a novel spatial pooling method, namely spatial distribution pooling (SDP). The proposed SDP method uses an extension model of Gauss mixture model to estimate the spatial layout distributions of the visual vocabulary. For each visual word type, SDP can generate a set of flexible grids rather than the regular grids from the traditional SPM. Furthermore, we can compute the grid weights for visual word tokens according to their spatial coordinates. The experimental results demonstrate that SDP outperforms the traditional spatial pooling methods, and is competitive with the state-of-the-art classification accuracy on several challenging image datasets.

  7. Statistical Analysis of CFD Solutions from the Third AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Hemsch, Michael J.

    2007-01-01

    The first AIAA Drag Prediction Workshop, held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third Drag Prediction Workshop focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This work evaluated the effect of grid refinement on the code-to-code scatter for the clean attached flow test cases and the separated flow test cases.

  8. The scale dependence of optical diversity in a prairie ecosystem

    NASA Astrophysics Data System (ADS)

    Gamon, J. A.; Wang, R.; Stilwell, A.; Zygielbaum, A. I.; Cavender-Bares, J.; Townsend, P. A.

    2015-12-01

    Biodiversity loss, one of the most crucial challenges of our time, endangers ecosystem services that maintain human wellbeing. Traditional methods of measuring biodiversity require extensive and costly field sampling by biologists with extensive experience in species identification. Remote sensing can be used for such assessment based upon patterns of optical variation. This provides efficient and cost-effective means to determine ecosystem diversity at different scales and over large areas. Sampling scale has been described as a "fundamental conceptual problem" in ecology, and is an important practical consideration in both remote sensing and traditional biodiversity studies. On the one hand, with decreasing spatial and spectral resolution, the differences among different optical types may become weak or even disappear. Alternately, high spatial and/or spectral resolution may introduce redundant or contradictory information. For example, at high resolution, the variation within optical types (e.g., between leaves on a single plant canopy) may add complexity unrelated to specie richness. We studied the scale-dependence of optical diversity in a prairie ecosystem at Cedar Creek Ecosystem Science Reserve, Minnesota, USA using a variety of spectrometers from several platforms on the ground and in the air. Using the coefficient of variation (CV) of spectra as an indicator of optical diversity, we found that high richness plots generally have a higher coefficient of variation. High resolution imaging spectrometer data (1 mm pixels) showed the highest sensitivity to richness level. With decreasing spatial resolution, the difference in CV between richness levels decreased, but remained significant. These findings can be used to guide airborne studies of biodiversity and develop more effective large-scale biodiversity sampling methods.

  9. DEM Based Modeling: Grid or TIN? The Answer Depends

    NASA Astrophysics Data System (ADS)

    Ogden, F. L.; Moreno, H. A.

    2015-12-01

    The availability of petascale supercomputing power has enabled process-based hydrological simulations on large watersheds and two-way coupling with mesoscale atmospheric models. Of course with increasing watershed scale come corresponding increases in watershed complexity, including wide ranging water management infrastructure and objectives, and ever increasing demands for forcing data. Simulations of large watersheds using grid-based models apply a fixed resolution over the entire watershed. In large watersheds, this means an enormous number of grids, or coarsening of the grid resolution to reduce memory requirements. One alternative to grid-based methods is the triangular irregular network (TIN) approach. TINs provide the flexibility of variable resolution, which allows optimization of computational resources by providing high resolution where necessary and low resolution elsewhere. TINs also increase required effort in model setup, parameter estimation, and coupling with forcing data which are often gridded. This presentation discusses the costs and benefits of the use of TINs compared to grid-based methods, in the context of large watershed simulations within the traditional gridded WRF-HYDRO framework and the new TIN-based ADHydro high performance computing watershed simulator.

  10. Estimating dust production rate of carbon-rich stars in the Small Magellanic Cloud

    NASA Astrophysics Data System (ADS)

    Nanni, A.; Marigo, P.; Groenewegen, M. A. T.; Aringer, B.; Pastorelli, G.; Rubele, S.; Girardi, L.; Bressan, A.; Bladh, S.

    We compute a grid of spectra describing dusty Circumstellar Envelopes of Thermally Pulsing Asymptotic Giant Branch carbon-rich stars by employing a physically grounded description for dust growth. The optical constants for carbon dust have been selected in order to reproduce simultaneously the most important color-color diagrams in the Near and Mid Infrared bands. We fit the Spectral Energy Distribution of ≈2000 carbon-rich in the Small Magellanic Cloud and we compute their total dust production rate. We compare our results with the ones in the literature. Different choices of the dust-to-gas ratio and outflow expansion velocity adopted in different works, yield, in some cases, a total dust budget about three times lower than the one derived from our scheme, with the same optical data set for carbon dust.

  11. An Offload NIC for NASA, NLR, and Grid Computing

    NASA Technical Reports Server (NTRS)

    Awrach, James

    2013-01-01

    This work addresses distributed data management and access dynamically configurable high-speed access to data distributed and shared over wide-area high-speed network environments. An offload engine NIC (network interface card) is proposed that scales at nX10-Gbps increments through 100-Gbps full duplex. The Globus de facto standard was used in projects requiring secure, robust, high-speed bulk data transport. Novel extension mechanisms were derived that will combine these technologies for use by GridFTP, bandwidth management resources, and host CPU (central processing unit) acceleration. The result will be wire-rate encrypted Globus grid data transactions through offload for splintering, encryption, and compression. As the need for greater network bandwidth increases, there is an inherent need for faster CPUs. The best way to accelerate CPUs is through a network acceleration engine. Grid computing data transfers for the Globus tool set did not have wire-rate encryption or compression. Existing technology cannot keep pace with the greater bandwidths of backplane and network connections. Present offload engines with ports to Ethernet are 32 to 40 Gbps f-d at best. The best of ultra-high-speed offload engines use expensive ASICs (application specific integrated circuits) or NPUs (network processing units). The present state of the art also includes bonding and the use of multiple NICs that are also in the planning stages for future portability to ASICs and software to accommodate data rates at 100 Gbps. The remaining industry solutions are for carrier-grade equipment manufacturers, with costly line cards having multiples of 10-Gbps ports, or 100-Gbps ports such as CFP modules that interface to costly ASICs and related circuitry. All of the existing solutions vary in configuration based on requirements of the host, motherboard, or carriergrade equipment. The purpose of the innovation is to eliminate data bottlenecks within cluster, grid, and cloud computing systems, and to add several more capabilities while reducing space consumption and cost. Provisions were designed for interoperability with systems used in the NASA HEC (High-End Computing) program. The new acceleration engine consists of state-ofthe- art FPGA (field-programmable gate array) core IP, C, and Verilog code; novel communication protocol; and extensions to the Globus structure. The engine provides the functions of network acceleration, encryption, compression, packet-ordering, and security added to Globus grid or for cloud data transfer. This system is scalable in nX10-Gbps increments through 100-Gbps f-d. It can be interfaced to industry-standard system-side or network-side devices or core IP in increments of 10 GigE, scaling to provide IEEE 40/100 GigE compliance.

  12. Extensive Management Promotes Plant and Microbial Nitrogen Retention in Temperate Grassland

    PubMed Central

    de Vries, Franciska T.; Bloem, Jaap; Quirk, Helen; Stevens, Carly J.; Bol, Roland; Bardgett, Richard D.

    2012-01-01

    Leaching losses of nitrogen (N) from soil and atmospheric N deposition have led to widespread changes in plant community and microbial community composition, but our knowledge of the factors that determine ecosystem N retention is limited. A common feature of extensively managed, species-rich grasslands is that they have fungal-dominated microbial communities, which might reduce soil N losses and increase ecosystem N retention, which is pivotal for pollution mitigation and sustainable food production. However, the mechanisms that underpin improved N retention in extensively managed, species-rich grasslands are unclear. We combined a landscape-scale field study and glasshouse experiment to test how grassland management affects plant and soil N retention. Specifically, we hypothesised that extensively managed, species-rich grasslands of high conservation value would have lower N loss and greater N retention than intensively managed, species-poor grasslands, and that this would be due to a greater immobilisation of N by a more fungal-dominated microbial community. In the field study, we found that extensively managed, species-rich grasslands had lower N leaching losses. Soil inorganic N availability decreased with increasing abundance of fungi relative to bacteria, although the best predictor of soil N leaching was the C/N ratio of aboveground plant biomass. In the associated glasshouse experiment we found that retention of added 15N was greater in extensively than in intensively managed grasslands, which was attributed to a combination of greater root uptake and microbial immobilisation of 15N in the former, and that microbial immobilisation increased with increasing biomass and abundance of fungi. These findings show that grassland management affects mechanisms of N retention in soil through changes in root and microbial uptake of N. Moreover, they support the notion that microbial communities might be the key to improved N retention through tightening linkages between plants and microbes and reducing N availability. PMID:23227252

  13. Near-Body Grid Adaption for Overset Grids

    NASA Technical Reports Server (NTRS)

    Buning, Pieter G.; Pulliam, Thomas H.

    2016-01-01

    A solution adaption capability for curvilinear near-body grids has been implemented in the OVERFLOW overset grid computational fluid dynamics code. The approach follows closely that used for the Cartesian off-body grids, but inserts refined grids in the computational space of original near-body grids. Refined curvilinear grids are generated using parametric cubic interpolation, with one-sided biasing based on curvature and stretching ratio of the original grid. Sensor functions, grid marking, and solution interpolation tasks are implemented in the same fashion as for off-body grids. A goal-oriented procedure, based on largest error first, is included for controlling growth rate and maximum size of the adapted grid system. The adaption process is almost entirely parallelized using MPI, resulting in a capability suitable for viscous, moving body simulations. Two- and three-dimensional examples are presented.

  14. Adaptive EAGLE dynamic solution adaptation and grid quality enhancement

    NASA Technical Reports Server (NTRS)

    Luong, Phu Vinh; Thompson, J. F.; Gatlin, B.; Mastin, C. W.; Kim, H. J.

    1992-01-01

    In the effort described here, the elliptic grid generation procedure in the EAGLE grid code was separated from the main code into a subroutine, and a new subroutine which evaluates several grid quality measures at each grid point was added. The elliptic grid routine can now be called, either by a computational fluid dynamics (CFD) code to generate a new adaptive grid based on flow variables and quality measures through multiple adaptation, or by the EAGLE main code to generate a grid based on quality measure variables through static adaptation. Arrays of flow variables can be read into the EAGLE grid code for use in static adaptation as well. These major changes in the EAGLE adaptive grid system make it easier to convert any CFD code that operates on a block-structured grid (or single-block grid) into a multiple adaptive code.

  15. LOOS: an extensible platform for the structural analysis of simulations.

    PubMed

    Romo, Tod D; Grossfield, Alan

    2009-01-01

    We have developed LOOS (Lightweight Object-Oriented Structure-analysis library) as an object-oriented library designed to facilitate the rapid development of tools for the structural analysis of simulations. LOOS supports the native file formats of most common simulation packages including AMBER, CHARMM, CNS, Gromacs, NAMD, Tinker, and X-PLOR. Encapsulation and polymorphism are used to simultaneously provide a stable interface to the programmer and make LOOS easily extensible. A rich atom selection language based on the C expression syntax is included as part of the library. LOOS enables students and casual programmer-scientists to rapidly write their own analytical tools in a compact and expressive manner resembling scripting. LOOS is written in C++ and makes extensive use of the Standard Template Library and Boost, and is freely available under the GNU General Public License (version 3) LOOS has been tested on Linux and MacOS X, but is written to be portable and should work on most Unix-based platforms.

  16. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  17. Schwarz-Christoffel Conformal Mapping based Grid Generation for Global Oceanic Circulation Models

    NASA Astrophysics Data System (ADS)

    Xu, Shiming

    2015-04-01

    We propose new grid generation algorithms for global ocean general circulation models (OGCMs). Contrary to conventional, analytical forms based dipolar or tripolar grids, the new algorithm are based on Schwarz-Christoffel (SC) conformal mapping with prescribed boundary information. While dealing with the conventional grid design problem of pole relocation, it also addresses more advanced issues of computational efficiency and the new requirements on OGCM grids arisen from the recent trend of high-resolution and multi-scale modeling. The proposed grid generation algorithm could potentially achieve the alignment of grid lines to coastlines, enhanced spatial resolution in coastal regions, and easier computational load balance. Since the generated grids are still orthogonal curvilinear, they can be readily 10 utilized in existing Bryan-Cox-Semtner type ocean models. The proposed methodology can also be applied to the grid generation task for regional ocean modeling when complex land-ocean distribution is present.

  18. Spatial services grid

    NASA Astrophysics Data System (ADS)

    Cao, Jian; Li, Qi; Cheng, Jicheng

    2005-10-01

    This paper discusses the concept, key technologies and main application of Spatial Services Grid. The technologies of Grid computing and Webservice is playing a revolutionary role in studying the spatial information services. The concept of the SSG (Spatial Services Grid) is put forward based on the SIG (Spatial Information Grid) and OGSA (open grid service architecture). Firstly, the grid computing is reviewed and the key technologies of SIG and their main applications are reviewed. Secondly, the grid computing and three kinds of SIG (in broad sense)--SDG (spatial data grid), SIG (spatial information grid) and SSG (spatial services grid) and their relationships are proposed. Thirdly, the key technologies of the SSG (spatial services grid) is put forward. Finally, three representative applications of SSG (spatial services grid) are discussed. The first application is urban location based services gird, which is a typical spatial services grid and can be constructed on OGSA (Open Grid Services Architecture) and digital city platform. The second application is region sustainable development grid which is the key to the urban development. The third application is Region disaster and emergency management services grid.

  19. Wildlife-friendly farming benefits rare birds, bees and plants

    PubMed Central

    Pywell, Richard F.; Heard, Matthew S.; Bradbury, Richard B.; Hinsley, Shelley; Nowakowski, Marek; Walker, Kevin J.; Bullock, James M.

    2012-01-01

    Agricultural intensification is a leading cause of global biodiversity loss, especially for threatened and near-threatened species. One widely implemented response is ‘wildlife-friendly farming’, involving the close integration of conservation and extensive farming practices within agricultural landscapes. However, the putative benefits from this controversial policy are currently either unknown or thought unlikely to extend to rare and declining species. Here, we show that new, evidence-based approaches to habitat creation on intensively managed farmland in England can achieve large increases in plant, bee and bird species. In particular, we found that habitat enhancement methods designed to provide the requirements of sensitive target biota consistently increased the richness and abundance of both rare and common species, with 10-fold to greater than 100-fold more rare species per sample area than generalized conventional conservation measures. Furthermore, targeting landscapes of high species richness amplified beneficial effects on the least mobile taxa: plants and bees. Our results provide the first unequivocal support for a national wildlife-friendly farming policy and suggest that this approach should be implemented much more extensively to address global biodiversity loss. However, to be effective, these conservation measures must be evidence-based, and developed using sound knowledge of the ecological requirements of key species. PMID:22675140

  20. On modelling the pressure-strain correlations in wall bounded flows

    NASA Technical Reports Server (NTRS)

    Peltier, L. J.; Biringen, S.

    1990-01-01

    Turbulence models for the pressure-strain term of the Reynolds-stress equations in the vicinity of a moving wall are evaluated for a high Reynolds number flow using decaying grid turbulence as a model problem. The data of Thomas and Hancock are used as a base for evaluating the different turbulence models. In particular, the Rotta model for return-to-isotropy is evaluated both in its inclusion into the Reynolds-stress equation model and in comparison to a nonlinear model advanced by Sarkar and Speziale. Further, models for the wall correction to the transfer term advanced by Launder et al., Shir, and Shih and Lumley are compared. Initial data using the decaying grid turbulence experiment as a base suggests that the coefficients proposed for these models are high perhaps by as much as an order of magnitude. The Shih and Lumley model which satisfies realizability constraints, in particular, seems to hold promise in adequately modeling the Reynolds stress components of this flow. Extensions of this work are to include testing the homogeneous transfer model by Shih and Lumley and the testing of the wall transfer models using their proposed coefficients and the coefficients chosen from this work in a flow with mean shear component.

  1. Functional diversity measures revealed impacts of non-native species and habitat degradation on species-poor freshwater fish assemblages.

    PubMed

    Colin, Nicole; Villéger, Sébastien; Wilkes, Martin; de Sostoa, Adolfo; Maceda-Veiga, Alberto

    2018-06-01

    Trait-based ecology has been developed for decades to infer ecosystem responses to stressors based on the functional structure of communities, yet its value in species-poor systems is largely unknown. Here, we used an extensive dataset in a Spanish region highly prone to non-native fish invasions (15 catchments, N=389 sites) to assess for the first time how species-poor communities respond to large-scale environmental gradients using a taxonomic and functional trait-based approach in riverine fish. We examined total species richness and three functional trait-based indices available when many sites have ≤3 species (specialization, FSpe; originality, FOri and entropy, FEnt). We assessed the responses of these taxonomic and functional indices along gradients of altitude, water pollution, physical habitat degradation and non-native fish biomass. Whilst species richness was relatively sensitive to spatial effects, functional diversity indices were responsive across natural and anthropogenic gradients. All four diversity measures declined with altitude but this decline was modulated by physical habitat degradation (richness, FSpe and FEnt) and the non-native:total fish biomass ratio (FSpe and FOri) in ways that varied between indices. Furthermore, FSpe and FOri were significantly correlated with Total Nitrogen. Non-native fish were a major component of the taxonomic and functional structure of fish communities, raising concerns about potential misdiagnosis between invaded and environmentally-degraded river reaches. Such misdiagnosis was evident in a regional fish index widely used in official monitoring programs. We recommend the application of FSpe and FOri to extensive datasets from monitoring programs in order to generate valuable cross-system information about the impacts of non-native species and habitat degradation, even in species-poor systems. Scoring non-native species apart from habitat degradation in the indices used to determine ecosystem health is essential to develop better management strategies. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Reversible structural alterations of undifferentiated and differentiated human neuroblastoma cells induced by phorbol ester.

    PubMed Central

    Tint, I S; Bonder, E M; Feder, H H; Reboulleau, C P; Vasiliev, J M; Gelfand, I M

    1992-01-01

    Morphological alterations in the structure of undifferentiated and morphologically differentiated human neuroblastoma cells induced by phorbol 12-myristate 13-acetate (PMA), an activator of protein kinase C, were examined by video microscopy and immunomorphology. In undifferentiated cells, PMA induced the formation of motile actin-rich lamellas and of stable cylindrical processes rich in microtubules. Formation of stable processes resulted either from the collapse of lamellas or the movement of the cell body away from the base of a process. In differentiated cells, PMA induced the rapid extension of small lamellas and subsequent formation of short-lived elongated processes from the lateral edges of neurites. Additionally, growth cones exhibited enhanced modulation in shape after PMA treatment. These reversible reorganizations were similar to the actinoplast-tubuloplast transformations exhibited by PMA-treated fibroblasts. We suggest that actinoplast-tubuloplast reorganizations play essential roles in morphogenesis where stable cytoplasmic extensions are induced by external stimuli. In particular, PMA-induced reorganizations of neural cells in culture may be a model for morphological modulations that occur in nerve tissue. Images PMID:1518842

  3. Normal and Oblique Impact of Cylindro-Conical and Cylindrical Projectiles on Metallic Plates

    DTIC Science & Technology

    1985-06-01

    light grid for projcrtile rpcovery, and a witness paper marked initially at the extension of the barrel centerline placed on the front of the catcher...transducer were obtained for experimer-ti -iv’. lving normal impact on 3.175-mm-thick 2024-0 aluminum tatctýs &truck at and just above the ballistic limit...spite of the presence at its rear of a plastic gas check. These particles also prevented the use of a fine copper-wire grid conducting a current whose

  4. Conservative treatment of boundary interfaces for overlaid grids and multi-level grid adaptations

    NASA Technical Reports Server (NTRS)

    Moon, Young J.; Liou, Meng-Sing

    1989-01-01

    Conservative algorithms for boundary interfaces of overlaid grids are presented. The basic method is zeroth order, and is extended to a higher order method using interpolation and subcell decomposition. The present method, strictly based on a conservative constraint, is tested with overlaid grids for various applications of unsteady and steady supersonic inviscid flows with strong shock waves. The algorithm is also applied to a multi-level grid adaptation in which the next level finer grid is overlaid on the coarse base grid with an arbitrary orientation.

  5. Grid artifact reduction for direct digital radiography detectors based on rotated stationary grids with homomorphic filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Dong Sik; Lee, Sanggyun

    2013-06-15

    Purpose: Grid artifacts are caused when using the antiscatter grid in obtaining digital x-ray images. In this paper, research on grid artifact reduction techniques is conducted especially for the direct detectors, which are based on amorphous selenium. Methods: In order to analyze and reduce the grid artifacts, the authors consider a multiplicative grid image model and propose a homomorphic filtering technique. For minimal damage due to filters, which are used to suppress the grid artifacts, rotated grids with respect to the sampling direction are employed, and min-max optimization problems for searching optimal grid frequencies and angles for given sampling frequenciesmore » are established. The authors then propose algorithms for the grid artifact reduction based on the band-stop filters as well as low-pass filters. Results: The proposed algorithms are experimentally tested for digital x-ray images, which are obtained from direct detectors with the rotated grids, and are compared with other algorithms. It is shown that the proposed algorithms can successfully reduce the grid artifacts for direct detectors. Conclusions: By employing the homomorphic filtering technique, the authors can considerably suppress the strong grid artifacts with relatively narrow-bandwidth filters compared to the normal filtering case. Using rotated grids also significantly reduces the ringing artifact. Furthermore, for specific grid frequencies and angles, the authors can use simple homomorphic low-pass filters in the spatial domain, and thus alleviate the grid artifacts with very low implementation complexity.« less

  6. A modified adjoint-based grid adaptation and error correction method for unstructured grid

    NASA Astrophysics Data System (ADS)

    Cui, Pengcheng; Li, Bin; Tang, Jing; Chen, Jiangtao; Deng, Youqi

    2018-05-01

    Grid adaptation is an important strategy to improve the accuracy of output functions (e.g. drag, lift, etc.) in computational fluid dynamics (CFD) analysis and design applications. This paper presents a modified robust grid adaptation and error correction method for reducing simulation errors in integral outputs. The procedure is based on discrete adjoint optimization theory in which the estimated global error of output functions can be directly related to the local residual error. According to this relationship, local residual error contribution can be used as an indicator in a grid adaptation strategy designed to generate refined grids for accurately estimating the output functions. This grid adaptation and error correction method is applied to subsonic and supersonic simulations around three-dimensional configurations. Numerical results demonstrate that the sensitive grids to output functions are detected and refined after grid adaptation, and the accuracy of output functions is obviously improved after error correction. The proposed grid adaptation and error correction method is shown to compare very favorably in terms of output accuracy and computational efficiency relative to the traditional featured-based grid adaptation.

  7. Agile Datacube Analytics (not just) for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Merticariu, Vlad; Baumann, Peter

    2017-04-01

    Metadata are considered small, smart, and queryable; data, on the other hand, are known as big, clumsy, hard to analyze. Consequently, gridded data - such as images, image timeseries, and climate datacubes - are managed separately from the metadata, and with different, restricted retrieval capabilities. One reason for this silo approach is that databases, while good at tables, XML hierarchies, RDF graphs, etc., traditionally do not support multi-dimensional arrays well. This gap is being closed by Array Databases which extend the SQL paradigm of "any query, anytime" to NoSQL arrays. They introduce semantically rich modelling combined with declarative, high-level query languages on n-D arrays. On Server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. This way, they offer new vistas in flexibility, scalability, performance, and data integration. In this respect, the forthcoming ISO SQL extension MDA ("Multi-dimensional Arrays") will be a game changer in Big Data Analytics. We introduce concepts and opportunities through the example of rasdaman ("raster data manager") which in fact has pioneered the field of Array Databases and forms the blueprint for ISO SQL/MDA and further Big Data standards, such as OGC WCPS for querying spatio-temporal Earth datacubes. With operational installations exceeding 140 TB queries have been split across more than one thousand cloud nodes, using CPUs as well as GPUs. Installations can easily be mashed up securely, enabling large-scale location-transparent query processing in federations. Federation queries have been demonstrated live at EGU 2016 spanning Europe and Australia in the context of the intercontinental EarthServer initiative, visualized through NASA WorldWind.

  8. Agile Datacube Analytics (not just) for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2016-12-01

    Metadata are considered small, smart, and queryable; data, on the other hand, are known as big, clumsy, hard to analyze. Consequently, gridded data - such as images, image timeseries, and climate datacubes - are managed separately from the metadata, and with different, restricted retrieval capabilities. One reason for this silo approach is that databases, while good at tables, XML hierarchies, RDF graphs, etc., traditionally do not support multi-dimensional arrays well.This gap is being closed by Array Databases which extend the SQL paradigm of "any query, anytime" to NoSQL arrays. They introduce semantically rich modelling combined with declarative, high-level query languages on n-D arrays. On Server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. This way, they offer new vistas in flexibility, scalability, performance, and data integration. In this respect, the forthcoming ISO SQL extension MDA ("Multi-dimensional Arrays") will be a game changer in Big Data Analytics.We introduce concepts and opportunities through the example of rasdaman ("raster data manager") which in fact has pioneered the field of Array Databases and forms the blueprint for ISO SQL/MDA and further Big Data standards, such as OGC WCPS for querying spatio-temporal Earth datacubes. With operational installations exceeding 140 TB queries have been split across more than one thousand cloud nodes, using CPUs as well as GPUs. Installations can easily be mashed up securely, enabling large-scale location-transparent query processing in federations. Federation queries have been demonstrated live at EGU 2016 spanning Europe and Australia in the context of the intercontinental EarthServer initiative, visualized through NASA WorldWind.

  9. Early stage fatigue damage occurs in bovine tendon fascicles in the absence of changes in mechanics at either the gross or micro-structural level.

    PubMed

    Shepherd, Jennifer H; Riley, Graham P; Screen, Hazel R C

    2014-10-01

    Many tendon injuries are believed to result from repetitive motion or overuse, leading to the accumulation of micro-damage over time. In vitro fatigue loading can be used to characterise damage during repeated use and investigate how this may relate to the aetiology of tendinopathy. This study considered the effect of fatigue loading on fascicles from two functionally distinct bovine tendons: the digital extensor and deep digital flexor. Micro-scale extension mechanisms were investigated in fascicles before or after a period of cyclic creep loading, comparing two different measurement techniques - the displacement of a photo-bleached grid and the use of nuclei as fiducial markers. Whilst visual damage was clearly identified after only 300 cycles of creep loading, these visual changes did not affect either gross fascicle mechanics or fascicle microstructural extension mechanisms over the 900 fatigue cycles investigated. However, significantly greater fibre sliding was measured when observing grid deformation rather than the analysis of nuclei movement. Measurement of microstructural extension with both techniques was localised and this may explain the absence of change in microstructural deformation in response to fatigue loading. Alternatively, the data may demonstrate that fascicles can withstand a degree of matrix disruption with no impact on mechanics. Whilst use of a photo-bleached grid to directly measure the collagen is the best indicator of matrix deformation, nuclei tracking may provide a better measure of the strain perceived directly by the cells. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Grid point extraction and coding for structured light system

    NASA Astrophysics Data System (ADS)

    Song, Zhan; Chung, Ronald

    2011-09-01

    A structured light system simplifies three-dimensional reconstruction by illuminating a specially designed pattern to the target object, thereby generating a distinct texture on it for imaging and further processing. Success of the system hinges upon what features are to be coded in the projected pattern, extracted in the captured image, and matched between the projector's display panel and the camera's image plane. The codes have to be such that they are largely preserved in the image data upon illumination from the projector, reflection from the target object, and projective distortion in the imaging process. The features also need to be reliably extracted in the image domain. In this article, a two-dimensional pseudorandom pattern consisting of rhombic color elements is proposed, and the grid points between the pattern elements are chosen as the feature points. We describe how a type classification of the grid points plus the pseudorandomness of the projected pattern can equip each grid point with a unique label that is preserved in the captured image. We also present a grid point detector that extracts the grid points without the need of segmenting the pattern elements, and that localizes the grid points in subpixel accuracy. Extensive experiments are presented to illustrate that, with the proposed pattern feature definition and feature detector, more features points in higher accuracy can be reconstructed in comparison with the existing pseudorandomly encoded structured light systems.

  11. Accuracy Analysis for Finite-Volume Discretization Schemes on Irregular Grids

    NASA Technical Reports Server (NTRS)

    Diskin, Boris; Thomas, James L.

    2010-01-01

    A new computational analysis tool, downscaling test, is introduced and applied for studying the convergence rates of truncation and discretization errors of nite-volume discretization schemes on general irregular (e.g., unstructured) grids. The study shows that the design-order convergence of discretization errors can be achieved even when truncation errors exhibit a lower-order convergence or, in some cases, do not converge at all. The downscaling test is a general, efficient, accurate, and practical tool, enabling straightforward extension of verification and validation to general unstructured grid formulations. It also allows separate analysis of the interior, boundaries, and singularities that could be useful even in structured-grid settings. There are several new findings arising from the use of the downscaling test analysis. It is shown that the discretization accuracy of a common node-centered nite-volume scheme, known to be second-order accurate for inviscid equations on triangular grids, degenerates to first order for mixed grids. Alternative node-centered schemes are presented and demonstrated to provide second and third order accuracies on general mixed grids. The local accuracy deterioration at intersections of tangency and in flow/outflow boundaries is demonstrated using the DS tests tailored to examining the local behavior of the boundary conditions. The discretization-error order reduction within inviscid stagnation regions is demonstrated. The accuracy deterioration is local, affecting mainly the velocity components, but applies to any order scheme.

  12. NAVIS-An UGV Indoor Positioning System Using Laser Scan Matching for Large-Area Real-Time Applications

    PubMed Central

    Tang, Jian.; Chen, Yuwei.; Jaakkola, Anttoni.; Liu, Jinbing.; Hyyppä, Juha.; Hyyppä, Hannu.

    2014-01-01

    Laser scan matching with grid-based maps is a promising tool for real-time indoor positioning of mobile Unmanned Ground Vehicles (UGVs). While there are critical implementation problems, such as the ability to estimate the position by sensing the unknown indoor environment with sufficient accuracy and low enough latency for stable vehicle control, further development work is necessary. Unfortunately, most of the existing methods employ heuristics for quick positioning in which numerous accumulated errors easily lead to loss of positioning accuracy. This severely restricts its applications in large areas and over lengthy periods of time. This paper introduces an efficient real-time mobile UGV indoor positioning system for large-area applications using laser scan matching with an improved probabilistically-motivated Maximum Likelihood Estimation (IMLE) algorithm, which is based on a multi-resolution patch-divided grid likelihood map. Compared with traditional methods, the improvements embodied in IMLE include: (a) Iterative Closed Point (ICP) preprocessing, which adaptively decreases the search scope; (b) a totally brute search matching method on multi-resolution map layers, based on the likelihood value between current laser scan and the grid map within refined search scope, adopted to obtain the global optimum position at each scan matching; and (c) a patch-divided likelihood map supporting a large indoor area. A UGV platform called NAVIS was designed, manufactured, and tested based on a low-cost robot integrating a LiDAR and an odometer sensor to verify the IMLE algorithm. A series of experiments based on simulated data and field tests with NAVIS proved that the proposed IMEL algorithm is a better way to perform local scan matching that can offer a quick and stable positioning solution with high accuracy so it can be part of a large area localization/mapping, application. The NAVIS platform can reach an updating rate of 12 Hz in a feature-rich environment and 2 Hz even in a feature-poor environment, respectively. Therefore, it can be utilized in a real-time application. PMID:24999715

  13. NAVIS-An UGV indoor positioning system using laser scan matching for large-area real-time applications.

    PubMed

    Tang, Jian; Chen, Yuwei; Jaakkola, Anttoni; Liu, Jinbing; Hyyppä, Juha; Hyyppä, Hannu

    2014-07-04

    Laser scan matching with grid-based maps is a promising tool for real-time indoor positioning of mobile Unmanned Ground Vehicles (UGVs). While there are critical implementation problems, such as the ability to estimate the position by sensing the unknown indoor environment with sufficient accuracy and low enough latency for stable vehicle control, further development work is necessary. Unfortunately, most of the existing methods employ heuristics for quick positioning in which numerous accumulated errors easily lead to loss of positioning accuracy. This severely restricts its applications in large areas and over lengthy periods of time. This paper introduces an efficient real-time mobile UGV indoor positioning system for large-area applications using laser scan matching with an improved probabilistically-motivated Maximum Likelihood Estimation (IMLE) algorithm, which is based on a multi-resolution patch-divided grid likelihood map. Compared with traditional methods, the improvements embodied in IMLE include: (a) Iterative Closed Point (ICP) preprocessing, which adaptively decreases the search scope; (b) a totally brute search matching method on multi-resolution map layers, based on the likelihood value between current laser scan and the grid map within refined search scope, adopted to obtain the global optimum position at each scan matching; and (c) a patch-divided likelihood map supporting a large indoor area. A UGV platform called NAVIS was designed, manufactured, and tested based on a low-cost robot integrating a LiDAR and an odometer sensor to verify the IMLE algorithm. A series of experiments based on simulated data and field tests with NAVIS proved that the proposed IMEL algorithm is a better way to perform local scan matching that can offer a quick and stable positioning solution with high accuracy so it can be part of a large area localization/mapping, application. The NAVIS platform can reach an updating rate of 12 Hz in a feature-rich environment and 2 Hz even in a feature-poor environment, respectively. Therefore, it can be utilized in a real-time application.

  14. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    NASA Astrophysics Data System (ADS)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  15. Results from the Operational Testing of the General Electric Smart Grid Capable Electric Vehicle Supply Equipment (EVSE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, Richard Barney; Scoffield, Don; Bennett, Brion

    2013-12-01

    The Idaho National Laboratory conducted testing and analysis of the General Electric (GE) smart grid capable electric vehicle supply equipment (EVSE), which was a deliverable from GE for the U.S. Department of Energy FOA-554. The Idaho National Laboratory has extensive knowledge and experience in testing advanced conductive and wireless charging systems though INL’s support of the U.S. Department of Energy’s Advanced Vehicle Testing Activity. This document details the findings from the EVSE operational testing conducted at the Idaho National Laboratory on the GE smart grid capable EVSE. The testing conducted on the EVSE included energy efficiency testing, SAE J1772 functionalitymore » testing, abnormal conditions testing, and charging of a plug-in vehicle.« less

  16. The path to active living: physical activity through community design in Somerville, Massachusetts.

    PubMed

    Burke, Noreen M; Chomitz, Virginia R; Rioles, Nicole A; Winslow, Stephen P; Brukilacchio, Lisa B; Baker, Jessie C

    2009-12-01

    Somerville, Massachusetts, an ethnically diverse, urban community northwest of Boston, presents opportunities and challenges for active living. With a dense street grid, well-maintained sidewalks, neighborhood parks, and existing Community Path, Somerville is very walkable. However, two major surface arteries traverse and bisect neighborhoods, creating pedestrian safety and environmental justice issues. Major goals included promoting increased collaboration and communication among existing active-living efforts; managing the Community Path extension project; encouraging Portuguese-speaking adults to incorporate daily physical activity; leveraging existing urban planning work to establish secure, attractive walking/biking corridors; and embedding active-living messages in everyday life. The Somerville Active Living by Design Partnership (ALbD) successfully created a robust task force that was integrated with citywide active-living efforts, secured resources to increase infrastructure and support for active living, including city-level coordinator positions, and changed decision-making practices that led to incorporation of pedestrian and bicycle transportation priorities into city planning and that influenced the extension of the Community Path. Partnerships must employ sustainability planning early on, utilize skilled facilitative leaders to manage leadership transitions, and engage new partners. Identifying, cultivating, and celebrating champions, especially those with political power, are critical. Working closely with research partners leads to rich data sources for planning and evaluation. Changing the built environment is difficult; working toward smaller wins is realistic and achievable. The synergy of ALbD and other community interventions created a foundation for short-term successes and accelerated political-cultural changes already underway with respect to active living.

  17. Numerical Study of Boundary Layer Interaction with Shocks: Method Improvement and Test Computation

    NASA Technical Reports Server (NTRS)

    Adams, N. A.

    1995-01-01

    The objective is the development of a high-order and high-resolution method for the direct numerical simulation of shock turbulent-boundary-layer interaction. Details concerning the spatial discretization of the convective terms can be found in Adams and Shariff (1995). The computer code based on this method as introduced in Adams (1994) was formulated in Cartesian coordinates and thus has been limited to simple rectangular domains. For more general two-dimensional geometries, as a compression corner, an extension to generalized coordinates is necessary. To keep the requirements or limitations for grid generation low, the extended formulation should allow for non-orthogonal grids. Still, for simplicity and cost efficiency, periodicity can be assumed in one cross-flow direction. For easy vectorization, the compact-ENO coupling algorithm as used in Adams (1994) treated whole planes normal to the derivative direction with the ENO scheme whenever at least one point of this plane satisfied the detection criterion. This is apparently too restrictive for more general geometries and more complex shock patterns. Here we introduce a localized compact-ENO coupling algorithm, which is efficient as long as the overall number of grid points treated by the ENO scheme is small compared to the total number of grid points. Validation and test computations with the final code are performed to assess the efficiency and suitability of the computer code for the problems of interest. We define a set of parameters where a direct numerical simulation of a turbulent boundary layer along a compression corner with reasonably fine resolution is affordable.

  18. The Evolution of the Internet Community and the"Yet-to-Evolve" Smart Grid Community: Parallels and Lessons-to-be-Learned

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McParland, Charles

    The Smart Grid envisions a transformed US power distribution grid that enables communicating devices, under human supervision, to moderate loads and increase overall system stability and security. This vision explicitly promotes increased participation from a community that, in the past, has had little involvement in power grid operations -the consumer. The potential size of this new community and its member's extensive experience with the public Internet prompts an analysis of the evolution and current state of the Internet as a predictor for best practices in the architectural design of certain portions of the Smart Grid network. Although still evolving, themore » vision of the Smart Grid is that of a community of communicating and cooperating energy related devices that can be directed to route power and modulate loads in pursuit of an integrated, efficient and secure electrical power grid. The remaking of the present power grid into the Smart Grid is considered as fundamentally transformative as previous developments such as modern computing technology and high bandwidth data communications. However, unlike these earlier developments, which relied on the discovery of critical new technologies (e.g. the transistor or optical fiber transmission lines), the technologies required for the Smart Grid currently exist and, in many cases, are already widely deployed. In contrast to other examples of technical transformations, the path (and success) of the Smart Grid will be determined not by its technology, but by its system architecture. Fortunately, we have a recent example of a transformative force of similar scope that shares a fundamental dependence on our existing communications infrastructure - namely, the Internet. We will explore several ways in which the scale of the Internet and expectations of its users have shaped the present Internet environment. As the presence of consumers within the Smart Grid increases, some experiences from the early growth of the Internet are expected to be informative and pertinent.« less

  19. Enhancing adaptive sparse grid approximations and improving refinement strategies using adjoint-based a posteriori error estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakeman, J.D., E-mail: jdjakem@sandia.gov; Wildey, T.

    2015-01-01

    In this paper we present an algorithm for adaptive sparse grid approximations of quantities of interest computed from discretized partial differential equations. We use adjoint-based a posteriori error estimates of the physical discretization error and the interpolation error in the sparse grid to enhance the sparse grid approximation and to drive adaptivity of the sparse grid. Utilizing these error estimates provides significantly more accurate functional values for random samples of the sparse grid approximation. We also demonstrate that alternative refinement strategies based upon a posteriori error estimates can lead to further increases in accuracy in the approximation over traditional hierarchicalmore » surplus based strategies. Throughout this paper we also provide and test a framework for balancing the physical discretization error with the stochastic interpolation error of the enhanced sparse grid approximation.« less

  20. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  1. A Review on Development Practice of Smart Grid Technology in China

    NASA Astrophysics Data System (ADS)

    Han, Liu; Chen, Wei; Zhuang, Bo; Shen, Hongming

    2017-05-01

    Smart grid has become an inexorable trend of energy and economy development worldwide. Since the development of smart grid was put forward in China in 2009, we have obtained abundant research results and practical experiences as well as extensive attention from international community in this field. This paper reviews the key technologies and demonstration projects on new energy connection forecasts; energy storage; smart substations; disaster prevention and reduction for power transmission lines; flexible DC transmission; distribution automation; distributed generation access and micro grid; smart power consumption; the comprehensive demonstration of power distribution and utilization; smart power dispatching and control systems; and the communication networks and information platforms of China, systematically, on the basis of 5 fields, i.e., renewable energy integration, smart power transmission and transformation, smart power distribution and consumption, smart power dispatching and control systems and information and communication platforms. Meanwhile, it also analyzes and compares with the developmental level of similar technologies abroad, providing an outlook on the future development trends of various technologies.

  2. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    PubMed

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  3. Musical rhythm and reading development: does beat processing matter?

    PubMed

    Ozernov-Palchik, Ola; Patel, Aniruddh D

    2018-05-20

    There is mounting evidence for links between musical rhythm processing and reading-related cognitive skills, such as phonological awareness. This may be because music and speech are rhythmic: both involve processing complex sound sequences with systematic patterns of timing, accent, and grouping. Yet, there is a salient difference between musical and speech rhythm: musical rhythm is often beat-based (based on an underlying grid of equal time intervals), while speech rhythm is not. Thus, the role of beat-based processing in the reading-rhythm relationship is not clear. Is there is a distinct relation between beat-based processing mechanisms and reading-related language skills, or is the rhythm-reading link entirely due to shared mechanisms for processing nonbeat-based aspects of temporal structure? We discuss recent evidence for a distinct link between beat-based processing and early reading abilities in young children, and suggest experimental designs that would allow one to further methodically investigate this relationship. We propose that beat-based processing taps into a listener's ability to use rich contextual regularities to form predictions, a skill important for reading development. © 2018 New York Academy of Sciences.

  4. On the use of Schwarz-Christoffel conformal mappings to the grid generation for global ocean models

    NASA Astrophysics Data System (ADS)

    Xu, S.; Wang, B.; Liu, J.

    2015-02-01

    In this article we propose two conformal mapping based grid generation algorithms for global ocean general circulation models (OGCMs). Contrary to conventional, analytical forms based dipolar or tripolar grids, the new algorithms are based on Schwarz-Christoffel (SC) conformal mapping with prescribed boundary information. While dealing with the basic grid design problem of pole relocation, these new algorithms also address more advanced issues such as smoothed scaling factor, or the new requirements on OGCM grids arisen from the recent trend of high-resolution and multi-scale modeling. The proposed grid generation algorithm could potentially achieve the alignment of grid lines to coastlines, enhanced spatial resolution in coastal regions, and easier computational load balance. Since the generated grids are still orthogonal curvilinear, they can be readily utilized in existing Bryan-Cox-Semtner type ocean models. The proposed methodology can also be applied to the grid generation task for regional ocean modeling where complex land-ocean distribution is present.

  5. Invited: Advances Toward Practical Detection of Trace Chemical Hazards with Solid State Microarray Devices

    NASA Astrophysics Data System (ADS)

    Raman, Barani; Meier, Douglas; Shenoy, Rupa; Benkstein, Kurt; Semancik, Steve

    2011-09-01

    We describe progress on an array-based microsensor approach employed for detecting trace levels of toxic industrial chemicals (TICs) in air-based backgrounds with varied levels of humidity, and with occasional introduction of aggressive interferents. Our MEMS microhotplate arrays are populated with multiple chemiresistive sensing materials, and all elements are programmed to go through extensive temperature cycling over repetitive cycles with lengths of approximately 20 s. Under such operation, analytically-rich data streams are produced containing the required information for target recognition.

  6. Exploration of the Core and Variable Dimensions of Extensive Reading Research and Pedagogy

    ERIC Educational Resources Information Center

    Waring, Rob; McLean, Stuart

    2015-01-01

    The Extensive Reading Foundation's bibliography now boasts over 530 articles with "Extensive Reading" in the title. About 35% of this rich and diverse body of papers were published in the past decade. A meta-review of this literature shows it is quite fragmented as evidenced by considerable variability in the conceptualization of…

  7. Durable Hybrid Coatings Annual Performance Report (2009)

    DTIC Science & Technology

    2009-10-01

    results based on lengths of cracks on different topcoat/primer combinations. Non- topcoated High gloss Low gloss White enamel 0 5 10 15 20 25 30...SR-285, showed extensive cracking and delamination upon cure and, thus, were eliminated from further investigation. Figure 3.15 shows the viscosity...solids polyurethane gloss enamel (AKZO NOBEL 646-58-7925 with AKZO NOBEL X- 501 curing component) and a Mg-rich primer developed at NDSU.16 In this

  8. Middle Devonian depositional environments for Rapid Member (Cedar Valley Group) interpreted from exposures at the Coralville spillway and elsewhere in eastern Iowa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witzke, B.J.; Bunker, B.J.

    1994-04-01

    Ongoing studies of the Cedar Valley Group were fortuitously aided by the exhumation of Rapid Mbr (Little Cedar Fm) carbonate strata below the coralville spillway, Johnson Co., Iowa, during the summer floods of 1993. Extensive bedding-plane exposures provided an exceptional opportunity to document fine-scale lateral biotic and lithologic variations within the member, and to compare these with data from elsewhere in eastern Iowa. The base of the Rapid Mbr is drawn at an abrupt lithic change above packstones of the Solon Mbr, marking a regional transgressive event. The basal 2.6 m of the Rapid is dominated by argillaceous skeletal wackestonesmore » with common brachiopods and echinoderm debris, interspersed with thin mudstones. The overlying 4 m comprises a series of 50--100 cm thick couplets which display alternations of thin mudstones and thicker brachiopod-rich wackepackstones. The next unit (2.9 m) is dominated by sparse-skeletal argillaceous mudstones. The paucity of burrowing and skeletal benthos through much of the unit is interpreted to reflect bottom oxygen stress in a relatively deep, possibly stratified cratonic seaway. Nevertheless, skeletal horizons within the unit indicate episodically favorable benthic conditions. The mudstone unit shallows upward into a brachiopod-rich wackestone interval which is, in turn, capped by a condensed horizon of phosphatic and glauconitic enrichment (near base of subterminus Fauna). Above this, two regionally extensive coral-rich biostromes occur. Upper Rapid strata show a complex of wackestone and packstone facies, with glauconitic enrichment and hardgrounds noted. The member is capped by shoal-water grainstones in the Coralville area, and by peritidal facies in northern Iowa.« less

  9. Thermo-Gas-Dynamic Model of Afterburning in Explosions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhl, A L; Ferguson, R E; Bell, J B

    2003-07-27

    A theoretical model of afterburning in explosions created by turbulent mixing of the detonation products from fuel-rich charges with air is described. It contains three key elements: (i) a thermodynamic-equilibrium description of the fluids (fuel, air, and products), (ii) a multi-component gas-dynamic treatment of the flow field, and (iii) a sub-grid model of molecular processes of mixing, combustion and equilibration.

  10. FUN3D Grid Refinement and Adaptation Studies for the Ares Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.; Vasta, Veer; Carlson, Jan-Renee; Park, Mike; Mineck, Raymond E.

    2010-01-01

    This paper presents grid refinement and adaptation studies performed in conjunction with computational aeroelastic analyses of the Ares crew launch vehicle (CLV). The unstructured grids used in this analysis were created with GridTool and VGRID while the adaptation was performed using the Computational Fluid Dynamic (CFD) code FUN3D with a feature based adaptation software tool. GridTool was developed by ViGYAN, Inc. while the last three software suites were developed by NASA Langley Research Center. The feature based adaptation software used here operates by aligning control volumes with shock and Mach line structures and by refining/de-refining where necessary. It does not redistribute node points on the surface. This paper assesses the sensitivity of the complex flow field about a launch vehicle to grid refinement. It also assesses the potential of feature based grid adaptation to improve the accuracy of CFD analysis for a complex launch vehicle configuration. The feature based adaptation shows the potential to improve the resolution of shocks and shear layers. Further development of the capability to adapt the boundary layer and surface grids of a tetrahedral grid is required for significant improvements in modeling the flow field.

  11. Editorial Introduction: Lunar Reconnaissance Orbiter, Part II

    NASA Technical Reports Server (NTRS)

    Petro, Noah E.; Keller, John W.; Gaddis, Lisa R.

    2016-01-01

    The Lunar Reconnaissance Orbiter (LRO) mission has shifted our understanding of the history of the Moon. The seven instruments on LRO each have contributed to creating new paradigms for the evolution of the Moon by providing unprecedented measurements of the surface, subsurface, and lunar environment. In this second volume of the LRO Special Issue, we present 21 papers from a broad range of the areas of investigation from LRO, from the volatile inventory, to the shape of the Moon's surface, to its rich volcanic history, and the interactions between the lunar surface and the space environment. These themes provide rich science for the instrument teams, as well as for the broader science com- munity who continue to use the LRO data in their research. Each paper uses publicly available data from one or more instruments on LRO, illustrating the value of a robust spacecraft. For example, the production of high-resolution topographic data products from the LRO Camera Narrow Angle Camera (Henriksen et al., pp. 122-137, this issue) rely on the accurate geodetic grid produced by the LOLA instrument (Mao et al., pp. 55-69, this issue; Smith et al., pp. 70-91, this issue). Additionally, analysis of LRO data coupled with other spacecraft data, such as LADEE (Hurley et al., pp. 31-37, this issue) and GRAIL (e.g., Jozwiak et al., pp. 224-231, this issue) illustrate the utility of merging not only data from multiple instruments, but also multiple orbital platforms. These synergistic studies show the value of the inter-team approach adopted by the LRO mission. This second volume represents the culmination of an extensive effort to highlight the high-quality science still being produced by the LRO instrument teams, even after more than seven years in orbit at the Moon.

  12. Sage Studies Of The Mass Return From AGB And RSG Stars In The Large Magellanic Cloud

    NASA Astrophysics Data System (ADS)

    Sargent, Benjamin A.; Srinivasan, S.; Meixner, M.

    2011-01-01

    The Surveying the Agents of a Galaxy's Evolution (SAGE; PI: M. Meixner) Spitzer Space Telescope Legacy project aims to further our understanding of the life cycle of matter in galaxies by studying this life cycle in our neighboring galaxy, the Large Magellanic Cloud (LMC). Combining SAGE mid-infrared photometry with that at shorter wavelengths from other catalogs, the spectral energy distribution (SED) for each of >25000 Asymptotic Giant Branch (AGB) and Red Supergiant (RSG) stars in the LMC has been assembled. To model mass loss from these stars, my colleagues and I have constructed the grid of RSG and AGB models (GRAMS) using the radiative transfer code 2Dust. I will discuss how GRAMS was constructed, and how we use it to determine the mass-loss rate for each evolved star studied, which gives the total mass-loss return to the LMC from AGB and RSG stars. In my talk, I show how this total mass-loss return is divided into oxygen-rich (O-rich) and carbon-rich (C-rich) dust using SED-fitting to identify O-rich versus C-rich AGB stars. Applications of this work to determining the mass return from evolved stars in other galaxies, including the Milky Way, will also be discussed.

  13. Error Estimates of the Ares I Computed Turbulent Ascent Longitudinal Aerodynamic Analysis

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Ghaffari, Farhad

    2012-01-01

    Numerical predictions of the longitudinal aerodynamic characteristics for the Ares I class of vehicles, along with the associated error estimate derived from an iterative convergence grid refinement, are presented. Computational results are based on an unstructured grid, Reynolds-averaged Navier-Stokes analysis. The validity of the approach to compute the associated error estimates, derived from a base grid to an extrapolated infinite-size grid, was first demonstrated on a sub-scaled wind tunnel model at representative ascent flow conditions for which the experimental data existed. Such analysis at the transonic flow conditions revealed a maximum deviation of about 23% between the computed longitudinal aerodynamic coefficients with the base grid and the measured data across the entire roll angles. This maximum deviation from the wind tunnel data was associated with the computed normal force coefficient at the transonic flow condition and was reduced to approximately 16% based on the infinite-size grid. However, all the computed aerodynamic coefficients with the base grid at the supersonic flow conditions showed a maximum deviation of only about 8% with that level being improved to approximately 5% for the infinite-size grid. The results and the error estimates based on the established procedure are also presented for the flight flow conditions.

  14. An infrastructure for the integration of geoscience instruments and sensors on the Grid

    NASA Astrophysics Data System (ADS)

    Pugliese, R.; Prica, M.; Kourousias, G.; Del Linz, A.; Curri, A.

    2009-04-01

    The Grid, as a computing paradigm, has long been in the attention of both academia and industry[1]. The distributed and expandable nature of its general architecture result to scalability and more efficient utilisation of the computing infrastructures. The scientific community, including that of geosciences, often handles problems with very high requirements in data processing, transferring, and storing[2,3]. This has raised the interest on Grid technologies but these are often viewed solely as an access gateway to HPC. Suitable Grid infrastructures could provide the geoscience community with additional benefits like those of sharing, remote access and control of scientific systems. These systems can be scientific instruments, sensors, robots, cameras and any other device used in geosciences. The solution for practical, general, and feasible Grid-enabling of such devices requires non-intrusive extensions on core parts of the current Grid architecture. We propose an extended version of an architecture[4] that can serve as the solution to the problem. The solution we propose is called Grid Instrument Element (IE) [5]. It is an addition to the existing core Grid parts; the Computing Element (CE) and the Storage Element (SE) that serve the purposes that their name suggests. The IE that we will be referring to, and the related technologies have been developed in the EU project on the Deployment of Remote Instrumentation Infrastructure (DORII1). In DORII, partners of various scientific communities including those of Earthquake, Environmental science, and Experimental science, have adopted the technology of the Instrument Element in order to integrate to the Grid their devices. The Oceanographic and coastal observation and modelling Mediterranean Ocean Observing Network (OGS2), a DORII partner, is in the process of deploying the above mentioned Grid technologies on two types of observational modules: Argo profiling floats and a novel Autonomous Underwater Vehicle (AUV). In this paper i) we define the need for integration of instrumentation in the Grid, ii) we introduce the solution of the Instrument Element, iii) we demonstrate a suitable end-user web portal for accessing Grid resources, iv) we describe from the Grid-technological point of view the process of the integration to the Grid of two advanced environmental monitoring devices. References [1] M. Surridge, S. Taylor, D. De Roure, and E. Zaluska, "Experiences with GRIA—Industrial Applications on a Web Services Grid," e-Science and Grid Computing, First International Conference on e-Science and Grid Computing, 2005, pp. 98-105. [2] A. Chervenak, I. Foster, C. Kesselman, C. Salisbury, and S. Tuecke, "The data grid: Towards an architecture for the distributed management and analysis of large scientific datasets," Journal of Network and Computer Applications, vol. 23, 2000, pp. 187-200. [3] B. Allcock, J. Bester, J. Bresnahan, A.L. Chervenak, I. Foster, C. Kesselman, S. Meder, V. Nefedova, D. Quesnel, and S. Tuecke, "Data management and transfer in high-performance computational grid environments," Parallel Computing, vol. 28, 2002, pp. 749-771. [4] E. Frizziero, M. Gulmini, F. Lelli, G. Maron, A. Oh, S. Orlando, A. Petrucci, S. Squizzato, and S. Traldi, "Instrument Element: A New Grid component that Enables the Control of Remote Instrumentation," Proceedings of the Sixth IEEE International Symposium on Cluster Computing and the Grid (CCGRID'06)-Volume 00, IEEE Computer Society Washington, DC, USA, 2006. [5] R. Ranon, L. De Marco, A. Senerchia, S. Gabrielli, L. Chittaro, R. Pugliese, L. Del Cano, F. Asnicar, and M. Prica, "A Web-based Tool for Collaborative Access to Scientific Instruments in Cyberinfrastructures." 1 The DORII project is supported by the European Commission within the 7th Framework Programme (FP7/2007-2013) under grant agreement no. RI-213110. URL: http://www.dorii.eu 2 Istituto Nazionale di Oceanografia e di Geofisica Sperimentale. URL: http://www.ogs.trieste.it

  15. Recent advances in the study of the UO2-PuO2 phase diagram at high temperatures

    NASA Astrophysics Data System (ADS)

    Böhler, R.; Welland, M. J.; Prieur, D.; Cakir, P.; Vitova, T.; Pruessmann, T.; Pidchenko, I.; Hennig, C.; Guéneau, C.; Konings, R. J. M.; Manara, D.

    2014-05-01

    Recently, novel container-less laser heating experimental data have been published on the melting behaviour of pure PuO2 and PuO2-rich compositions in the uranium dioxide-plutonium dioxide system. Such data showed that previous data obtained by more traditional furnace heating techniques were affected by extensive interaction between the sample and its containment. It is therefore paramount to check whether data so far used by nuclear engineers for the uranium-rich side of the pseudo-binary dioxide system can be confirmed or not. In the present work, new data are presented both in the UO2-rich part of the phase diagram, most interesting for the uranium-plutonium dioxide based nuclear fuel safety, and in the PuO2 side. The new results confirm earlier furnace heating data in the uranium-dioxide rich part of the phase diagram, and more recent laser-heating data in the plutonium-dioxide side of the system. As a consequence, it is also confirmed that a minimum melting point must exist in the UO2-PuO2 system, at a composition between x(PuO2) = 0.4 and x(PuO2) = 0.7 and 2900 K ⩽ T ⩽ 3000 K. Taking into account that, especially at high temperature, oxygen chemistry has an effect on the reported phase boundary uncertainties, the current results should be projected in the ternary U-Pu-O system. This aspect has been extensively studied here by X-ray diffraction and X-ray absorption spectroscopy. The current results suggest that uncertainty bands related to oxygen behaviour in the equilibria between condensed phases and gas should not significantly affect the qualitative trend of the current solid-liquid phase boundaries.

  16. Cause and Cure-Deterioration in Accuracy of CFD Simulations with Use of High-Aspect-Ratio Triangular/Tetrahedral Grids

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Chang, Chau-Lyan; Venkatachari, Balaji

    2017-01-01

    In the multi-dimensional space-time conservation element and solution element16 (CESE) method, triangles and tetrahedral mesh elements turn out to be the most natural building blocks for 2D and 3D spatial grids, respectively. As such, the CESE method is naturally compatible with the simplest 2D and 3D unstructured grids and thus can be easily applied to solve problems with complex geometries. However, because (a) accurate solution of a high-Reynolds number flow field near a solid wall requires that the grid intervals along the direction normal to the wall be much finer than those in a direction parallel to the wall and, as such, the use of grid cells with extremely high aspect ratio (103 to 106) may become mandatory, and (b) unlike quadrilateral hexahedral grids, it is well-known that accuracy of gradient computations involving triangular tetrahedral grids tends to deteriorate rapidly as cell aspect ratio increases. As a result, the use of triangular tetrahedral grid cells near a solid wall has long been deemed impractical by CFD researchers. In view of (a) the critical role played by triangular tetrahedral grids in the CESE development, and (b) the importance of accurate resolution of high-Reynolds number flow field near a solid wall, as will be presented in the main paper, a comprehensive and rigorous mathematical framework that clearly identifies the reasons behind the accuracy deterioration as described above has been developed for the 2D case involving triangular cells. By avoiding the pitfalls identified by the 2D framework, and its 3D extension, it has been shown numerically.

  17. Broad-band beam buncher

    DOEpatents

    Goldberg, D.A.; Flood, W.S.; Arthur, A.A.; Voelker, F.

    1984-03-20

    A broad-band beam bunther is disclosed, comprising an evacuated housing, an electron gun therein for producing an electron beam, a buncher cavity having entrance and exit openings through which the beam is directed, grids across such openings, a source providing a positive DC voltage between the cavity and the electron gun, a drift tube through which the electron beam travels in passing through such cavity, grids across the ends of such drift tube, gaps being provided between the drift tube grids and the entrance and exit grids, a modulator for supplying an ultrahigh frequency modulating signal to the drift tube for producing velocity modulation of the electrons in the beam, a drift space in the housing through which the velocity modulated electron beam travels and in which the beam is bunched, and a discharge opening from such drift tube and having a grid across such opening through which the bunched electron beam is discharged into an accelerator or the like. The buncher cavity and the drift tube may be arranged to constitute an extension of a coaxial transmission line which is employed to deliver the modulating signal from a signal source. The extended transmission line may be terminated in its characteristic impedance to afford a broad-

  18. From grid cells to place cells with realistic field sizes

    PubMed Central

    2017-01-01

    While grid cells in the medial entorhinal cortex (MEC) of rodents have multiple, regularly arranged firing fields, place cells in the cornu ammonis (CA) regions of the hippocampus mostly have single spatial firing fields. Since there are extensive projections from MEC to the CA regions, many models have suggested that a feedforward network can transform grid cell firing into robust place cell firing. However, these models generate place fields that are consistently too small compared to those recorded in experiments. Here, we argue that it is implausible that grid cell activity alone can be transformed into place cells with robust place fields of realistic size in a feedforward network. We propose two solutions to this problem. Firstly, weakly spatially modulated cells, which are abundant throughout EC, provide input to downstream place cells along with grid cells. This simple model reproduces many place cell characteristics as well as results from lesion studies. Secondly, the recurrent connections between place cells in the CA3 network generate robust and realistic place fields. Both mechanisms could work in parallel in the hippocampal formation and this redundancy might account for the robustness of place cell responses to a range of disruptions of the hippocampal circuitry. PMID:28750005

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arakawa, Akio; Konor, C.S.

    Two types of vertical grids are used for atmospheric models: The Lorenz (L grid) and the Charney-Phillips grid (CP grid). In this paper, problems with the L grid are pointed out that are due to the existence of an extra degree of freedom in the vertical distribution of the temperature (and the potential temperature). Then a vertical differencing of the primitive equations based on the CP grid is presented, while most of the advantages of the L grid in a hybrid {sigma}-p vetical coordinate are maintained. The discrete hydrostatic equation is constructed in such a way that it is freemore » from the vertical computational mode in the thermal field. Also, the vertical advection of the potential temperature in the discrete thermodynamic equation is constructed in such a way that it reduces to the standard (and most straightforward) vertical differencing of the quasigeostrophic equations based on the CP grid. Simulations of standing oscillations superposed on a resting atmosphere are presented using two vertically discrete models, one based on the L grid and the other on the CP grid. The comparison of the simulations shows that with the L grid a stationary vertically zigzag pattern dominates in the thermal field, while with the CP grid no such pattern is evident. Simulations of the growth of an extrapolated cyclone in a cyclic channel on a {beta} plan are also presented using two different {sigma}-coordinate models, again one with the L grid and the other with the CP grid, starting from random disturbances. 17 refs., 8 figs.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Sangwook; Bhalerao, Jayant, E-mail: s.park@uta.edu

    The supernova remnant (SNR) N49B in the Large Magellanic Cloud is a peculiar example of a core-collapse SNR that shows the shocked metal-rich ejecta enriched only in Mg without evidence for a similar overabundance in O and Ne. Based on archival Chandra data, we present results from our extensive spatially resolved spectral analysis of N49B. We find that the Mg-rich ejecta gas extends from the central regions of the SNR out to the southeastern outermost boundary of the SNR. This elongated feature shows an overabundance for Mg similar to that of the main ejecta region at the SNR center, andmore » its electron temperature appears to be higher than the central main ejecta gas. We estimate that the Mg mass in this southeastern elongated ejecta feature is ∼10% of the total Mg ejecta mass. Our estimated lower limit of >0.1 M {sub ⊙} on the total mass of the Mg-rich ejecta confirms the previously suggested large mass for the progenitor star ( M  ≳ 25 M {sub ⊙}). We entertain scenarios of an SNR expanding into a nonuniform medium and an energetic jet-driven supernova in an attempt to interpret these results. However, with the current results, the origins of the extended Mg-rich ejecta and the Mg-only-rich nature of the overall metal-rich ejecta in this SNR remain elusive.« less

  1. caGrid 1.0: a Grid enterprise architecture for cancer research.

    PubMed

    Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel

    2007-10-11

    caGrid is the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG) program. The current release, caGrid version 1.0, is developed as the production Grid software infrastructure of caBIG. Based on feedback from adopters of the previous version (caGrid 0.5), it has been significantly enhanced with new features and improvements to existing components. This paper presents an overview of caGrid 1.0, its main components, and enhancements over caGrid 0.5.

  2. Spatial Data Transfer Standard (SDTS), part 5 : SDTS raster profile and extensions

    DOT National Transportation Integrated Search

    1998-01-01

    The SRPE contains specifications for a profile for use with georeferenced two-dimensional raster data. Both raster image and raster grid data are included within the scope of this profile. The transfer of indirectly referenced images is permitted, i....

  3. FLUXCOM - Overview and First Synthesis

    NASA Astrophysics Data System (ADS)

    Jung, M.; Ichii, K.; Tramontana, G.; Camps-Valls, G.; Schwalm, C. R.; Papale, D.; Reichstein, M.; Gans, F.; Weber, U.

    2015-12-01

    We present a community effort aiming at generating an ensemble of global gridded flux products by upscaling FLUXNET data using an array of different machine learning methods including regression/model tree ensembles, neural networks, and kernel machines. We produced products for gross primary production, terrestrial ecosystem respiration, net ecosystem exchange, latent heat, sensible heat, and net radiation for two experimental protocols: 1) at a high spatial and 8-daily temporal resolution (5 arc-minute) using only remote sensing based inputs for the MODIS era; 2) 30 year records of daily, 0.5 degree spatial resolution by incorporating meteorological driver data. Within each set-up, all machine learning methods were trained with the same input data for carbon and energy fluxes respectively. Sets of input driver variables were derived using an extensive formal variable selection exercise. The performance of the extrapolation capacities of the approaches is assessed with a fully internally consistent cross-validation. We perform cross-consistency checks of the gridded flux products with independent data streams from atmospheric inversions (NEE), sun-induced fluorescence (GPP), catchment water balances (LE, H), satellite products (Rn), and process-models. We analyze the uncertainties of the gridded flux products and for example provide a breakdown of the uncertainty of mean annual GPP originating from different machine learning methods, different climate input data sets, and different flux partitioning methods. The FLUXCOM archive will provide an unprecedented source of information for water, energy, and carbon cycle studies.

  4. The numerics of hydrostatic structured-grid coastal ocean models: State of the art and future perspectives

    NASA Astrophysics Data System (ADS)

    Klingbeil, Knut; Lemarié, Florian; Debreu, Laurent; Burchard, Hans

    2018-05-01

    The state of the art of the numerics of hydrostatic structured-grid coastal ocean models is reviewed here. First, some fundamental differences in the hydrodynamics of the coastal ocean, such as the large surface elevation variation compared to the mean water depth, are contrasted against large scale ocean dynamics. Then the hydrodynamic equations as they are used in coastal ocean models as well as in large scale ocean models are presented, including parameterisations for turbulent transports. As steps towards discretisation, coordinate transformations and spatial discretisations based on a finite-volume approach are discussed with focus on the specific requirements for coastal ocean models. As in large scale ocean models, splitting of internal and external modes is essential also for coastal ocean models, but specific care is needed when drying & flooding of intertidal flats is included. As one obvious characteristic of coastal ocean models, open boundaries occur and need to be treated in a way that correct model forcing from outside is transmitted to the model domain without reflecting waves from the inside. Here, also new developments in two-way nesting are presented. Single processes such as internal inertia-gravity waves, advection and turbulence closure models are discussed with focus on the coastal scales. Some overview on existing hydrostatic structured-grid coastal ocean models is given, including their extensions towards non-hydrostatic models. Finally, an outlook on future perspectives is made.

  5. A Decentralized Wireless Solution to Monitor and Diagnose PV Solar Module Performance Based on Symmetrized-Shifted Gompertz Functions

    PubMed Central

    Molina-García, Angel; Campelo, José Carlos; Blanc, Sara; Serrano, Juan José; García-Sánchez, Tania; Bueso, María C.

    2015-01-01

    This paper proposes and assesses an integrated solution to monitor and diagnose photovoltaic (PV) solar modules based on a decentralized wireless sensor acquisition system. Both DC electrical variables and environmental data are collected at PV module level using low-cost and high-energy efficiency node sensors. Data is real-time processed locally and compared with expected PV module performances obtained by a PV module model based on symmetrized-shifted Gompertz functions (as previously developed and assessed by the authors). Sensor nodes send data to a centralized sink-computing module using a multi-hop wireless sensor network architecture. Such integration thus provides extensive analysis of PV installations, and avoids off-line tests or post-processing processes. In comparison with previous approaches, this solution is enhanced with a low-cost system and non-critical performance constraints, and it is suitable for extensive deployment in PV power plants. Moreover, it is easily implemented in existing PV installations, since no additional wiring is required. The system has been implemented and assessed in a Spanish PV power plant connected to the grid. Results and estimations of PV module performances are also included in the paper. PMID:26230694

  6. A Decentralized Wireless Solution to Monitor and Diagnose PV Solar Module Performance Based on Symmetrized-Shifted Gompertz Functions.

    PubMed

    Molina-García, Angel; Campelo, José Carlos; Blanc, Sara; Serrano, Juan José; García-Sánchez, Tania; Bueso, María C

    2015-07-29

    This paper proposes and assesses an integrated solution to monitor and diagnose photovoltaic (PV) solar modules based on a decentralized wireless sensor acquisition system. Both DC electrical variables and environmental data are collected at PV module level using low-cost and high-energy efficiency node sensors. Data is real-time processed locally and compared with expected PV module performances obtained by a PV module model based on symmetrized-shifted Gompertz functions (as previously developed and assessed by the authors). Sensor nodes send data to a centralized sink-computing module using a multi-hop wireless sensor network architecture. Such integration thus provides extensive analysis of PV installations, and avoids off-line tests or post-processing processes. In comparison with previous approaches, this solution is enhanced with a low-cost system and non-critical performance constraints, and it is suitable for extensive deployment in PV power plants. Moreover, it is easily implemented in existing PV installations, since no additional wiring is required. The system has been implemented and assessed in a Spanish PV power plant connected to the grid. Results and estimations of PV module performances are also included in the paper.

  7. Disentangling the influence of environmental and anthropogenic factors on the distribution of endemic vascular plants in Sardinia.

    PubMed

    Fois, Mauro; Fenu, Giuseppe; Cañadas, Eva Maria; Bacchetta, Gianluigi

    2017-01-01

    Due to the impelling urgency of plant conservation and the increasing availability of high resolution spatially interpolated (e.g. climate variables) and categorical data (e.g. land cover and vegetation type), many recent studies have examined relationships among plant species distributions and a diversified set of explanatory factors; nevertheless, global and regional patterns of endemic plant richness remain in many cases unexplained. One such pattern is the 294 endemic vascular plant taxa recorded on a 1 km resolution grid on the environmentally heterogeneous island of Sardinia. Sixteen predictors, including topographic, geological, climatic and anthropogenic factors, were used to model local (number of taxa inside each 1 km grid cell) Endemic Vascular Plant Richness (EVPR). Generalized Linear Models were used to evaluate how each factor affected the distribution of local EVPR. Significant relationships with local EVPR and topographic, geological, climatic and anthropogenic factors were found. In particular, elevation explained the larger fraction of variation in endemic richness but other environmental factors (e.g. precipitation seasonality and slope) and human-related factors (e.g. the Human Influence Index (HII) and the proportion of anthropogenic land uses) were, respectively, positively and negatively correlated with local EVPR. Regional EVPR (number of endemic taxa inside each 100 m elevation interval) was also measured to compare local and regional EVPR patterns along the elevation gradient. In contrast to local, regional EVPR tended to decrease with altitude partly due to the decreasing area covered along altitude. The contrasting results between local and regional patterns suggest that local richness increases as a result of increased interspecific aggregation along altitude, whereas regional richness may depend on the interaction between area and altitude. This suggests that the shape and magnitude of the species-area relationship might vary with elevation. This work provides-for the first time in Sardinia-a comprehensive analysis of the influence of environmental factors on the pattern of EVPR in the entire territory, from sea level to the highest peaks. Elevation, as well as other environmental and human-related variables, were confirmed to be influencing factors. In addition, variations of EVPR patterns at regional-to-local spatial scales inspire next investigations on the possible interaction between elevation and area in explaining patterns of plant species richness.

  8. Enhancing adaptive sparse grid approximations and improving refinement strategies using adjoint-based a posteriori error estimates

    DOE PAGES

    Jakeman, J. D.; Wildey, T.

    2015-01-01

    In this paper we present an algorithm for adaptive sparse grid approximations of quantities of interest computed from discretized partial differential equations. We use adjoint-based a posteriori error estimates of the interpolation error in the sparse grid to enhance the sparse grid approximation and to drive adaptivity. We show that utilizing these error estimates provides significantly more accurate functional values for random samples of the sparse grid approximation. We also demonstrate that alternative refinement strategies based upon a posteriori error estimates can lead to further increases in accuracy in the approximation over traditional hierarchical surplus based strategies. Throughout this papermore » we also provide and test a framework for balancing the physical discretization error with the stochastic interpolation error of the enhanced sparse grid approximation.« less

  9. FROG: Time Series Analysis for the Web Service Era

    NASA Astrophysics Data System (ADS)

    Allan, A.

    2005-12-01

    The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).

  10. MaGate Simulator: A Simulation Environment for a Decentralized Grid Scheduler

    NASA Astrophysics Data System (ADS)

    Huang, Ye; Brocco, Amos; Courant, Michele; Hirsbrunner, Beat; Kuonen, Pierre

    This paper presents a simulator for of a decentralized modular grid scheduler named MaGate. MaGate’s design emphasizes scheduler interoperability by providing intelligent scheduling serving the grid community as a whole. Each MaGate scheduler instance is able to deal with dynamic scheduling conditions, with continuously arriving grid jobs. Received jobs are either allocated on local resources, or delegated to other MaGates for remote execution. The proposed MaGate simulator is based on GridSim toolkit and Alea simulator, and abstracts the features and behaviors of complex fundamental grid elements, such as grid jobs, grid resources, and grid users. Simulation of scheduling tasks is supported by a grid network overlay simulator executing distributed ant-based swarm intelligence algorithms to provide services such as group communication and resource discovery. For evaluation, a comparison of behaviors of different collaborative policies among a community of MaGates is provided. Results support the use of the proposed approach as a functional ready grid scheduler simulator.

  11. OlyMPUS - The Ontology-based Metadata Portal for Unified Semantics

    NASA Astrophysics Data System (ADS)

    Huffer, E.; Gleason, J. L.

    2015-12-01

    The Ontology-based Metadata Portal for Unified Semantics (OlyMPUS), funded by the NASA Earth Science Technology Office Advanced Information Systems Technology program, is an end-to-end system designed to support data consumers and data providers, enabling the latter to register their data sets and provision them with the semantically rich metadata that drives the Ontology-Driven Interactive Search Environment for Earth Sciences (ODISEES). OlyMPUS leverages the semantics and reasoning capabilities of ODISEES to provide data producers with a semi-automated interface for producing the semantically rich metadata needed to support ODISEES' data discovery and access services. It integrates the ODISEES metadata search system with multiple NASA data delivery tools to enable data consumers to create customized data sets for download to their computers, or for NASA Advanced Supercomputing (NAS) facility registered users, directly to NAS storage resources for access by applications running on NAS supercomputers. A core function of NASA's Earth Science Division is research and analysis that uses the full spectrum of data products available in NASA archives. Scientists need to perform complex analyses that identify correlations and non-obvious relationships across all types of Earth System phenomena. Comprehensive analytics are hindered, however, by the fact that many Earth science data products are disparate and hard to synthesize. Variations in how data are collected, processed, gridded, and stored, create challenges for data interoperability and synthesis, which are exacerbated by the sheer volume of available data. Robust, semantically rich metadata can support tools for data discovery and facilitate machine-to-machine transactions with services such as data subsetting, regridding, and reformatting. Such capabilities are critical to enabling the research activities integral to NASA's strategic plans. However, as metadata requirements increase and competing standards emerge, metadata provisioning becomes increasingly burdensome to data producers. The OlyMPUS system helps data providers produce semantically rich metadata, making their data more accessible to data consumers, and helps data consumers quickly discover and download the right data for their research.

  12. Renormalization group analysis of turbulence

    NASA Technical Reports Server (NTRS)

    Smith, Leslie M.

    1989-01-01

    The objective is to understand and extend a recent theory of turbulence based on dynamic renormalization group (RNG) techniques. The application of RNG methods to hydrodynamic turbulence was explored most extensively by Yakhot and Orszag (1986). An eddy viscosity was calculated which was consistent with the Kolmogorov inertial range by systematic elimination of the small scales in the flow. Further, assumed smallness of the nonlinear terms in the redefined equations for the large scales results in predictions for important flow constants such as the Kolmogorov constant. It is emphasized that no adjustable parameters are needed. The parameterization of the small scales in a self-consistent manner has important implications for sub-grid modeling.

  13. Terrestrial photovoltaic collector technology trends

    NASA Technical Reports Server (NTRS)

    Shimada, K.; Costogue, E.

    1984-01-01

    Following the path of space PV collector development in its early stages, terrestrial PV technologies based upon single-crystal silicon have matured rapidly. Currently, terrestrial PV cells with efficiencies approaching space cell efficiencies are being fabricated into modules at a fraction of the space PV module cost. New materials, including CuInSe2 and amorphous silicon, are being developed for lowering the cost, and multijunction materials for achieving higher efficiency. Large grid-interactive, tracking flat-plate power systems and concentrator PV systems totaling about 10 MW, are already in operation. Collector technology development both flat-plate and concentrator, will continue under an extensive government and private industry partnership.

  14. Spatial operator algebra framework for multibody system dynamics

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Jain, Abhinandan; Kreutz, K.

    1989-01-01

    The Spatial Operator Algebra framework for the dynamics of general multibody systems is described. The use of a spatial operator-based methodology permits the formulation of the dynamical equations of motion of multibody systems in a concise and systematic way. The dynamical equations of progressively more complex grid multibody systems are developed in an evolutionary manner beginning with a serial chain system, followed by a tree topology system and finally, systems with arbitrary closed loops. Operator factorizations and identities are used to develop novel recursive algorithms for the forward dynamics of systems with closed loops. Extensions required to deal with flexible elements are also discussed.

  15. Spatial Operator Algebra for multibody system dynamics

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Jain, A.; Kreutz-Delgado, K.

    1992-01-01

    The Spatial Operator Algebra framework for the dynamics of general multibody systems is described. The use of a spatial operator-based methodology permits the formulation of the dynamical equations of motion of multibody systems in a concise and systematic way. The dynamical equations of progressively more complex grid multibody systems are developed in an evolutionary manner beginning with a serial chain system, followed by a tree topology system and finally, systems with arbitrary closed loops. Operator factorizations and identities are used to develop novel recursive algorithms for the forward dynamics of systems with closed loops. Extensions required to deal with flexible elements are also discussed.

  16. Orchestrating Bulk Data Movement in Grid Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vazhkudai, SS

    2005-01-25

    Data Grids provide a convenient environment for researchers to manage and access massively distributed bulk data by addressing several system and transfer challenges inherent to these environments. This work addresses issues involved in the efficient selection and access of replicated data in Grid environments in the context of the Globus Toolkit{trademark}, building middleware that (1) selects datasets in highly replicated environments, enabling efficient scheduling of data transfer requests; (2) predicts transfer times of bulk wide-area data transfers using extensive statistical analysis; and (3) co-allocates bulk data transfer requests, enabling parallel downloads from mirrored sites. These efforts have demonstrated a decentralizedmore » data scheduling architecture, a set of forecasting tools that predict bandwidth availability within 15% error and co-allocation architecture, and heuristics that expedites data downloads by up to 2 times.« less

  17. Topographic heterogeneity and temperature amplitude explain species richness patterns of birds in the Qinghai-Tibetan Plateau.

    PubMed

    Zhang, Chunlan; Quan, Qing; Wu, Yongjie; Chen, Youhua; He, Peng; Qu, Yanhua; Lei, Fumin

    2017-04-01

    Large-scale patterns of species richness have gained much attention in recent years; however, the factors that drive high species richness are still controversial in local regions, especially in highly diversified montane regions. The Qinghai-Tibetan Plateau (QTP) and the surrounding mountains are biodiversity hot spots due to a high number of endemic montane species. Here, we explored the factors underlying this high level of diversity by studying the relationship between species richness and environmental variables. The richness patterns of 758 resident bird species were summarized at the scale of 1°×1° grid cell at different taxonomic levels (order, family, genus, and species) and in different taxonomic groups (Passeriformes, Galliformes, Falconiformes, and Columbiformes). These richness patterns were subsequently analyzed against habitat heterogeneity (topographical heterogeneity and land cover), temperature amplitude (annual temperature, annual precipitation, precipitation seasonality, and temperature seasonality) and a vegetation index (net primary productivity). Our results showed that the highest richness was found in the southeastern part of the QTP, the eastern Himalayas. The lowest richness was observed in the central plateau of the QTP. Topographical heterogeneity and temperature amplitude are the primary factors that explain overall patterns of species richness in the QTP, although the specific effect of each environmental variable varies between the different taxonomic groups depending on their own evolutionary histories and ecological requirements. High species richness in the southeastern QTP is mostly due to highly diversified habitat types and temperature zones along elevation gradients, whereas the low species richness in the central plateau of the QTP may be due to environmental and energetic constraints, as the central plateau is harsh environment.

  18. On Accuracy of Adaptive Grid Methods for Captured Shocks

    NASA Technical Reports Server (NTRS)

    Yamaleev, Nail K.; Carpenter, Mark H.

    2002-01-01

    The accuracy of two grid adaptation strategies, grid redistribution and local grid refinement, is examined by solving the 2-D Euler equations for the supersonic steady flow around a cylinder. Second- and fourth-order linear finite difference shock-capturing schemes, based on the Lax-Friedrichs flux splitting, are used to discretize the governing equations. The grid refinement study shows that for the second-order scheme, neither grid adaptation strategy improves the numerical solution accuracy compared to that calculated on a uniform grid with the same number of grid points. For the fourth-order scheme, the dominant first-order error component is reduced by the grid adaptation, while the design-order error component drastically increases because of the grid nonuniformity. As a result, both grid adaptation techniques improve the numerical solution accuracy only on the coarsest mesh or on very fine grids that are seldom found in practical applications because of the computational cost involved. Similar error behavior has been obtained for the pressure integral across the shock. A simple analysis shows that both grid adaptation strategies are not without penalties in the numerical solution accuracy. Based on these results, a new grid adaptation criterion for captured shocks is proposed.

  19. On the use of Schwarz-Christoffel conformal mappings to the grid generation for global ocean models

    NASA Astrophysics Data System (ADS)

    Xu, S.; Wang, B.; Liu, J.

    2015-10-01

    In this article we propose two grid generation methods for global ocean general circulation models. Contrary to conventional dipolar or tripolar grids, the proposed methods are based on Schwarz-Christoffel conformal mappings that map areas with user-prescribed, irregular boundaries to those with regular boundaries (i.e., disks, slits, etc.). The first method aims at improving existing dipolar grids. Compared with existing grids, the sample grid achieves a better trade-off between the enlargement of the latitudinal-longitudinal portion and the overall smooth grid cell size transition. The second method addresses more modern and advanced grid design requirements arising from high-resolution and multi-scale ocean modeling. The generated grids could potentially achieve the alignment of grid lines to the large-scale coastlines, enhanced spatial resolution in coastal regions, and easier computational load balance. Since the grids are orthogonal curvilinear, they can be easily utilized by the majority of ocean general circulation models that are based on finite difference and require grid orthogonality. The proposed grid generation algorithms can also be applied to the grid generation for regional ocean modeling where complex land-sea distribution is present.

  20. Time-Dependent Hartree-Fock Approach to Nuclear Pasta at Finite Temperature

    NASA Astrophysics Data System (ADS)

    Schuetrumpf, B.; Klatt, M. A.; Iida, K.; Maruhn, J. A.; Mecke, K.; Reinhard, P.-G.

    2013-03-01

    We present simulations of neutron-rich matter at subnuclear densities, like supernova matter, with the time-dependent Hartree-Fock approximation at temperatures of several MeV. The initial state consists of α particles randomly distributed in space that have a Maxwell-Boltzmann distribution in momentum space. Adding a neutron background initialized with Fermi distributed plane waves the calculations reflect a reasonable approximation of astrophysical matter. This matter evolves into spherical, rod-like, and slab-like shapes and mixtures thereof. The simulations employ a full Skyrme interaction in a periodic three-dimensional grid. By an improved morphological analysis based on Minkowski functionals, all eight pasta shapes can be uniquely identified by the sign of only two valuations, namely the Euler characteristic and the integral mean curvature.

  1. GEM-AC, a stratospheric-tropospheric global and regional model for air quality and climate change: evaluation of gas phase properties

    NASA Astrophysics Data System (ADS)

    Kaminski, J. W.; Semeniuk, K.; McConnell, J. C.; Lupu, A.; Mamun, A.

    2012-12-01

    The Global Environmental Multiscale model for Air Quality and climate change (GEM-AC) is a global general circulation model based on the GEM model developed by the Meteorological Service of Canada for operational weather forecasting. It can be run with a global uniform (GU) grid or a global variable (GV) grid where the core has uniform grid spacing and the exterior grid expands. With a GV grid high resolution regional runs can be accomplished without a concern for boundary conditions. The work described here uses GEM version 3.3.2. The gas-phase chemistry consists in detailed reactions of Ox, NOx, HOx, CO, CH4, NMVOCs, halocarbons, ClOx and BrO. We have recently added elements of the Global Modal-aerosol eXtension (GMXe) scheme to address aerosol microphysics and gas-aerosol partitioning. The evaluation of the MESSY GMXe aerosol scheme is addressed in another poster. The Canadian aerosol module (CAM) is also available. Tracers are advected using the semi-Lagrangian scheme native to GEM. The vertical transport includes parameterized subgrid scale turbulence and large scale convection. Dry deposition is implemented as a flux boundary condition in the vertical diffusion equation. For climate runs the GHGs CO2, CH4, N2O, CFCs in the radiation scheme are adjusted to the scenario considered. In GV regional mode at high resolutions a lake model, FLAKE is also included. Wet removal comprises both in-cloud and below-cloud scavenging. With the gas phase chemistry the model has been run for a series of ten year time slices on a 3°×3° global grid with 77 hybrid levels from the surface to 0.15 hPa. The tropospheric and stratospheric gas phase results are compared with satellite measurements including, ACE, MIPAS, MOPITT, and OSIRIS. Current evaluations of the ozone field and other stratospheric fields are encouraging and tropospheric lifetimes for CH4 and CH3CCl3 are in reasonable accord with tropospheric models. We will present results for current and future climate conditions forced by SST for 2050.

  2. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    NASA Astrophysics Data System (ADS)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term forecasts (0-20 min ahead) to improve optimization and control of equipment on distribution feeders with high penetration of solar. Leveraging such tools that have seen extensive use in the atmospheric sciences supports the development of accurate physics-based solar forecast models. Directions for future research are also provided.

  3. Reserve networks based on richness hotspots and representation vary with scale.

    PubMed

    Shriner, Susan A; Wilson, Kenneth R; Flather, Curtis H

    2006-10-01

    While the importance of spatial scale in ecology is well established, few studies have investigated the impact of data grain on conservation planning outcomes. In this study, we compared species richness hotspot and representation networks developed at five grain sizes. We used species distribution maps for mammals and birds developed by the Arizona and New Mexico Gap Analysis Programs (GAP) to produce 1-km2, 100-kmn2, 625-km2, 2500-km2, and 10,000-km2 grid cell resolution distribution maps. We used these distribution maps to generate species richness and hotspot (95th quantile) maps for each taxon in each state. Species composition information at each grain size was used to develop two types of representation networks using the reserve selection software MARXAN. Reserve selection analyses were restricted to Arizona birds due to considerable computation requirements. We used MARXAN to create best reserve networks based on the minimum area required to represent each species at least once and equal area networks based on irreplaceability values. We also measured the median area of each species' distribution included in hotspot (mammals and birds of Arizona and New Mexico) and irreplaceability (Arizona birds) networks across all species. Mean area overlap between richness hotspot reserves identified at the five grain sizes was 29% (grand mean for four within-taxon/state comparisons), mean overlap for irreplaceability reserve networks was 32%, and mean overlap for best reserve networks was 53%. Hotspots for mammals and birds showed low overlap with a mean of 30%. Comparison of hotspots and irreplaceability networks showed very low overlap with a mean of 13%. For hotspots, median species distribution area protected within reserves declined monotonically from a high of 11% for 1-km2 networks down to 6% for 10,000-km2 networks. Irreplaceability networks showed a similar, but more variable, pattern of decline. This work clearly shows that map resolution has a profound effect on conservation planning outcomes and that hotspot and representation outcomes may be strikingly dissimilar. Thus, conservation planning is scale dependent, such that reserves developed using coarse-grained data do not subsume fine-grained reserves. Moreover, preserving both full species representation and species rich areas may require combined reserve design strategies.

  4. caGrid 1.0: A Grid Enterprise Architecture for Cancer Research

    PubMed Central

    Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel

    2007-01-01

    caGrid is the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIGTM) program. The current release, caGrid version 1.0, is developed as the production Grid software infrastructure of caBIGTM. Based on feedback from adopters of the previous version (caGrid 0.5), it has been significantly enhanced with new features and improvements to existing components. This paper presents an overview of caGrid 1.0, its main components, and enhancements over caGrid 0.5. PMID:18693901

  5. Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks.

    PubMed

    Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue

    2017-06-06

    Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions.

  6. A Transparent Translation from Legacy System Model into Common Information Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Simpson, Jeffrey; Zhang, Yingchen

    Advance in smart grid is forcing utilities towards better monitoring, control and analysis of distribution systems, and requires extensive cyber-based intelligent systems and applications to realize various functionalities. The ability of systems, or components within systems, to interact and exchange services or information with each other is the key to the success of smart grid technologies, and it requires efficient information exchanging and data sharing infrastructure. The Common Information Model (CIM) is a standard that allows different applications to exchange information about an electrical system, and it has become a widely accepted solution for information exchange among different platforms andmore » applications. However, most existing legacy systems are not developed using CIM, but using their own languages. Integrating such legacy systems is a challenge for utilities, and the appropriate utilization of the integrated legacy systems is even more intricate. Thus, this paper has developed an approach and open-source tool in order to translate legacy system models into CIM format. The developed tool is tested for a commercial distribution management system and simulation results have proved its effectiveness.« less

  7. Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks

    PubMed Central

    Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue

    2017-01-01

    Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions. PMID:28587304

  8. Integrating technologies for oil spill response in the SW Iberian coast

    NASA Astrophysics Data System (ADS)

    Janeiro, J.; Neves, A.; Martins, F.; Relvas, P.

    2017-09-01

    An operational oil spill modelling system developed for the SW Iberia Coast is used to investigate the relative importance of the different components and technologies integrating an oil spill monitoring and response structure. A backtrack of a CleanSeaNet oil detection in the region is used to demonstrate the concept. Taking advantage of regional operational products available, the system provides the necessary resolution to go from regional to coastal scales using a downscalling approach, while a multi-grid methodology allows the based oil spill model to span across model domains taking full advantage of the increasing resolution between the model grids. An extensive validation procedure using a multiplicity of sensors, with good spatial and temporal coverage, strengthens the operational system ability to accurately solve coastal scale processes. The model is validated using available trajectories from satellite-tracked drifters. Finally, a methodology is proposed to identifying potential origins for the CleanSeaNet oil detection, by combining model backtrack results with ship trajectories supplied by AIS was developed, including the error estimations found in the backtrack validation.

  9. Distributed Relaxation for Conservative Discretizations

    NASA Technical Reports Server (NTRS)

    Diskin, Boris; Thomas, James L.

    2001-01-01

    A multigrid method is defined as having textbook multigrid efficiency (TME) if the solutions to the governing system of equations are attained in a computational work that is a small (less than 10) multiple of the operation count in one target-grid residual evaluation. The way to achieve this efficiency is the distributed relaxation approach. TME solvers employing distributed relaxation have already been demonstrated for nonconservative formulations of high-Reynolds-number viscous incompressible and subsonic compressible flow regimes. The purpose of this paper is to provide foundations for applications of distributed relaxation to conservative discretizations. A direct correspondence between the primitive variable interpolations for calculating fluxes in conservative finite-volume discretizations and stencils of the discretized derivatives in the nonconservative formulation has been established. Based on this correspondence, one can arrive at a conservative discretization which is very efficiently solved with a nonconservative relaxation scheme and this is demonstrated for conservative discretization of the quasi one-dimensional Euler equations. Formulations for both staggered and collocated grid arrangements are considered and extensions of the general procedure to multiple dimensions are discussed.

  10. Influence of current climate, historical climate stability and topography on species richness and endemism in Mesoamerican geophyte plants

    PubMed Central

    2017-01-01

    Background A number of biotic and abiotic factors have been proposed as drivers of geographic variation in species richness. As biotic elements, inter-specific interactions are the most widely recognized. Among abiotic factors, in particular for plants, climate and topographic variables as well as their historical variation have been correlated with species richness and endemism. In this study, we determine the extent to which the species richness and endemism of monocot geophyte species in Mesoamerica is predicted by current climate, historical climate stability and topography. Methods Using approximately 2,650 occurrence points representing 507 geophyte taxa, species richness (SR) and weighted endemism (WE) were estimated at a geographic scale using grids of 0.5 × 0.5 decimal degrees resolution using Mexico as the geographic extent. SR and WE were also estimated using species distributions inferred from ecological niche modeling for species with at least five spatially unique occurrence points. Current climate, current to Last Glacial Maximum temperature, precipitation stability and topographic features were used as predictor variables on multiple spatial regression analyses (i.e., spatial autoregressive models, SAR) using the estimates of SR and WE as response variables. The standardized coefficients of the predictor variables that were significant in the regression models were utilized to understand the observed patterns of species richness and endemism. Results Our estimates of SR and WE based on direct occurrence data and distribution modeling generally yielded similar results, though estimates based on ecological niche modeling indicated broader distribution areas for SR and WE than when species richness was directly estimated using georeferenced coordinates. The SR and WE of monocot geophytes were highest along the Trans-Mexican Volcanic Belt, in both cases with higher levels in the central area of this mountain chain. Richness and endemism were also elevated in the southern regions of the Sierra Madre Oriental and Occidental mountain ranges, and in the Tehuacán Valley. Some areas of the Sierra Madre del Sur and Sierra Madre Oriental had high levels of WE, though they are not the areas with the highest SR. The spatial regressions suggest that SR is mostly influenced by current climate, whereas endemism is mainly affected by topography and precipitation stability. Conclusions Both methods (direct occurrence data and ecological niche modeling) used to estimate SR and WE in this study yielded similar results and detected a key area that should be considered in plant conservation strategies: the central region of the Trans-Mexican Volcanic Belt. Our results also corroborated that species richness is more closely correlated with current climate factors while endemism is related to differences in topography and to changes in precipitation levels compared to the LGM climatic conditions. PMID:29062605

  11. Optimal Control of Micro Grid Operation Mode Seamless Switching Based on Radau Allocation Method

    NASA Astrophysics Data System (ADS)

    Chen, Xiaomin; Wang, Gang

    2017-05-01

    The seamless switching process of micro grid operation mode directly affects the safety and stability of its operation. According to the switching process from island mode to grid-connected mode of micro grid, we establish a dynamic optimization model based on two grid-connected inverters. We use Radau allocation method to discretize the model, and use Newton iteration method to obtain the optimal solution. Finally, we implement the optimization mode in MATLAB and get the optimal control trajectory of the inverters.

  12. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  13. Navigation in Grid Space with the NAS Grid Benchmarks

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We present a navigational tool for computational grids. The navigational process is based on measuring the grid characteristics with the NAS Grid Benchmarks (NGB) and using the measurements to assign tasks of a grid application to the grid machines. The tool allows the user to explore the grid space and to navigate the execution at a grid application to minimize its turnaround time. We introduce the notion of gridscape as a user view of the grid and show how it can be me assured by NGB, Then we demonstrate how the gridscape can be used with two different schedulers to navigate a grid application through a rudimentary grid.

  14. An open source software for fast grid-based data-mining in spatial epidemiology (FGBASE).

    PubMed

    Baker, David M; Valleron, Alain-Jacques

    2014-10-30

    Examining whether disease cases are clustered in space is an important part of epidemiological research. Another important part of spatial epidemiology is testing whether patients suffering from a disease are more, or less, exposed to environmental factors of interest than adequately defined controls. Both approaches involve determining the number of cases and controls (or population at risk) in specific zones. For cluster searches, this often must be done for millions of different zones. Doing this by calculating distances can lead to very lengthy computations. In this work we discuss the computational advantages of geographical grid-based methods, and introduce an open source software (FGBASE) which we have created for this purpose. Geographical grids based on the Lambert Azimuthal Equal Area projection are well suited for spatial epidemiology because they preserve area: each cell of the grid has the same area. We describe how data is projected onto such a grid, as well as grid-based algorithms for spatial epidemiological data-mining. The software program (FGBASE), that we have developed, implements these grid-based methods. The grid based algorithms perform extremely fast. This is particularly the case for cluster searches. When applied to a cohort of French Type 1 Diabetes (T1D) patients, as an example, the grid based algorithms detected potential clusters in a few seconds on a modern laptop. This compares very favorably to an equivalent cluster search using distance calculations instead of a grid, which took over 4 hours on the same computer. In the case study we discovered 4 potential clusters of T1D cases near the cities of Le Havre, Dunkerque, Toulouse and Nantes. One example of environmental analysis with our software was to study whether a significant association could be found between distance to vineyards with heavy pesticide. None was found. In both examples, the software facilitates the rapid testing of hypotheses. Grid-based algorithms for mining spatial epidemiological data provide advantages in terms of computational complexity thus improving the speed of computations. We believe that these methods and this software tool (FGBASE) will lower the computational barriers to entry for those performing epidemiological research.

  15. Computational studies of horizontal axis wind turbines

    NASA Astrophysics Data System (ADS)

    Xu, Guanpeng

    A numerical technique has been developed for efficiently simulating fully three-dimensional viscous fluid flow around horizontal axis wind turbines (HAWT) using a zonal approach. The flow field is viewed as a combination of viscous regions, inviscid regions and vortices. The method solves the costly unsteady Reynolds averaged Navier-Stokes (RANS) equations only in the viscous region around the turbine blades. It solves the full potential equation in the inviscid region where flow is irrotational and isentropic. The tip vortices are simulated using a Lagrangean approach, thus removing the need to accurately resolve them on a fine grid. The hybrid method is shown to provide good results with modest CPU resources. A full Navier-Stokes based methodology has also been developed for modeling wind turbines at high wind conditions where extensive stall may occur. An overset grid based version that can model rotor-tower interactions has been developed. Finally, a blade element theory based methodology has been developed for the purpose of developing improved tip loss models and stall delay models. The effects of turbulence are simulated using a zero equation eddy viscosity model, or a one equation Spalart-Allmaras model. Two transition models, one based on the Eppler's criterion, and the other based on Michel's criterion, have been developed and tested. The hybrid method has been extensively validated for axial wind conditions for three rotors---NREL Phase II, Phase III, and Phase VI configurations. A limited set of calculations has been done for rotors operating under yaw conditions. Preliminary simulations have also been carried out to assess the effects of the tower wake on the rotor. In most of these cases, satisfactory agreement has been obtained with measurements. Using the numerical results from present methodologies as a guide, Prandtl's tip loss model and Corrigan's stall delay model were correlated with present calculations. An improved tip loss model has been obtained. A correction to the Corrigan's stall delay model has also been developed. Incorporation of these corrections is shown to considerably improve power predictions, even when a very simple aerodynamic theory---blade element method with annular inflow---is used.

  16. Formation of polycyclic aromatic hydrocarbons in circumstellar envelopes

    NASA Technical Reports Server (NTRS)

    Frenklach, Michael; Feigelson, Eric D.

    1989-01-01

    Production of polycyclic aromatic hydrocarbons in carbon-rich circumstellar envelopes was investigated using a kinetic approach. A detailed chemical reaction mechanism of gas-phase PAH formation and growth, containing approximately 100 reactions of 40 species, was numerically solved under the physical conditions expected in cool stellar winds. The chemistry is based on studies of soot production in hydrocarbon pyrolysis and combustion. Several first-ring and second-ring cyclization processes were considered. A linear lumping algorithm was used to describe PAH growth beyond the second aromatic ring. PAH production using this mechanism was examined with respect to a grid of idealized constant velocity stellar winds as well as several published astrophysical models. The basic result is that the onset of PAH production in the interstellar envelopes is predicted to occur within the temperature interval of 1100 to 900 K. The absolute amounts of the PAHs formed, however, are very sensitive to a number of parameters, both chemical and astrophysical, whose values are not accurately known. Astrophysically meaningful quantities of PAHs require particularly dense and slow stellar winds and high initial acetylene abundance. It is suggested that most of the PAHs may be produced in a relatively small fraction of carbon-rich red giants.

  17. Comparison of Grid Nudging and Spectral Nudging Techniques for Dynamical Climate Downscaling within the WRF Model

    NASA Astrophysics Data System (ADS)

    Fan, X.; Chen, L.; Ma, Z.

    2010-12-01

    Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.

  18. RGLite, an interface between ROOT and gLite—proof on the grid

    NASA Astrophysics Data System (ADS)

    Malzacher, P.; Manafov, A.; Schwarz, K.

    2008-07-01

    Using the gLitePROOF package it is possible to perform PROOF-based distributed data analysis on the gLite Grid. The LHC experiments managed to run globally distributed Monte Carlo productions on the Grid, now the development of tools for data analysis is in the foreground. To grant access interfaces must be provided. The ROOT/PROOF framework is used as a starting point. Using abstract ROOT classes (TGrid, ...) interfaces can be implemented, via which Grid access from ROOT can be accomplished. A concrete implementation exists for the ALICE Grid environment AliEn. Within the D-Grid project an interface to the common Grid middleware of all LHC experiments, gLite, has been created. Therefore it is possible to query Grid File Catalogues from ROOT for the location of the data to be analysed. Grid jobs can be submitted into a gLite based Grid. The status of the jobs can be asked for, and their results can be obtained.

  19. Sharing Data and Analytical Resources Securely in a Biomedical Research Grid Environment

    PubMed Central

    Langella, Stephen; Hastings, Shannon; Oster, Scott; Pan, Tony; Sharma, Ashish; Permar, Justin; Ervin, David; Cambazoglu, B. Barla; Kurc, Tahsin; Saltz, Joel

    2008-01-01

    Objectives To develop a security infrastructure to support controlled and secure access to data and analytical resources in a biomedical research Grid environment, while facilitating resource sharing among collaborators. Design A Grid security infrastructure, called Grid Authentication and Authorization with Reliably Distributed Services (GAARDS), is developed as a key architecture component of the NCI-funded cancer Biomedical Informatics Grid (caBIG™). The GAARDS is designed to support in a distributed environment 1) efficient provisioning and federation of user identities and credentials; 2) group-based access control support with which resource providers can enforce policies based on community accepted groups and local groups; and 3) management of a trust fabric so that policies can be enforced based on required levels of assurance. Measurements GAARDS is implemented as a suite of Grid services and administrative tools. It provides three core services: Dorian for management and federation of user identities, Grid Trust Service for maintaining and provisioning a federated trust fabric within the Grid environment, and Grid Grouper for enforcing authorization policies based on both local and Grid-level groups. Results The GAARDS infrastructure is available as a stand-alone system and as a component of the caGrid infrastructure. More information about GAARDS can be accessed at http://www.cagrid.org. Conclusions GAARDS provides a comprehensive system to address the security challenges associated with environments in which resources may be located at different sites, requests to access the resources may cross institutional boundaries, and user credentials are created, managed, revoked dynamically in a de-centralized manner. PMID:18308979

  20. Intrinsic alignment of redMaPPer clusters: cluster shape-matter density correlation

    NASA Astrophysics Data System (ADS)

    van Uitert, Edo; Joachimi, Benjamin

    2017-07-01

    We measure the alignment of the shapes of galaxy clusters, as traced by their satellite distributions, with the matter density field using the public redMaPPer catalogue based on Sloan Digital Sky Survey-Data Release 8 (SDSS-DR8), which contains 26 111 clusters up to z ˜ 0.6. The clusters are split into nine redshift and richness samples; in each of them, we detect a positive alignment, showing that clusters point towards density peaks. We interpret the measurements within the tidal alignment paradigm, allowing for a richness and redshift dependence. The intrinsic alignment (IA) amplitude at the pivot redshift z = 0.3 and pivot richness λ = 30 is A_IA^gen=12.6_{-1.2}^{+1.5}. We obtain tentative evidence that the signal increases towards higher richness and lower redshift. Our measurements agree well with results of maxBCG clusters and with dark-matter-only simulations. Comparing our results to the IA measurements of luminous red galaxies, we find that the IA amplitude of galaxy clusters forms a smooth extension towards higher mass. This suggests that these systems share a common alignment mechanism, which can be exploited to improve our physical understanding of IA.

  1. Resource assessment in Western Australia using a geographic information system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, A.

    1991-03-01

    Three study areas in Western Australia covering from 77,000 to 425,000 mi{sup 2} were examined for oil and gas potential using a geographic information system (GIS). A data base of source rock thickness, source richness, maturity, and expulsion efficiency was created for each interval. The GIS (Arc/Info) was used to create, manage, and analyze data for each interval in each study area. Source rock thickness and source richness data were added to the data base from digitized data. Maturity information was generated with Arc/Info by combining geochemical and depth to structure data. Expulsion efficiency data was created by a systemmore » level Arc/Info program. After the data base for each interval was built, the GIS was used to analyze the geologic data. The analysis consisted of converting each data layer into a lattice (grid) and using the lattice operation in Arc/Infor (addition, multiplication, division, and subtraction) to combine the data layers. Additional techniques for combining and selecting data were developed using Arc/Info system level programs. The procedure for performing the analyses was written as macros in Arc/Info's macro programming language (AML). The results of the analysis were estimates of oil and gas volumes for each interval. The resultant volumes were produced in tabular form for reports and cartographic form for presentation. The geographic information system provided several clear advantages over traditional methods of resource assessment including simplified management, updating, and editing of geologic data.« less

  2. Data based abnormality detection

    NASA Astrophysics Data System (ADS)

    Purwar, Yashasvi

    Data based abnormality detection is a growing research field focussed on extracting information from feature rich data. They are considered to be non-intrusive and non-destructive in nature which gives them a clear advantage over conventional methods. In this study, we explore different streams of data based anomalies detection. We propose extension and revisions to existing valve stiction detection algorithm supported with industrial case study. We also explored the area of image analysis and proposed a complete solution for Malaria diagnosis. The proposed method is tested over images provided by pathology laboratory at Alberta Health Service. We also address the robustness and practicality of the solution proposed.

  3. Conceptual Design of the Everglades Depth Estimation Network (EDEN) Grid

    USGS Publications Warehouse

    Jones, John W.; Price, Susan D.

    2007-01-01

    INTRODUCTION The Everglades Depth Estimation Network (EDEN) offers a consistent and documented dataset that can be used to guide large-scale field operations, to integrate hydrologic and ecological responses, and to support biological and ecological assessments that measure ecosystem responses to the Comprehensive Everglades Restoration Plan (Telis, 2006). Ground elevation data for the greater Everglades and the digital ground elevation models derived from them form the foundation for all EDEN water depth and associated ecologic/hydrologic modeling (Jones, 2004, Jones and Price, 2007). To use EDEN water depth and duration information most effectively, it is important to be able to view and manipulate information on elevation data quality and other land cover and habitat characteristics across the Everglades region. These requirements led to the development of the geographic data layer described in this techniques and methods report. Relying on extensive experience in GIS data development, distribution, and analysis, a great deal of forethought went into the design of the geographic data layer used to index elevation and other surface characteristics for the Greater Everglades region. To allow for simplicity of design and use, the EDEN area was broken into a large number of equal-sized rectangles ('Cells') that in total are referred to here as the 'grid'. Some characteristics of this grid, such as the size of its cells, its origin, the area of Florida it is designed to represent, and individual grid cell identifiers, could not be changed once the grid database was developed. Therefore, these characteristics were selected to design as robust a grid as possible and to ensure the grid's long-term utility. It is desirable to include all pertinent information known about elevation and elevation data collection as grid attributes. Also, it is very important to allow for efficient grid post-processing, sub-setting, analysis, and distribution. This document details the conceptual design of the EDEN grid spatial parameters and cell attribute-table content.

  4. Initial Conceptualization and Application of the Alaska Thermokarst Model

    NASA Astrophysics Data System (ADS)

    Bolton, W. R.; Lara, M. J.; Genet, H.; Romanovsky, V. E.; McGuire, A. D.

    2015-12-01

    Thermokarst topography forms whenever ice-rich permafrost thaws and the ground subsides due to the volume loss when ground ice transitions to water. The Alaska Thermokarst Model (ATM) is a large-scale, state-and-transition model designed to simulate transitions between landscape units affected by thermokarst disturbance. The ATM uses a frame-based methodology to track transitions and proportion of cohorts within a 1-km2 grid cell. In the arctic tundra environment, the ATM tracks thermokarst-related transitions among wetland tundra, graminoid tundra, shrub tundra, and thermokarst lakes. In the boreal forest environment, the ATM tracks transitions among forested permafrost plateau, thermokarst lakes, collapse scar fens and bogs. The transition from one cohort to another due to thermokarst processes can take place if thaw reaches ice-rich ground layers either due to pulse disturbance (i.e. large precipitation event or fires), or due to gradual active layer deepening that eventually results in penetration of the protective layer. The protective layer buffers the ice-rich soils from the land surface and is critical to determine how susceptible an area is to thermokarst degradation. The rate of terrain transition in our model is determined by a set of rules that are based upon the ice-content of the soil, the drainage efficiency (or the ability of the landscape to store or transport water), the cumulative probability of thermokarst initiation, distance from rivers, lake dynamics (increasing, decreasing, or stable), and other factors. Tundra types are allowed to transition from one type to another (for example, wetland tundra to graminoid tundra) under favorable climatic conditions. In this study, we present our conceptualization and initial simulation results from in the arctic (the Barrow Peninsula) and boreal (the Tanana Flats) regions of Alaska.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rush, Jason; Holubnyak, Yevhen; Watney, Willard

    This DOE-funded project evaluates the utility of seismic volumetric curvature (VC) for predicting stratal and structural architecture diagnostic of paleokarst reservoirs. Of special interest are applications geared toward carbon capture, utilization, and storage (CCUS). VC has been championed for identifying faults (offset <¼ λ) that cannot be imaged by conventional 3-D seismic attributes such as coherence. The objective of this research was to evaluate VC-techniques for reducing uncertainties in reservoir compartmentalization studies and seal risk assessments especially for saline aquifers. A 2000-ft horizontal lateral was purposefully drilled across VC-imaged lineaments—interpreted to record a fractured and a fault-bounded doline—to physically confirmmore » their presence. The 15-mi² study area is located in southeastern Bemis-Shutts Field, which is situated along the crest of the Central Kansas Uplift (CKU) in Ellis County, Kansas. The uppermost Arbuckle (200+ ft) has extensive paleokarst including collapsed paleocaverns and dolines related to exceedingly prolonged pre-Simpson (Sauk–Tippecanoe) and/or pre-Pennsylvanian subaerial exposure. A lateral borehole was successfully drilled across the full extent (~1100 ft) of a VC-inferred paleokarst doline. Triple combo (GR-neutron/density-resistivity), full-wave sonic, and borehole micro-imager logs were successfully run to TD on drill-pipe. Results from the formation evaluation reveal breccias (e.g., crackle, mosaic, chaotic), fractures, faults, vugs (1-6"), and unaffected host strata consistent with the pre-spud interpretation. Well-rounded pebbles were also observed on the image log. VC-inferred lineaments coincide with 20–80-ft wide intervals of high GR values (100+ API), matrix-rich breccias, and faults. To further demonstrate their utility, VC attributes are integrated into a geocellular modeling workflow: 1) to constrain the structural model; 2) to generate facies probability grids, and; 3) to collocate petrophysical models to separate-vug rock fabrics along solution-enlarged fault and fracture systems. Simulation-based studies demonstrate a potential alternative field development model for developing CO 2 storage sites that target carbonate reservoirs overprinted by paleokarst. Simulation results for this complex reservoir indicate that individual fault blocks could function as discrete containers for CO 2 storage thereby reducing the risk of plume migration outside the legally defined extent of the permitted storage site. Vertically extensive, anastomosing, solution-enlarged fault/fracture systems — infilled by clay-rich sediments — would operate as non-to-low permeability vertical "curtains" that restrict CO 2 movement beyond the confines of the CO 2 storage site. Such a location could be developed in a checker-board fashion with CO 2 injection operations occurring in one block and surveillance operations occurring in the adjacent block. Such naturally partitioned reservoirs may be ideal candidates for reducing risks associated with CO 2 plume breakthrough.« less

  6. Arc Length Based Grid Distribution For Surface and Volume Grids

    NASA Technical Reports Server (NTRS)

    Mastin, C. Wayne

    1996-01-01

    Techniques are presented for distributing grid points on parametric surfaces and in volumes according to a specified distribution of arc length. Interpolation techniques are introduced which permit a given distribution of grid points on the edges of a three-dimensional grid block to be propagated through the surface and volume grids. Examples demonstrate how these methods can be used to improve the quality of grids generated by transfinite interpolation.

  7. JTS and its Application in Environmental Protection Applications

    NASA Astrophysics Data System (ADS)

    Atanassov, Emanouil; Gurov, Todor; Slavov, Dimitar; Ivanovska, Sofiya; Karaivanova, Aneta

    2010-05-01

    The environmental protection was identified as a domain of high interest for South East Europe, addressing practical problems related to security and quality of life. The gridification of the Bulgarian applications MCSAES (Monte Carlo Sensitivity Analysis for Environmental Studies) which aims to develop an efficient Grid implementation of a sensitivity analysis of the Danish Eulerian Model), MSACM (Multi-Scale Atmospheric Composition Modeling) which aims to produce an integrated, multi-scale Balkan region oriented modelling system, able to interface the scales of the problem from emissions on the urban scale to their transport and transformation on the local and regional scales), MSERRHSA (Modeling System for Emergency Response to the Release of Harmful Substances in the Atmosphere) which aims to develop and deploy a modeling system for emergency response to the release of harmful substances in the atmosphere, targeted at the SEE and more specifically Balkan region) faces several challenges: These applications are resource intensive, in terms of both CPU utilization and data transfers and storage. The use of applications for operational purposes poses requirements for availability of resources, which are difficult to be met on a dynamically changing Grid environment. The validation of applications is resource intensive and time consuming. The successful resolution of these problems requires collaborative work and support from part of the infrastructure operators. However, the infrastructure operators are interested to avoid underutilization of resources. That is why we developed the Job Track Service and tested it during the development of the grid implementations of MCSAES, MSACM and MSERRHSA. The Job Track Service (JTS) is a grid middleware component which facilitates the provision of Quality of Service in grid infrastructures using gLite middleware like EGEE and SEEGRID. The service is based on messaging middleware and uses standart protocols like AMQP (Advanced Message Queuing Protocol) and XMPP (eXtensible Messaging and Presence Protocol) for real-time communication, while its security model is based on GSI authentication. It enables resource owners to provide the most popular types of QoS of execution to some of their users, using a standardized model. The first version of the service offered services to individual users. In this work we describe a new version of the Job Track service offering application specific functionality, geared towards the specific needs of the Environmental Modelling and Protection applications and oriented towards collaborative usage by groups and subgroups of users. We used the modular design of the JTS in order to implement plugins enabling smoother interaction of the users with the Grid environment. Our experience shows improved response times and decreased failure rate from the executions of the application. In this work we present such observations from the use of the South East European Grid infrastructure.

  8. Red palm oil-supplemented and biofortified gari on the carotenoid and retinyl palmitate concentrations of triacylglycerol-rich plasma of women

    USDA-ARS?s Scientific Manuscript database

    Boiled biofortified cassava containing ß-carotene (BC) can increase retinyl palmitate (RP) in triacylglycerol (TAG)-rich plasma. Thus, it might alleviate vitamin A deficiency. Cassava requires extensive preparation to decrease its level of cyanogenic glucosides, which can be fatal. Garification ...

  9. Effective Selection: A Study of First-Line Supervisor Selection Processes in the Department of Homeland Security

    DTIC Science & Technology

    2011-03-01

    performance and the extensive studies connecting perceptive measures to actual performance (Bommer, Johnson, Rich, Podsakoff , & MacKenzie, 1995; Brewer, 2005...theory of modern politics. London: Polity. Bommer, W. H., Johnson, J. L., Rich, G., Podsakoff , P. M., & MacKenzie, S. B. (1995). On the

  10. Black-sphere approximation to nuclei and its application to reactions with neutron-rich nuclei

    NASA Astrophysics Data System (ADS)

    Kohama, Akihisa; Iida, Kei; Oyamatsu, Kazuhiro

    2013-09-01

    We briefly review our formula for a proton-nucleus total reaction cross section, σR, constructed in the black-sphere approximation of nuclei, in which a nucleus is viewed as a "black" sphere of radius "a". An extension to reactions involving neutron-rich nuclei is also reported.

  11. Summer distribution and species richness of non-native fishes in the mainstem Willamette River, oregon, 1944-2006

    EPA Science Inventory

    We reviewed the results of seven extensive and two reach-specific fish surveys conducted on the mainstem Willamette River between 1944 and 2006 to document changes in the summer distribution and species richness of non-native fishes through time and the relative abundances of the...

  12. Feasibility and its characteristics of CO2 laser micromachining-based PMMA anti-scattering grid estimated by MCNP code simulation.

    PubMed

    Bae, Jun Woo; Kim, Hee Reyoung

    2018-01-01

    Anti-scattering grid has been used to improve the image quality. However, applying a commonly used linear or parallel grid would cause image distortion, and focusing grid also requires a precise fabrication technology, which is expensive. To investigate and analyze whether using CO2 laser micromachining-based PMMA anti-scattering grid can improve the performance of the grid at a lower cost. Thus, improvement of grid performance would result in improvement of image quality. The cross-sectional shape of CO2 laser machined PMMA is similar to alphabet 'V'. The performance was characterized by contrast improvement factor (CIF) and Bucky. Four types of grid were tested, which include thin parallel, thick parallel, 'V'-type and 'inverse V'-type of grid. For a Bucky factor of 2.1, the CIF of the grid with both the "V" and inverse "V" had a value of 1.53, while the thick and thick parallel types had values of 1.43 and 1.65, respectively. The 'V' shape grid manufacture by CO2 laser micromachining showed higher CIF than parallel one, which had same shielding material channel width. It was thought that the 'V' shape grid would be replacement to the conventional parallel grid if it is hard to fabricate the high-aspect-ratio grid.

  13. Low gravity containerless processing of immiscible gold rhodium alloys

    NASA Technical Reports Server (NTRS)

    Andrews, J. Barry

    1986-01-01

    Under normal one-g conditions immiscible alloys segregate extensively during solidification due to sedementation of the more dense of the immiscible liquid phases. However, under low-g conditions it should be possible to form a dispersion of the two immiscible liquids and maintain this dispersed structure during solidification. Immiscible (hypermonotectic) gold-rhodium alloys were processed in the Marshall Space Flight Center 105 meter drop tube in order to investigate the influence of low gravity, containerless solidification on their microstructure. Hypermonotectic alloys composed of 65 atomic % rhodium exhibited a tendency for the gold rich liquid to wet the outer surface of the containerless processed samples. This tendency led to extensive segregation in several cases. However, well dispersed microstructures consisting of 2 to 3 micron diameter rhodium-rich spheres in a gold-rich matrix were produced in 23.4 atomic % rhodium alloys. This is one of the best dispersions obtained in research on immiscible alloy-systems to data.

  14. The two-point correlation function for groups of galaxies in the Center for Astrophysics redshift survey

    NASA Technical Reports Server (NTRS)

    Ramella, Massimo; Geller, Margaret J.; Huchra, John P.

    1990-01-01

    The large-scale distribution of groups of galaxies selected from complete slices of the CfA redshift survey extension is examined. The survey is used to reexamine the contribution of group members to the galaxy correlation function. The relationship between the correlation function for groups and those calculated for rich clusters is discussed, and the results for groups are examined as an extension of the relation between correlation function amplitude and richness. The group correlation function indicates that groups and individual galaxies are equivalent tracers of the large-scale matter distribution. The distribution of group centers is equivalent to random sampling of the galaxy distribution. The amplitude of the correlation function for groups is consistent with an extrapolation of the amplitude-richness relation for clusters. The amplitude scaled by the mean intersystem separation is also consistent with results for richer clusters.

  15. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    DOE PAGES

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-01-01

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less

  16. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-06-23

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less

  17. Smart grid initialization reduces the computational complexity of multi-objective image registration based on a dual-dynamic transformation model to account for large anatomical differences

    NASA Astrophysics Data System (ADS)

    Bosman, Peter A. N.; Alderliesten, Tanja

    2016-03-01

    We recently demonstrated the strong potential of using dual-dynamic transformation models when tackling deformable image registration problems involving large anatomical differences. Dual-dynamic transformation models employ two moving grids instead of the common single moving grid for the target image (and single fixed grid for the source image). We previously employed powerful optimization algorithms to make use of the additional flexibility offered by a dual-dynamic transformation model with good results, directly obtaining insight into the trade-off between important registration objectives as a result of taking a multi-objective approach to optimization. However, optimization has so far been initialized using two regular grids, which still leaves a great potential of dual-dynamic transformation models untapped: a-priori grid alignment with image structures/areas that are expected to deform more. This allows (far) less grid points to be used, compared to using a sufficiently refined regular grid, leading to (far) more efficient optimization, or, equivalently, more accurate results using the same number of grid points. We study the implications of exploiting this potential by experimenting with two new smart grid initialization procedures: one manual expert-based and one automated image-feature-based. We consider a CT test case with large differences in bladder volume with and without a multi-resolution scheme and find a substantial benefit of using smart grid initialization.

  18. An objective decision model of power grid environmental protection based on environmental influence index and energy-saving and emission-reducing index

    NASA Astrophysics Data System (ADS)

    Feng, Jun-shu; Jin, Yan-ming; Hao, Wei-hua

    2017-01-01

    Based on modelling the environmental influence index of power transmission and transformation project and energy-saving and emission-reducing index of source-grid-load of power system, this paper establishes an objective decision model of power grid environmental protection, with constraints of power grid environmental protection objectives being legal and economical, and considering both positive and negative influences of grid on the environmental in all-life grid cycle. This model can be used to guide the programming work of power grid environmental protection. A numerical simulation of Jiangsu province’s power grid environmental protection objective decision model has been operated, and the results shows that the maximum goal of energy-saving and emission-reducing benefits would be reached firstly as investment increasing, and then the minimum goal of environmental influence.

  19. Integration and management of massive remote-sensing data based on GeoSOT subdivision model

    NASA Astrophysics Data System (ADS)

    Li, Shuang; Cheng, Chengqi; Chen, Bo; Meng, Li

    2016-07-01

    Owing to the rapid development of earth observation technology, the volume of spatial information is growing rapidly; therefore, improving query retrieval speed from large, rich data sources for remote-sensing data management systems is quite urgent. A global subdivision model, geographic coordinate subdivision grid with one-dimension integer coding on 2n-tree, which we propose as a solution, has been used in data management organizations. However, because a spatial object may cover several grids, ample data redundancy will occur when data are stored in relational databases. To solve this redundancy problem, we first combined the subdivision model with the spatial array database containing the inverted index. We proposed an improved approach for integrating and managing massive remote-sensing data. By adding a spatial code column in an array format in a database, spatial information in remote-sensing metadata can be stored and logically subdivided. We implemented our method in a Kingbase Enterprise Server database system and compared the results with the Oracle platform by simulating worldwide image data. Experimental results showed that our approach performed better than Oracle in terms of data integration and time and space efficiency. Our approach also offers an efficient storage management system for existing storage centers and management systems.

  20. Hippocampome.org: a knowledge base of neuron types in the rodent hippocampus.

    PubMed

    Wheeler, Diek W; White, Charise M; Rees, Christopher L; Komendantov, Alexander O; Hamilton, David J; Ascoli, Giorgio A

    2015-09-24

    Hippocampome.org is a comprehensive knowledge base of neuron types in the rodent hippocampal formation (dentate gyrus, CA3, CA2, CA1, subiculum, and entorhinal cortex). Although the hippocampal literature is remarkably information-rich, neuron properties are often reported with incompletely defined and notoriously inconsistent terminology, creating a formidable challenge for data integration. Our extensive literature mining and data reconciliation identified 122 neuron types based on neurotransmitter, axonal and dendritic patterns, synaptic specificity, electrophysiology, and molecular biomarkers. All ∼3700 annotated properties are individually supported by specific evidence (∼14,000 pieces) in peer-reviewed publications. Systematic analysis of this unprecedented amount of machine-readable information reveals novel correlations among neuron types and properties, the potential connectivity of the full hippocampal circuitry, and outstanding knowledge gaps. User-friendly browsing and online querying of Hippocampome.org may aid design and interpretation of both experiments and simulations. This powerful, simple, and extensible neuron classification endeavor is unique in its detail, utility, and completeness.

  1. From three-dimensional long-term tectonic numerical models to synthetic structural data: semi-automatic extraction of instantaneous & finite strain quantities

    NASA Astrophysics Data System (ADS)

    Duclaux, Guillaume; May, Dave

    2017-04-01

    Over the past three decades thermo-mechanical numerical modelling has transformed the way we look at deformation in the lithosphere. More than just generating aesthetically pleasing pictures, the output from a numerical models contains a rich source of quantitative information that can be used to measure deformation quantities in plan view or three-dimensions. Adding value to any numerical experiment requires a thorough post-processing of the modelling results. Such work aims to produce visual information that will resonate to seasoned structural geologists and assist with comparing experimental and observational data. Here we introduce two methods to generate synthetic structural data from numerical model outputs. We first present an image processing and shape recognition workflow developed to extract the active faults orientation from surface velocity gradients. In order to measure the active faults lengths and directions along with their distribution at the surface of the model we implemented an automated sequential mapping technique based on the second invariant of the strain rate tensor and using a suite a python functions. Active fault direction measurements are achieved using a probabilistic method for extracting linear features orientation from any surface. This method has the undeniable advantage to avoid interpretation bias. Strike measurements for individual segments are weighted according to their length and orientation distribution data are presented in an equal-area moving average rose diagrams produced using a weighted method. Finally, we discuss a method for mapping finite strain in three-dimensions. A high-resolution Lagrangian regular grid which advects during the numerical experiment is used to track the progressive deformation within the model. Thanks to this data we can measure the finite strain ellipsoids for any region of interest in the model. This method assumes that the finite strain is homogenous within one unit cell of the grid. We can compute individual ellipsoid's parameters (orientation, shape, etc.) and represent the finite deformation for any region of interest in a Flinn diagram. In addition, we can use the finite strain ellipsoids to estimate the prevailing foliation and/or lineation directions anywhere in the model. These two methods are applied to measure the instantaneous and finite deformation patterns within an oblique rift zone ongoing constant extension in the absence of surface processes.

  2. The multidimensional Self-Adaptive Grid code, SAGE, version 2

    NASA Technical Reports Server (NTRS)

    Davies, Carol B.; Venkatapathy, Ethiraj

    1995-01-01

    This new report on Version 2 of the SAGE code includes all the information in the original publication plus all upgrades and changes to the SAGE code since that time. The two most significant upgrades are the inclusion of a finite-volume option and the ability to adapt and manipulate zonal-matching multiple-grid files. In addition, the original SAGE code has been upgraded to Version 1.1 and includes all options mentioned in this report, with the exception of the multiple grid option and its associated features. Since Version 2 is a larger and more complex code, it is suggested (but not required) that Version 1.1 be used for single-grid applications. This document contains all the information required to run both versions of SAGE. The formulation of the adaption method is described in the first section of this document. The second section is presented in the form of a user guide that explains the input and execution of the code. The third section provides many examples. Successful application of the SAGE code in both two and three dimensions for the solution of various flow problems has proven the code to be robust, portable, and simple to use. Although the basic formulation follows the method of Nakahashi and Deiwert, many modifications have been made to facilitate the use of the self-adaptive grid method for complex grid structures. Modifications to the method and the simple but extensive input options make this a flexible and user-friendly code. The SAGE code can accommodate two-dimensional and three-dimensional, finite-difference and finite-volume, single grid, and zonal-matching multiple grid flow problems.

  3. Coherent field propagation between tilted planes.

    PubMed

    Stock, Johannes; Worku, Norman Girma; Gross, Herbert

    2017-10-01

    Propagating electromagnetic light fields between nonparallel planes is of special importance, e.g., within the design of novel computer-generated holograms or the simulation of optical systems. In contrast to the extensively discussed evaluation between parallel planes, the diffraction-based propagation of light onto a tilted plane is more burdensome, since discrete fast Fourier transforms cannot be applied directly. In this work, we propose a quasi-fast algorithm (O(N 3  log N)) that deals with this problem. Based on a proper decomposition into three rotations, the vectorial field distribution is calculated on a tilted plane using the spectrum of plane waves. The algorithm works on equidistant grids, so neither nonuniform Fourier transforms nor an explicit complex interpolation is necessary. The proposed algorithm is discussed in detail and applied to several examples of practical interest.

  4. Statistical Analysis of the AIAA Drag Prediction Workshop CFD Solutions

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Hemsch, Michael J.

    2007-01-01

    The first AIAA Drag Prediction Workshop (DPW), held in June 2001, evaluated the results from an extensive N-version test of a collection of Reynolds-Averaged Navier-Stokes CFD codes. The code-to-code scatter was more than an order of magnitude larger than desired for design and experimental validation of cruise conditions for a subsonic transport configuration. The second AIAA Drag Prediction Workshop, held in June 2003, emphasized the determination of installed pylon-nacelle drag increments and grid refinement studies. The code-to-code scatter was significantly reduced compared to the first DPW, but still larger than desired. However, grid refinement studies showed no significant improvement in code-to-code scatter with increasing grid refinement. The third AIAA Drag Prediction Workshop, held in June 2006, focused on the determination of installed side-of-body fairing drag increments and grid refinement studies for clean attached flow on wing alone configurations and for separated flow on the DLR-F6 subsonic transport model. This report compares the transonic cruise prediction results of the second and third workshops using statistical analysis.

  5. Estimating the Subsurface Basement Topography of Dodge County, Wisconsin Using Three Dimensional Modeling of Gravity and Aeromagnetic Data

    NASA Astrophysics Data System (ADS)

    MacAlister, E.; Skalbeck, J.; Stewart, E.

    2016-12-01

    Since the late 1800's, geologic studies have been completed in Wisconsin in pursuit of understanding the basement topography and locating economically viable mineral resources. The doubly plunging Baraboo Syncline located in Columbia and Sauk Counties provides a classic record of Precambrian deformation. A similar buried structure is thought to exist in adjacent Dodge County based on a prominent aeromagnetic anomaly. For this study, 3-D modeling of gravity and aeromagnetic survey data was used to approximate the structure of the Precambrian basement topography beneath Dodge County, Wisconsin. The aim of the research was to determine a suitable basement topography grid using potential field data and then use this grid as the base for groundwater flow models. Geosoft Oasis Montaj GM-SYS 3D modeling software was used to build grids of subsurface layers and the model was constrained by well records of basement rock elevations located throughout the county. The study demonstrated that there is a complex network of crystalline basement structures that have been folded through tectonic activity during the Precambrian. A thick layer of iron rich sedimentary material was deposited on top of the basement rocks, causing a distinct magnetic signature that outlined the basement structure in the magnetic survey. Preliminary results reveal an iron layer with a density of 3.7 g/cm3 and magnetic susceptibility of 8000 x 10-6 cgs that is approximately 500 feet thick and ranges between elevations of -300 meters below and 400 meters above sea level. The 3-D model depths are consistent with depths from recent core drilling operations performed by the Wisconsin Geological and Natural History Survey. Knowing the depth to and structure of basement rock throughout Dodge County and Wisconsin plays an important role in understanding the geologic history of the region. Also, better resolution of the basement topography can enhance the accuracy of future groundwater flow models.

  6. Al-rich Chondrules: Petrologic Basis for Their Diversity, and Relation to Type C CAIs

    NASA Technical Reports Server (NTRS)

    MacPherson, G. J.; Huss, G. R.

    2003-01-01

    Al-rich chondrules share mineralogical and chemical properties with, and are intermediate in a volatility sense between, CAIs and ferromagnesian chondrules. In some way they must be petrogenetic links between the two. A recent upsurge of interest in Al-rich chondrules is due to their constituent plagioclase feldspar and Al-rich glass being amenable to successful ion microprobe searches for radiogenic Mg-26, the decay product of Al-26 (t(sub 1/2) = 720,000 y). This has allowed estimates to be made of the time duration between CAI formation and the onset of Al-rich (and possibly, by extension, ferromagnesian) chondrule formation, on the order of 1.5-2.5 million years.

  7. Decentralized control of units in smart grids for the support of renewable energy supply

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sonnenschein, Michael, E-mail: Michael.Sonnenschein@Uni-Oldenburg.DE; Lünsdorf, Ontje, E-mail: Ontje.Luensdorf@OFFIS.DE; Bremer, Jörg, E-mail: Joerg.Bremer@Uni-Oldenburg.DE

    Due to the significant environmental impact of power production from fossil fuels and nuclear fission, future energy systems will increasingly rely on distributed and renewable energy sources (RES). The electrical feed-in from photovoltaic (PV) systems and wind energy converters (WEC) varies greatly both over short and long time periods (from minutes to seasons), and (not only) by this effect the supply of electrical power from RES and the demand for electrical power are not per se matching. In addition, with a growing share of generation capacity especially in distribution grids, the top-down paradigm of electricity distribution is gradually replaced bymore » a bottom-up power supply. This altogether leads to new problems regarding the safe and reliable operation of power grids. In order to address these challenges, the notion of Smart Grids has been introduced. The inherent flexibilities, i.e. the set of feasible power schedules, of distributed power units have to be controlled in order to support demand–supply matching as well as stable grid operation. Controllable power units are e.g. combined heat and power plants, power storage systems such as batteries, and flexible power consumers such as heat pumps. By controlling the flexibilities of these units we are particularly able to optimize the local utilization of RES feed-in in a given power grid by integrating both supply and demand management measures with special respect to the electrical infrastructure. In this context, decentralized systems, autonomous agents and the concept of self-organizing systems will become key elements of the ICT based control of power units. In this contribution, we first show how a decentralized load management system for battery charging/discharging of electrical vehicles (EVs) can increase the locally used share of supply from PV systems in a low voltage grid. For a reliable demand side management of large sets of appliances, dynamic clustering of these appliances into uniformly controlled appliance sets is necessary. We introduce a method for self-organized clustering for this purpose and show how control of such clusters can affect load peaks in distribution grids. Subsequently, we give a short overview on how we are going to expand the idea of self-organized clusters of units into creating a virtual control center for dynamic virtual power plants (DVPP) offering products at a power market. For an efficient organization of DVPPs, the flexibilities of units have to be represented in a compact and easy to use manner. We give an introduction how the problem of representing a set of possibly 10{sup 100} feasible schedules can be solved by a machine-learning approach. In summary, this article provides an overall impression how we use agent based control techniques and methods of self-organization to support the further integration of distributed and renewable energy sources into power grids and energy markets. - Highlights: • Distributed load management for electrical vehicles supports local supply from PV. • Appliances can self-organize into so called virtual appliances for load control. • Dynamic VPPs can be controlled by extensively decentralized control centers. • Flexibilities of units can efficiently be represented by support-vector descriptions.« less

  8. A Roadmap for caGrid, an Enterprise Grid Architecture for Biomedical Research

    PubMed Central

    Saltz, Joel; Hastings, Shannon; Langella, Stephen; Oster, Scott; Kurc, Tahsin; Payne, Philip; Ferreira, Renato; Plale, Beth; Goble, Carole; Ervin, David; Sharma, Ashish; Pan, Tony; Permar, Justin; Brezany, Peter; Siebenlist, Frank; Madduri, Ravi; Foster, Ian; Shanbhag, Krishnakant; Mead, Charlie; Hong, Neil Chue

    2012-01-01

    caGrid is a middleware system which combines the Grid computing, the service oriented architecture, and the model driven architecture paradigms to support development of interoperable data and analytical resources and federation of such resources in a Grid environment. The functionality provided by caGrid is an essential and integral component of the cancer Biomedical Informatics Grid (caBIG™) program. This program is established by the National Cancer Institute as a nationwide effort to develop enabling informatics technologies for collaborative, multi-institutional biomedical research with the overarching goal of accelerating translational cancer research. Although the main application domain for caGrid is cancer research, the infrastructure provides a generic framework that can be employed in other biomedical research and healthcare domains. The development of caGrid is an ongoing effort, adding new functionality and improvements based on feedback and use cases from the community. This paper provides an overview of potential future architecture and tooling directions and areas of improvement for caGrid and caGrid-like systems. This summary is based on discussions at a roadmap workshop held in February with participants from biomedical research, Grid computing, and high performance computing communities. PMID:18560123

  9. A roadmap for caGrid, an enterprise Grid architecture for biomedical research.

    PubMed

    Saltz, Joel; Hastings, Shannon; Langella, Stephen; Oster, Scott; Kurc, Tahsin; Payne, Philip; Ferreira, Renato; Plale, Beth; Goble, Carole; Ervin, David; Sharma, Ashish; Pan, Tony; Permar, Justin; Brezany, Peter; Siebenlist, Frank; Madduri, Ravi; Foster, Ian; Shanbhag, Krishnakant; Mead, Charlie; Chue Hong, Neil

    2008-01-01

    caGrid is a middleware system which combines the Grid computing, the service oriented architecture, and the model driven architecture paradigms to support development of interoperable data and analytical resources and federation of such resources in a Grid environment. The functionality provided by caGrid is an essential and integral component of the cancer Biomedical Informatics Grid (caBIG) program. This program is established by the National Cancer Institute as a nationwide effort to develop enabling informatics technologies for collaborative, multi-institutional biomedical research with the overarching goal of accelerating translational cancer research. Although the main application domain for caGrid is cancer research, the infrastructure provides a generic framework that can be employed in other biomedical research and healthcare domains. The development of caGrid is an ongoing effort, adding new functionality and improvements based on feedback and use cases from the community. This paper provides an overview of potential future architecture and tooling directions and areas of improvement for caGrid and caGrid-like systems. This summary is based on discussions at a roadmap workshop held in February with participants from biomedical research, Grid computing, and high performance computing communities.

  10. Reliability analysis in interdependent smart grid systems

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong

    2018-06-01

    Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.

  11. Vector-based navigation using grid-like representations in artificial agents.

    PubMed

    Banino, Andrea; Barry, Caswell; Uria, Benigno; Blundell, Charles; Lillicrap, Timothy; Mirowski, Piotr; Pritzel, Alexander; Chadwick, Martin J; Degris, Thomas; Modayil, Joseph; Wayne, Greg; Soyer, Hubert; Viola, Fabio; Zhang, Brian; Goroshin, Ross; Rabinowitz, Neil; Pascanu, Razvan; Beattie, Charlie; Petersen, Stig; Sadik, Amir; Gaffney, Stephen; King, Helen; Kavukcuoglu, Koray; Hassabis, Demis; Hadsell, Raia; Kumaran, Dharshan

    2018-05-01

    Deep neural networks have achieved impressive successes in fields ranging from object recognition to complex games such as Go 1,2 . Navigation, however, remains a substantial challenge for artificial agents, with deep neural networks trained by reinforcement learning 3-5 failing to rival the proficiency of mammalian spatial behaviour, which is underpinned by grid cells in the entorhinal cortex 6 . Grid cells are thought to provide a multi-scale periodic representation that functions as a metric for coding space 7,8 and is critical for integrating self-motion (path integration) 6,7,9 and planning direct trajectories to goals (vector-based navigation) 7,10,11 . Here we set out to leverage the computational functions of grid cells to develop a deep reinforcement learning agent with mammal-like navigational abilities. We first trained a recurrent network to perform path integration, leading to the emergence of representations resembling grid cells, as well as other entorhinal cell types 12 . We then showed that this representation provided an effective basis for an agent to locate goals in challenging, unfamiliar, and changeable environments-optimizing the primary objective of navigation through deep reinforcement learning. The performance of agents endowed with grid-like representations surpassed that of an expert human and comparison agents, with the metric quantities necessary for vector-based navigation derived from grid-like units within the network. Furthermore, grid-like representations enabled agents to conduct shortcut behaviours reminiscent of those performed by mammals. Our findings show that emergent grid-like representations furnish agents with a Euclidean spatial metric and associated vector operations, providing a foundation for proficient navigation. As such, our results support neuroscientific theories that see grid cells as critical for vector-based navigation 7,10,11 , demonstrating that the latter can be combined with path-based strategies to support navigation in challenging environments.

  12. Parallel Adaptive Mesh Refinement Library

    NASA Technical Reports Server (NTRS)

    Mac-Neice, Peter; Olson, Kevin

    2005-01-01

    Parallel Adaptive Mesh Refinement Library (PARAMESH) is a package of Fortran 90 subroutines designed to provide a computer programmer with an easy route to extension of (1) a previously written serial code that uses a logically Cartesian structured mesh into (2) a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, PARAMESH can operate as a domain-decomposition tool for users who want to parallelize their serial codes but who do not wish to utilize adaptivity. The package builds a hierarchy of sub-grids to cover the computational domain of a given application program, with spatial resolution varying to satisfy the demands of the application. The sub-grid blocks form the nodes of a tree data structure (a quad-tree in two or an oct-tree in three dimensions). Each grid block has a logically Cartesian mesh. The package supports one-, two- and three-dimensional models.

  13. The equal load-sharing model of cascade failures in power grids

    NASA Astrophysics Data System (ADS)

    Scala, Antonio; De Sanctis Lucentini, Pier Giorgio

    2016-11-01

    Electric power-systems are one of the most important critical infrastructures. In recent years, they have been exposed to extreme stress due to the increasing power demand, the introduction of distributed renewable energy sources, and the development of extensive interconnections. We investigate the phenomenon of abrupt breakdown of an electric power-system under two scenarios: load growth (mimicking the ever-increasing customer demand) and power fluctuations (mimicking the effects of renewable sources). Our results indicate that increasing the system size causes breakdowns to become more abrupt; in fact, mapping the system to a solvable statistical-physics model indicates the occurrence of a first order transition in the large size limit. Such an enhancement for the systemic risk failures (black-outs) with increasing network size is an effect that should be considered in the current projects aiming to integrate national power-grids into ;super-grids;.

  14. Abruptness of Cascade Failures in Power Grids

    NASA Astrophysics Data System (ADS)

    Pahwa, Sakshi; Scoglio, Caterina; Scala, Antonio

    2014-01-01

    Electric power-systems are one of the most important critical infrastructures. In recent years, they have been exposed to extreme stress due to the increasing demand, the introduction of distributed renewable energy sources, and the development of extensive interconnections. We investigate the phenomenon of abrupt breakdown of an electric power-system under two scenarios: load growth (mimicking the ever-increasing customer demand) and power fluctuations (mimicking the effects of renewable sources). Our results on real, realistic and synthetic networks indicate that increasing the system size causes breakdowns to become more abrupt; in fact, mapping the system to a solvable statistical-physics model indicates the occurrence of a first order transition in the large size limit. Such an enhancement for the systemic risk failures (black-outs) with increasing network size is an effect that should be considered in the current projects aiming to integrate national power-grids into ``super-grids''.

  15. Abruptness of cascade failures in power grids.

    PubMed

    Pahwa, Sakshi; Scoglio, Caterina; Scala, Antonio

    2014-01-15

    Electric power-systems are one of the most important critical infrastructures. In recent years, they have been exposed to extreme stress due to the increasing demand, the introduction of distributed renewable energy sources, and the development of extensive interconnections. We investigate the phenomenon of abrupt breakdown of an electric power-system under two scenarios: load growth (mimicking the ever-increasing customer demand) and power fluctuations (mimicking the effects of renewable sources). Our results on real, realistic and synthetic networks indicate that increasing the system size causes breakdowns to become more abrupt; in fact, mapping the system to a solvable statistical-physics model indicates the occurrence of a first order transition in the large size limit. Such an enhancement for the systemic risk failures (black-outs) with increasing network size is an effect that should be considered in the current projects aiming to integrate national power-grids into "super-grids".

  16. Partitioning medical image databases for content-based queries on a Grid.

    PubMed

    Montagnat, J; Breton, V; E Magnin, I

    2005-01-01

    In this paper we study the impact of executing a medical image database query application on the grid. For lowering the total computation time, the image database is partitioned into subsets to be processed on different grid nodes. A theoretical model of the application complexity and estimates of the grid execution overhead are used to efficiently partition the database. We show results demonstrating that smart partitioning of the database can lead to significant improvements in terms of total computation time. Grids are promising for content-based image retrieval in medical databases.

  17. Developing High-resolution Soil Database for Regional Crop Modeling in East Africa

    NASA Astrophysics Data System (ADS)

    Han, E.; Ines, A. V. M.

    2014-12-01

    The most readily available soil data for regional crop modeling in Africa is the World Inventory of Soil Emission potentials (WISE) dataset, which has 1125 soil profiles for the world, but does not extensively cover countries Ethiopia, Kenya, Uganda and Tanzania in East Africa. Another dataset available is the HC27 (Harvest Choice by IFPRI) in a gridded format (10km) but composed of generic soil profiles based on only three criteria (texture, rooting depth, and organic carbon content). In this paper, we present a development and application of a high-resolution (1km), gridded soil database for regional crop modeling in East Africa. Basic soil information is extracted from Africa Soil Information Service (AfSIS), which provides essential soil properties (bulk density, soil organic carbon, soil PH and percentages of sand, silt and clay) for 6 different standardized soil layers (5, 15, 30, 60, 100 and 200 cm) in 1km resolution. Soil hydraulic properties (e.g., field capacity and wilting point) are derived from the AfSIS soil dataset using well-proven pedo-transfer functions and are customized for DSSAT-CSM soil data requirements. The crop model is used to evaluate crop yield forecasts using the new high resolution soil database and compared with WISE and HC27. In this paper we will present also the results of DSSAT loosely coupled with a hydrologic model (VIC) to assimilate root-zone soil moisture. Creating a grid-based soil database, which provides a consistent soil input for two different models (DSSAT and VIC) is a critical part of this work. The created soil database is expected to contribute to future applications of DSSAT crop simulation in East Africa where food security is highly vulnerable.

  18. A Variational Formulation of Macro-Particle Algorithms for Kinetic Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Shadwick, B. A.

    2013-10-01

    Macro-particle based simulations methods are in widespread use in plasma physics; their computational efficiency and intuitive nature are largely responsible for their longevity. In the main, these algorithms are formulated by approximating the continuous equations of motion. For systems governed by a variational principle (such as collisionless plasmas), approximations of the equations of motion is known to introduce anomalous behavior, especially in system invariants. We present a variational formulation of particle algorithms for plasma simulation based on a reduction of the distribution function onto a finite collection of macro-particles. As in the usual Particle-In-Cell (PIC) formulation, these macro-particles have a definite momentum and are spatially extended. The primary advantage of this approach is the preservation of the link between symmetries and conservation laws. For example, nothing in the reduction introduces explicit time dependence to the system and, therefore, the continuous-time equations of motion exactly conserve energy; thus, these models are free of grid-heating. In addition, the variational formulation allows for constructing models of arbitrary spatial and temporal order. In contrast, the overall accuracy of the usual PIC algorithm is at most second due to the nature of the force interpolation between the gridded field quantities and the (continuous) particle position. Again in contrast to the usual PIC algorithm, here the macro-particle shape is arbitrary; the spatial extent is completely decoupled from both the grid-size and the ``smoothness'' of the shape; smoother particle shapes are not necessarily larger. For simplicity, we restrict our discussion to one-dimensional, non-relativistic, un-magnetized, electrostatic plasmas. We comment on the extension to the electromagnetic case. Supported by the US DoE under contract numbers DE-FG02-08ER55000 and DE-SC0008382.

  19. A 3-Year Climatology of Cloud and Radiative Properties Derived from GOES-8 Data Over the Southern Great Plains

    NASA Technical Reports Server (NTRS)

    Khaiyer, M. M.; Rapp, A. D.; Doelling, D. R.; Nordeen, M. L.; Minnis, P.; Smith, W. L., Jr.; Nguyen, L.

    2001-01-01

    While the various instruments maintained at the Atmospheric Radiation Measurement (ARM) Program Southern Great Plains (SGP) Central Facility (CF) provide detailed cloud and radiation measurements for a small area, satellite cloud property retrievals provide a means of examining the large-scale properties of the surrounding region over an extended period of time. Seasonal and inter-annual climatological trends can be analyzed with such a dataset. For this purpose, monthly datasets of cloud and radiative properties from December 1996 through November 1999 over the SGP region have been derived using the layered bispectral threshold method (LBTM). The properties derived include cloud optical depths (ODs), temperatures and albedos, and are produced on two grids of lower (0.5 deg) and higher resolution (0.3 deg) centered on the ARM SGP CF. The extensive time period and high-resolution of the inner grid of this dataset allows for comparison with the suite of instruments located at the ARM CF. In particular, Whole-Sky Imager (WSI) and the Active Remote Sensing of Clouds (ARSCL) cloud products can be compared to the cloud amounts and heights of the LBTM 0.3 deg grid box encompassing the CF site. The WSI provides cloud fraction and the ARSCL computes cloud fraction, base, and top heights using the algorithms by Clothiaux et al. (2001) with a combination of Belfort Laser Ceilometer (BLC), Millimeter Wave Cloud Radar (MMCR), and Micropulse Lidar (MPL) data. This paper summarizes the results of the LBTM analysis for 3 years of GOES-8 data over the SGP and examines the differences between surface and satellite-based estimates of cloud fraction.

  20. Deriving the species richness distribution of Geotrupinae (Coleoptera: Scarabaeoidea) in Mexico from the overlap of individual model predictions.

    PubMed

    Trotta-Moreu, Nuria; Lobo, Jorge M

    2010-02-01

    Predictions from individual distribution models for Mexican Geotrupinae species were overlaid to obtain a total species richness map for this group. A database (GEOMEX) that compiles available information from the literature and from several entomological collections was used. A Maximum Entropy method (MaxEnt) was applied to estimate the distribution of each species, taking into account 19 climatic variables as predictors. For each species, suitability values ranging from 0 to 100 were calculated for each grid cell on the map, and 21 different thresholds were used to convert these continuous suitability values into binary ones (presence-absence). By summing all of the individual binary maps, we generated a species richness prediction for each of the considered thresholds. The number of species and faunal composition thus predicted for each Mexican state were subsequently compared with those observed in a preselected set of well-surveyed states. Our results indicate that the sum of individual predictions tends to overestimate species richness but that the selection of an appropriate threshold can reduce this bias. Even under the most optimistic prediction threshold, the mean species richness error is 61% of the observed species richness, with commission errors being significantly more common than omission errors (71 +/- 29 versus 18 +/- 10%). The estimated distribution of Geotrupinae species richness in Mexico in discussed, although our conclusions are preliminary and contingent on the scarce and probably biased available data.

  1. Modelling noise propagation using Grid Resources. Progress within GDI-Grid

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut

    2010-05-01

    Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation. This immense intensive calculation needs to be performed for a major part of European landscape. A LINUX version of the commercial LimA software for noise mapping analysis has been implemented on a test cluster within the German D-GRID computer network. Results and performance indicators will be presented. The presentation is an extension to last-years presentation "Spatial Data Infrastructures and Grid Computing: the GDI-Grid project" that described the gridification concept developed in the GDI-Grid project and provided an overview of the conceptual gaps between Grid Computing and Spatial Data Infrastructures. Results from the GDI-Grid project are incorporated in the OGC-OGF (Open Grid Forum) collaboration efforts as well as the OGC WPS 2.0 standards working group developing the next major version of the WPS specification.

  2. Sino/American cooperation for rural electrification in China

    NASA Astrophysics Data System (ADS)

    Wallace, William L.; Tsuo, Y. Simon

    1997-02-01

    Rapid growth in economic development, coupled with the absence of an electric grid in large areas of the rural countryside, have created a need for new energy sources both in urban centers and rural areas in China. There is a very large need for new sources of energy for rural electrification in China as represented by 120 million people in remote regions who do not have access to an electric grid and by over 300 coastal islands in China that are unelectrified. In heavily populated regions in China where there is an electric grid, there are still severe shortages of electric power and limited access to the grid by village populations. In order to meet energy demands in rural China, renewable energy in the form of solar, wind, and biomass resources are being utilized as a cost effective alternative to grid extension and use of diesel and gasoline generators. An Energy Efficiency and Renewable Energy Protocol Agreement was signed by the U.S. Department of Energy with the Chinese State Science and Technology Commission in Beijing in February, 1995. Under this agreement, projects using photovoltaics for rural electrification are being conducted in Gansu Province in western China and Inner Mongolia in northern China, providing the basis for much wider deployment and use of photovoltaics for meeting the growing rural energy demands of China.

  3. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    PubMed

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  4. High density grids

    DOEpatents

    Cohen, Aina E.; Baxter, Elizabeth L.

    2018-01-16

    An X-ray data collection grid device is provided that includes a magnetic base that is compatible with robotic sample mounting systems used at synchrotron beamlines, a grid element fixedly attached to the magnetic base, where the grid element includes at least one sealable sample window disposed through a planar synchrotron-compatible material, where the planar synchrotron-compatible material includes at least one automated X-ray positioning and fluid handling robot fiducial mark.

  5. Landscape patterns in rainforest phylogenetic signal: isolated islands of refugia or structured continental distributions?

    PubMed

    Kooyman, Robert M; Rossetto, Maurizio; Sauquet, Hervé; Laffan, Shawn W

    2013-01-01

    Identify patterns of change in species distributions, diversity, concentrations of evolutionary history, and assembly of Australian rainforests. We used the distribution records of all known rainforest woody species in Australia across their full continental extent. These were analysed using measures of species richness, phylogenetic diversity (PD), phylogenetic endemism (PE) and phylogenetic structure (net relatedness index; NRI). Phylogenetic structure was assessed using both continental and regional species pools. To test the influence of growth-form, freestanding and climbing plants were analysed independently, and in combination. Species richness decreased along two generally orthogonal continental axes, corresponding with wet to seasonally dry and tropical to temperate habitats. The PE analyses identified four main areas of substantially restricted phylogenetic diversity, including parts of Cape York, Wet Tropics, Border Ranges, and Tasmania. The continental pool NRI results showed evenness (species less related than expected by chance) in groups of grid cells in coastally aligned areas of species rich tropical and sub-tropical rainforest, and in low diversity moist forest areas in the south-east of the Great Dividing Range and in Tasmania. Monsoon and drier vine forests, and moist forests inland from upland refugia showed phylogenetic clustering, reflecting lower diversity and more relatedness. Signals for evenness in Tasmania and clustering in northern monsoon forests weakened in analyses using regional species pools. For climbing plants, values for NRI by grid cell showed strong spatial structuring, with high diversity and PE concentrated in moist tropical and subtropical regions. Concentrations of rainforest evolutionary history (phylo-diversity) were patchily distributed within a continuum of species distributions. Contrasting with previous concepts of rainforest community distribution, our findings of continuous distributions and continental connectivity have significant implications for interpreting rainforest evolutionary history and current day ecological processes, and for managing rainforest diversity in changing circumstances.

  6. Final Report: Closeout of the Award NO. DE-FG02-98ER62618 (M.S. Fox-Rabinovitz, P.I.)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox-Rabinovitz, M. S.

    The final report describes the study aimed at exploring the variable-resolution stretched-grid (SG) approach to decadal regional climate modeling using advanced numerical techniques. The obtained results have shown that variable-resolution SG-GCMs using stretched grids with fine resolution over the area(s) of interest, is a viable established approach to regional climate modeling. The developed SG-GCMs have been extensively used for regional climate experimentation. The SG-GCM simulations are aimed at studying the U.S. regional climate variability with an emphasis on studying anomalous summer climate events, the U.S. droughts and floods.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsell, J.N.

    Mexico's new president faces an economic crisis of high inflation, corruption, and overextended borrowing to finance an ambitious development program based on anticipated oil revenues. The decline of world oil prices in 1981 and the depressed prices of other export commodities led to extensive borrowing from foreign banks, which saw Mexico's oil riches as a good risk. Despite its overpopulation and related socio-economic problems, Mexico's oil resources will be the foundation of economic development even though the government must move first to restore public confidence and diversify the economy. (DCK)

  8. Cracks in a Debris Apron

    NASA Image and Video Library

    2015-03-11

    This observation from NASA Mars Reconnaissance Orbiter shows the nature of large fissures in a smooth apron around a mound in the Phlegra region. The apron could be (or could have been) ice-rich, so one possibility is that the fissures are related to ice loss. Based on radar data from MRO combined with studies of the region's geology from other orbiters, scientists think that extensive glaciers covered this region several hundred million years ago. http://photojournal.jpl.nasa.gov/catalog/PIA19307

  9. Bird communities of natural and modified habitats in Panama

    USGS Publications Warehouse

    Petit, L.J.; Petit, D.R.; Christian, D.G.; Powell, Hugo D.W.

    1999-01-01

    Only a small proportion of land can realistically be protected as nature reserves and thus conservation efforts also must focus on the ecological value of agroecosystems and developed areas surrounding nature reserves. In this study, avian communities were surveyed in 11 habitat types in central Panama, across a gradient from extensive forest to intensive agricultural land uses, to examine patterns of species richness and abundance and community composition. Wooded habitats, including extensive and fragmented forests, shade coffee plantations, and residential areas supported the most species and individuals. Nearctic-Neotropical migratory species were most numerous in lowland forest fragments, shade coffee, and residential areas. Introduced Pinus caribbea and sugar cane plantations supported the fewest species compared to all other habitats. Cattle pastures left fallow for less than two years supported more than twice as many total species as actively grazed pastures, such that species richness in fallow pastures was similar to that found in wooded habitats. Community similarities were relatively low among all habitat types (none exceeding the observed 65% similarity between extensive and fragmented lowland forests), but communities in shade coffee and residential areas were 43% and 54% similar to lowland forest fragments, respectively. Fallow pastures and residential areas shared 60% of their species. Bird communities in shade coffee and residential areas were characterized by higher proportions of frugivorous and nectarivorous species than in native forests. These same guilds also were better represented in fallow than in grazed pastures. Raptors and piscivorous species were most prevalent in cattle pastures and rice fields. These results, though based upon only species richness and abundance, demonstrate that many human-altered habitats have potential ecological value for birds, and conservation efforts in tropical areas should focus greater attention on enhancement of agricultural and developed lands as wildlife habitat. To understand the true conservation value of these modified lands will require examination not only of numbers but also of the types of species supported by these habitats, their reproductive output and survival rates.

  10. Coexisting shortening and extension along the "Africa-Eurasia" plate boundary in southern Italy

    NASA Astrophysics Data System (ADS)

    Cuffaro, M.; Riguzzi, F.; Scrocca, D.; Doglioni, C.

    2009-04-01

    We performed geodetic strain rate field analyses along the "Africa (Sicily microplate)"-"Eurasia (Tyrrhenian microplate)" plate boundary in Sicily (southern Italy), using new GPS velocities from a data set spanning maximum ten years (1998-2007). Data from GPS permanent stations maintained from different institutions and the recent RING network, settled in Italy in the last five years by the Istituto Nazionale di Geofisica e Vulcanologia, were included into the analysis. Two dimensional strain and rotation rate fields were estimated by the distance weighted approach on a regularly spaced grid (30*30km), estimating the strain using all stations, but data from each station are weighted by their distance from the grid node by a constant a=70km that specifies how the effect of a station decays with distance from the node grid interpolation. Results show that most of the shortening of the Africa-Eurasia relative motion is distributed in the northwestern side offshore Sicily, whereas the extension becomes comparable with shortening on the western border of the Capo d'Orlando basin, and grater in the northeastern side, offshore Sicily, as directly provided by GPS velocities which show a larger E-ward component of sites located in Calabria with respect to those located either in northern Sicily or in the Ustica-Aeolian islands. Moreover, where shortening and extension have mostly a similar order of magnitude, two rotation rate fields can be detected, CCW in the northwestern side of Sicily, and CW in the northeastern one respectively. Also, 2-D dilatation field records a similar pattern, with negative values (shortening) in the northwestern area of Sicily close to the Ustica island, and positive values (extension) in the northeastern and southeastern ones, respectively. Principal shortening and extension rate axes are consistent with long-term geological features: seismic reflection profiles acquired in the southern Tyrrhenian seismogenic belt show active extensional faults affecting Pleistocene strata and deforming the seafloor in the western sector of the Cefalù Basin, on both NE-SW and W-E trending faults. Combining geodetic data and geological features contributes to the knowledge of the active deformation along the Africa-Eurasia plate boundary, suggesting coexisting, independent geodynamic processes, i.e., active E-W backarc spreading in the hangingwall of the Apennines subduction zone, and shortening of the southern margin of the Tyrrhenian backarc basin operated by the "Africa" NW-motion relative to "Europe".

  11. Probabilistic, sediment-geochemical parameterisation of the groundwater compartment of the Netherlands for spatially distributed, reactive transport modelling

    NASA Astrophysics Data System (ADS)

    Janssen, Gijs; Gunnink, Jan; van Vliet, Marielle; Goldberg, Tanya; Griffioen, Jasper

    2017-04-01

    Pollution of groundwater aquifers with contaminants as nitrate is a common problem. Reactive transport models are useful to predict the fate of such contaminants and to characterise the efficiency of mitigating or preventive measures. Parameterisation of a groundwater transport model on reaction capacity is a necessary step during building the model. Two Dutch, national programs are combined to establish a methodology for building a probabilistic model on reaction capacity of the groundwater compartment at the national scale: the Geological Survey program and the NHI Netherlands Hydrological Instrument program. Reaction capacity is considered as a series of geochemical characteristics that control acid/base condition, redox condition and sorption capacity. Five primary reaction capacity variables are characterised: 1. pyrite, 2. non-pyrite, reactive iron (oxides, siderite and glauconite), 3. clay fraction, 4. organic matter and 5. Ca-carbonate. Important reaction capacity variables that are determined by more than one solid compound are also deduced: 1. potential reduction capacity (PRC) by pyrite and organic matter, 2. cation-exchange capacity (CEC) by organic matter and clay content, 3. carbonate buffering upon pyrite oxidation (CPBO) by carbonate and pyrite. Statistical properties of these variables are established based on c. 16,000 sediment geochemical analyses. The first tens of meters are characterised based on 25 regions using combinations of lithological class and geological formation as strata. Because of both less data and more geochemical uniformity, the deeper subsurface is characterised in a similar way based on 3 regions. The statistical data is used as input in an algoritm that probabilistically calculates the reaction capacity per grid cell. First, the cumulative frequency distribution (cfd) functions are calculated from the statistical data for the geochemical strata. Second, all voxel cells are classified into the geochemical strata. Third, the cfd functions are used to put random reaction capacity variables into the hydrological voxel model. Here, the distribution can be conditioned on two variables. Two important variables are clay content and depth. The first is valid because more dense data is available for clay content than for geochemical variables as pyrite and probabilistic, lithological models are also built at TNO Geological Survey. The second is important to account for locally different depths at which the redox cline between NO3-rich and Fe(II)-rich groundwater occurs within the first tens of meters of the subsurface. An extensive data-set of groundwater quality analyses is used to derive criteria for depth variability of the redox cline. The result is a unique algoritm in order to obtain heterogeneous geochemical reaction capacity models of the entire groundwater compartment of the Netherlands.

  12. Application of the FUN3D Unstructured-Grid Navier-Stokes Solver to the 4th AIAA Drag Prediction Workshop Cases

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, Elizabeth M.; Hammond, Dana P.; Nielsen, Eric J.; Pirzadeh, S. Z.; Rumsey, Christopher L.

    2010-01-01

    FUN3D Navier-Stokes solutions were computed for the 4th AIAA Drag Prediction Workshop grid convergence study, downwash study, and Reynolds number study on a set of node-based mixed-element grids. All of the baseline tetrahedral grids were generated with the VGRID (developmental) advancing-layer and advancing-front grid generation software package following the gridding guidelines developed for the workshop. With maximum grid sizes exceeding 100 million nodes, the grid convergence study was particularly challenging for the node-based unstructured grid generators and flow solvers. At the time of the workshop, the super-fine grid with 105 million nodes and 600 million elements was the largest grid known to have been generated using VGRID. FUN3D Version 11.0 has a completely new pre- and post-processing paradigm that has been incorporated directly into the solver and functions entirely in a parallel, distributed memory environment. This feature allowed for practical pre-processing and solution times on the largest unstructured-grid size requested for the workshop. For the constant-lift grid convergence case, the convergence of total drag is approximately second-order on the finest three grids. The variation in total drag between the finest two grids is only 2 counts. At the finest grid levels, only small variations in wing and tail pressure distributions are seen with grid refinement. Similarly, a small wing side-of-body separation also shows little variation at the finest grid levels. Overall, the FUN3D results compare well with the structured-grid code CFL3D. The FUN3D downwash study and Reynolds number study results compare well with the range of results shown in the workshop presentations.

  13. 76 FR 36909 - Commission Information Collection Activities (FERC-549B); Comment Request; Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-23

    ... grid, the Commission revised its capacity release regulations regarding scheduling, segmentation and... market as well as to improve shipper's and the Commission's ability to monitor the market for potential... in a competitive market as well as improve shippers' and the Commission's ability to monitor...

  14. SURFACE WATER FLOW IN LANDSCAPE MODELS: 1. EVERGLADES CASE STUDY. (R824766)

    EPA Science Inventory

    Many landscape models require extensive computational effort using a large array of grid cells that represent the landscape. The number of spatial cells may be in the thousands and millions, while the ecological component run in each of the cells to account for landscape dynamics...

  15. UAV Inspection of Electrical Transmission Infrastructure with Path Conformance Autonomy and Lidar-Based Geofences NASA Report on UTM Reference Mission Flights at Southern Company Flights November 2016

    NASA Technical Reports Server (NTRS)

    Moore, Andrew J.; Schubert, Matthew; Rymer, Nicholas; Balachandran, Swee; Consiglio, Maria; Munoz, Cesar; Smith, Joshua; Lewis, Dexter; Schneider, Paul

    2017-01-01

    Flights at low altitudes in close proximity to electrical transmission infrastructure present serious navigational challenges: GPS and radio communication quality is variable and yet tight position control is needed to measure defects while avoiding collisions with ground structures. To advance unmanned aerial vehicle (UAV) navigation technology while accomplishing a task with economic and societal benefit, a high voltage electrical infrastructure inspection reference mission was designed. An integrated air-ground platform was developed for this mission and tested in two days of experimental flights to determine whether navigational augmentation was needed to successfully conduct a controlled inspection experiment. The airborne component of the platform was a multirotor UAV built from commercial off-the-shelf hardware and software, and the ground component was a commercial laptop running open source software. A compact ultraviolet sensor mounted on the UAV can locate 'hot spots' (potential failure points in the electric grid), so long as the UAV flight path adequately samples the airspace near the power grid structures. To improve navigation, the platform was supplemented with two navigation technologies: lidar-to-polyhedron preflight processing for obstacle demarcation and inspection distance planning, and trajectory management software to enforce inspection standoff distance. Both navigation technologies were essential to obtaining useful results from the hot spot sensor in this obstacle-rich, low-altitude airspace. Because the electrical grid extends into crowded airspaces, the UAV position was tracked with NASA unmanned aerial system traffic management (UTM) technology. The following results were obtained: (1) Inspection of high-voltage electrical transmission infrastructure to locate 'hot spots' of ultraviolet emission requires navigation methods that are not broadly available and are not needed at higher altitude flights above ground structures. (2) The sensing capability of a novel airborne UV detector was verified with a standard ground-based instrument. Flights with this sensor showed that UAV measurement operations and recording methods are viable. With improved sensor range, UAVs equipped with compact UV sensors could serve as the detection elements in a self-diagnosing power grid. (3) Simplification of rich lidar maps to polyhedral obstacle maps reduces data volume by orders of magnitude, so that computation with the resultant maps in real time is possible. This enables real-time obstacle avoidance autonomy. Stable navigation may be feasible in the GPS-deprived environment near transmission lines by a UAV that senses ground structures and compares them to these simplified maps. (4) A new, formally verified path conformance software system that runs onboard a UAV was demonstrated in flight for the first time. It successfully maneuvered the aircraft after a sudden lateral perturbation that models a gust of wind, and processed lidar-derived polyhedral obstacle maps in real time. (5) Tracking of the UAV in the national airspace using the NASA UTM technology was a key safety component of this reference mission, since the flights were conducted beneath the landing approach to a heavily used runway. Comparison to autopilot tracking showed that UTM tracking accurately records the UAV position throughout the flight path.

  16. Grid Transmission Expansion Planning Model Based on Grid Vulnerability

    NASA Astrophysics Data System (ADS)

    Tang, Quan; Wang, Xi; Li, Ting; Zhang, Quanming; Zhang, Hongli; Li, Huaqiang

    2018-03-01

    Based on grid vulnerability and uniformity theory, proposed global network structure and state vulnerability factor model used to measure different grid models. established a multi-objective power grid planning model which considering the global power network vulnerability, economy and grid security constraint. Using improved chaos crossover and mutation genetic algorithm to optimize the optimal plan. For the problem of multi-objective optimization, dimension is not uniform, the weight is not easy given. Using principal component analysis (PCA) method to comprehensive assessment of the population every generation, make the results more objective and credible assessment. the feasibility and effectiveness of the proposed model are validated by simulation results of Garver-6 bus system and Garver-18 bus.

  17. Gridded Data in the Arctic; Benefits and Perils of Publicly Available Grids

    NASA Astrophysics Data System (ADS)

    Coakley, B.; Forsberg, R.; Gabbert, R.; Beale, J.; Kenyon, S. C.

    2015-12-01

    Our understanding of the Arctic Ocean has been hugely advanced by release of gridded bathymetry and potential field anomaly grids. The Arctic Gravity Project grid achieves excellent, near-isotropic coverage of the earth north of 64˚N by combining land, satellite, airborne, submarine, surface ship and ice set-out measurements of gravity anomalies. Since the release of the V 2.0 grid in 2008, there has been extensive icebreaker activity across the Amerasia Basin due to mapping of the Arctic coastal nation's Extended Continental Shelves (ECS). While grid resolution has been steadily improving over time, addition of higher resolution and better navigated data highlights some distortions in the grid that may influence interpretation. In addition to the new ECS data sets, gravity anomaly data has been collected from other vessels; notably the Korean Icebreaker Araon, the Japanese icebreaker Mirai and the German icebreaker Polarstern. Also the GRAV-D project of the US National Geodetic Survey has flown airborne surveys over much of Alaska. These data will be Included in the new AGP grid, which will result in a much improved product when version 3.0 is released in 2015. To make use of these measurements, it is necessary to compile them into a continuous spatial representation. Compilation is complicated by differences in survey parameters, gravimeter sensitivity and reduction methods. Cross-over errors are the classic means to assess repeatability of track measurements. Prior to the introduction of near-universal GPS positioning, positional uncertainty was evaluated by cross-over analysis. GPS positions can be treated as more or less true, enabling evaluation of differences due to contrasting sensitivity, reference and reduction techniques. For the most part, cross-over errors for racks of gravity anomaly data collected since 2008 are less than 0.5 mGals, supporting the compilation of these data with only slight adjustments. Given the different platforms used for various Arctic Ocean surveys, registration between bathymetric and gravity anomaly grids cannot be assumed. Inverse methods, which assume co-registration of data produce, sometimes surprising results when well-constrained gravity grid values are inverted against interpolated bathymetry.

  18. The eGo grid model: An open source approach towards a model of German high and extra-high voltage power grids

    NASA Astrophysics Data System (ADS)

    Mueller, Ulf Philipp; Wienholt, Lukas; Kleinhans, David; Cussmann, Ilka; Bunke, Wolf-Dieter; Pleßmann, Guido; Wendiggensen, Jochen

    2018-02-01

    There are several power grid modelling approaches suitable for simulations in the field of power grid planning. The restrictive policies of grid operators, regulators and research institutes concerning their original data and models lead to an increased interest in open source approaches of grid models based on open data. By including all voltage levels between 60 kV (high voltage) and 380kV (extra high voltage), we dissolve the common distinction between transmission and distribution grid in energy system models and utilize a single, integrated model instead. An open data set for primarily Germany, which can be used for non-linear, linear and linear-optimal power flow methods, was developed. This data set consists of an electrically parameterised grid topology as well as allocated generation and demand characteristics for present and future scenarios at high spatial and temporal resolution. The usability of the grid model was demonstrated by the performance of exemplary power flow optimizations. Based on a marginal cost driven power plant dispatch, being subject to grid restrictions, congested power lines were identified. Continuous validation of the model is nescessary in order to reliably model storage and grid expansion in progressing research.

  19. Valuation of Electric Power System Services and Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kintner-Meyer, Michael C. W.; Homer, Juliet S.; Balducci, Patrick J.

    Accurate valuation of existing and new technologies and grid services has been recognized to be important to stimulate investment in grid modernization. Clear, transparent, and accepted methods for estimating the total value (i.e., total benefits minus cost) of grid technologies and services are necessary for decision makers to make informed decisions. This applies to home owners interested in distributed energy technologies, as well as to service providers offering new demand response services, and utility executives evaluating best investment strategies to meet their service obligation. However, current valuation methods lack consistency, methodological rigor, and often the capabilities to identify and quantifymore » multiple benefits of grid assets or new and innovative services. Distributed grid assets often have multiple benefits that are difficult to quantify because of the locational context in which they operate. The value is temporally, operationally, and spatially specific. It varies widely by distribution systems, transmission network topology, and the composition of the generation mix. The Electric Power Research Institute (EPRI) recently established a benefit-cost framework that proposes a process for estimating multiple benefits of distributed energy resources (DERs) and the associated cost. This document proposes an extension of this endeavor that offers a generalizable framework for valuation that quantifies the broad set of values for a wide range of technologies (including energy efficiency options, distributed resources, transmission, and generation) as well as policy options that affect all aspects of the entire generation and delivery system of the electricity infrastructure. The extension includes a comprehensive valuation framework of monetizable and non-monetizable benefits of new technologies and services beyond the traditional reliability objectives. The benefits are characterized into the following categories: sustainability, affordability, and security, flexibility, and resilience. This document defines the elements of a generic valuation framework and process as well as system properties and metrics by which value streams can be derived. The valuation process can be applied to determine the value on the margin of incremental system changes. This process is typically performed when estimating the value of a particular project (e.g., value of a merchant generator, or a distributed photovoltaic (PV) rooftop installation). Alternatively, the framework can be used when a widespread change in the grid operation, generation mix, or transmission topology is to be valued. In this case a comprehensive system analysis is required.« less

  20. Design and Implementation of Real-Time Off-Grid Detection Tool Based on FNET/GridEye

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Jiahui; Zhang, Ye; Liu, Yilu

    2014-01-01

    Real-time situational awareness tools are of critical importance to power system operators, especially during emergencies. The availability of electric power has become a linchpin of most post disaster response efforts as it is the primary dependency for public and private sector services, as well as individuals. Knowledge of the scope and extent of facilities impacted, as well as the duration of their dependence on backup power, enables emergency response officials to plan for contingencies and provide better overall response. Based on real-time data acquired by Frequency Disturbance Recorders (FDRs) deployed in the North American power grid, a real-time detection methodmore » is proposed. This method monitors critical electrical loads and detects the transition of these loads from an on-grid state, where the loads are fed by the power grid to an off-grid state, where the loads are fed by an Uninterrupted Power Supply (UPS) or a backup generation system. The details of the proposed detection algorithm are presented, and some case studies and off-grid detection scenarios are also provided to verify the effectiveness and robustness. Meanwhile, the algorithm has already been implemented based on the Grid Solutions Framework (GSF) and has effectively detected several off-grid situations.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Besse, Nicolas; Latu, Guillaume; Ghizzo, Alain

    In this paper we present a new method for the numerical solution of the relativistic Vlasov-Maxwell system on a phase-space grid using an adaptive semi-Lagrangian method. The adaptivity is performed through a wavelet multiresolution analysis, which gives a powerful and natural refinement criterion based on the local measurement of the approximation error and regularity of the distribution function. Therefore, the multiscale expansion of the distribution function allows to get a sparse representation of the data and thus save memory space and CPU time. We apply this numerical scheme to reduced Vlasov-Maxwell systems arising in laser-plasma physics. Interaction of relativistically strongmore » laser pulses with overdense plasma slabs is investigated. These Vlasov simulations revealed a rich variety of phenomena associated with the fast particle dynamics induced by electromagnetic waves as electron trapping, particle acceleration, and electron plasma wavebreaking. However, the wavelet based adaptive method that we developed here, does not yield significant improvements compared to Vlasov solvers on a uniform mesh due to the substantial overhead that the method introduces. Nonetheless they might be a first step towards more efficient adaptive solvers based on different ideas for the grid refinement or on a more efficient implementation. Here the Vlasov simulations are performed in a two-dimensional phase-space where the development of thin filaments, strongly amplified by relativistic effects requires an important increase of the total number of points of the phase-space grid as they get finer as time goes on. The adaptive method could be more useful in cases where these thin filaments that need to be resolved are a very small fraction of the hyper-volume, which arises in higher dimensions because of the surface-to-volume scaling and the essentially one-dimensional structure of the filaments. Moreover, the main way to improve the efficiency of the adaptive method is to increase the local character in phase-space of the numerical scheme, by considering multiscale reconstruction with more compact support and by replacing the semi-Lagrangian method with more local - in space - numerical scheme as compact finite difference schemes, discontinuous-Galerkin method or finite element residual schemes which are well suited for parallel domain decomposition techniques.« less

  2. Advanced CFD Methods for Hypervelocity Wind Tunnels

    DTIC Science & Technology

    2011-03-10

    Mach 14 nozzle produces non-uniformities in the test section flow that are not desirable [1,2]. Calibration runs with Pitot pressure rakes suggest...flows is presented. The grid is based on the characteristic lines of the supersonic regions of the flow. This allows for grid alignment and clustering...novel grid generation scheme for hypersonic nozzle flows is presented. The grid is based on the characteristic lines of the supersonic regions of the

  3. SARS Grid--an AG-based disease management and collaborative platform.

    PubMed

    Hung, Shu-Hui; Hung, Tsung-Chieh; Juang, Jer-Nan

    2006-01-01

    This paper describes the development of the NCHC's Severe Acute Respiratory Syndrome (SARS) Grid project-An Access Grid (AG)-based disease management and collaborative platform that allowed for SARS patient's medical data to be dynamically shared and discussed between hospitals and doctors using AG's video teleconferencing (VTC) capabilities. During the height of the SARS epidemic in Asia, SARS Grid and the SARShope website significantly curved the spread of SARS by helping doctors manage the in-hospital and in-home care of quarantined SARS patients through medical data exchange and the monitoring of the patient's symptoms. Now that the SARS epidemic has ended, the primary function of the SARS Grid project is that of a web-based informatics tool to increase pubic awareness of SARS and other epidemic diseases. Additionally, the SARS Grid project can be viewed and further studied as an outstanding model of epidemic disease prevention and/or containment.

  4. Equilibrium composition of interphase boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wynblatt, P.

    1990-01-01

    Two modeling approaches have been used to investigate segregation effects at interphase boundaries. The first approach is based on the nearest neighbor bond model, used in conjunction with the regular solution approximation, and is an extension of an earlier framework developed to address segregation phenomena at free surfaces. In order to model a semicoherent interphase boundary, we have employed a second modeling approach, based on Monte Carol simulation, in conjunction with the embedded atom method (EAM). The EAM is a powerful new method for describing interatomic interactions in metallic systems. It includes certain many-body interactions that depend on the localmore » environment of an atom. The Monte Carol approach has been applied to semicoherent interphase boundaries in Cu-Ag-Au alloys dilute in Au. These alloys consist of coexisting Cu-rich and Ag-rich phases, which differ in lattice constant by about 12%, such that good matching across in interface occurs when nine structural units of the Cu-rich phase are opposed to eight structural units of the Ag-rich phase. Thus far, interfaces with two different orientations have been studied: {l brace}001{r brace}-Cu//{l brace}001{r brace}-Ag, {l angle}110{r angle}-Cu//{l angle}110{r angle}-Ag; and {l brace}111{r brace}-Cu//{l brace}111{r brace}-Ag, {l angle}110{r angle}-Cu//{l angle}110{r angle}-Ag. These two interfaces will be referred to as the (001) and (111) interphase boundaries, for short. 18 refs.« less

  5. Time-domain analysis of planar microstrip devices using a generalized Yee-algorithm based on unstructured grids

    NASA Technical Reports Server (NTRS)

    Gedney, Stephen D.; Lansing, Faiza

    1993-01-01

    The generalized Yee-algorithm is presented for the temporal full-wave analysis of planar microstrip devices. This algorithm has the significant advantage over the traditional Yee-algorithm in that it is based on unstructured and irregular grids. The robustness of the generalized Yee-algorithm is that structures that contain curved conductors or complex three-dimensional geometries can be more accurately, and much more conveniently modeled using standard automatic grid generation techniques. This generalized Yee-algorithm is based on the the time-marching solution of the discrete form of Maxwell's equations in their integral form. To this end, the electric and magnetic fields are discretized over a dual, irregular, and unstructured grid. The primary grid is assumed to be composed of general fitted polyhedra distributed throughout the volume. The secondary grid (or dual grid) is built up of the closed polyhedra whose edges connect the centroid's of adjacent primary cells, penetrating shared faces. Faraday's law and Ampere's law are used to update the fields normal to the primary and secondary grid faces, respectively. Subsequently, a correction scheme is introduced to project the normal fields onto the grid edges. It is shown that this scheme is stable, maintains second-order accuracy, and preserves the divergenceless nature of the flux densities. Finally, for computational efficiency the algorithm is structured as a series of sparse matrix-vector multiplications. Based on this scheme, the generalized Yee-algorithm has been implemented on vector and parallel high performance computers in a highly efficient manner.

  6. Investigation of the effects of external current systems on the MAGSAT data utilizing grid cell modeling techniques

    NASA Technical Reports Server (NTRS)

    Klumpar, D. M. (Principal Investigator)

    1982-01-01

    The feasibility of modeling magnetic fields due to certain electrical currents flowing in the Earth's ionosphere and magnetosphere was investigated. A method was devised to carry out forward modeling of the magnetic perturbations that arise from space currents. The procedure utilizes a linear current element representation of the distributed electrical currents. The finite thickness elements are combined into loops which are in turn combined into cells having their base in the ionosphere. In addition to the extensive field modeling, additional software was developed for the reduction and analysis of the MAGSAT data in terms of the external current effects. Direct comparisons between the models and the MAGSAT data are possible.

  7. Managing and Querying Image Annotation and Markup in XML.

    PubMed

    Wang, Fusheng; Pan, Tony; Sharma, Ashish; Saltz, Joel

    2010-01-01

    Proprietary approaches for representing annotations and image markup are serious barriers for researchers to share image data and knowledge. The Annotation and Image Markup (AIM) project is developing a standard based information model for image annotation and markup in health care and clinical trial environments. The complex hierarchical structures of AIM data model pose new challenges for managing such data in terms of performance and support of complex queries. In this paper, we present our work on managing AIM data through a native XML approach, and supporting complex image and annotation queries through native extension of XQuery language. Through integration with xService, AIM databases can now be conveniently shared through caGrid.

  8. Managing and Querying Image Annotation and Markup in XML

    PubMed Central

    Wang, Fusheng; Pan, Tony; Sharma, Ashish; Saltz, Joel

    2010-01-01

    Proprietary approaches for representing annotations and image markup are serious barriers for researchers to share image data and knowledge. The Annotation and Image Markup (AIM) project is developing a standard based information model for image annotation and markup in health care and clinical trial environments. The complex hierarchical structures of AIM data model pose new challenges for managing such data in terms of performance and support of complex queries. In this paper, we present our work on managing AIM data through a native XML approach, and supporting complex image and annotation queries through native extension of XQuery language. Through integration with xService, AIM databases can now be conveniently shared through caGrid. PMID:21218167

  9. On Bi-Grid Local Mode Analysis of Solution Techniques for 3-D Euler and Navier-Stokes Equations

    NASA Technical Reports Server (NTRS)

    Ibraheem, S. O.; Demuren, A. O.

    1994-01-01

    A procedure is presented for utilizing a bi-grid stability analysis as a practical tool for predicting multigrid performance in a range of numerical methods for solving Euler and Navier-Stokes equations. Model problems based on the convection, diffusion and Burger's equation are used to illustrate the superiority of the bi-grid analysis as a predictive tool for multigrid performance in comparison to the smoothing factor derived from conventional von Neumann analysis. For the Euler equations, bi-grid analysis is presented for three upwind difference based factorizations, namely Spatial, Eigenvalue and Combination splits, and two central difference based factorizations, namely LU and ADI methods. In the former, both the Steger-Warming and van Leer flux-vector splitting methods are considered. For the Navier-Stokes equations, only the Beam-Warming (ADI) central difference scheme is considered. In each case, estimates of multigrid convergence rates from the bi-grid analysis are compared to smoothing factors obtained from single-grid stability analysis. Effects of grid aspect ratio and flow skewness are examined. Both predictions are compared with practical multigrid convergence rates for 2-D Euler and Navier-Stokes solutions based on the Beam-Warming central scheme.

  10. Enhanced Product Generation at NASA Data Centers Through Grid Technology

    NASA Technical Reports Server (NTRS)

    Barkstrom, Bruce R.; Hinke, Thomas H.; Gavali, Shradha; Seufzer, William J.

    2003-01-01

    This paper describes how grid technology can support the ability of NASA data centers to provide customized data products. A combination of grid technology and commodity processors are proposed to provide the bandwidth necessary to perform customized processing of data, with customized data subsetting providing the initial example. This customized subsetting engine can be used to support a new type of subsetting, called phenomena-based subsetting, where data is subsetted based on its association with some phenomena, such as mesoscale convective systems or hurricanes. This concept is expanded to allow the phenomena to be detected in one type of data, with the subsetting requirements transmitted to the subsetting engine to subset a different type of data. The subsetting requirements are generated by a data mining system and transmitted to the subsetter in the form of an XML feature index that describes the spatial and temporal extent of the phenomena. For this work, a grid-based mining system called the Grid Miner is used to identify the phenomena and generate the feature index. This paper discusses the value of grid technology in facilitating the development of a high performance customized product processing and the coupling of a grid mining system to support phenomena-based subsetting.

  11. Modelling GIC Flow in New Zealand's Electrical Transmission Grid

    NASA Astrophysics Data System (ADS)

    Divett, T.; Thomson, A. W. P.; Ingham, M.; Rodger, C. J.; Beggan, C.; Kelly, G.

    2016-12-01

    Transformers in Transpower New Zealand Ltd's electrical grid have been impacted by geomagnetically induced currents (GIC) during geomagnetic storms, for example in November 2001. In this study we have developed an initial model of the South Island's power grid to advance understanding of the impact of GIC on New Zealand's (NZ) grid. NZ's latitude and island setting mean that modelling approaches successfully used in the UK in the past can be used. Vasseur and Weidelt's thin sheet model is applied to model the electric field as a function of magnetic field and conductance. However the 4 km deep ocean near NZ's coast compared to the UK's relatively shallow continental shelf waters restricts the range of frequency and spatial grid that can be used due to assumptions in the thin sheet model. Some early consequences of these restrictions will be discussed. Lines carrying 220kV, 110kV and 66kV make up NZ's electrical transmission grid with multiple earthing nodes at each substation. Transpower have measured DC earth currents at 17 nodes in NZ's South Island grid for 15 years, including observations at multiple transformers for some substations. Different transformers at the same substation can experience quite different GIC during space weather events. Therefore we have initially modelled each transformer in some substations separately to compare directly with measured currents.Ultimately this study aims to develop a validated modelling tool that will be used to strengthen NZ's grid against the risks of space weather. Further, mitigation tactics which could be used to reduce the threat to the electrical grid will be evaluated. In particular we will focus at the transformer level where the risk lies, and not at the substation level as has been commonly done to date. As we will validate our model against the extensive Transpower observations, this will be a valuable confirmation of the approaches used by the wider international community.

  12. On Market-Based Coordination of Thermostatically Controlled Loads With User Preference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Sen; Zhang, Wei; Lian, Jianming

    2014-12-15

    This paper presents a market-based control framework to coordinate a group of autonomous Thermostatically Controlled Loads (TCL) to achieve the system-level objectives with pricing incentives. The problem is formulated as maximizing the social welfare subject to feeder power constraint. It allows the coordinator to affect the aggregated power of a group of dynamical systems, and creates an interactive market where the users and the coordinator cooperatively determine the optimal energy allocation and energy price. The optimal pricing strategy is derived, which maximizes social welfare while respecting the feeder power constraint. The bidding strategy is also designed to compute the optimalmore » price in real time (e.g., every 5 minutes) based on local device information. The coordination framework is validated with realistic simulations in GridLab-D. Extensive simulation results demonstrate that the proposed approach effectively maximizes the social welfare and decreases power congestion at key times.« less

  13. Ab initio molecular simulations with numeric atom-centered orbitals

    NASA Astrophysics Data System (ADS)

    Blum, Volker; Gehrke, Ralf; Hanke, Felix; Havu, Paula; Havu, Ville; Ren, Xinguo; Reuter, Karsten; Scheffler, Matthias

    2009-11-01

    We describe a complete set of algorithms for ab initio molecular simulations based on numerically tabulated atom-centered orbitals (NAOs) to capture a wide range of molecular and materials properties from quantum-mechanical first principles. The full algorithmic framework described here is embodied in the Fritz Haber Institute "ab initio molecular simulations" (FHI-aims) computer program package. Its comprehensive description should be relevant to any other first-principles implementation based on NAOs. The focus here is on density-functional theory (DFT) in the local and semilocal (generalized gradient) approximations, but an extension to hybrid functionals, Hartree-Fock theory, and MP2/GW electron self-energies for total energies and excited states is possible within the same underlying algorithms. An all-electron/full-potential treatment that is both computationally efficient and accurate is achieved for periodic and cluster geometries on equal footing, including relaxation and ab initio molecular dynamics. We demonstrate the construction of transferable, hierarchical basis sets, allowing the calculation to range from qualitative tight-binding like accuracy to meV-level total energy convergence with the basis set. Since all basis functions are strictly localized, the otherwise computationally dominant grid-based operations scale as O(N) with system size N. Together with a scalar-relativistic treatment, the basis sets provide access to all elements from light to heavy. Both low-communication parallelization of all real-space grid based algorithms and a ScaLapack-based, customized handling of the linear algebra for all matrix operations are possible, guaranteeing efficient scaling (CPU time and memory) up to massively parallel computer systems with thousands of CPUs.

  14. Power Flow Simulations of a More Renewable California Grid Utilizing Wind and Solar Insolation Forecasting

    NASA Astrophysics Data System (ADS)

    Hart, E. K.; Jacobson, M. Z.; Dvorak, M. J.

    2008-12-01

    Time series power flow analyses of the California electricity grid are performed with extensive addition of intermittent renewable power. The study focuses on the effects of replacing non-renewable and imported (out-of-state) electricity with wind and solar power on the reliability of the transmission grid. Simulations are performed for specific days chosen throughout the year to capture seasonal fluctuations in load, wind, and insolation. Wind farm expansions and new wind farms are proposed based on regional wind resources and time-dependent wind power output is calculated using a meteorological model and the power curves of specific wind turbines. Solar power is incorporated both as centralized and distributed generation. Concentrating solar thermal plants are modeled using local insolation data and the efficiencies of pre-existing plants. Distributed generation from rooftop PV systems is included using regional insolation data, efficiencies of common PV systems, and census data. The additional power output of these technologies offsets power from large natural gas plants and is balanced for the purposes of load matching largely with hydroelectric power and by curtailment when necessary. A quantitative analysis of the effects of this significant shift in the electricity portfolio of the state of California on power availability and transmission line congestion, using a transmission load-flow model, is presented. A sensitivity analysis is also performed to determine the effects of forecasting errors in wind and insolation on load-matching and transmission line congestion.

  15. DPM: Future Proof Storage

    NASA Astrophysics Data System (ADS)

    Alvarez, Alejandro; Beche, Alexandre; Furano, Fabrizio; Hellmich, Martin; Keeble, Oliver; Rocha, Ricardo

    2012-12-01

    The Disk Pool Manager (DPM) is a lightweight solution for grid enabled disk storage management. Operated at more than 240 sites it has the widest distribution of all grid storage solutions in the WLCG infrastructure. It provides an easy way to manage and configure disk pools, and exposes multiple interfaces for data access (rfio, xroot, nfs, gridftp and http/dav) and control (srm). During the last year we have been working on providing stable, high performant data access to our storage system using standard protocols, while extending the storage management functionality and adapting both configuration and deployment procedures to reuse commonly used building blocks. In this contribution we cover in detail the extensive evaluation we have performed of our new HTTP/WebDAV and NFS 4.1 frontends, in terms of functionality and performance. We summarize the issues we faced and the solutions we developed to turn them into valid alternatives to the existing grid protocols - namely the additional work required to provide multi-stream transfers for high performance wide area access, support for third party copies, credential delegation or the required changes in the experiment and fabric management frameworks and tools. We describe new functionality that has been added to ease system administration, such as different filesystem weights and a faster disk drain, and new configuration and monitoring solutions based on the industry standards Puppet and Nagios. Finally, we explain some of the internal changes we had to do in the DPM architecture to better handle the additional load from the analysis use cases.

  16. Self-similar grid patterns in free-space shuffle-exchange networks

    NASA Astrophysics Data System (ADS)

    Haney, Michael W.

    1993-12-01

    Self-similar grid patterns are proposed as an alternative to rectangular grid, array optoelectronic sources, and detectors of smart pixels. For shuffle based multistage interconnection networks, it is suggested that smart pixel should not be arrayed on a rectangular grid and that smart pixel unit cell should be the kernel of a self-similar grid pattern.

  17. Feature combination analysis in smart grid based using SOM for Sudan national grid

    NASA Astrophysics Data System (ADS)

    Bohari, Z. H.; Yusof, M. A. M.; Jali, M. H.; Sulaima, M. F.; Nasir, M. N. M.

    2015-12-01

    In the investigation of power grid security, the cascading failure in multicontingency situations has been a test because of its topological unpredictability and computational expense. Both system investigations and burden positioning routines have their limits. In this project, in view of sorting toward Self Organizing Maps (SOM), incorporated methodology consolidating spatial feature (distance)-based grouping with electrical attributes (load) to evaluate the vulnerability and cascading impact of various part sets in the force lattice. Utilizing the grouping result from SOM, sets of overwhelming stacked beginning victimized people to perform assault conspires and asses the consequent falling impact of their failures, and this SOM-based approach viably distinguishes the more powerless sets of substations than those from the conventional burden positioning and other bunching strategies. The robustness of power grids is a central topic in the design of the so called "smart grid". In this paper, to analyze the measures of importance of the nodes in a power grid under cascading failure. With these efforts, we can distinguish the most vulnerable nodes and protect them, improving the safety of the power grid. Also we can measure if a structure is proper for power grids.

  18. A coarse-grid-projection acceleration method for finite-element incompressible flow computations

    NASA Astrophysics Data System (ADS)

    Kashefi, Ali; Staples, Anne; FiN Lab Team

    2015-11-01

    Coarse grid projection (CGP) methodology provides a framework for accelerating computations by performing some part of the computation on a coarsened grid. We apply the CGP to pressure projection methods for finite element-based incompressible flow simulations. Based on it, the predicted velocity field data is restricted to a coarsened grid, the pressure is determined by solving the Poisson equation on the coarse grid, and the resulting data are prolonged to the preset fine grid. The contributions of the CGP method to the pressure correction technique are twofold: first, it substantially lessens the computational cost devoted to the Poisson equation, which is the most time-consuming part of the simulation process. Second, it preserves the accuracy of the velocity field. The velocity and pressure spaces are approximated by Galerkin spectral element using piecewise linear basis functions. A restriction operator is designed so that fine data are directly injected into the coarse grid. The Laplacian and divergence matrices are driven by taking inner products of coarse grid shape functions. Linear interpolation is implemented to construct a prolongation operator. A study of the data accuracy and the CPU time for the CGP-based versus non-CGP computations is presented. Laboratory for Fluid Dynamics in Nature.

  19. Towards an improved soil moisture retrieval for organic-rich soils from SMOS passive microwave L-band observations

    NASA Astrophysics Data System (ADS)

    Bircher, Simone; Richaume, Philippe; Mahmoodi, Ali; Mialon, Arnaud; Fernandez-Moran, Roberto; Wigneron, Jean-Pierre; Demontoux, François; Jonard, François; Weihermüller, Lutz; Andreasen, Mie; Rautiainen, Kimmo; Ikonen, Jaakko; Schwank, Mike; Drusch, Mattias; Kerr, Yann H.

    2017-04-01

    From the passive L-band microwave radiometer onboard the Soil Moisture and Ocean Salinity (SMOS) space mission global surface soil moisture data is retrieved every 2 - 3 days. Thus far, the empirical L-band Microwave Emission of the Biosphere (L-MEB) radiative transfer model applied in the SMOS soil moisture retrieval algorithm is exclusively calibrated over test sites in dry and temperate climate zones. Furthermore, the included dielectric mixing model relating soil moisture to relative permittivity accounts only for mineral soils. However, soil moisture monitoring over the higher Northern latitudes is crucial since these regions are especially sensitive to climate change. A considerable positive feedback is expected if thawing of these extremely organic soils supports carbon decomposition and release to the atmosphere. Due to differing structural characteristics and thus varying bound water fractions, the relative permittivity of organic material is lower than that of the most mineral soils at a given water content. This assumption was verified by means of L-band relative permittivity laboratory measurements of organic and mineral substrates from various sites in Denmark, Finland, Scotland and Siberia using a resonant cavity. Based on these data, a simple empirical dielectric model for organic soils was derived and implemented in the SMOS Soil Moisture Level 2 Prototype Processor (SML2PP). Unfortunately, the current SMOS retrieved soil moisture product seems to show unrealistically low values compared to in situ soil moisture data collected from organic surface layers in North America, Europe and the Tibetan Plateau so that the impact of the dielectric model for organic soils cannot really be tested. A simplified SMOS processing scheme yielding higher soil moisture levels has recently been proposed and is presently under investigation. Furthermore, recalibration of the model parameters accounting for vegetation and roughness effects that were thus far only evaluated using the default dielectric model for mineral soils is ongoing for the "organic" L-MEB version. Additionally, in order to decide where a soil moisture retrieval using the "organic" dielectric model should be triggered, information on soil organic matter content in the soil surface layer has to be considered in the retrieval algorithm. For this purpose, SoilGrids (www.soilgrids.org) providing soil organic carbon content (SOCC) in g/kg is under study. A SOCC threshold based on the relation between the SoilGrids' SOCC and the presence of organic soil surface layers (relevant to alter the microwave L-band emissions from the land surface) in the SoilGrids' source soil profile information has to be established. In this communication, we present the current status of the above outlined studies with the objective to advance towards an improved soil moisture retrieval for organic-rich soils from SMOS passive microwave L-band observations.

  20. Experimental Assessment of the Emissions Control Potential of a Rich/Quench/Lean Combustor for High Speed Civil Transport Aircraft Engines

    NASA Technical Reports Server (NTRS)

    Rosfjord, T. J.; Padget, F. C.; Tacina, Robert R. (Technical Monitor)

    2001-01-01

    In support of Pratt & Whitney efforts to define the Rich burn/Quick mix/Lean burn (RQL) combustor for the High Speed Civil Transport (HSCT) aircraft engine, UTRC conducted a flametube-scale study of the RQL concept. Extensive combustor testing was performed at the Supersonic Cruise (SSC) condition of a HSCT engine cycle, Data obtained from probe traverses near the exit of the mixing section confirmed that the mixing section was the critical component in controlling combustor emissions. Circular-hole configurations, which produced rapidly-, highly-penetrating jets, were most effective in limiting NOx. The spatial profiles of NOx and CO at the mixer exit were not directly interpretable using a simple flow model based on jet penetration, and a greater understanding of the flow and chemical processes in this section are required to optimize it. Neither the rich-combustor equivalence ratio nor its residence time was a direct contributor to the exit NOx. Based on this study, it was also concluded that (1) While NOx formation in both the mixing section and the lean combustor contribute to the overall emission, the NOx formation in the mixing section dominates. The gas composition exiting the rich combustor can be reasonably represented by the equilibrium composition corresponding to the rich combustor operating condition. Negligible NOx exits the rich combustor. (2) At the SSC condition, the oxidation processes occurring in the mixing section consume 99 percent of the CO exiting the rich combustor. Soot formed in the rich combustor is also highly oxidized, with combustor exit SAE Smoke Number <3. (3) Mixing section configurations which demonstrated enhanced emissions control at SSC also performed better at part-power conditions. Data from mixer exit traverses reflected the expected mixing behavior for off-design jet to crossflow momentum-flux ratios. (4) Low power operating conditions require that the RQL combustor operate as a lean-lean combustor to achieve low CO and high efficiency. (5) A RQL combustor can achieve the emissions goal of EINOX = 5 at the Supersonic Cruise operating condition for a HSCT engine.

  1. Experimental Assessment of the Emissions Control Potential of a Rich/Quench/ Lean Combustor for High Speed Civil Transport Aircraft Engines

    NASA Technical Reports Server (NTRS)

    Tacina, Robert R. (Technical Monitor); Rosfjord, T. J.; Padget, F. C.

    2001-01-01

    In support of Pratt & Whitney efforts to define the Rich burn/Quick mix/Lean burn (RQL) combustor for the High Speed Civil Transport (HSCT) aircraft engine, UTRC conducted a flametube-scale study of the RQL concept. Extensive combustor testing was performed at the Supersonic Cruise (SSC) condition of an HSCT engine cycle. Data obtained from probe traverses near the exit of the mixing section confirmed that the mixing section was the critical component in controlling combustor emissions. Circular-hole configurations, which produced rapidly-, highly-penetrating jets, were most effective in limiting NO(x). The spatial profiles of NO(x) and CO at the mixer exit were not directly interpretable using a simple flow model based on jet penetration, and a greater understanding of the flow and chemical processes in this section are required to optimize it. Neither the rich-combustor equivalence ratio nor its residence time was a direct contributor to the exit NO(x). Based on this study, it was also concluded that: (1) While NO(x) formation in both the mixing section and the lean combustor contribute to the overall emission, the NOx formation in the mixing section dominates. The gas composition exiting the rich combustor can be reasonably represented by the equilibrium composition corresponding to the rich combustor operating condition. Negligible NO(x) exits the rich combustor. (2) At the SSC condition, the oxidation processes occurring in the mixing section consume 99 percent of the CO exiting the rich combustor. Soot formed in the rich combustor is also highly oxidized, with combustor exit SAE Smoke Number <3. (3) Mixing section configurations which demonstrated enhanced emissions control at SSC also performed better at part-power conditions. Data from mixer exit traverses reflected the expected mixing behavior for off-design jet to crossflow momentum-flux ratios. (4) Low power operating conditions require that the RQL combustor operate as a lean-lean combustor to achieve low CO and high efficiency. (5) An RQL combustor can achieve the emissions goal of EINO(x) = 5 at the Supersonic Cruise operating condition for an HSCT engine.

  2. Efficient grid-based techniques for density functional theory

    NASA Astrophysics Data System (ADS)

    Rodriguez-Hernandez, Juan Ignacio

    Understanding the chemical and physical properties of molecules and materials at a fundamental level often requires quantum-mechanical models for these substance's electronic structure. This type of many body quantum mechanics calculation is computationally demanding, hindering its application to substances with more than a few hundreds atoms. The supreme goal of many researches in quantum chemistry---and the topic of this dissertation---is to develop more efficient computational algorithms for electronic structure calculations. In particular, this dissertation develops two new numerical integration techniques for computing molecular and atomic properties within conventional Kohn-Sham-Density Functional Theory (KS-DFT) of molecular electronic structure. The first of these grid-based techniques is based on the transformed sparse grid construction. In this construction, a sparse grid is generated in the unit cube and then mapped to real space according to the pro-molecular density using the conditional distribution transformation. The transformed sparse grid was implemented in program deMon2k, where it is used as the numerical integrator for the exchange-correlation energy and potential in the KS-DFT procedure. We tested our grid by computing ground state energies, equilibrium geometries, and atomization energies. The accuracy on these test calculations shows that our grid is more efficient than some previous integration methods: our grids use fewer points to obtain the same accuracy. The transformed sparse grids were also tested for integrating, interpolating and differentiating in different dimensions (n = 1,2,3,6). The second technique is a grid-based method for computing atomic properties within QTAIM. It was also implemented in deMon2k. The performance of the method was tested by computing QTAIM atomic energies, charges, dipole moments, and quadrupole moments. For medium accuracy, our method is the fastest one we know of.

  3. An Improved Compressive Sensing and Received Signal Strength-Based Target Localization Algorithm with Unknown Target Population for Wireless Local Area Networks.

    PubMed

    Yan, Jun; Yu, Kegen; Chen, Ruizhi; Chen, Liang

    2017-05-30

    In this paper a two-phase compressive sensing (CS) and received signal strength (RSS)-based target localization approach is proposed to improve position accuracy by dealing with the unknown target population and the effect of grid dimensions on position error. In the coarse localization phase, by formulating target localization as a sparse signal recovery problem, grids with recovery vector components greater than a threshold are chosen as the candidate target grids. In the fine localization phase, by partitioning each candidate grid, the target position in a grid is iteratively refined by using the minimum residual error rule and the least-squares technique. When all the candidate target grids are iteratively partitioned and the measurement matrix is updated, the recovery vector is re-estimated. Threshold-based detection is employed again to determine the target grids and hence the target population. As a consequence, both the target population and the position estimation accuracy can be significantly improved. Simulation results demonstrate that the proposed approach achieves the best accuracy among all the algorithms compared.

  4. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    PubMed Central

    Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  5. [A wavelet-transform-based method for the automatic detection of late-type stars].

    PubMed

    Liu, Zhong-tian; Zhao, Rrui-zhen; Zhao, Yong-heng; Wu, Fu-chao

    2005-07-01

    The LAMOST project, the world largest sky survey project, urgently needs an automatic late-type stars detection system. However, to our knowledge, no effective methods for automatic late-type stars detection have been reported in the literature up to now. The present study work is intended to explore possible ways to deal with this issue. Here, by "late-type stars" we mean those stars with strong molecule absorption bands, including oxygen-rich M, L and T type stars and carbon-rich C stars. Based on experimental results, the authors find that after a wavelet transform with 5 scales on the late-type stars spectra, their frequency spectrum of the transformed coefficient on the 5th scale consistently manifests a unimodal distribution, and the energy of frequency spectrum is largely concentrated on a small neighborhood centered around the unique peak. However, for the spectra of other celestial bodies, the corresponding frequency spectrum is of multimodal and the energy of frequency spectrum is dispersible. Based on such a finding, the authors presented a wavelet-transform-based automatic late-type stars detection method. The proposed method is shown by extensive experiments to be practical and of good robustness.

  6. SACRB-MAC: A High-Capacity MAC Protocol for Cognitive Radio Sensor Networks in Smart Grid

    PubMed Central

    Yang, Zhutian; Shi, Zhenguo; Jin, Chunlin

    2016-01-01

    The Cognitive Radio Sensor Network (CRSN) is considered as a viable solution to enhance various aspects of the electric power grid and to realize a smart grid. However, several challenges for CRSNs are generated due to the harsh wireless environment in a smart grid. As a result, throughput and reliability become critical issues. On the other hand, the spectrum aggregation technique is expected to play an important role in CRSNs in a smart grid. By using spectrum aggregation, the throughput of CRSNs can be improved efficiently, so as to address the unique challenges of CRSNs in a smart grid. In this regard, we proposed Spectrum Aggregation Cognitive Receiver-Based MAC (SACRB-MAC), which employs the spectrum aggregation technique to improve the throughput performance of CRSNs in a smart grid. Moreover, SACRB-MAC is a receiver-based MAC protocol, which can provide a good reliability performance. Analytical and simulation results demonstrate that SACRB-MAC is a promising solution for CRSNs in a smart grid. PMID:27043573

  7. SACRB-MAC: A High-Capacity MAC Protocol for Cognitive Radio Sensor Networks in Smart Grid.

    PubMed

    Yang, Zhutian; Shi, Zhenguo; Jin, Chunlin

    2016-03-31

    The Cognitive Radio Sensor Network (CRSN) is considered as a viable solution to enhance various aspects of the electric power grid and to realize a smart grid. However, several challenges for CRSNs are generated due to the harsh wireless environment in a smart grid. As a result, throughput and reliability become critical issues. On the other hand, the spectrum aggregation technique is expected to play an important role in CRSNs in a smart grid. By using spectrum aggregation, the throughput of CRSNs can be improved efficiently, so as to address the unique challenges of CRSNs in a smart grid. In this regard, we proposed Spectrum Aggregation Cognitive Receiver-Based MAC (SACRB-MAC), which employs the spectrum aggregation technique to improve the throughput performance of CRSNs in a smart grid. Moreover, SACRB-MAC is a receiver-based MAC protocol, which can provide a good reliability performance. Analytical and simulation results demonstrate that SACRB-MAC is a promising solution for CRSNs in a smart grid.

  8. Grid Work

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Pointwise Inc.'s, Gridgen Software is a system for the generation of 3D (three dimensional) multiple block, structured grids. Gridgen is a visually-oriented, graphics-based interactive code used to decompose a 3D domain into blocks, distribute grid points on curves, initialize and refine grid points on surfaces and initialize volume grid points. Gridgen is available to U.S. citizens and American-owned companies by license.

  9. A Simple Algebraic Grid Adaptation Scheme with Applications to Two- and Three-dimensional Flow Problems

    NASA Technical Reports Server (NTRS)

    Hsu, Andrew T.; Lytle, John K.

    1989-01-01

    An algebraic adaptive grid scheme based on the concept of arc equidistribution is presented. The scheme locally adjusts the grid density based on gradients of selected flow variables from either finite difference or finite volume calculations. A user-prescribed grid stretching can be specified such that control of the grid spacing can be maintained in areas of known flowfield behavior. For example, the grid can be clustered near a wall for boundary layer resolution and made coarse near the outer boundary of an external flow. A grid smoothing technique is incorporated into the adaptive grid routine, which is found to be more robust and efficient than the weight function filtering technique employed by other researchers. Since the present algebraic scheme requires no iteration or solution of differential equations, the computer time needed for grid adaptation is trivial, making the scheme useful for three-dimensional flow problems. Applications to two- and three-dimensional flow problems show that a considerable improvement in flowfield resolution can be achieved by using the proposed adaptive grid scheme. Although the scheme was developed with steady flow in mind, it is a good candidate for unsteady flow computations because of its efficiency.

  10. Detailed analysis of stem I and its 5' and 3' neighbor regions in the trans-acting HDV ribozyme.

    PubMed Central

    Nishikawa, F; Roy, M; Fauzi, H; Nishikawa, S

    1999-01-01

    To determine the stem I structure of the human hepatitis delta virus (HDV) ribozyme, which is related to the substrate sequence in the trans -acting system, we kinetically studied stem I length and sequences. Stem I extension from 7 to 8 or 9 bp caused a loss of activity and a low amount of active complex with 9 bp in the trans -acting system. In a previous report, we presented cleavage in a 6 bp stem I. The observed reaction rates indicate that the original 7 bp stem I is in the most favorable location for catalytic reaction among the possible 6-8 bp stems. To test base specificity, we replaced the original GC-rich sequence in stem I with AU-rich sequences containing six AU or UA base pairs with the natural +1G.U wobble base pair at the cleavage site. The cis -acting AU-rich molecules demonstrated similar catalytic activity to that of the wild-type. In trans -acting molecules, due to stem I instability, reaction efficiency strongly depended on the concentration of the ribozyme-substrate complex and reaction temperature. Multiple turnover was observed at 37 degreesC, strongly suggesting that stem I has no base specificity and more efficient activity can be expected under multiple turnover conditions by substituting several UA or AU base pairs into stem I. We also studied the substrate damaging sequences linked to both ends of stem I for its development in therapeutic applications and confirmed the functions of the unique structure. PMID:9862958

  11. Using Grid Benchmarks for Dynamic Scheduling of Grid Applications

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert

    2003-01-01

    Navigation or dynamic scheduling of applications on computational grids can be improved through the use of an application-specific characterization of grid resources. Current grid information systems provide a description of the resources, but do not contain any application-specific information. We define a GridScape as dynamic state of the grid resources. We measure the dynamic performance of these resources using the grid benchmarks. Then we use the GridScape for automatic assignment of the tasks of a grid application to grid resources. The scalability of the system is achieved by limiting the navigation overhead to a few percent of the application resource requirements. Our task submission and assignment protocol guarantees that the navigation system does not cause grid congestion. On a synthetic data mining application we demonstrate that Gridscape-based task assignment reduces the application tunaround time.

  12. MAG3D and its application to internal flowfield analysis

    NASA Technical Reports Server (NTRS)

    Lee, K. D.; Henderson, T. L.; Choo, Y. K.

    1992-01-01

    MAG3D (multiblock adaptive grid, 3D) is a 3D solution-adaptive grid generation code which redistributes grid points to improve the accuracy of a flow solution without increasing the number of grid points. The code is applicable to structured grids with a multiblock topology. It is independent of the original grid generator and the flow solver. The code uses the coordinates of an initial grid and the flow solution interpolated onto the new grid. MAG3D uses a numerical mapping and potential theory to modify the grid distribution based on properties of the flow solution on the initial grid. The adaptation technique is discussed, and the capability of MAG3D is demonstrated with several internal flow examples. Advantages of using solution-adaptive grids are also shown by comparing flow solutions on adaptive grids with those on initial grids.

  13. An Adaptive Unstructured Grid Method by Grid Subdivision, Local Remeshing, and Grid Movement

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    1999-01-01

    An unstructured grid adaptation technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The approach is based on a combination of grid subdivision, local remeshing, and grid movement. For solution adaptive grids, the surface triangulation is locally refined by grid subdivision, and the tetrahedral grid in the field is partially remeshed at locations of dominant flow features. A grid redistribution strategy is employed for geometric adaptation of volume grids to moving or deforming surfaces. The method is automatic and fast and is designed for modular coupling with different solvers. Several steady state test cases with different inviscid flow features were tested for grid/solution adaptation. In all cases, the dominant flow features, such as shocks and vortices, were accurately and efficiently predicted with the present approach. A new and robust method of moving tetrahedral "viscous" grids is also presented and demonstrated on a three-dimensional example.

  14. Towards Dynamic Authentication in the Grid — Secure and Mobile Business Workflows Using GSet

    NASA Astrophysics Data System (ADS)

    Mangler, Jürgen; Schikuta, Erich; Witzany, Christoph; Jorns, Oliver; Ul Haq, Irfan; Wanek, Helmut

    Until now, the research community mainly focused on the technical aspects of Grid computing and neglected commercial issues. However, recently the community tends to accept that the success of the Grid is crucially based on commercial exploitation. In our vision Foster's and Kesselman's statement "The Grid is all about sharing." has to be extended by "... and making money out of it!". To allow for the realization of this vision the trust-worthyness of the underlying technology needs to be ensured. This can be achieved by the use of gSET (Gridified Secure Electronic Transaction) as a basic technology for trust management and secure accounting in the presented Grid based workflow. We present a framework, conceptually and technically, from the area of the Mobile-Grid, which justifies the Grid infrastructure as a viable platform to enable commercially successful business workflows.

  15. A Damping Grid Strapdown Inertial Navigation System Based on a Kalman Filter for Ships in Polar Regions.

    PubMed

    Huang, Weiquan; Fang, Tao; Luo, Li; Zhao, Lin; Che, Fengzhu

    2017-07-03

    The grid strapdown inertial navigation system (SINS) used in polar navigation also includes three kinds of periodic oscillation errors as common SINS are based on a geographic coordinate system. Aiming ships which have the external information to conduct a system reset regularly, suppressing the Schuler periodic oscillation is an effective way to enhance navigation accuracy. The Kalman filter based on the grid SINS error model which applies to the ship is established in this paper. The errors of grid-level attitude angles can be accurately estimated when the external velocity contains constant error, and then correcting the errors of the grid-level attitude angles through feedback correction can effectively dampen the Schuler periodic oscillation. The simulation results show that with the aid of external reference velocity, the proposed external level damping algorithm based on the Kalman filter can suppress the Schuler periodic oscillation effectively. Compared with the traditional external level damping algorithm based on the damping network, the algorithm proposed in this paper can reduce the overshoot errors when the state of grid SINS is switched from the non-damping state to the damping state, and this effectively improves the navigation accuracy of the system.

  16. Quality Assurance Framework for Mini-Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baring-Gould, Ian; Burman, Kari; Singh, Mohit

    Providing clean and affordable energy services to the more than 1 billion people globally who lack access to electricity is a critical driver for poverty reduction, economic development, improved health, and social outcomes. More than 84% of populations without electricity are located in rural areas where traditional grid extension may not be cost-effective; therefore, distributed energy solutions such as mini-grids are critical. To address some of the root challenges of providing safe, quality, and financially viable mini-grid power systems to remote customers, the U.S. Department of Energy (DOE) teamed with the National Renewable Energy Laboratory (NREL) to develop a Qualitymore » Assurance Framework (QAF) for isolated mini-grids. The QAF for mini-grids aims to address some root challenges of providing safe, quality, and affordable power to remote customers via financially viable mini-grids through two key components: (1) Levels of service: Defines a standard set of tiers of end-user service and links them to technical parameters of power quality, power availability, and power reliability. These levels of service span the entire energy ladder, from basic energy service to high-quality, high-reliability, and high-availability service (often considered 'grid parity'); (2) Accountability and performance reporting framework: Provides a clear process of validating power delivery by providing trusted information to customers, funders, and/or regulators. The performance reporting protocol can also serve as a robust monitoring and evaluation tool for mini-grid operators and funding organizations. The QAF will provide a flexible alternative to rigid top-down standards for mini-grids in energy access contexts, outlining tiers of end-user service and linking them to relevant technical parameters. In addition, data generated through implementation of the QAF will provide the foundation for comparisons across projects, assessment of impacts, and greater confidence that will drive investment and scale-up in this sector. The QAF implementation process also defines a set of implementation guidelines that help the deployment of mini-grids on a regional or national scale, helping to insure successful rapid deployment of these relatively new remote energy options. Note that the QAF is technology agnostic, addressing both alternating current (AC) and direct current (DC) mini-grids, and is also applicable to renewable, fossil-fuel, and hybrid systems.« less

  17. On- and off-grid operation of hybrid renewable power plants: When are the economics favorable?

    NASA Astrophysics Data System (ADS)

    Petrakopoulou, F.; Santana, D.

    2016-12-01

    Hybrid renewable energy conversion systems offer a good alternative to conventional systems in locations where the extension of the electrical grid is difficult or not economical or where the cost of electricity is high. However, stand-alone operation implies net energy output restrictions (limited to exclusively serve the energy demand of a region), capacity oversizing and large storage facilities. In interconnected areas, on the other hand, the operational restrictions of the power stations change significantly and the efficiencies and costs of renewable technologies become more favorable. In this paper, the operation of three main renewable technologies (CSP, PV and wind) is studied assuming both hybrid and individual operation for both autonomous and inter-connected operation. The case study used is a Mediterranean island of ca. 3,000 inhabitants. Each system is optimized to fully cover the energy demand of the community. In addition, in the on-grid operation cases, it is required that the annual energy generated from the renewable sources is net positive (i.e., the island generates at least as much energy as it uses). It is found that when connected to the grid, hybridization of more than one technology is not required to satisfy the energy demand, as expected. Each of the renewable technologies investigated can satisfy the annual energy demand individually, without significant complications. In addition, the cost of electricity generated with the three studied technologies drops significantly for on-grid applications, when compared to off-grid operation. However, when compared to business-as-usual scenarios in both the on- and off-grid cases, both investigated hybrid and single-technology renewable scenarios are found to be economically viable. A sensitivity analysis reveals the limits of the acceptable costs that make the technologies favorable when compared to conventional alternatives.

  18. Tuned grid generation with ICEM CFD

    NASA Technical Reports Server (NTRS)

    Wulf, Armin; Akdag, Vedat

    1995-01-01

    ICEM CFD is a CAD based grid generation package that supports multiblock structured, unstructured tetrahedral and unstructured hexahedral grids. Major development efforts have been spent to extend ICEM CFD's multiblock structured and hexahedral unstructured grid generation capabilities. The modules added are: a parametric grid generation module and a semi-automatic hexahedral grid generation module. A fully automatic version of the hexahedral grid generation module for around a set of predefined objects in rectilinear enclosures has been developed. These modules will be presented and the procedures used will be described, and examples will be discussed.

  19. Commissioning of a CERN Production and Analysis Facility Based on xrootd

    NASA Astrophysics Data System (ADS)

    Campana, Simone; van der Ster, Daniel C.; Di Girolamo, Alessandro; Peters, Andreas J.; Duellmann, Dirk; Coelho Dos Santos, Miguel; Iven, Jan; Bell, Tim

    2011-12-01

    The CERN facility hosts the Tier-0 of the four LHC experiments, but as part of WLCG it also offers a platform for production activities and user analysis. The CERN CASTOR storage technology has been extensively tested and utilized for LHC data recording and exporting to external sites according to experiments computing model. On the other hand, to accommodate Grid data processing activities and, more importantly, chaotic user analysis, it was realized that additional functionality was needed including a different throttling mechanism for file access. This paper will describe the xroot-based CERN production and analysis facility for the ATLAS experiment and in particular the experiment use case and data access scenario, the xrootd redirector setup on top of the CASTOR storage system, the commissioning of the system and real life experience for data processing and data analysis.

  20. Development of an Aeroelastic Code Based on an Euler/Navier-Stokes Aerodynamic Solver

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.; Srivastava, Rakesh; Keith, Theo G., Jr.; Stefko, George L.; Janus, Mark J.

    1996-01-01

    This paper describes the development of an aeroelastic code (TURBO-AE) based on an Euler/Navier-Stokes unsteady aerodynamic analysis. A brief review of the relevant research in the area of propulsion aeroelasticity is presented. The paper briefly describes the original Euler/Navier-Stokes code (TURBO) and then details the development of the aeroelastic extensions. The aeroelastic formulation is described. The modeling of the dynamics of the blade using a modal approach is detailed, along with the grid deformation approach used to model the elastic deformation of the blade. The work-per-cycle approach used to evaluate aeroelastic stability is described. Representative results used to verify the code are presented. The paper concludes with an evaluation of the development thus far, and some plans for further development and validation of the TURBO-AE code.

  1. Spatial application of WEPS for estimating wind erosion in the Pacific Northwest

    USDA-ARS?s Scientific Manuscript database

    The Wind Erosion Prediction System (WEPS) is used to simulate soil erosion on croplands and was originally designed to run field scale simulations. This research is an extension of the WEPS model to run on multiple fields (grids) covering a larger region. We modified the WEPS source code to allow it...

  2. The Interplay of Surface Mount Solder Joint Quality and Reliability of Low Volume SMAs

    NASA Technical Reports Server (NTRS)

    Ghaffarian, R.

    1997-01-01

    Spacecraft electronics including those used at the Jet Propulsion Laboratory (JPL), demand production of highly reliable assemblies. JPL has recently completed an extensive study, funded by NASA's code Q, of the interplay between manufacturing defects and reliability of ball grid array (BGA) and surface mount electronic components.

  3. Stability analysis of cylinders with circular cutouts

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Brogan, F. A.; Marlowe, M. B.

    1973-01-01

    The stability of axially compressed cylinders with circular cutouts is analyzed numerically. An extension of the finite-difference method is used which removes the requirement that displacement components be defined in the directions of the grid lines. The results of this nonlinear analysis are found to be in good agreement with earlier experimental results.

  4. Downstream Benefits of Energy Management Systems

    DTIC Science & Technology

    2015-12-01

    OAT Outside Air Temperature POM Presidio of Monterey RCx Retro-Commissioning Solar PV Solar Photovoltaic VSG Virtual Smart Grid xiv THIS PAGE...efficiency, including some advanced demonstration projects for EMSs, microgrids, extensive solar photovoltaic (PV) generation capacity, and others...approach to reducing consumption, maintaining mission assurance, and providing reliable power to critical loads. (Deputy Undersecretary of Defense

  5. Shear-lag analysis about an internally-dropped ply

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vizzini, A.J.

    1995-12-31

    The region around a terminated ply is modeled as several elastic layers separated by shear regions. A shear-lag analysis is then performed allowing for the thickness of the elastic and shear layers to vary. Boundary conditions, away for the ply drop, are based on the deflections determined by a finite element model. The interlaminar stresses are compared against those generated by the finite element model for tapered laminates under pure extension, pure bending, and extension-bending coupling. The shear-lag analysis predicts the interlaminar shear at and near the ply drop for pure extension and in cases involving bending if the deflectionsmore » due to bending are removed. The interlaminar shear stress and force equilibrium are used to determine the interlaminar normal stress. The trends in the interlaminar normal stress shown by the finite element model are partially captured by the shear-lag analysis. This simple analysis indicates that the mechanism for load transfer about a ply drop is primarily due to shear transfer through the resin rich areas.« less

  6. Algorithm and code development for unsteady three-dimensional Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Obayashi, Shigeru

    1991-01-01

    A streamwise upwind algorithm for solving the unsteady 3-D Navier-Stokes equations was extended to handle the moving grid system. It is noted that the finite volume concept is essential to extend the algorithm. The resulting algorithm is conservative for any motion of the coordinate system. Two extensions to an implicit method were considered and the implicit extension that makes the algorithm computationally efficient is implemented into Ames's aeroelasticity code, ENSAERO. The new flow solver has been validated through the solution of test problems. Test cases include three-dimensional problems with fixed and moving grids. The first test case shown is an unsteady viscous flow over an F-5 wing, while the second test considers the motion of the leading edge vortex as well as the motion of the shock wave for a clipped delta wing. The resulting algorithm has been implemented into ENSAERO. The upwind version leads to higher accuracy in both steady and unsteady computations than the previously used central-difference method does, while the increase in the computational time is small.

  7. LSPRAY-IV: A Lagrangian Spray Module

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    2012-01-01

    LSPRAY-IV is a Lagrangian spray solver developed for application with parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and/or Monte Carlo Probability Density Function (PDF) solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type for the gas flow grid representation. It is mainly designed to predict the flow, thermal and transport properties of a rapidly vaporizing spray. Some important research areas covered as a part of the code development are: (1) the extension of combined CFD/scalar-Monte- Carlo-PDF method to spray modeling, (2) the multi-component liquid spray modeling, and (3) the assessment of various atomization models used in spray calculations. The current version contains the extension to the modeling of superheated sprays. The manual provides the user with an understanding of various models involved in the spray formulation, its code structure and solution algorithm, and various other issues related to parallelization and its coupling with other solvers.

  8. Spaceflight Operations Services Grid (SOSG) Project

    NASA Technical Reports Server (NTRS)

    Bradford, Robert; Lisotta, Anthony

    2004-01-01

    The motivation, goals, and objectives of the Space Operations Services Grid Project (SOSG) are covered in this viewgraph presentation. The goals and objectives of SOSG include: 1) Developing a grid-enabled prototype providing Space-based ground operations end user services through a collaborative effort between NASA, academia, and industry to assess the technical and cost feasibility of implementation of Grid technologies in the Space Operations arena; 2) Provide to space operations organizations and processes, through a single secure portal(s), access to all the information technology (Grid and Web based) services necessary for program/project development, operations and the ultimate creation of new processes, information and knowledge.

  9. Grid-Based Projector Augmented Wave (GPAW) Implementation of Quantum Mechanics/Molecular Mechanics (QM/MM) Electrostatic Embedding and Application to a Solvated Diplatinum Complex.

    PubMed

    Dohn, A O; Jónsson, E Ö; Levi, G; Mortensen, J J; Lopez-Acevedo, O; Thygesen, K S; Jacobsen, K W; Ulstrup, J; Henriksen, N E; Møller, K B; Jónsson, H

    2017-12-12

    A multiscale density functional theory-quantum mechanics/molecular mechanics (DFT-QM/MM) scheme is presented, based on an efficient electrostatic coupling between the electronic density obtained from a grid-based projector augmented wave (GPAW) implementation of density functional theory and a classical potential energy function. The scheme is implemented in a general fashion and can be used with various choices for the descriptions of the QM or MM regions. Tests on H 2 O clusters, ranging from dimer to decamer show that no systematic energy errors are introduced by the coupling that exceeds the differences in the QM and MM descriptions. Over 1 ns of liquid water, Born-Oppenheimer QM/MM molecular dynamics (MD) are sampled combining 10 parallel simulations, showing consistent liquid water structure over the QM/MM border. The method is applied in extensive parallel MD simulations of an aqueous solution of the diplatinum [Pt 2 (P 2 O 5 H 2 ) 4 ] 4- complex (PtPOP), spanning a total time period of roughly half a nanosecond. An average Pt-Pt distance deviating only 0.01 Å from experimental results, and a ground-state Pt-Pt oscillation frequency deviating by <2% from experimental results were obtained. The simulations highlight a remarkable harmonicity of the Pt-Pt oscillation, while also showing clear signs of Pt-H hydrogen bonding and directional coordination of water molecules along the Pt-Pt axis of the complex.

  10. Extending Climate Analytics-As to the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  11. Design and power management of an offshore medium voltage DC microgrid realized through high voltage power electronics technologies and control

    NASA Astrophysics Data System (ADS)

    Grainger, Brandon Michael

    The growth in the electric power industry's portfolio of Direct Current (DC) based generation and loads have captured the attention of many leading research institutions. Opportunities for using DC based systems have been explored in electric ship design and have been a proven, reliable solution for transmitting bulk power onshore and offshore. To integrate many of the renewable resources into our existing AC grid, a number of power conversions through power electronics are required to condition the equipment for direct connection. Within the power conversion stages, there is always a requirement to convert to or from DC. The AC microgrid is a conceptual solution proposed for integrating various types of renewable generation resources. The fundamental microgrid requirements include the capability of operating in islanding mode and/or grid connected modes. The technical challenges associated with microgrids include (1) operation modes and transitions that comply with IEEE1547 without extensive custom engineering and (2) control architecture and communication. The Medium Voltage DC (MVDC) architecture, explored by the University of Pittsburgh, can be visualized as a special type of DC microgrid. This dissertation is multi-faceted, focused on many design aspects of an offshore DC microgrid. The focal points of the discussion are focused on optimized high power, high frequency magnetic material performance in electric machines, transformers, and DC/DC power converters---all components found within offshore, power system architectures. A new controller design based upon model reference control is proposed and shown to stabilize the electric motor drives (modeled as constant power loads), which serve as the largest power consuming entities in the microgrid. The design and simulation of a state-of-the-art multilevel converter for High Voltage DC (HVDC) is discussed and a component sensitivity analysis on fault current peaks is explored. A power management routine is proposed and evaluated as the DC microgrid is disturbed through various mode transitions. Finally, two communication protocols are described for the microgrid---one to minimize communication overhead inside the microgrid and another to provide robust and scalable intra-grid communication. The work presented is supported by Asea Brown Boveri (ABB) Corporate Research Center within the Active Grid Infrastructure program, the Advanced Research Project Agency - Energy (ARPA-E) through the Solar ADEPT program, and Mitsubishi Electric Corporation (MELCO).

  12. Design and implementation of GRID-based PACS in a hospital with multiple imaging departments

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyuan; Jin, Jin; Sun, Jianyong; Zhang, Jianguo

    2008-03-01

    Usually, there were multiple clinical departments providing imaging-enabled healthcare services in enterprise healthcare environment, such as radiology, oncology, pathology, and cardiology, the picture archiving and communication system (PACS) is now required to support not only radiology-based image display, workflow and data flow management, but also to have more specific expertise imaging processing and management tools for other departments providing imaging-guided diagnosis and therapy, and there were urgent demand to integrate the multiple PACSs together to provide patient-oriented imaging services for enterprise collaborative healthcare. In this paper, we give the design method and implementation strategy of developing grid-based PACS (Grid-PACS) for a hospital with multiple imaging departments or centers. The Grid-PACS functions as a middleware between the traditional PACS archiving servers and workstations or image viewing clients and provide DICOM image communication and WADO services to the end users. The images can be stored in distributed multiple archiving servers, but can be managed with central mode. The grid-based PACS has auto image backup and disaster recovery services and can provide best image retrieval path to the image requesters based on the optimal algorithms. The designed grid-based PACS has been implemented in Shanghai Huadong Hospital and been running for two years smoothly.

  13. UA-ICON - A non-hydrostatic global model for studying gravity waves from the troposphere to the thermosphere

    NASA Astrophysics Data System (ADS)

    Borchert, Sebastian; Zängl, Günther; Baldauf, Michael; Zhou, Guidi; Schmidt, Hauke; Manzini, Elisa

    2017-04-01

    In numerical weather prediction as well as climate simulations, there are ongoing efforts to raise the upper model lid, acknowledging the possible influence of middle and upper atmosphere dynamics on tropospheric weather and climate. As the momentum deposition of gravity waves (GWs) is responsible for key features of the large scale flow in the middle and upper atmosphere, the upward model extension has put GWs in the focus of atmospheric research needs. The Max Planck Institute for Meteorology (MPI-M) and the German Weather Service (DWD) have been developing jointly the non-hydrostatic global model ICON (Zängl et al, 2015) which features a new dynamical core based on an icosahedral grid. The extension of ICON beyond the mesosphere, where most GWs deposit their momentum, requires, e.g., relaxing the shallow-atmosphere and other traditional approximations as well as implementing additional physical processes that are important to the upper atmosphere. We would like to present aspects of the model development and its evaluation, and first results from a simulation of a period of the DEEPWAVE campaign in New Zealand in 2014 (Fritts et al, 2016) using grid nesting up to a horizontal mesh size of about 1.25 km. This work is part of the research unit: Multi-Scale Dynamics of Gravity Waves (MS-GWaves: sub-project GWING, https://ms-gwaves.iau.uni-frankfurt.de/index.php), funded by the German Research Foundation. Fritts, D.C. and Coauthors, 2016: "The Deep Propagating Gravity Wave Experiment (DEEPWAVE): An airborne and ground-based exploration of gravity wave propagation and effects from their sources throughout the lower and middle atmosphere". Bull. Amer. Meteor. Soc., 97, 425 - 453, doi:10.1175/BAMS-D-14-00269.1 Zängl, G., Reinert, D., Ripodas, P., Baldauf, M., 2015: "The ICON (ICOsahedral Non-hydrostatic) modelling framework of DWD and MPI-M: Description of the non-hydrostatic dynamical core". Quart. J. Roy. Met. Soc., 141, 563 - 579, doi:10.1002/qj.2378

  14. Spatiotemporal modelling of viral infection dynamics

    NASA Astrophysics Data System (ADS)

    Beauchemin, Catherine

    Viral kinetics have been studied extensively in the past through the use of ordinary differential equations describing the time evolution of the diseased state in a spatially well-mixed medium. However, emerging spatial structures such as localized populations of dead cells might affect the spread of infection, similar to the manner in which a counter-fire can stop a forest fire from spreading. In the first phase of the project, a simple two-dimensional cellular automaton model of viral infections was developed. It was validated against clinical immunological data for uncomplicated influenza A infections and shown to be accurate enough to adequately model them. In the second phase of the project, the simple two-dimensional cellular automaton model was used to investigate the effects of relaxing the well-mixed assumption on viral infection dynamics. It was shown that grouping the initially infected cells into patches rather than distributing them uniformly on the grid reduced the infection rate as only cells on the perimeter of the patch have healthy neighbours to infect. Use of a local epithelial cell regeneration rule where dead cells are replaced by healthy cells when an immediate neighbour divides was found to result in more extensive damage of the epithelium and yielded a better fit to experimental influenza A infection data than a global regeneration rule based on division rate of healthy cell. Finally, the addition of immune cell at the site of infection was found to be a better strategy at low infection levels, while addition at random locations on the grid was the better strategy at high infection level. In the last project, the movement of T cells within lymph nodes in the absence of antigen, was investigated. Based on individual T cell track data captured by two-photon microscopy experiments in vivo, a simple model was proposed for the motion of T cells. This is the first step towards the implementation of a more realistic spatiotemporal model of HIV than those proposed thus far.

  15. Homogeneity and EPR metrics for assessment of regular grids used in CW EPR powder simulations.

    PubMed

    Crăciun, Cora

    2014-08-01

    CW EPR powder spectra may be approximated numerically using a spherical grid and a Voronoi tessellation-based cubature. For a given spin system, the quality of simulated EPR spectra depends on the grid type, size, and orientation in the molecular frame. In previous work, the grids used in CW EPR powder simulations have been compared mainly from geometric perspective. However, some grids with similar homogeneity degree generate different quality simulated spectra. This paper evaluates the grids from EPR perspective, by defining two metrics depending on the spin system characteristics and the grid Voronoi tessellation. The first metric determines if the grid points are EPR-centred in their Voronoi cells, based on the resonance magnetic field variations inside these cells. The second metric verifies if the adjacent Voronoi cells of the tessellation are EPR-overlapping, by computing the common range of their resonance magnetic field intervals. Beside a series of well known regular grids, the paper investigates a modified ZCW grid and a Fibonacci spherical code, which are new in the context of EPR simulations. For the investigated grids, the EPR metrics bring more information than the homogeneity quantities and are better related to the grids' EPR behaviour, for different spin system symmetries. The metrics' efficiency and limits are finally verified for grids generated from the initial ones, by using the original or magnetic field-constraint variants of the Spherical Centroidal Voronoi Tessellation method. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Convergence of the Bouguer-Beer law for radiation extinction in particulate media

    NASA Astrophysics Data System (ADS)

    Frankel, A.; Iaccarino, G.; Mani, A.

    2016-10-01

    Radiation transport in particulate media is a common physical phenomenon in natural and industrial processes. Developing predictive models of these processes requires a detailed model of the interaction between the radiation and the particles. Resolving the interaction between the radiation and the individual particles in a very large system is impractical, whereas continuum-based representations of the particle field lend themselves to efficient numerical techniques based on the solution of the radiative transfer equation. We investigate radiation transport through discrete and continuum-based representations of a particle field. Exact solutions for radiation extinction are developed using a Monte Carlo model in different particle distributions. The particle distributions are then projected onto a concentration field with varying grid sizes, and the Bouguer-Beer law is applied by marching across the grid. We show that the continuum-based solution approaches the Monte Carlo solution under grid refinement, but quickly diverges as the grid size approaches the particle diameter. This divergence is attributed to the homogenization error of an individual particle across a whole grid cell. We remark that the concentration energy spectrum of a point-particle field does not approach zero, and thus the concentration variance must also diverge under infinite grid refinement, meaning that no grid-converged solution of the radiation transport is possible.

  17. WPS mediation: An approach to process geospatial data on different computing backends

    NASA Astrophysics Data System (ADS)

    Giuliani, Gregory; Nativi, Stefano; Lehmann, Anthony; Ray, Nicolas

    2012-10-01

    The OGC Web Processing Service (WPS) specification allows generating information by processing distributed geospatial data made available through Spatial Data Infrastructures (SDIs). However, current SDIs have limited analytical capacities and various problems emerge when trying to use them in data and computing-intensive domains such as environmental sciences. These problems are usually not or only partially solvable using single computing resources. Therefore, the Geographic Information (GI) community is trying to benefit from the superior storage and computing capabilities offered by distributed computing (e.g., Grids, Clouds) related methods and technologies. Currently, there is no commonly agreed approach to grid-enable WPS. No implementation allows one to seamlessly execute a geoprocessing calculation following user requirements on different computing backends, ranging from a stand-alone GIS server up to computer clusters and large Grid infrastructures. Considering this issue, this paper presents a proof of concept by mediating different geospatial and Grid software packages, and by proposing an extension of WPS specification through two optional parameters. The applicability of this approach will be demonstrated using a Normalized Difference Vegetation Index (NDVI) mediated WPS process, highlighting benefits, and issues that need to be further investigated to improve performances.

  18. Weather Research and Forecasting Model with Vertical Nesting Capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-01

    The Weather Research and Forecasting (WRF) model with vertical nesting capability is an extension of the WRF model, which is available in the public domain, from www.wrf-model.org. The new code modifies the nesting procedure, which passes lateral boundary conditions between computational domains in the WRF model. Previously, the same vertical grid was required on all domains, while the new code allows different vertical grids to be used on concurrently run domains. This new functionality improves WRF's ability to produce high-resolution simulations of the atmosphere by allowing a wider range of scales to be efficiently resolved and more accurate lateral boundarymore » conditions to be provided through the nesting procedure.« less

  19. Ferrous iron- and ammonium-rich diffuse vents support habitat-specific communities in a shallow hydrothermal field off the Basiluzzo Islet (Aeolian Volcanic Archipelago).

    PubMed

    Bortoluzzi, G; Romeo, T; La Cono, V; La Spada, G; Smedile, F; Esposito, V; Sabatino, G; Di Bella, M; Canese, S; Scotti, G; Bo, M; Giuliano, L; Jones, D; Golyshin, P N; Yakimov, M M; Andaloro, F

    2017-09-01

    Ammonium- and Fe(II)-rich fluid flows, known from deep-sea hydrothermal systems, have been extensively studied in the last decades and are considered as sites with high microbial diversity and activity. Their shallow-submarine counterparts, despite their easier accessibility, have so far been under-investigated, and as a consequence, much less is known about microbial communities inhabiting these ecosystems. A field of shallow expulsion of hydrothermal fluids has been discovered at depths of 170-400 meters off the base of the Basiluzzo Islet (Aeolian Volcanic Archipelago, Southern Tyrrhenian Sea). This area consists predominantly of both actively diffusing and inactive 1-3 meters-high structures in the form of vertical pinnacles, steeples and mounds covered by a thick orange to brown crust deposits hosting rich benthic fauna. Integrated morphological, mineralogical, and geochemical analyses revealed that, above all, these crusts are formed by ferrihydrite-type Fe 3+ oxyhydroxides. Two cruises in 2013 allowed us to monitor and sampled this novel ecosystem, certainly interesting in terms of shallow-water iron-rich site. The main objective of this work was to characterize the composition of extant communities of iron microbial mats in relation to the environmental setting and the observed patterns of macrofaunal colonization. We demonstrated that iron-rich deposits contain complex and stratified microbial communities with a high proportion of prokaryotes akin to ammonium- and iron-oxidizing chemoautotrophs, belonging to Thaumarchaeota, Nitrospira, and Zetaproteobacteria. Colonizers of iron-rich mounds, while composed of the common macrobenthic grazers, predators, filter-feeders, and tube-dwellers with no representatives of vent endemic fauna, differed from the surrounding populations. Thus, it is very likely that reduced electron donors (Fe 2+ and NH 4 + ) are important energy sources in supporting primary production in microbial mats, which form a habitat-specific trophic base of the whole Basiluzzo hydrothermal ecosystem, including macrobenthic fauna. © 2017 John Wiley & Sons Ltd.

  20. Site-specific strong ground motion prediction using 2.5-D modelling

    NASA Astrophysics Data System (ADS)

    Narayan, J. P.

    2001-08-01

    An algorithm was developed using the 2.5-D elastodynamic wave equation, based on the displacement-stress relation. One of the most significant advantages of the 2.5-D simulation is that the 3-D radiation pattern can be generated using double-couple point shear-dislocation sources in the 2-D numerical grid. A parsimonious staggered grid scheme was adopted instead of the standard staggered grid scheme, since this is the only scheme suitable for computing the dislocation. This new 2.5-D numerical modelling avoids the extensive computational cost of 3-D modelling. The significance of this exercise is that it makes it possible to simulate the strong ground motion (SGM), taking into account the energy released, 3-D radiation pattern, path effects and local site conditions at any location around the epicentre. The slowness vector (py) was used in the supersonic region for each layer, so that all the components of the inertia coefficient are positive. The double-couple point shear-dislocation source was implemented in the numerical grid using the moment tensor components as the body-force couples. The moment per unit volume was used in both the 3-D and 2.5-D modelling. A good agreement in the 3-D and 2.5-D responses for different grid sizes was obtained when the moment per unit volume was further reduced by a factor equal to the finite-difference grid size in the case of the 2.5-D modelling. The components of the radiation pattern were computed in the xz-plane using 3-D and 2.5-D algorithms for various focal mechanisms, and the results were in good agreement. A comparative study of the amplitude behaviour of the 3-D and 2.5-D wavefronts in a layered medium reveals the spatial and temporal damped nature of the 2.5-D elastodynamic wave equation. 3-D and 2.5-D simulated responses at a site using a different strike direction reveal that strong ground motion (SGM) can be predicted just by rotating the strike of the fault counter-clockwise by the same amount as the azimuth of the site with respect to the epicentre. This adjustment is necessary since the response is computed keeping the epicentre, focus and the desired site in the same xz-plane, with the x-axis pointing in the north direction.

Top