Sample records for grid visualization tools

  1. A Data-Driven Approach to Interactive Visualization of Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Jun

    Driven by emerging industry standards, electric utilities and grid coordination organizations are eager to seek advanced tools to assist grid operators to perform mission-critical tasks and enable them to make quick and accurate decisions. The emerging field of visual analytics holds tremendous promise for improving the business practices in today’s electric power industry. The conducted investigation, however, has revealed that the existing commercial power grid visualization tools heavily rely on human designers, hindering user’s ability to discover. Additionally, for a large grid, it is very labor-intensive and costly to build and maintain the pre-designed visual displays. This project proposes amore » data-driven approach to overcome the common challenges. The proposed approach relies on developing powerful data manipulation algorithms to create visualizations based on the characteristics of empirically or mathematically derived data. The resulting visual presentations emphasize what the data is rather than how the data should be presented, thus fostering comprehension and discovery. Furthermore, the data-driven approach formulates visualizations on-the-fly. It does not require a visualization design stage, completely eliminating or significantly reducing the cost for building and maintaining visual displays. The research and development (R&D) conducted in this project is mainly divided into two phases. The first phase (Phase I & II) focuses on developing data driven techniques for visualization of power grid and its operation. Various data-driven visualization techniques were investigated, including pattern recognition for auto-generation of one-line diagrams, fuzzy model based rich data visualization for situational awareness, etc. The R&D conducted during the second phase (Phase IIB) focuses on enhancing the prototyped data driven visualization tool based on the gathered requirements and use cases. The goal is to evolve the prototyped tool developed during the first phase into a commercial grade product. We will use one of the identified application areas as an example to demonstrate how research results achieved in this project are successfully utilized to address an emerging industry need. In summary, the data-driven visualization approach developed in this project has proven to be promising for building the next-generation power grid visualization tools. Application of this approach has resulted in a state-of-the-art commercial tool currently being leveraged by more than 60 utility organizations in North America and Europe .« less

  2. Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3

    NASA Astrophysics Data System (ADS)

    Endsley, K. A.; Billmire, M. G.

    2016-01-01

    Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.

  3. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  4. A Virtual World of Visualization

    NASA Technical Reports Server (NTRS)

    1998-01-01

    In 1990, Sterling Software, Inc., developed the Flow Analysis Software Toolkit (FAST) for NASA Ames on contract. FAST is a workstation based modular analysis and visualization tool. It is used to visualize and animate grids and grid oriented data, typically generated by finite difference, finite element and other analytical methods. FAST is now available through COSMIC, NASA's software storehouse.

  5. Graphical Contingency Analysis for the Nation's Electric Grid

    ScienceCinema

    Zhenyu (Henry) Huang

    2017-12-09

    PNNL has developed a new tool to manage the electric grid more effectively, helping prevent blackouts and brownouts--and possibly avoiding millions of dollars in fines for system violations. The Graphical Contingency Analysis tool monitors grid performance, shows prioritized lists of problems, provides visualizations of potential consequences, and helps operators identify the most effective courses of action. This technology yields faster, better decisions and a more stable and reliable power grid.

  6. A Unified Overset Grid Generation Graphical Interface and New Concepts on Automatic Gridding Around Surface Discontinuities

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Akien, Edwin (Technical Monitor)

    2002-01-01

    For many years, generation of overset grids for complex configurations has required the use of a number of different independently developed software utilities. Results created by each step were then visualized using a separate visualization tool before moving on to the next. A new software tool called OVERGRID was developed which allows the user to perform all the grid generation steps and visualization under one environment. OVERGRID provides grid diagnostic functions such as surface tangent and normal checks as well as grid manipulation functions such as extraction, extrapolation, concatenation, redistribution, smoothing, and projection. Moreover, it also contains hyperbolic surface and volume grid generation modules that are specifically suited for overset grid generation. It is the first time that such a unified interface existed for the creation of overset grids for complex geometries. New concepts on automatic overset surface grid generation around surface discontinuities will also be briefly presented. Special control curves on the surface such as intersection curves, sharp edges, open boundaries, are called seam curves. The seam curves are first automatically extracted from a multiple panel network description of the surface. Points where three or more seam curves meet are automatically identified and are called seam corners. Seam corner surface grids are automatically generated using a singular axis topology. Hyperbolic surface grids are then grown from the seam curves that are automatically trimmed away from the seam corners.

  7. Accessing eSDO Solar Image Processing and Visualization through AstroGrid

    NASA Astrophysics Data System (ADS)

    Auden, E.; Dalla, S.

    2008-08-01

    The eSDO project is funded by the UK's Science and Technology Facilities Council (STFC) to integrate Solar Dynamics Observatory (SDO) data, algorithms, and visualization tools with the UK's Virtual Observatory project, AstroGrid. In preparation for the SDO launch in January 2009, the eSDO team has developed nine algorithms covering coronal behaviour, feature recognition, and global / local helioseismology. Each of these algorithms has been deployed as an AstroGrid Common Execution Architecture (CEA) application so that they can be included in complex VO workflows. In addition, the PLASTIC-enabled eSDO "Streaming Tool" online movie application allows users to search multi-instrument solar archives through AstroGrid web services and visualise the image data through galleries, an interactive movie viewing applet, and QuickTime movies generated on-the-fly.

  8. 3D Feature Extraction for Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Silver, Deborah

    1996-01-01

    Visualization techniques provide tools that help scientists identify observed phenomena in scientific simulation. To be useful, these tools must allow the user to extract regions, classify and visualize them, abstract them for simplified representations, and track their evolution. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This article explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and those from Finite Element Analysis.

  9. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  10. The global lambda visualization facility: An international ultra-high-definition wide-area visualization collaboratory

    USGS Publications Warehouse

    Leigh, J.; Renambot, L.; Johnson, Aaron H.; Jeong, B.; Jagodic, R.; Schwarz, N.; Svistula, D.; Singh, R.; Aguilera, J.; Wang, X.; Vishwanath, V.; Lopez, B.; Sandin, D.; Peterka, T.; Girado, J.; Kooima, R.; Ge, J.; Long, L.; Verlo, A.; DeFanti, T.A.; Brown, M.; Cox, D.; Patterson, R.; Dorn, P.; Wefel, P.; Levy, S.; Talandis, J.; Reitzer, J.; Prudhomme, T.; Coffin, T.; Davis, B.; Wielinga, P.; Stolk, B.; Bum, Koo G.; Kim, J.; Han, S.; Corrie, B.; Zimmerman, T.; Boulanger, P.; Garcia, M.

    2006-01-01

    The research outlined in this paper marks an initial global cooperative effort between visualization and collaboration researchers to build a persistent virtual visualization facility linked by ultra-high-speed optical networks. The goal is to enable the comprehensive and synergistic research and development of the necessary hardware, software and interaction techniques to realize the next generation of end-user tools for scientists to collaborate on the global Lambda Grid. This paper outlines some of the visualization research projects that were demonstrated at the iGrid 2005 workshop in San Diego, California.

  11. Flowfield computer graphics

    NASA Technical Reports Server (NTRS)

    Desautel, Richard

    1993-01-01

    The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).

  12. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  13. An Integrated Software Package to Enable Predictive Simulation Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang

    The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less

  14. Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.

    PubMed

    Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar

    2012-01-01

    Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.

  15. Accessing and Visualizing scientific spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Bergou, Attila; Berriman, Bruce G.; Block, Gary L.; Collier, Jim; Curkendall, David W.; Good, John; Husman, Laura; Jacob, Joseph C.; Laity, Anastasia; hide

    2004-01-01

    This paper discusses work done by JPL 's Parallel Applications Technologies Group in helping scientists access and visualize very large data sets through the use of multiple computing resources, such as parallel supercomputers, clusters, and grids These tools do one or more of the following tasks visualize local data sets for local users, visualize local data sets for remote users, and access and visualize remote data sets The tools are used for various types of data, including remotely sensed image data, digital elevation models, astronomical surveys, etc The paper attempts to pull some common elements out of these tools that may be useful for others who have to work with similarly large data sets.

  16. 3rd Annual Earth System Grid Federation and 3rd Annual Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Face-to-Face Meeting Report December 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.

    The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed andmore » simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.« less

  17. Adapting the iSNOBAL model for improved visualization in a GIS environment

    NASA Astrophysics Data System (ADS)

    Johansen, W. J.; Delparte, D.

    2014-12-01

    Snowmelt is a primary means of crucial water resources in much of the western United States. Researchers are developing models that estimate snowmelt to aid in water resource management. One such model is the image snowcover energy and mass balance (iSNOBAL) model. It uses input climate grids to simulate the development and melting of snowpack in mountainous regions. This study looks at applying this model to the Reynolds Creek Experimental Watershed in southwestern Idaho, utilizing novel approaches incorporating geographic information systems (GIS). To improve visualization of the iSNOBAL model, we have adapted it to run in a GIS environment. This type of environment is suited to both the input grid creation and the visualization of results. The data used for input grid creation can be stored locally or on a web-server. Kriging interpolation embedded within Python scripts are used to create air temperature, soil temperature, humidity, and precipitation grids, while built-in GIS and existing tools are used to create solar radiation and wind grids. Additional Python scripting is then used to perform model calculations. The final product is a user-friendly and accessible version of the iSNOBAL model, including the ability to easily visualize and interact with model results, all within a web- or desktop-based GIS environment. This environment allows for interactive manipulation of model parameters and visualization of the resulting input grids for the model calculations. Future work is moving towards adapting the model further for use in a 3D gaming engine for improved visualization and interaction.

  18. Hydrological Scenario Using Tools and Applications Available in enviroGRIDS Portal

    NASA Astrophysics Data System (ADS)

    Bacu, V.; Mihon, D.; Stefanut, T.; Rodila, D.; Cau, P.; Manca, S.; Soru, C.; Gorgan, D.

    2012-04-01

    Nowadays the decision makers but also citizens are concerning with the sustainability and vulnerability of land management practices on various aspects and in particular on water quality and quantity in complex watersheds. The Black Sea Catchment is an important watershed in the Central and East Europe. In the FP7 project enviroGRIDS [1] was developed a Web Portal that incorporates different tools and applications focused on geospatial data management, hydrologic model calibration, execution and visualization and training activities. This presentation highlights, from the end-user point of view, the scenario related with hydrological models using the tools and applications available in the enviroGRIDS Web Portal [2]. The development of SWAT (Soil Water Assessment Tool) hydrological models is a well known procedure for the hydrological specialists [3]. Starting from the primary data (information related to weather, soil properties, topography, vegetation, and land management practices of the particular watershed) that are used to develop SWAT hydrological models, to specific reports, about the water quality in the studied watershed, the hydrological specialist will use different applications available in the enviroGRIDS portal. The tools and applications available through the enviroGRIDS portal are not dealing with the building up of the SWAT hydrological models. They are mainly focused on: calibration procedure (gSWAT [4]) - uses the GRID computational infrastructure to speed-up the calibration process; development of specific scenarios (BASHYT [5]) - starts from an already calibrated SWAT hydrological model and defines new scenarios; execution of scenarios (gSWATSim [6]) - executes the scenarios exported from BASHYT; visualization (BASHYT) - displays charts, tables and maps. Each application is built-up as a stack of functional layers. We combine different layers of applications by vertical interoperability in order to build the desired complex functionality. On the other hand, the applications can collaborate at the same architectural levels, which represent the horizontal interoperability. Both the horizontal and vertical interoperability is accomplished by services and by exchanging data. The calibration procedure requires huge computational resources, which are provided by the Grid infrastructure. On the other hand the scenario development through BASHYT requires a flexible way of interaction with the SWAT model in order to easily change the input model. The large user community of SWAT from the enviroGRIDS consortium or outside may greatly benefit from tools and applications related with the calibration process, scenario development and execution from the enviroGRIDS portal. [1]. enviroGRIDS project, http://envirogrids.net/ [2]. Gorgan D., Abbaspour K., Cau P., Bacu V., Mihon D., Giuliani G., Ray N., Lehmann A., Grid Based Data Processing Tools and Applications for Black Sea Catchment Basin. IDAACS 2011 - The 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications 15-17 September 2011, Prague. IEEE Computer Press, pp. 223 - 228 (2011). [3]. Soil and Water Assessment Tool, http://www.brc.tamus.edu/swat/index.html [4]. Bacu V., Mihon D., Rodila D., Stefanut T., Gorgan D., Grid Based Architectural Components for SWAT Model Calibration. HPCS 2011 - International Conference on High Performance Computing and Simulation, 4-8 July, Istanbul, Turkey, ISBN 978-1-61284-381-0, doi: 10.1109/HPCSim.2011.5999824, pp. 193-198 (2011). [5]. Manca S., Soru C., Cau P., Meloni G., Fiori M., A multi model and multiscale, GIS oriented Web framework based on the SWAT model to face issues of water and soil resource vulnerability. Presentation at the 5th International SWAT Conference, August 3-7, 2009, http://www.brc.tamus.edu/swat/4thswatconf/docs/rooma/session5/Cau-Bashyt.pdf [6]. Bacu V., Mihon D., Stefanut T., Rodila D., Gorgan D., Cau P., Manca S., Grid Based Services and Tools for Hydrological Model Processing and Visualization. SYNASC 2011 - 13 International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (in press).

  19. Repertory Grid Technique in Early Childhood as a Tool for Reflective Conversations in Arts Education.

    ERIC Educational Resources Information Center

    Karppinen, Seija

    The repertory grid is a technique used in psychological and behavioral studies to elicit individuals' personal constructs. This pilot study examined the feasibility of using this technique with young children in early childhood education settings with regard to visual art education, based on the view that listening to, and appreciating children's…

  20. New Multibeam Bathymetry Mosaic at NOAA/NCEI

    NASA Astrophysics Data System (ADS)

    Varner, J. D.; Cartwright, J.; Rosenberg, A. M.; Amante, C.; Sutherland, M.; Jencks, J. H.

    2017-12-01

    NOAA's National Centers for Environmental Information (NCEI) maintains an ever-growing archive of multibeam bathymetric data acquired from U.S. and international government and academic sources. The data are partitioned in the individual survey files in which they were originally received, and are stored in various formats not directly accessible by popular analysis and visualization tools. In order to improve the discoverability and accessibility of the data, NCEI created a new Multibeam Bathymetry Mosaic. Each survey was gridded at 3 arcsecond cell size and organized in an ArcGIS mosaic dataset, which was published as a set of standards-based web services usable in desktop GIS and web clients. In addition to providing a "seamless" grid of all surveys, a filter can be applied to isolate individual surveys. Both depth values in meters and shaded relief visualizations are available. The product represents the current state of the archive; no QA/QC was performed on the data before being incorporated, and the mosaic will be updated incrementally as new surveys are added to the archive. We expect the mosaic will address customer needs for visualization/extraction that existing tools (e.g. NCEI's AutoGrid) are unable to meet, and also assist data managers in identifying problem surveys, missing data, quality control issues, etc. This project complements existing efforts such as the Global Multi-Resolution Topography Data Synthesis (GMRT) at LDEO. Comprehensive visual displays of bathymetric data holdings are invaluable tools for seafloor mapping initiatives, such as Seabed 2030, that will aid in minimizing data collection redundancies and ensuring that valuable data are made available to the broadest community.

  1. MeshVoro: A Three-Dimensional Voronoi Mesh Building Tool for the TOUGH Family of Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, C. M.; Boyle, K. L.; Reagan, M.

    2013-09-30

    Few tools exist for creating and visualizing complex three-dimensional simulation meshes, and these have limitations that restrict their application to particular geometries and circumstances. Mesh generation needs to trend toward ever more general applications. To that end, we have developed MeshVoro, a tool that is based on the Voro (Rycroft 2009) library and is capable of generating complex threedimensional Voronoi tessellation-based (unstructured) meshes for the solution of problems of flow and transport in subsurface geologic media that are addressed by the TOUGH (Pruess et al. 1999) family of codes. MeshVoro, which includes built-in data visualization routines, is a particularly usefulmore » tool because it extends the applicability of the TOUGH family of codes by enabling the scientifically robust and relatively easy discretization of systems with challenging 3D geometries. We describe several applications of MeshVoro. We illustrate the ability of the tool to straightforwardly transform a complex geological grid into a simulation mesh that conforms to the specifications of the TOUGH family of codes. We demonstrate how MeshVoro can describe complex system geometries with a relatively small number of grid blocks, and we construct meshes for geometries that would have been practically intractable with a standard Cartesian grid approach. We also discuss the limitations and appropriate applications of this new technology.« less

  2. Visualization of GPM Standard Products at the Precipitation Processing System (PPS)

    NASA Astrophysics Data System (ADS)

    Kelley, O.

    2010-12-01

    Many of the standard data products for the Global Precipitation Measurement (GPM) constellation of satellites will be generated at and distributed by the Precipitation Processing System (PPS) at NASA Goddard. PPS will provide several means to visualize these data products. These visualization tools will be used internally by PPS analysts to investigate potential anomalies in the data files, and these tools will also be made available to researchers. Currently, a free data viewer called THOR, the Tool for High-resolution Observation Review, can be downloaded and installed on Linux, Windows, and Mac OS X systems. THOR can display swath and grid products, and to a limited degree, the low-level data packets that the satellite itself transmits to the ground system. Observations collected since the 1997 launch of the Tropical Rainfall Measuring Mission (TRMM) satellite can be downloaded from the PPS FTP archive, and in the future, many of the GPM standard products will also be available from this FTP site. To provide easy access to this 80 terabyte and growing archive, PPS currently operates an on-line ordering tool called STORM that provides geographic and time searches, browse-image display, and the ability to order user-specified subsets of standard data files. Prior to the anticipated 2013 launch of the GPM core satellite, PPS will expand its visualization tools by integrating an on-line version of THOR within STORM to provide on-the-fly image creation of any portion of an archived data file at a user-specified degree of magnification. PPS will also provide OpenDAP access to the data archive and OGC WMS image creation of both swath and gridded data products. During the GPM era, PPS will continue to provide realtime globally-gridded 3-hour rainfall estimates to the public in a compact binary format (3B42RT) and in a GIS format (2-byte TIFF images + ESRI WorldFiles).

  3. Singularity classification as a design tool for multiblock grids

    NASA Technical Reports Server (NTRS)

    Jones, Alan K.

    1992-01-01

    A major stumbling block in interactive design of 3-D multiblock grids is the difficulty of visualizing the design as a whole. One way to make this visualization task easier is to focus, at least in early design stages, on an aspect of the grid which is inherently easy to present graphically, and to conceptualize mentally, namely the nature and location of singularities in the grid. The topological behavior of a multiblock grid design is determined by what happens at its edges and vertices. Only a few of these are in any way exceptional. The exceptional behaviors lie along a singularity graph, which is a 1-D construct embedded in 3-D space. The varieties of singular behavior are limited enough to make useful symbology on a graphics device possible. Furthermore, some forms of block design manipulation that appear appropriate to the early conceptual-modeling phase can be accomplished on this level of abstraction. An overview of a proposed singularity classification scheme and selected examples of corresponding manipulation techniques is presented.

  4. Association rule mining on grid monitoring data to detect error sources

    NASA Astrophysics Data System (ADS)

    Maier, Gerhild; Schiffers, Michael; Kranzlmueller, Dieter; Gaidioz, Benjamin

    2010-04-01

    Error handling is a crucial task in an infrastructure as complex as a grid. There are several monitoring tools put in place, which report failing grid jobs including exit codes. However, the exit codes do not always denote the actual fault, which caused the job failure. Human time and knowledge is required to manually trace back errors to the real fault underlying an error. We perform association rule mining on grid job monitoring data to automatically retrieve knowledge about the grid components' behavior by taking dependencies between grid job characteristics into account. Therewith, problematic grid components are located automatically and this information - expressed by association rules - is visualized in a web interface. This work achieves a decrease in time for fault recovery and yields an improvement of a grid's reliability.

  5. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability andmore » accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.« less

  6. Automatic structured grid generation using Gridgen (some restrictions apply)

    NASA Technical Reports Server (NTRS)

    Chawner, John R.; Steinbrenner, John P.

    1995-01-01

    The authors have noticed in the recent grid generation literature an emphasis on the automation of structured grid generation. The motivation behind such work is clear; grid generation is easily the most despised task in the grid-analyze-visualize triad of computational analysis (CA). However, because grid generation is closely coupled to both the design and analysis software and because quantitative measures of grid quality are lacking, 'push button' grid generation usually results in a compromise between speed, control, and quality. Overt emphasis on automation obscures the substantive issues of providing users with flexible tools for generating and modifying high quality grids in a design environment. In support of this paper's tongue-in-cheek title, many features of the Gridgen software are described. Gridgen is by no stretch of the imagination an automatic grid generator. Despite this fact, the code does utilize many automation techniques that permit interesting regenerative features.

  7. Geoscience data visualization and analysis using GeoMapApp

    NASA Astrophysics Data System (ADS)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D visualizations into common formats including grids, images, text files, spreadsheets, etc. Examples of interdisciplinary investigations that make use of GeoMapApp visualization and analysis functionality will be provided.

  8. AWE: Aviation Weather Data Visualization

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Lodha, Suresh K.

    2001-01-01

    The two official sources for aviation weather reports both require the pilot to mentally visualize the provided information. In contrast, our system, Aviation Weather Environment (AWE) presents aviation specific weather available to pilots in an easy to visualize form. We start with a computer-generated textual briefing for a specific area. We map this briefing onto a grid specific to the pilot's route that includes only information relevant to his flight route that includes only information relevant to his flight as defined by route, altitude, true airspeed, and proposed departure time. By modifying various parameters, the pilot can use AWE as a planning tool as well as a weather briefing tool.

  9. Evaluation of Statistical Methodologies Used in U. S. Army Ordnance and Explosive Work

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, G

    2000-02-14

    Oak Ridge National Laboratory was tasked by the U.S. Army Engineering and Support Center (Huntsville, AL) to evaluate the mathematical basis of existing software tools used to assist the Army with the characterization of sites potentially contaminated with unexploded ordnance (UXO). These software tools are collectively known as SiteStats/GridStats. The first purpose of the software is to guide sampling of underground anomalies to estimate a site's UXO density. The second purpose is to delineate areas of homogeneous UXO density that can be used in the formulation of response actions. It was found that SiteStats/GridStats does adequately guide the sampling somore » that the UXO density estimator for a sector is unbiased. However, the software's techniques for delineation of homogeneous areas perform less well than visual inspection, which is frequently used to override the software in the overall sectorization methodology. The main problems with the software lie in the criteria used to detect nonhomogeneity and those used to recommend the number of homogeneous subareas. SiteStats/GridStats is not a decision-making tool in the classical sense. Although it does provide information to decision makers, it does not require a decision based on that information. SiteStats/GridStats provides information that is supplemented by visual inspections, land-use plans, and risk estimates prior to making any decisions. Although the sector UXO density estimator is unbiased regardless of UXO density variation within a sector, its variability increases with increased sector density variation. For this reason, the current practice of visual inspection of individual sampled grid densities (as provided by Site-Stats/GridStats) is necessary to ensure approximate homogeneity, particularly at sites with medium to high UXO density. Together with Site-Stats/GridStats override capabilities, this provides a sufficient mechanism for homogeneous sectorization and thus yields representative UXO density estimates. Objections raised by various parties to the use of a numerical ''discriminator'' in SiteStats/GridStats were likely because of the fact that the concerned statistical technique is customarily applied for a different purpose and because of poor documentation. The ''discriminator'', in Site-Stats/GridStats is a ''tuning parameter'' for the sampling process, and it affects the precision of the grid density estimates through changes in required sample size. It is recommended that sector characterization in terms of a map showing contour lines of constant UXO density with an expressed uncertainty or confidence level is a better basis for remediation decisions than a sector UXO density point estimate. A number of spatial density estimation techniques could be adapted to the UXO density estimation problem.« less

  10. Early detection of glaucoma by means of a novel 3D computer‐automated visual field test

    PubMed Central

    Nazemi, Paul P; Fink, Wolfgang; Sadun, Alfredo A; Francis, Brian; Minckler, Donald

    2007-01-01

    Purpose A recently devised 3D computer‐automated threshold Amsler grid test was used to identify early and distinctive defects in people with suspected glaucoma. Further, the location, shape and depth of these field defects were characterised. Finally, the visual fields were compared with those obtained by standard automated perimetry. Patients and methods Glaucoma suspects were defined as those having elevated intraocular pressure (>21 mm Hg) or cup‐to‐disc ratio of >0.5. 33 patients and 66 eyes with risk factors for glaucoma were examined. 15 patients and 23 eyes with no risk factors were tested as controls. The recently developed 3D computer‐automated threshold Amsler grid test was used. The test exhibits a grid on a computer screen at a preselected greyscale and angular resolution, and allows patients to trace those areas on the grid that are missing in their visual field using a touch screen. The 5‐minute test required that the patients repeatedly outline scotomas on a touch screen with varied displays of contrast while maintaining their gaze on a central fixation marker. A 3D depiction of the visual field defects was then obtained that was further characterised by the location, shape and depth of the scotomas. The exam was repeated three times per eye. The results were compared to Humphrey visual field tests (ie, achromatic standard or SITA standard 30‐2 or 24‐2). Results In this pilot study 79% of the eyes tested in the glaucoma‐suspect group repeatedly demonstrated visual field loss with the 3D perimetry. The 3D depictions of visual field loss associated with these risk factors were all characteristic of or compatible with glaucoma. 71% of the eyes demonstrated arcuate defects or a nasal step. Constricted visual fields were shown in 29% of the eyes. No visual field changes were detected in the control group. Conclusions The 3D computer‐automated threshold Amsler grid test may demonstrate visual field abnormalities characteristic of glaucoma in glaucoma suspects with normal achromatic Humphrey visual field testing. This test may be used as a screening tool for the early detection of glaucoma. PMID:17504855

  11. VisIVO: A Library and Integrated Tools for Large Astrophysical Dataset Exploration

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Costa, A.; Ersotelos, N.; Krokos, M.; Massimino, P.; Petta, C.; Vitello, F.

    2012-09-01

    VisIVO provides an integrated suite of tools and services that can be used in many scientific fields. VisIVO development starts in the Virtual Observatory framework. VisIVO allows users to visualize meaningfully highly-complex, large-scale datasets and create movies of these visualizations based on distributed infrastructures. VisIVO supports high-performance, multi-dimensional visualization of large-scale astrophysical datasets. Users can rapidly obtain meaningful visualizations while preserving full and intuitive control of the relevant parameters. VisIVO consists of VisIVO Desktop - a stand-alone application for interactive visualization on standard PCs, VisIVO Server - a platform for high performance visualization, VisIVO Web - a custom designed web portal, VisIVOSmartphone - an application to exploit the VisIVO Server functionality and the latest VisIVO features: VisIVO Library allows a job running on a computational system (grid, HPC, etc.) to produce movies directly with the code internal data arrays without the need to produce intermediate files. This is particularly important when running on large computational facilities, where the user wants to have a look at the results during the data production phase. For example, in grid computing facilities, images can be produced directly in the grid catalogue while the user code is running in a system that cannot be directly accessed by the user (a worker node). The deployment of VisIVO on the DG and gLite is carried out with the support of EDGI and EGI-Inspire projects. Depending on the structure and size of datasets under consideration, the data exploration process could take several hours of CPU for creating customized views and the production of movies could potentially last several days. For this reason an MPI parallel version of VisIVO could play a fundamental role in increasing performance, e.g. it could be automatically deployed on nodes that are MPI aware. A central concept in our development is thus to produce unified code that can run either on serial nodes or in parallel by using HPC oriented grid nodes. Another important aspect, to obtain as high performance as possible, is the integration of VisIVO processes with grid nodes where GPUs are available. We have selected CUDA for implementing a range of computationally heavy modules. VisIVO is supported by EGI-Inspire, EDGI and SCI-BUS projects.

  12. Cyberinfrastructure for End-to-End Environmental Explorations

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Kumar, S.; Song, C.; Zhao, L.; Govindaraju, R.; Niyogi, D.

    2007-12-01

    The design and implementation of a cyberinfrastructure for End-to-End Environmental Exploration (C4E4) is presented. The C4E4 framework addresses the need for an integrated data/computation platform for studying broad environmental impacts by combining heterogeneous data resources with state-of-the-art modeling and visualization tools. With Purdue being a TeraGrid Resource Provider, C4E4 builds on top of the Purdue TeraGrid data management system and Grid resources, and integrates them through a service-oriented workflow system. It allows researchers to construct environmental workflows for data discovery, access, transformation, modeling, and visualization. Using the C4E4 framework, we have implemented an end-to-end SWAT simulation and analysis workflow that connects our TeraGrid data and computation resources. It enables researchers to conduct comprehensive studies on the impact of land management practices in the St. Joseph watershed using data from various sources in hydrologic, atmospheric, agricultural, and other related disciplines.

  13. The Overgrid Interface for Computational Simulations on Overset Grids

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    Computational simulations using overset grids typically involve multiple steps and a variety of software modules. A graphical interface called OVERGRID has been specially designed for such purposes. Data required and created by the different steps include geometry, grids, domain connectivity information and flow solver input parameters. The interface provides a unified environment for the visualization, processing, generation and diagnosis of such data. General modules are available for the manipulation of structured grids and unstructured surface triangulations. Modules more specific for the overset approach include surface curve generators, hyperbolic and algebraic surface grid generators, a hyperbolic volume grid generator, Cartesian box grid generators, and domain connectivity: pre-processing tools. An interface provides automatic selection and viewing of flow solver boundary conditions, and various other flow solver inputs. For problems involving multiple components in relative motion, a module is available to build the component/grid relationships and to prescribe and animate the dynamics of the different components.

  14. AstroGrid: Taverna in the Virtual Observatory .

    NASA Astrophysics Data System (ADS)

    Benson, K. M.; Walton, N. A.

    This paper reports on the implementation of the Taverna workbench by AstroGrid, a tool for designing and executing workflows of tasks in the Virtual Observatory. The workflow approach helps astronomers perform complex task sequences with little technical effort. Visual approach to workflow construction streamlines highly complex analysis over public and private data and uses computational resources as minimal as a desktop computer. Some integration issues and future work are discussed in this article.

  15. Grid Visualization Tool

    NASA Technical Reports Server (NTRS)

    Chouinard, Caroline; Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steven

    2005-01-01

    The Grid Visualization Tool (GVT) is a computer program for displaying the path of a mobile robotic explorer (rover) on a terrain map. The GVT reads a map-data file in either portable graymap (PGM) or portable pixmap (PPM) format, representing a gray-scale or color map image, respectively. The GVT also accepts input from path-planning and activity-planning software. From these inputs, the GVT generates a map overlaid with one or more rover path(s), waypoints, locations of targets to be explored, and/or target-status information (indicating success or failure in exploring each target). The display can also indicate different types of paths or path segments, such as the path actually traveled versus a planned path or the path traveled to the present position versus planned future movement along a path. The program provides for updating of the display in real time to facilitate visualization of progress. The size of the display and the map scale can be changed as desired by the user. The GVT was written in the C++ language using the Open Graphics Library (OpenGL) software. It has been compiled for both Sun Solaris and Linux operating systems.

  16. Monitoring Areal Snow Cover Using NASA Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Harshburger, Brian J.; Blandford, Troy; Moore, Brandon

    2011-01-01

    The objective of this project is to develop products and tools to assist in the hydrologic modeling process, including tools to help prepare inputs for hydrologic models and improved methods for the visualization of streamflow forecasts. In addition, this project will facilitate the use of NASA satellite imagery (primarily snow cover imagery) by other federal and state agencies with operational streamflow forecasting responsibilities. A GIS software toolkit for monitoring areal snow cover extent and producing streamflow forecasts is being developed. This toolkit will be packaged as multiple extensions for ArcGIS 9.x and an opensource GIS software package. The toolkit will provide users with a means for ingesting NASA EOS satellite imagery (snow cover analysis), preparing hydrologic model inputs, and visualizing streamflow forecasts. Primary products include a software tool for predicting the presence of snow under clouds in satellite images; a software tool for producing gridded temperature and precipitation forecasts; and a suite of tools for visualizing hydrologic model forecasting results. The toolkit will be an expert system designed for operational users that need to generate accurate streamflow forecasts in a timely manner. The Remote Sensing of Snow Cover Toolbar will ingest snow cover imagery from multiple sources, including the MODIS Operational Snowcover Data and convert them to gridded datasets that can be readily used. Statistical techniques will then be applied to the gridded snow cover data to predict the presence of snow under cloud cover. The toolbar has the ability to ingest both binary and fractional snow cover data. Binary mapping techniques use a set of thresholds to determine whether a pixel contains snow or no snow. Fractional mapping techniques provide information regarding the percentage of each pixel that is covered with snow. After the imagery has been ingested, physiographic data is attached to each cell in the snow cover image. This data can be obtained from a digital elevation model (DEM) for the area of interest.

  17. Exploratory Climate Data Visualization and Analysis Using DV3D and UVCDAT

    NASA Technical Reports Server (NTRS)

    Maxwell, Thomas

    2012-01-01

    Earth system scientists are being inundated by an explosion of data generated by ever-increasing resolution in both global models and remote sensors. Advanced tools for accessing, analyzing, and visualizing very large and complex climate data are required to maintain rapid progress in Earth system research. To meet this need, NASA, in collaboration with the Ultra-scale Visualization Climate Data Analysis Tools (UVCOAT) consortium, is developing exploratory climate data analysis and visualization tools which provide data analysis capabilities for the Earth System Grid (ESG). This paper describes DV3D, a UV-COAT package that enables exploratory analysis of climate simulation and observation datasets. OV3D provides user-friendly interfaces for visualization and analysis of climate data at a level appropriate for scientists. It features workflow inte rfaces, interactive 40 data exploration, hyperwall and stereo visualization, automated provenance generation, and parallel task execution. DV30's integration with CDAT's climate data management system (COMS) and other climate data analysis tools provides a wide range of high performance climate data analysis operations. DV3D expands the scientists' toolbox by incorporating a suite of rich new exploratory visualization and analysis methods for addressing the complexity of climate datasets.

  18. An Advanced User Interface Approach for Complex Parameter Study Process Specification in the Information Power Grid

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; McCann, Karen M.; Biswas, Rupak; VanderWijngaart, Rob; Yan, Jerry C. (Technical Monitor)

    2000-01-01

    The creation of parameter study suites has recently become a more challenging problem as the parameter studies have now become multi-tiered and the computational environment has become a supercomputer grid. The parameter spaces are vast, the individual problem sizes are getting larger, and researchers are now seeking to combine several successive stages of parameterization and computation. Simultaneously, grid-based computing offers great resource opportunity but at the expense of great difficulty of use. We present an approach to this problem which stresses intuitive visual design tools for parameter study creation and complex process specification, and also offers programming-free access to grid-based supercomputer resources and process automation.

  19. Geographic Visualization of Power-Grid Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R.

    2015-06-18

    The visualization enables the simulation analyst to see changes in the frequency through time and space. With this technology, the analyst has a bird's eye view of the frequency at loads and generators as the simulated power system responds to the loss of a generator, spikes in load, and other contingencies. The significance of a contingency to the operation of an electrical power system depends critically on how the resulting tansients evolve in time and space. Consequently, these dynamic events can only be understood when seen in their proper geographic context. this understanding is indispensable to engineers working on themore » next generation of distributed sensing and control systems for the smart grid. By making possible a natural and intuitive presentation of dynamic behavior, our new visualization technology is a situational-awareness tool for power-system engineers.« less

  20. Synergy Between Archives, VO, and the Grid at ESAC

    NASA Astrophysics Data System (ADS)

    Arviset, C.; Alvarez, R.; Gabriel, C.; Osuna, P.; Ott, S.

    2011-07-01

    Over the years, in support to the Science Operations Centers at ESAC, we have set up two Grid infrastructures. These have been built: 1) to facilitate daily research for scientists at ESAC, 2) to provide high computing capabilities for project data processing pipelines (e.g., Herschel), 3) to support science operations activities (e.g., calibration monitoring). Furthermore, closer collaboration between the science archives, the Virtual Observatory (VO) and data processing activities has led to an other Grid use case: the Remote Interface to XMM-Newton SAS Analysis (RISA). This web service-based system allows users to launch SAS tasks transparently to the GRID, save results on http-based storage and visualize them through VO tools. This paper presents real and operational use cases of Grid usages in these contexts

  1. Wide-area situation awareness in electric power grid

    NASA Astrophysics Data System (ADS)

    Greitzer, Frank L.

    2010-04-01

    Two primary elements of the US energy policy are demand management and efficiency and renewable sources. Major objectives are clean energy transmission and integration, reliable energy transmission, and grid cyber security. Development of the Smart Grid seeks to achieve these goals by lowering energy costs for consumers, achieving energy independence and reducing greenhouse gas emissions. The Smart Grid is expected to enable real time wide-area situation awareness (SA) for operators. Requirements for wide-area SA have been identified among interoperability standards proposed by the Federal Energy Regulatory Commission and the National Institute of Standards and Technology to ensure smart-grid functionality. Wide-area SA and enhanced decision support and visualization tools are key elements in the transformation to the Smart Grid. This paper discusses human factors research to promote SA in the electric power grid and the Smart Grid. Topics that will be discussed include the role of human factors in meeting US energy policy goals, the impact and challenges for Smart Grid development, and cyber security challenges.

  2. Discovery of Marine Datasets and Geospatial Metadata Visualization

    NASA Astrophysics Data System (ADS)

    Schwehr, K. D.; Brennan, R. T.; Sellars, J.; Smith, S.

    2009-12-01

    NOAA's National Geophysical Data Center (NGDC) provides the deep archive of US multibeam sonar hydrographic surveys. NOAA stores the data as Bathymetric Attributed Grids (BAG; http://www.opennavsurf.org/) that are HDF5 formatted files containing gridded bathymetry, gridded uncertainty, and XML metadata. While NGDC provides the deep store and a basic ERSI ArcIMS interface to the data, additional tools need to be created to increase the frequency with which researchers discover hydrographic surveys that might be beneficial for their research. Using Open Source tools, we have created a draft of a Google Earth visualization of NOAA's complete collection of BAG files as of March 2009. Each survey is represented as a bounding box, an optional preview image of the survey data, and a pop up placemark. The placemark contains a brief summary of the metadata and links to directly download of the BAG survey files and the complete metadata file. Each survey is time tagged so that users can search both in space and time for surveys that meet their needs. By creating this visualization, we aim to make the entire process of data discovery, validation of relevance, and download much more efficient for research scientists who may not be familiar with NOAA's hydrographic survey efforts or the BAG format. In the process of creating this demonstration, we have identified a number of improvements that can be made to the hydrographic survey process in order to make the results easier to use especially with respect to metadata generation. With the combination of the NGDC deep archiving infrastructure, a Google Earth virtual globe visualization, and GeoRSS feeds of updates, we hope to increase the utilization of these high-quality gridded bathymetry. This workflow applies equally well to LIDAR topography and bathymetry. Additionally, with proper referencing and geotagging in journal publications, we hope to close the loop and help the community create a true “Geospatial Scholar” infrastructure.

  3. ESMPy and OpenClimateGIS: Python Interfaces for High Performance Grid Remapping and Geospatial Dataset Manipulation

    NASA Astrophysics Data System (ADS)

    O'Kuinghttons, Ryan; Koziol, Benjamin; Oehmke, Robert; DeLuca, Cecelia; Theurich, Gerhard; Li, Peggy; Jacob, Joseph

    2016-04-01

    The Earth System Modeling Framework (ESMF) Python interface (ESMPy) supports analysis and visualization in Earth system modeling codes by providing access to a variety of tools for data manipulation. ESMPy started as a Python interface to the ESMF grid remapping package, which provides mature and robust high-performance and scalable grid remapping between 2D and 3D logically rectangular and unstructured grids and sets of unconnected data. ESMPy now also interfaces with OpenClimateGIS (OCGIS), a package that performs subsetting, reformatting, and computational operations on climate datasets. ESMPy exposes a subset of ESMF grid remapping utilities. This includes bilinear, finite element patch recovery, first-order conservative, and nearest neighbor grid remapping methods. There are also options to ignore unmapped destination points, mask points on source and destination grids, and provide grid structure in the polar regions. Grid remapping on the sphere takes place in 3D Cartesian space, so the pole problem is not an issue as it can be with other grid remapping software. Remapping can be done between any combination of 2D and 3D logically rectangular and unstructured grids with overlapping domains. Grid pairs where one side of the regridding is represented by an appropriate set of unconnected data points, as is commonly found with observational data streams, is also supported. There is a developing interoperability layer between ESMPy and OpenClimateGIS (OCGIS). OCGIS is a pure Python, open source package designed for geospatial manipulation, subsetting, and computation on climate datasets stored in local NetCDF files or accessible remotely via the OPeNDAP protocol. Interfacing with OCGIS has brought GIS-like functionality to ESMPy (i.e. subsetting, coordinate transformations) as well as additional file output formats (i.e. CSV, ESRI Shapefile). ESMPy is distinguished by its strong emphasis on open source, community governance, and distributed development. The user base has grown quickly, and the package is integrating with several other software tools and frameworks. These include the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT), Iris, PyFerret, cfpython, and the Community Surface Dynamics Modeling System (CSDMS). ESMPy minimum requirements include Python 2.6, Numpy 1.6.1 and an ESMF installation. Optional dependencies include NetCDF and OCGIS-related dependencies: GDAL, Shapely, and Fiona. ESMPy is regression tested nightly, and supported on Darwin, Linux and Cray systems with the GNU compiler suite and MPI communications. OCGIS is supported on Linux, and also undergoes nightly regression testing. Both packages are installable from Anaconda channels. Upcoming development plans for ESMPy involve development of a higher order conservative grid remapping method. Future OCGIS development will focus on mesh and location stream interoperability and streamlined access to ESMPy's MPI implementation.

  4. Graphic Representations as Tools for Decision Making.

    ERIC Educational Resources Information Center

    Howard, Judith

    2001-01-01

    Focuses on the use of graphic representations to enable students to improve their decision making skills in the social studies. Explores three visual aids used in assisting students with decision making: (1) the force field; (2) the decision tree; and (3) the decision making grid. (CMK)

  5. Code IN Exhibits - Supercomputing 2000

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; McCann, Karen M.; Biswas, Rupak; VanderWijngaart, Rob F.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    The creation of parameter study suites has recently become a more challenging problem as the parameter studies have become multi-tiered and the computational environment has become a supercomputer grid. The parameter spaces are vast, the individual problem sizes are getting larger, and researchers are seeking to combine several successive stages of parameterization and computation. Simultaneously, grid-based computing offers immense resource opportunities but at the expense of great difficulty of use. We present ILab, an advanced graphical user interface approach to this problem. Our novel strategy stresses intuitive visual design tools for parameter study creation and complex process specification, and also offers programming-free access to grid-based supercomputer resources and process automation.

  6. Remote Visualization and Remote Collaboration On Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Watson, Val; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    A new technology has been developed for remote visualization that provides remote, 3D, high resolution, dynamic, interactive viewing of scientific data (such as fluid dynamics simulations or measurements). Based on this technology, some World Wide Web sites on the Internet are providing fluid dynamics data for educational or testing purposes. This technology is also being used for remote collaboration in joint university, industry, and NASA projects in computational fluid dynamics and wind tunnel testing. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewer's local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit).

  7. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Emma; Kiliccote, Sila; McParland, Charles

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation.more » Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve/ensure accuracy, providing information on normally estimated values such as underground conductor impedance, and characterization of complex loads. Although the input of high-fidelity data to existing tools will be challenging, µPMU data on phase angle (as well as other data from advanced sensors) will be useful for basic operational decisions that are based on a trend of changing data.« less

  8. Exploring New Methods of Displaying Bit-Level Quality and Other Flags for MODIS Data

    NASA Technical Reports Server (NTRS)

    Khalsa, Siri Jodha Singh; Weaver, Ron

    2003-01-01

    The NASA Distributed Active Archive Center (DAAC) at the National Snow and Ice Data Center (NSIDC) archives and distributes snow and sea ice products derived from the MODerate resolution Imaging Spectroradiometer (MODIS) on board NASA's Terra and Aqua satellites. All MODIS standard products are in the Earth Observing System version of the Hierarchal Data Format (HDF-EOS). The MODIS science team has packed a wealth of information into each HDF-EOS file. In addition to the science data arrays containing the geophysical product, there are often pixel-level Quality Assurance arrays which are important for understanding and interpreting the science data. Currently, researchers are limited in their ability to access and decode information stored as individual bits in many of the MODIS science products. Commercial and public domain utilities give users access, in varying degrees, to the elements inside MODIS HDF-EOS files. However, when attempting to visualize the data, users are confronted with the fact that many of the elements actually represent eight different 1-bit arrays packed into a single byte array. This project addressed the need for researchers to access bit-level information inside MODIS data files. In an previous NASA-funded project (ESDIS Prototype ID 50.0) we developed a visualization tool tailored to polar gridded HDF-EOS data set. This tool,called the Polar researchers to access, geolocate, visualize, and subset data that originate from different sources and have different spatial resolutions but which are placed on a common polar grid. The bit-level visualization function developed under this project was added to PHDIS, resulting in a versatile tool that serves a variety of needs. We call this the EOS Imaging Tool.

  9. Web-based visualization of gridded dataset usings OceanBrowser

    NASA Astrophysics Data System (ADS)

    Barth, Alexander; Watelet, Sylvain; Troupin, Charles; Beckers, Jean-Marie

    2015-04-01

    OceanBrowser is a web-based visualization tool for gridded oceanographic data sets. Those data sets are typically four-dimensional (longitude, latitude, depth and time). OceanBrowser allows one to visualize horizontal sections at a given depth and time to examine the horizontal distribution of a given variable. It also offers the possibility to display the results on an arbitrary vertical section. To study the evolution of the variable in time, the horizontal and vertical sections can also be animated. Vertical section can be generated by using a fixed distance from coast or fixed ocean depth. The user can customize the plot by changing the color-map, the range of the color-bar, the type of the plot (linearly interpolated color, simple contours, filled contours) and download the current view as a simple image or as Keyhole Markup Language (KML) file for visualization in applications such as Google Earth. The data products can also be accessed as NetCDF files and through OPeNDAP. Third-party layers from a web map service can also be integrated. OceanBrowser is used in the frame of the SeaDataNet project (http://gher-diva.phys.ulg.ac.be/web-vis/) and EMODNET Chemistry (http://oceanbrowser.net/emodnet/) to distribute gridded data sets interpolated from in situ observation using DIVA (Data-Interpolating Variational Analysis).

  10. 2014 Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Conference Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.

    2015-01-27

    The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities.more » The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.« less

  11. Differences in Visual-Spatial Input May Underlie Different Compression Properties of Firing Fields for Grid Cell Modules in Medial Entorhinal Cortex

    PubMed Central

    Raudies, Florian; Hasselmo, Michael E.

    2015-01-01

    Firing fields of grid cells in medial entorhinal cortex show compression or expansion after manipulations of the location of environmental barriers. This compression or expansion could be selective for individual grid cell modules with particular properties of spatial scaling. We present a model for differences in the response of modules to barrier location that arise from different mechanisms for the influence of visual features on the computation of location that drives grid cell firing patterns. These differences could arise from differences in the position of visual features within the visual field. When location was computed from the movement of visual features on the ground plane (optic flow) in the ventral visual field, this resulted in grid cell spatial firing that was not sensitive to barrier location in modules modeled with small spacing between grid cell firing fields. In contrast, when location was computed from static visual features on walls of barriers, i.e. in the more dorsal visual field, this resulted in grid cell spatial firing that compressed or expanded based on the barrier locations in modules modeled with large spacing between grid cell firing fields. This indicates that different grid cell modules might have differential properties for computing location based on visual cues, or the spatial radius of sensitivity to visual cues might differ between modules. PMID:26584432

  12. Collaboration tools and techniques for large model datasets

    USGS Publications Warehouse

    Signell, R.P.; Carniel, S.; Chiggiato, J.; Janekovic, I.; Pullen, J.; Sherwood, C.R.

    2008-01-01

    In MREA and many other marine applications, it is common to have multiple models running with different grids, run by different institutions. Techniques and tools are described for low-bandwidth delivery of data from large multidimensional datasets, such as those from meteorological and oceanographic models, directly into generic analysis and visualization tools. Output is stored using the NetCDF CF Metadata Conventions, and then delivered to collaborators over the web via OPeNDAP. OPeNDAP datasets served by different institutions are then organized via THREDDS catalogs. Tools and procedures are then used which enable scientists to explore data on the original model grids using tools they are familiar with. It is also low-bandwidth, enabling users to extract just the data they require, an important feature for access from ship or remote areas. The entire implementation is simple enough to be handled by modelers working with their webmasters - no advanced programming support is necessary. ?? 2007 Elsevier B.V. All rights reserved.

  13. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  14. Time-dependent transition density matrix for visualizing charge-transfer excitations in photoexcited organic donor-acceptor systems

    NASA Astrophysics Data System (ADS)

    Li, Yonghui; Ullrich, Carsten

    2013-03-01

    The time-dependent transition density matrix (TDM) is a useful tool to visualize and interpret the induced charges and electron-hole coherences of excitonic processes in large molecules. Combined with time-dependent density functional theory on a real-space grid (as implemented in the octopus code), the TDM is a computationally viable visualization tool for optical excitation processes in molecules. It provides real-time maps of particles and holes which gives information on excitations, in particular those that have charge-transfer character, that cannot be obtained from the density alone. Some illustration of the TDM and comparison with standard density difference plots will be shown for photoexcited organic donor-acceptor molecules. This work is supported by NSF Grant DMR-1005651

  15. NASTRAN data generation and management using interactive graphics

    NASA Technical Reports Server (NTRS)

    Smootkatow, M.; Cooper, B. M.

    1972-01-01

    A method of using an interactive graphics device to generate a large portion of the input bulk data with visual checks of the structure and the card images is described. The generation starts from GRID and PBAR cards. The visual checks result from a three-dimensional display of the model in any rotated position. By detailing the steps, the time saving and cost effectiveness of this method may be judged, and its potential as a useful tool for the structural analyst may be established.

  16. CRT--Cascade Routing Tool to define and visualize flow paths for grid-based watershed models

    USGS Publications Warehouse

    Henson, Wesley R.; Medina, Rose L.; Mayers, C. Justin; Niswonger, Richard G.; Regan, R.S.

    2013-01-01

    The U.S. Geological Survey Cascade Routing Tool (CRT) is a computer application for watershed models that include the coupled Groundwater and Surface-water FLOW model, GSFLOW, and the Precipitation-Runoff Modeling System (PRMS). CRT generates output to define cascading surface and shallow subsurface flow paths for grid-based model domains. CRT requires a land-surface elevation for each hydrologic response unit (HRU) of the model grid; these elevations can be derived from a Digital Elevation Model raster data set of the area containing the model domain. Additionally, a list is required of the HRUs containing streams, swales, lakes, and other cascade termination features along with indices that uniquely define these features. Cascade flow paths are determined from the altitudes of each HRU. Cascade paths can cross any of the four faces of an HRU to a stream or to a lake within or adjacent to an HRU. Cascades can terminate at a stream, lake, or HRU that has been designated as a watershed outflow location.

  17. PLOT3D Export Tool for Tecplot

    NASA Technical Reports Server (NTRS)

    Alter, Stephen

    2010-01-01

    The PLOT3D export tool for Tecplot solves the problem of modified data being impossible to output for use by another computational science solver. The PLOT3D Exporter add-on enables the use of the most commonly available visualization tools to engineers for output of a standard format. The exportation of PLOT3D data from Tecplot has far reaching effects because it allows for grid and solution manipulation within a graphical user interface (GUI) that is easily customized with macro language-based and user-developed GUIs. The add-on also enables the use of Tecplot as an interpolation tool for solution conversion between different grids of different types. This one add-on enhances the functionality of Tecplot so significantly, it offers the ability to incorporate Tecplot into a general suite of tools for computational science applications as a 3D graphics engine for visualization of all data. Within the PLOT3D Export Add-on are several functions that enhance the operations and effectiveness of the add-on. Unlike Tecplot output functions, the PLOT3D Export Add-on enables the use of the zone selection dialog in Tecplot to choose which zones are to be written by offering three distinct options - output of active, inactive, or all zones (grid blocks). As the user modifies the zones to output with the zone selection dialog, the zones to be written are similarly updated. This enables the use of Tecplot to create multiple configurations of a geometry being analyzed. For example, if an aircraft is loaded with multiple deflections of flaps, by activating and deactivating different zones for a specific flap setting, new specific configurations of that aircraft can be easily generated by only writing out specific zones. Thus, if ten flap settings are loaded into Tecplot, the PLOT3D Export software can output ten different configurations, one for each flap setting.

  18. Structured grid technology to enable flow simulation in an integrated system environment

    NASA Astrophysics Data System (ADS)

    Remotigue, Michael Gerard

    An application-driven Computational Fluid Dynamics (CFD) environment needs flexible and general tools to effectively solve complex problems in a timely manner. In addition, reusable, portable, and maintainable specialized libraries will aid in rapidly developing integrated systems or procedures. The presented structured grid technology enables the flow simulation for complex geometries by addressing grid generation, grid decomposition/solver setup, solution, and interpretation. Grid generation is accomplished with the graphical, arbitrarily-connected, multi-block structured grid generation software system (GUM-B) developed and presented here. GUM-B is an integrated system comprised of specialized libraries for the graphical user interface and graphical display coupled with a solid-modeling data structure that utilizes a structured grid generation library and a geometric library based on Non-Uniform Rational B-Splines (NURBS). A presented modification of the solid-modeling data structure provides the capability for arbitrarily-connected regions between the grid blocks. The presented grid generation library provides algorithms that are reliable and accurate. GUM-B has been utilized to generate numerous structured grids for complex geometries in hydrodynamics, propulsors, and aerodynamics. The versatility of the libraries that compose GUM-B is also displayed in a prototype to automatically regenerate a grid for a free-surface solution. Grid decomposition and solver setup is accomplished with the graphical grid manipulation and repartition software system (GUMBO) developed and presented here. GUMBO is an integrated system comprised of specialized libraries for the graphical user interface and graphical display coupled with a structured grid-tools library. The described functions within the grid-tools library reduce the possibility of human error during decomposition and setup for the numerical solver by accounting for boundary conditions and connectivity. GUMBO is linked with a flow solver interface, to the parallel UNCLE code, to provide load balancing tools and solver setup. Weeks of boundary condition and connectivity specification and validation has been reduced to hours. The UNCLE flow solver is utilized for the solution of the flow field. To accelerate convergence toward a quick engineering answer, a full multigrid (FMG) approach coupled with UNCLE, which is a full approximation scheme (FAS), is presented. The prolongation operators used in the FMG-FAS method are compared. The procedure is demonstrated on a marine propeller in incompressible flow. Interpretation of the solution is accomplished by vortex feature detection. Regions of "Intrinsic Swirl" are located by interrogating the velocity gradient tensor for complex eigenvalues. The "Intrinsic Swirl" parameter is visualized on a solution of a marine propeller to determine if any vortical features are captured. The libraries and the structured grid technology presented herein are flexible and general enough to tackle a variety of complex applications. This technology has significantly enabled the capability of the ERC personnel to effectively calculate solutions for complex geometries.

  19. The iMeteo is a web-based weather visualization tool

    NASA Astrophysics Data System (ADS)

    Tuni San-Martín, Max; San-Martín, Daniel; Cofiño, Antonio S.

    2010-05-01

    iMeteo is a web-based weather visualization tool. Designed with an extensible J2EE architecture, it is capable of displaying information from heterogeneous data sources such as gridded data from numerical models (in NetCDF format) or databases of local predictions. All this information is presented in a user-friendly way, being able to choose the specific tool to display data (maps, graphs, information tables) and customize it to desired locations. *Modular Display System* Visualization of the data is achieved through a set of mini tools called widgets. A user can add them at will and arrange them around the screen easily with a drag and drop movement. They can be of various types and each can be configured separately, forming a really powerful and configurable system. The "Map" is the most complex widget, since it can show several variables simultaneously (either gridded or point-based) through a layered display. Other useful widgets are the the "Histogram", which generates a graph with the frequency characteristics of a variable and the "Timeline" which shows the time evolution of a variable at a given location in an interactive way. *Customization and security* Following the trends in web development, the user can easily customize the way data is displayed. Due to programming in client side with technologies like AJAX, the interaction with the application is similar to the desktop ones because there are rapid respone times. If a user is registered then he could also save his settings in the database, allowing access from any system with Internet access with his particular setup. There is particular emphasis on application security. The administrator can define a set of user profiles, which may have associated restrictions on access to certain data sources, geographic areas or time intervals.

  20. Synchronized Phasor Data for Analyzing Wind Power Plant Dynamic Behavior and Model Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Y. H.

    2013-01-01

    The U.S. power industry is undertaking several initiatives that will improve the operations of the power grid. One of those is the implementation of 'wide area measurements' using phasor measurement units (PMUs) to dynamically monitor the operations and the status of the network and provide advanced situational awareness and stability assessment. This project seeks to obtain PMU data from wind power plants and grid reference points and develop software tools to analyze and visualize synchrophasor data for the purpose of better understanding wind power plant dynamic behaviors under normal and contingency conditions.

  1. GRIDVIEW: Recent Improvements in Research and Education Software for Exploring Mars Topography

    NASA Technical Reports Server (NTRS)

    Roark, J. H.; Frey, H. V.

    2001-01-01

    We have developed an Interactive Data Language (IDL) scientific visualization software tool called GRIDVIEW that can be used in research and education to explore and study the most recent Mars Orbiter Laser Altimeter (MOLA) gridded topography of Mars (http://denali.gsfc.nasa.gov/mola_pub/gridview). Additional information is contained in the original extended abstract.

  2. A Sensemaking Perspective on Situation Awareness in Power Grid Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.; Schur, Anne; Paget, Mia L.

    2008-07-21

    With increasing complexity and interconnectivity of the electric power grid, the scope and complexity of grid operations continues to grow. New paradigms are needed to guide research to improve operations by enhancing situation awareness of operators. Research on human factors/situation awareness is described within a taxonomy of tools and approaches that address different levels of cognitive processing. While user interface features and visualization approaches represent the predominant focus of human factors studies of situation awareness, this paper argues that a complementary level, sensemaking, deserves further consideration by designers of decision support systems for power grid operations. A sensemaking perspective onmore » situation aware-ness may reveal new insights that complement ongoing human factors research, where the focus of the investigation of errors is to understand why the decision makers experienced the situation the way they did, or why what they saw made sense to them at the time.« less

  3. [Comparison of Preferential Hyperacuity Perimeter (PHP) test and Amsler grid test in the diagnosis of different stages of age-related macular degeneration].

    PubMed

    Kampmeier, J; Zorn, M M; Lang, G K; Botros, Y T; Lang, G E

    2006-09-01

    Age-related macular degeneration (ARMD) is the leading cause of blindness in people over 65 years of age. A rapid loss of vision occurs especially in cases with choroidal neovascularisation. Early detection of ARMD and timely treatment are mandatory. We have prospectively studied the results of two diagnostic self tests for the early detection of metamorphopsia and scotoma, the PHP test and the Amsler grid test, in different stages of ARMD. Patients with ARMD and best corrected visual acuity of 6/30 or better (Snellen charts) were examined with a standardised protocol, including supervised Amsler grid examination and PHP, a new device for metamorphopsia or scotoma measurement, based on the hyperacuity phenomenon in the central 14 degrees of the visual field. The stages of ARMD were independently graded in a masked fashion by stereoscopic ophthalmoscopy, stereoscopic fundus colour photographs, fluorescein angiography, and OCT. The patients were subdivided into 3 non-neovascular groups [early, late (RPE atrophy > 175 microm) and geographic atrophy], a neovascular group (classic and occult CNV) and an age-matched control group (healthy volunteers). 140 patients, with ages ranging from 50 to 90 years (median 68 years), were included in the study. Best corrected visual acuity ranged from 6/30 to 6/6 with a median of 6/12. 95 patients were diagnosed as non-neovascular ARMD. Thirty eyes had early ARMD (9 were tested positive by the PHP test and 9 by the Amsler grid test), and 50 late ARMD (positive: PHP test 23, Amsler grid test 26). The group with geographic atrophy consisted of 15 eyes (positive: PHP test 13, Amsler grid test 10). Forty-five patients presented with neovascular ARMD (positive: PHP test 38, Amsler grid test 36), 34 volunteers served as control group (positive: PHP test 1, Amsler grid test 5). The PHP and Amsler grid tests revealed comparable results detecting metamorphopsia and scotoma in early ARMD (30 vs. 30 %) and late ARMD (46 vs. 52 %). However, the PHP test more often revealed disease-related functional changes in the groups of geographic atrophy (87 vs. 67 %) and neovascular ARMD (84 vs. 80 %). This implies that the PHP and Amsler grid self tests are useful tools for detection of ARMD and that the PHP test has a greater sensitivity in the groups of geographic atrophy and neovascular AMD.

  4. Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Schissel, D. P.

    2004-11-01

    The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.

  5. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  6. Scientific Visualization Made Easy for the Scientist

    NASA Astrophysics Data System (ADS)

    Westerhoff, M.; Henderson, B.

    2002-12-01

    amirar is an application program used in creating 3D visualizations and geometric models of 3D image data sets from various application areas, e.g. medicine, biology, biochemistry, chemistry, physics, and engineering. It has demonstrated significant adoption in the market place since becoming commercially available in 2000. The rapid adoption has expanded the features being requested by the user base and broadened the scope of the amira product offering. The amira product offering includes amira Standard, amiraDevT, used to extend the product capabilities by users, amiraMolT, used for molecular visualization, amiraDeconvT, used to improve quality of image data, and amiraVRT, used in immersive VR environments. amira allows the user to construct a visualization tailored to his or her needs without requiring any programming knowledge. It also allows 3D objects to be represented as grids suitable for numerical simulations, notably as triangular surfaces and volumetric tetrahedral grids. The amira application also provides methods to generate such grids from voxel data representing an image volume, and it includes a general-purpose interactive 3D viewer. amiraDev provides an application-programming interface (API) that allows the user to add new components by C++ programming. amira supports many import formats including a 'raw' format allowing immediate access to your native uniform data sets. amira uses the power and speed of the OpenGLr and Open InventorT graphics libraries and 3D graphics accelerators to allow you to access over 145 modules, enabling you to process, probe, analyze and visualize your data. The amiraMolT extension adds powerful tools for molecular visualization to the existing amira platform. amiraMolT contains support for standard molecular file formats, tools for visualization and analysis of static molecules as well as molecular trajectories (time series). amiraDeconv adds tools for the deconvolution of 3D microscopic images. Deconvolution is the process of increasing image quality and resolution by computationally compensating artifacts of the recording process. amiraDeconv supports 3D wide field microscopy as well as 3D confocal microscopy. It offers both non-blind and blind image deconvolution algorithms. Non-blind deconvolution uses an individual measured point spread function, while non-blind algorithms work on the basis of only a few recording parameters (like numerical aperture or zoom factor). amiraVR is a specialized and extended version of the amira visualization system which is dedicated for use in immersive installations, such as large-screen stereoscopic projections, CAVEr or Holobenchr systems. Among others, it supports multi-threaded multi-pipe rendering, head-tracking, advanced 3D interaction concepts, and 3D menus allowing interaction with any amira object in the same way as on the desktop. With its unique set of features, amiraVR represents both a VR (Virtual Reality) ready application for scientific and medical visualization in immersive environments, and a development platform that allows building VR applications.

  7. RE Data Explorer: Informing Variable Renewable Energy Grid Integration for Low Emission Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sarah L

    The RE Data Explorer, developed by the National Renewable Energy Laboratory, is an innovative web-based analysis tool that utilizes geospatial and spatiotemporal renewable energy data to visualize, execute, and support analysis of renewable energy potential under various user-defined scenarios. This analysis can inform high-level prospecting, integrated planning, and policy making to enable low emission development.

  8. AWE: Aviation Weather Data Visualization Environment

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Lodha, Suresh K.; Norvig, Peter (Technical Monitor)

    2000-01-01

    Weather is one of the major causes of aviation accidents. General aviation (GA) flights account for 92% of all the aviation accidents, In spite of all the official and unofficial sources of weather visualization tools available to pilots, there is an urgent need for visualizing several weather related data tailored for general aviation pilots. Our system, Aviation Weather Data Visualization Environment AWE), presents graphical displays of meteorological observations, terminal area forecasts, and winds aloft forecasts onto a cartographic grid specific to the pilot's area of interest. Decisions regarding the graphical display and design are made based on careful consideration of user needs. Integral visual display of these elements of weather reports is designed for the use of GA pilots as a weather briefing and route selection tool. AWE provides linking of the weather information to the flight's path and schedule. The pilot can interact with the system to obtain aviation-specific weather for the entire area or for his specific route to explore what-if scenarios and make "go/no-go" decisions. The system, as evaluated by some pilots at NASA Ames Research Center, was found to be useful.

  9. SU-E-J-92: CERR: New Tools to Analyze Image Registration Precision.

    PubMed

    Apte, A; Wang, Y; Oh, J; Saleh, Z; Deasy, J

    2012-06-01

    To present new tools in CERR (The Computational Environment for Radiotherapy Research) to analyze image registration and other software updates/additions. CERR continues to be a key environment (cited more than 129 times to date) for numerous RT-research studies involving outcomes modeling, prototyping algorithms for segmentation, and registration, experiments with phantom dosimetry, IMRT research, etc. Image registration is one of the key technologies required in many research studies. CERR has been interfaced with popular image registration frameworks like Plastimatch and ITK. Once the images have been autoregistered, CERR provides tools to analyze the accuracy of registration using the following innovative approaches (1)Distance Discordance Histograms (DDH), described in detail in a separate paper and (2)'MirrorScope', explained as follows: for any view plane the 2-d image is broken up into a 2d grid of medium-sized squares. Each square contains a right-half, which is the reference image, and a left-half, which is the mirror flipped version of the overlay image. The user can increase or decrease the size of this grid to control the resolution of the analysis. Other updates to CERR include tools to extract image and dosimetric features programmatically and storage in a central database and tools to interface with Statistical analysis software like SPSS and Matlab Statistics toolbox. MirrorScope was compared on various examples, including 'perfect' registration examples and 'artificially translated' registrations. for 'perfect' registration, the patterns obtained within each circles are symmetric, and are easily, visually recognized as aligned. For registrations that are off, the patterns obtained in the circles located in the regions of imperfections show unsymmetrical patterns that are easily recognized. The new updates to CERR further increase its utility for RT-research. Mirrorscope is a visually intuitive method of monitoring the accuracy of image registration that improves on the visual confusion of standard methods. © 2012 American Association of Physicists in Medicine.

  10. The Parallel System for Integrating Impact Models and Sectors (pSIMS)

    NASA Technical Reports Server (NTRS)

    Elliott, Joshua; Kelly, David; Chryssanthacopoulos, James; Glotter, Michael; Jhunjhnuwala, Kanika; Best, Neil; Wilde, Michael; Foster, Ian

    2014-01-01

    We present a framework for massively parallel climate impact simulations: the parallel System for Integrating Impact Models and Sectors (pSIMS). This framework comprises a) tools for ingesting and converting large amounts of data to a versatile datatype based on a common geospatial grid; b) tools for translating this datatype into custom formats for site-based models; c) a scalable parallel framework for performing large ensemble simulations, using any one of a number of different impacts models, on clusters, supercomputers, distributed grids, or clouds; d) tools and data standards for reformatting outputs to common datatypes for analysis and visualization; and e) methodologies for aggregating these datatypes to arbitrary spatial scales such as administrative and environmental demarcations. By automating many time-consuming and error-prone aspects of large-scale climate impacts studies, pSIMS accelerates computational research, encourages model intercomparison, and enhances reproducibility of simulation results. We present the pSIMS design and use example assessments to demonstrate its multi-model, multi-scale, and multi-sector versatility.

  11. NASA Exhibits

    NASA Technical Reports Server (NTRS)

    Deardorff, Glenn; Djomehri, M. Jahed; Freeman, Ken; Gambrel, Dave; Green, Bryan; Henze, Chris; Hinke, Thomas; Hood, Robert; Kiris, Cetin; Moran, Patrick; hide

    2001-01-01

    A series of NASA presentations for the Supercomputing 2001 conference are summarized. The topics include: (1) Mars Surveyor Landing Sites "Collaboratory"; (2) Parallel and Distributed CFD for Unsteady Flows with Moving Overset Grids; (3) IP Multicast for Seamless Support of Remote Science; (4) Consolidated Supercomputing Management Office; (5) Growler: A Component-Based Framework for Distributed/Collaborative Scientific Visualization and Computational Steering; (6) Data Mining on the Information Power Grid (IPG); (7) Debugging on the IPG; (8) Debakey Heart Assist Device: (9) Unsteady Turbopump for Reusable Launch Vehicle; (10) Exploratory Computing Environments Component Framework; (11) OVERSET Computational Fluid Dynamics Tools; (12) Control and Observation in Distributed Environments; (13) Multi-Level Parallelism Scaling on NASA's Origin 1024 CPU System; (14) Computing, Information, & Communications Technology; (15) NAS Grid Benchmarks; (16) IPG: A Large-Scale Distributed Computing and Data Management System; and (17) ILab: Parameter Study Creation and Submission on the IPG.

  12. New trends in the virtualization of hospitals--tools for global e-Health.

    PubMed

    Graschew, Georgi; Roelofs, Theo A; Rakowsky, Stefan; Schlag, Peter M; Heinzlreiter, Paul; Kranzlmüller, Dieter; Volkert, Jens

    2006-01-01

    The development of virtual hospitals and digital medicine helps to bridge the digital divide between different regions of the world and enables equal access to high-level medical care. Pre-operative planning, intra-operative navigation and minimally-invasive surgery require a digital and virtual environment supporting the perception of the physician. As data and computing resources in a virtual hospital are distributed over many sites the concept of the Grid should be integrated with other communication networks and platforms. A promising approach is the implementation of service-oriented architectures for an invisible grid, hiding complexity for both application developers and end-users. Examples of promising medical applications of Grid technology are the real-time 3D-visualization and manipulation of patient data for individualized treatment planning and the creation of distributed intelligent databases of medical images.

  13. "Tools For Analysis and Visualization of Large Time- Varying CFD Data Sets"

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; vanGelder, Allen

    1999-01-01

    During the four years of this grant (including the one year extension), we have explored many aspects of the visualization of large CFD (Computational Fluid Dynamics) datasets. These have included new direct volume rendering approaches, hierarchical methods, volume decimation, error metrics, parallelization, hardware texture mapping, and methods for analyzing and comparing images. First, we implemented an extremely general direct volume rendering approach that can be used to render rectilinear, curvilinear, or tetrahedral grids, including overlapping multiple zone grids, and time-varying grids. Next, we developed techniques for associating the sample data with a k-d tree, a simple hierarchial data model to approximate samples in the regions covered by each node of the tree, and an error metric for the accuracy of the model. We also explored a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH (Association for Computing Machinery Special Interest Group on Computer Graphics) '96. In our initial implementation, we automatically image the volume from 32 approximately evenly distributed positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation.

  14. Global Static Indexing for Real-Time Exploration of Very Large Regular Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pascucci, V; Frank, R

    2001-07-23

    In this paper we introduce a new indexing scheme for progressive traversal and visualization of large regular grids. We demonstrate the potential of our approach by providing a tool that displays at interactive rates planar slices of scalar field data with very modest computing resources. We obtain unprecedented results both in terms of absolute performance and, more importantly, in terms of scalability. On a laptop computer we provide real time interaction with a 2048{sup 3} grid (8 Giga-nodes) using only 20MB of memory. On an SGI Onyx we slice interactively an 8192{sup 3} grid (1/2 tera-nodes) using only 60MB ofmore » memory. The scheme relies simply on the determination of an appropriate reordering of the rectilinear grid data and a progressive construction of the output slice. The reordering minimizes the amount of I/O performed during the out-of-core computation. The progressive and asynchronous computation of the output provides flexible quality/speed tradeoffs and a time-critical and interruptible user interface.« less

  15. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    NASA Technical Reports Server (NTRS)

    Nielson, Gregory M.

    1997-01-01

    This is a request for a second renewal (3d year of funding) of a research project on the topic of multiresolution and explicit methods for vector field analysis and visualization. In this report, we describe the progress made on this research project during the second year and give a statement of the planned research for the third year. There are two aspects to this research project. The first is concerned with the development of techniques for computing tangent curves for use in visualizing flow fields. The second aspect of the research project is concerned with the development of multiresolution methods for curvilinear grids and their use as tools for visualization, analysis and archiving of flow data. We report on our work on the development of numerical methods for tangent curve computation first.

  16. IN31A-1734 Development and Evaluation of a Gridded CrIS/ATMS Visualization for Operational Forecasting

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; Smith, Nadia; Dostalek, Jack; Stevens, Eric; Nelson, Kristine; Weisz, Elisabeth; Berndt, Emily; Line, Bill; Barnet, Chris; Gambacorta, Antonia; hide

    2016-01-01

    A collaborative effort between SPoRT, CIMSS, CIRA, GINA, and NOAA has produced a unique gridded visualization of real-time CrIS/ATMS sounding products. This product uses the NUCAPS retrieval algorithm and polar2grid software to generate plan-view and cross-section visualization for forecast challenges associated with cold air aloft and convective potential. Forecasters at select partner offices have been able to view the Gridded NUCAPS products in AWIPS alongside other operational data products with generally favorable feedback.

  17. Psyplot: Visualizing rectangular and triangular Climate Model Data with Python

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp

    2016-04-01

    The development and use of climate models often requires the visualization of geo-referenced data. Creating visualizations should be fast, attractive, flexible, easily applicable and easily reproducible. There is a wide range of software tools available for visualizing raster data, but they often are inaccessible to many users (e.g. because they are difficult to use in a script or have low flexibility). In order to facilitate easy visualization of geo-referenced data, we developed a new framework called "psyplot," which can aid earth system scientists with their daily work. It is purely written in the programming language Python and primarily built upon the python packages matplotlib, cartopy and xray. The package can visualize data stored on the hard disk (e.g. NetCDF, GeoTIFF, any other file format supported by the xray package), or directly from the memory or Climate Data Operators (CDOs). Furthermore, data can be visualized on a rectangular grid (following or not following the CF Conventions) and on a triangular grid (following the CF or UGRID Conventions). Psyplot visualizes 2D scalar and vector fields, enabling the user to easily manage and format multiple plots at the same time, and to export the plots into all common picture formats and movies covered by the matplotlib package. The package can currently be used in an interactive python session or in python scripts, and will soon be developed for use with a graphical user interface (GUI). Finally, the psyplot framework enables flexible configuration, allows easy integration into other scripts that uses matplotlib, and provides a flexible foundation for further development.

  18. 3D Voronoi grid dedicated software for modeling gas migration in deep layered sedimentary formations with TOUGH2-TMGAS

    NASA Astrophysics Data System (ADS)

    Bonduà, Stefano; Battistelli, Alfredo; Berry, Paolo; Bortolotti, Villiam; Consonni, Alberto; Cormio, Carlo; Geloni, Claudio; Vasini, Ester Maria

    2017-11-01

    As is known, a full three-dimensional (3D) unstructured grid permits a great degree of flexibility when performing accurate numerical reservoir simulations. However, when the Integral Finite Difference Method (IFDM) is used for spatial discretization, constraints (arising from the required orthogonality between the segment connecting the blocks nodes and the interface area between blocks) pose difficulties in the creation of grids with irregular shaped blocks. The full 3D Voronoi approach guarantees the respect of IFDM constraints and allows generation of grids conforming to geological formations and structural objects and at the same time higher grid resolution in volumes of interest. In this work, we present dedicated pre- and post-processing gridding software tools for the TOUGH family of numerical reservoir simulators, developed by the Geothermal Research Group of the DICAM Department, University of Bologna. VORO2MESH is a new software coded in C++, based on the voro++ library, allowing computation of the 3D Voronoi tessellation for a given domain and the creation of a ready to use TOUGH2 MESH file. If a set of geological surfaces is available, the software can directly generate the set of Voronoi seed points used for tessellation. In order to reduce the number of connections and so to decrease computation time, VORO2MESH can produce a mixed grid with regular blocks (orthogonal prisms) and irregular blocks (polyhedron Voronoi blocks) at the point of contact between different geological formations. In order to visualize 3D Voronoi grids together with the results of numerical simulations, the functionality of the TOUGH2Viewer post-processor has been extended. We describe an application of VORO2MESH and TOUGH2Viewer to validate the two tools. The case study deals with the simulation of the migration of gases in deep layered sedimentary formations at basin scale using TOUGH2-TMGAS. A comparison between the simulation performances of unstructured and structured grids is presented.

  19. Absence of Visual Input Results in the Disruption of Grid Cell Firing in the Mouse.

    PubMed

    Chen, Guifen; Manson, Daniel; Cacucci, Francesca; Wills, Thomas Joseph

    2016-09-12

    Grid cells are spatially modulated neurons within the medial entorhinal cortex whose firing fields are arranged at the vertices of tessellating equilateral triangles [1]. The exquisite periodicity of their firing has led to the suggestion that they represent a path integration signal, tracking the organism's position by integrating speed and direction of movement [2-10]. External sensory inputs are required to reset any errors that the path integrator would inevitably accumulate. Here we probe the nature of the external sensory inputs required to sustain grid firing, by recording grid cells as mice explore familiar environments in complete darkness. The absence of visual cues results in a significant disruption of grid cell firing patterns, even when the quality of the directional information provided by head direction cells is largely preserved. Darkness alters the expression of velocity signaling within the entorhinal cortex, with changes evident in grid cell firing rate and the local field potential theta frequency. Short-term (<1.5 s) spike timing relationships between grid cell pairs are preserved in the dark, indicating that network patterns of excitatory and inhibitory coupling between grid cells exist independently of visual input and of spatially periodic firing. However, we find no evidence of preserved hexagonal symmetry in the spatial firing of single grid cells at comparable short timescales. Taken together, these results demonstrate that visual input is required to sustain grid cell periodicity and stability in mice and suggest that grid cells in mice cannot perform accurate path integration in the absence of reliable visual cues. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  20. Unsteady flow simulations around complex geometries using stationary or rotating unstructured grids

    NASA Astrophysics Data System (ADS)

    Sezer-Uzol, Nilay

    In this research, the computational analysis of three-dimensional, unsteady, separated, vortical flows around complex geometries is studied by using stationary or moving unstructured grids. Two main engineering problems are investigated. The first problem is the unsteady simulation of a ship airwake, where helicopter operations become even more challenging, by using stationary unstructured grids. The second problem is the unsteady simulation of wind turbine rotor flow fields by using moving unstructured grids which are rotating with the whole three-dimensional rigid rotor geometry. The three dimensional, unsteady, parallel, unstructured, finite volume flow solver, PUMA2, is used for the computational fluid dynamics (CFD) simulations considered in this research. The code is modified to have a moving grid capability to perform three-dimensional, time-dependent rotor simulations. An instantaneous log-law wall model for Large Eddy Simulations is also implemented in PUMA2 to investigate the very large Reynolds number flow fields of rotating blades. To verify the code modifications, several sample test cases are also considered. In addition, interdisciplinary studies, which are aiming to provide new tools and insights to the aerospace and wind energy scientific communities, are done during this research by focusing on the coupling of ship airwake CFD simulations with the helicopter flight dynamics and control analysis, the coupling of wind turbine rotor CFD simulations with the aeroacoustic analysis, and the analysis of these time-dependent and large-scale CFD simulations with the help of a computational monitoring, steering and visualization tool, POSSE.

  1. ProfileGrids: a sequence alignment visualization paradigm that avoids the limitations of Sequence Logos.

    PubMed

    Roca, Alberto I

    2014-01-01

    The 2013 BioVis Contest provided an opportunity to evaluate different paradigms for visualizing protein multiple sequence alignments. Such data sets are becoming extremely large and thus taxing current visualization paradigms. Sequence Logos represent consensus sequences but have limitations for protein alignments. As an alternative, ProfileGrids are a new protein sequence alignment visualization paradigm that represents an alignment as a color-coded matrix of the residue frequency occurring at every homologous position in the aligned protein family. The JProfileGrid software program was used to analyze the BioVis contest data sets to generate figures for comparison with the Sequence Logo reference images. The ProfileGrid representation allows for the clear and effective analysis of protein multiple sequence alignments. This includes both a general overview of the conservation and diversity sequence patterns as well as the interactive ability to query the details of the protein residue distributions in the alignment. The JProfileGrid software is free and available from http://www.ProfileGrid.org.

  2. ModuleRole: a tool for modulization, role determination and visualization in protein-protein interaction networks.

    PubMed

    Li, Guipeng; Li, Ming; Zhang, Yiwei; Wang, Dong; Li, Rong; Guimerà, Roger; Gao, Juntao Tony; Zhang, Michael Q

    2014-01-01

    Rapidly increasing amounts of (physical and genetic) protein-protein interaction (PPI) data are produced by various high-throughput techniques, and interpretation of these data remains a major challenge. In order to gain insight into the organization and structure of the resultant large complex networks formed by interacting molecules, using simulated annealing, a method based on the node connectivity, we developed ModuleRole, a user-friendly web server tool which finds modules in PPI network and defines the roles for every node, and produces files for visualization in Cytoscape and Pajek. For given proteins, it analyzes the PPI network from BioGRID database, finds and visualizes the modules these proteins form, and then defines the role every node plays in this network, based on two topological parameters Participation Coefficient and Z-score. This is the first program which provides interactive and very friendly interface for biologists to find and visualize modules and roles of proteins in PPI network. It can be tested online at the website http://www.bioinfo.org/modulerole/index.php, which is free and open to all users and there is no login requirement, with demo data provided by "User Guide" in the menu Help. Non-server application of this program is considered for high-throughput data with more than 200 nodes or user's own interaction datasets. Users are able to bookmark the web link to the result page and access at a later time. As an interactive and highly customizable application, ModuleRole requires no expert knowledge in graph theory on the user side and can be used in both Linux and Windows system, thus a very useful tool for biologist to analyze and visualize PPI networks from databases such as BioGRID. ModuleRole is implemented in Java and C, and is freely available at http://www.bioinfo.org/modulerole/index.php. Supplementary information (user guide, demo data) is also available at this website. API for ModuleRole used for this program can be obtained upon request.

  3. Analysis, Mining and Visualization Service at NCSA

    NASA Astrophysics Data System (ADS)

    Wilhelmson, R.; Cox, D.; Welge, M.

    2004-12-01

    NCSA's goal is to create a balanced system that fully supports high-end computing as well as: 1) high-end data management and analysis; 2) visualization of massive, highly complex data collections; 3) large databases; 4) geographically distributed Grid computing; and 5) collaboratories, all based on a secure computational environment and driven with workflow-based services. To this end NCSA has defined a new technology path that includes the integration and provision of cyberservices in support of data analysis, mining, and visualization. NCSA has begun to develop and apply a data mining system-NCSA Data-to-Knowledge (D2K)-in conjunction with both the application and research communities. NCSA D2K will enable the formation of model-based application workflows and visual programming interfaces for rapid data analysis. The Java-based D2K framework, which integrates analytical data mining methods with data management, data transformation, and information visualization tools, will be configurable from the cyberservices (web and grid services, tools, ..) viewpoint to solve a wide range of important data mining problems. This effort will use modules, such as a new classification methods for the detection of high-risk geoscience events, and existing D2K data management, machine learning, and information visualization modules. A D2K cyberservices interface will be developed to seamlessly connect client applications with remote back-end D2K servers, providing computational resources for data mining and integration with local or remote data stores. This work is being coordinated with SDSC's data and services efforts. The new NCSA Visualization embedded workflow environment (NVIEW) will be integrated with D2K functionality to tightly couple informatics and scientific visualization with the data analysis and management services. Visualization services will access and filter disparate data sources, simplifying tasks such as fusing related data from distinct sources into a coherent visual representation. This approach enables collaboration among geographically dispersed researchers via portals and front-end clients, and the coupling with data management services enables recording associations among datasets and building annotation systems into visualization tools and portals, giving scientists a persistent, shareable, virtual lab notebook. To facilitate provision of these cyberservices to the national community, NCSA will be providing a computational environment for large-scale data assimilation, analysis, mining, and visualization. This will be initially implemented on the new 512 processor shared memory SGI's recently purchased by NCSA. In addition to standard batch capabilities, NCSA will provide on-demand capabilities for those projects requiring rapid response (e.g., development of severe weather, earthquake events) for decision makers. It will also be used for non-sequential interactive analysis of data sets where it is important have access to large data volumes over space and time.

  4. Integrated multidisciplinary CAD/CAE environment for micro-electro-mechanical systems (MEMS)

    NASA Astrophysics Data System (ADS)

    Przekwas, Andrzej J.

    1999-03-01

    Computational design of MEMS involves several strongly coupled physical disciplines, including fluid mechanics, heat transfer, stress/deformation dynamics, electronics, electro/magneto statics, calorics, biochemistry and others. CFDRC is developing a new generation multi-disciplinary CAD systems for MEMS using high-fidelity field solvers on unstructured, solution-adaptive grids for a full range of disciplines. The software system, ACE + MEMS, includes all essential CAD tools; geometry/grid generation for multi- discipline, multi-equation solvers, GUI, tightly coupled configurable 3D field solvers for FVM, FEM and BEM and a 3D visualization/animation tool. The flow/heat transfer/calorics/chemistry equations are solved with unstructured adaptive FVM solver, stress/deformation are computed with a FEM STRESS solver and a FAST BEM solver is used to solve linear heat transfer, electro/magnetostatics and elastostatics equations on adaptive polygonal surface grids. Tight multidisciplinary coupling and automatic interoperability between the tools was achieved by designing a comprehensive database structure and APIs for complete model definition. The virtual model definition is implemented in data transfer facility, a publicly available tool described in this paper. The paper presents overall description of the software architecture and MEMS design flow in ACE + MEMS. It describes current status, ongoing effort and future plans for the software. The paper also discusses new concepts of mixed-level and mixed- dimensionality capability in which 1D microfluidic networks are simulated concurrently with 3D high-fidelity models of discrete components.

  5. U.S. Electric System Operating Data

    EIA Publications

    EIA provides hourly electricity operating data, including actual and forecast demand, net generation, and the power flowing between electric systems. EIA's new U.S. Electric System Operating Data tool provides nearly real-time demand data, plus analysis and visualizations of hourly, daily, and weekly electricity supply and demand on a national and regional level for all of the 66 electric system balancing authorities that make up the U.S. electric grid.

  6. Neural Network Design on the SRC-6 Reconfigurable Computer

    DTIC Science & Technology

    2006-12-01

    fingerprint identification. In this field, automatic identification methods are used to save time, especially for the purpose of fingerprint matching in...grid widths and lengths and therefore was useful in producing an accurate canvas with which to create sample training images. The added benefit of...tools available free of charge and readily accessible on the computer, it was simple to design bitmap data files visually on a canvas and then

  7. Making sense of sparse rating data in collaborative filtering via topographic organization of user preference patterns.

    PubMed

    Polcicová, Gabriela; Tino, Peter

    2004-01-01

    We introduce topographic versions of two latent class models (LCM) for collaborative filtering. Latent classes are topologically organized on a square grid. Topographic organization of latent classes makes orientation in rating/preference patterns captured by the latent classes easier and more systematic. The variation in film rating patterns is modelled by multinomial and binomial distributions with varying independence assumptions. In the first stage of topographic LCM construction, self-organizing maps with neural field organized according to the LCM topology are employed. We apply our system to a large collection of user ratings for films. The system can provide useful visualization plots unveiling user preference patterns buried in the data, without loosing potential to be a good recommender model. It appears that multinomial distribution is most adequate if the model is regularized by tight grid topologies. Since we deal with probabilistic models of the data, we can readily use tools from probability and information theories to interpret and visualize information extracted by our system.

  8. SNPversity: a web-based tool for visualizing diversity

    PubMed Central

    Schott, David A; Vinnakota, Abhinav G; Portwood, John L; Andorf, Carson M

    2018-01-01

    Abstract Many stand-alone desktop software suites exist to visualize single nucleotide polymorphism (SNP) diversity, but web-based software that can be easily implemented and used for biological databases is absent. SNPversity was created to answer this need by building an open-source visualization tool that can be implemented on a Unix-like machine and served through a web browser that can be accessible worldwide. SNPversity consists of a HDF5 database back-end for SNPs, a data exchange layer powered by TASSEL libraries that represent data in JSON format, and an interface layer using PHP to visualize SNP information. SNPversity displays data in real-time through a web browser in grids that are color-coded according to a given SNP’s allelic status and mutational state. SNPversity is currently available at MaizeGDB, the maize community’s database, and will be soon available at GrainGenes, the clade-oriented database for Triticeae and Avena species, including wheat, barley, rye, and oat. The code and documentation are uploaded onto github, and they are freely available to the public. We expect that the tool will be highly useful for other biological databases with a similar need to display SNP diversity through their web interfaces. Database URL: https://www.maizegdb.org/snpversity PMID:29688387

  9. Spaceflight Operations Services Grid (SOSG) Prototype Implementation and Feasibility Study

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.; Lisotta, Anthony J.; Redman, Sandra

    2004-01-01

    Science Operations Services Grid is focusing on building a prototype grid-based environment that incorporates existing and new spaceflight services to enable current and future NASA programs with cost savings and new and evolvable methods to conduct science in a distributed environment. The Science Operations Services Grid (SOSG) will provide a distributed environment for widely disparate organizations to conduct their systems and processes in a more efficient and cost effective manner. These organizations include those that: 1) engage in space-based science and operations, 2) develop space-based systems and processes, and 3) conduct scientific research, bringing together disparate scientific disciplines like geology and oceanography to create new information. In addition educational outreach will be significantly enhanced by providing to schools the same tools used by NASA with the ability of the schools to actively participate on many levels in the science generated by NASA from space and on the ground. The services range from voice, video and telemetry processing and display to data mining, high level processing and visualization tools all accessible from a single portal. In this environment, users would not require high end systems or processes at their home locations to use these services. Also, the user would need to know minimal details about the applications in order to utilize the services. In addition, security at all levels is an underlying goal of the project. The Science Operations Services Grid will focus on four tools that are currently used by the ISS Payload community along with nine more that are new to the community. Under the prototype four Grid virtual organizations PO) will be developed to represent four types of users. They are a Payload (experimenters) VO, a Flight Controllers VO, an Engineering and Science Collaborators VO and an Education and Public Outreach VO. The User-based services will be implemented to replicate the operational voice, video, telemetry and commanding systems. Once the User-based services are in place, they will be analyzed to establish feasibility for Grid enabling. If feasible then each User-based service will be Grid enabled. The remaining non-Grid services if not already Web enabled will be so enabled. In the end, four portals will be developed one for each VO. Each portal will contain the appropriate User-based services required for that VO to operate.

  10. VisIVO: A Tool for the Virtual Observatory and Grid Environment

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Comparato, M.; Costa, A.; Larsson, B.; Gheller, C.; Pasian, F.; Smareglia, R.

    2007-10-01

    We present the new features of VisIVO, software for the visualization and analysis of astrophysical data which can be retrieved from the Virtual Observatory framework and used for cosmological simulations running both on Windows and GNU/Linux platforms. VisIVO is VO standards compliant and supports the most important astronomical data formats such as FITS, HDF5 and VOTables. It is free software and can be downloaded from the web site http://visivo.cineca.it. VisIVO can interoperate with other astronomical VO compliant tools through PLASTIC (PLatform for AStronomical Tool InterConnection). This feature allows VisIVO to share data with many other astronomical packages to further analyze the loaded data.

  11. User Manual for the PROTEUS Mesh Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Micheal A.; Shemon, Emily R

    2016-09-19

    PROTEUS is built around a finite element representation of the geometry for visualization. In addition, the PROTEUS-SN solver was built to solve the even-parity transport equation on a finite element mesh provided as input. Similarly, PROTEUS-MOC and PROTEUS-NEMO were built to apply the method of characteristics on unstructured finite element meshes. Given the complexity of real world problems, experience has shown that using commercial mesh generator to create rather simple input geometries is overly complex and slow. As a consequence, significant effort has been put into place to create multiple codes that help assist in the mesh generation and manipulation.more » There are three input means to create a mesh in PROTEUS: UFMESH, GRID, and NEMESH. At present, the UFMESH is a simple way to generate two-dimensional Cartesian and hexagonal fuel assembly geometries. The UFmesh input allows for simple assembly mesh generation while the GRID input allows the generation of Cartesian, hexagonal, and regular triangular structured grid geometry options. The NEMESH is a way for the user to create their own mesh or convert another mesh file format into a PROTEUS input format. Given that one has an input mesh format acceptable for PROTEUS, we have constructed several tools which allow further mesh and geometry construction (i.e. mesh extrusion and merging). This report describes the various mesh tools that are provided with the PROTEUS code giving both descriptions of the input and output. In many cases the examples are provided with a regression test of the mesh tools. The most important mesh tools for any user to consider using are the MT_MeshToMesh.x and the MT_RadialLattice.x codes. The former allows the conversion between most mesh types handled by PROTEUS while the second allows the merging of multiple (assembly) meshes into a radial structured grid. Note that the mesh generation process is recursive in nature and that each input specific for a given mesh tool (such as .axial or .merge) can be used as “mesh” input for any of the mesh tools discussed in this manual.« less

  12. Visualization of Octree Adaptive Mesh Refinement (AMR) in Astrophysical Simulations

    NASA Astrophysics Data System (ADS)

    Labadens, M.; Chapon, D.; Pomaréde, D.; Teyssier, R.

    2012-09-01

    Computer simulations are important in current cosmological research. Those simulations run in parallel on thousands of processors, and produce huge amount of data. Adaptive mesh refinement is used to reduce the computing cost while keeping good numerical accuracy in regions of interest. RAMSES is a cosmological code developed by the Commissariat à l'énergie atomique et aux énergies alternatives (English: Atomic Energy and Alternative Energies Commission) which uses Octree adaptive mesh refinement. Compared to grid based AMR, the Octree AMR has the advantage to fit very precisely the adaptive resolution of the grid to the local problem complexity. However, this specific octree data type need some specific software to be visualized, as generic visualization tools works on Cartesian grid data type. This is why the PYMSES software has been also developed by our team. It relies on the python scripting language to ensure a modular and easy access to explore those specific data. In order to take advantage of the High Performance Computer which runs the RAMSES simulation, it also uses MPI and multiprocessing to run some parallel code. We would like to present with more details our PYMSES software with some performance benchmarks. PYMSES has currently two visualization techniques which work directly on the AMR. The first one is a splatting technique, and the second one is a custom ray tracing technique. Both have their own advantages and drawbacks. We have also compared two parallel programming techniques with the python multiprocessing library versus the use of MPI run. The load balancing strategy has to be smartly defined in order to achieve a good speed up in our computation. Results obtained with this software are illustrated in the context of a massive, 9000-processor parallel simulation of a Milky Way-like galaxy.

  13. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  14. A Micro-Grid Simulator Tool (SGridSim) using Effective Node-to-Node Complex Impedance (EN2NCI) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Udhay Ravishankar; Milos manic

    2013-08-01

    This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSimmore » micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.« less

  15. A Unified Air-Sea Visualization System: Survey on Gridding Structures

    NASA Technical Reports Server (NTRS)

    Anand, Harsh; Moorhead, Robert

    1995-01-01

    The goal is to develop a Unified Air-Sea Visualization System (UASVS) to enable the rapid fusion of observational, archival, and model data for verification and analysis. To design and develop UASVS, modelers were polled to determine the gridding structures and visualization systems used, and their needs with respect to visual analysis. A basic UASVS requirement is to allow a modeler to explore multiple data sets within a single environment, or to interpolate multiple datasets onto one unified grid. From this survey, the UASVS should be able to visualize 3D scalar/vector fields; render isosurfaces; visualize arbitrary slices of the 3D data; visualize data defined on spectral element grids with the minimum number of interpolation stages; render contours; produce 3D vector plots and streamlines; provide unified visualization of satellite images, observations and model output overlays; display the visualization on a projection of the users choice; implement functions so the user can derive diagnostic values; animate the data to see the time-evolution; animate ocean and atmosphere at different rates; store the record of cursor movement, smooth the path, and animate a window around the moving path; repeatedly start and stop the visual time-stepping; generate VHS tape animations; work on a variety of workstations; and allow visualization across clusters of workstations and scalable high performance computer systems.

  16. Topography Analysis and Visualization Software Supports a Guided Comparative Planetology Education Exhibit at the Smithsonian's Air and Space Museum

    NASA Technical Reports Server (NTRS)

    Roark, J. H.; Masuoka, C. M.; Frey, H. V.; Keller, J.; Williams, S.

    2005-01-01

    The Planetary Geodynamics Laboratory (http://geodynamics.gsfc.nasa.gov) of NASA s Goddard Space Flight Center designed, produced and recently delivered a "museum-friendly" version of GRIDVIEW, a grid visualization and analysis application, to the Smithsonian's National Air and Space Museum where it will be used in a guided comparative planetology education exhibit. The software was designed to enable museum visitors to interact with the same Earth and Mars topographic data and tools typically used by planetary scientists, and experience the thrill of discovery while learning about the geologic differences between Earth and Mars.

  17. Visualization of Atmospheric Water Vapor Data for SAGE

    NASA Technical Reports Server (NTRS)

    Kung, Mou-Liang; Chu, W. P. (Technical Monitor)

    2000-01-01

    The goal of this project was to develop visualization tools to study the water vapor dynamics using the Stratospheric Aerosol and Gas Experiment 11 (SAGE 11) water vapor data. During the past years, we completed the development of a visualization tool called EZSAGE, and various Gridded Water Vapor plots, tools deployed on the web to provide users with new insight into the water vapor dynamics. Results and experiences from this project, including papers, tutorials and reviews were published on the main Web page. Additional publishing effort has been initiated to package EZSAGE software for CD production and distribution. There have been some major personnel changes since Fall, 1998. Dr. Mou-Liang Kung, a Professor of Computer Science assumed the PI position vacated by Dr. Waldo Rodriguez who was on leave. However, former PI, Dr. Rodriguez continued to serve as a research adviser to this project to assure smooth transition and project completion. Typically in each semester, five student research assistants were hired and trained. Weekly group meetings were held to discuss problems, progress, new research direction, and activity planning. Other small group meetings were also held regularly for different objectives of this project. All student research assistants were required to submit reports for conference submission.

  18. Considering the Spatial Layout Information of Bag of Features (BoF) Framework for Image Classification.

    PubMed

    Mu, Guangyu; Liu, Ying; Wang, Limin

    2015-01-01

    The spatial pooling method such as spatial pyramid matching (SPM) is very crucial in the bag of features model used in image classification. SPM partitions the image into a set of regular grids and assumes that the spatial layout of all visual words obey the uniform distribution over these regular grids. However, in practice, we consider that different visual words should obey different spatial layout distributions. To improve SPM, we develop a novel spatial pooling method, namely spatial distribution pooling (SDP). The proposed SDP method uses an extension model of Gauss mixture model to estimate the spatial layout distributions of the visual vocabulary. For each visual word type, SDP can generate a set of flexible grids rather than the regular grids from the traditional SPM. Furthermore, we can compute the grid weights for visual word tokens according to their spatial coordinates. The experimental results demonstrate that SDP outperforms the traditional spatial pooling methods, and is competitive with the state-of-the-art classification accuracy on several challenging image datasets.

  19. ProfileGrids: a sequence alignment visualization paradigm that avoids the limitations of Sequence Logos

    PubMed Central

    2014-01-01

    Background The 2013 BioVis Contest provided an opportunity to evaluate different paradigms for visualizing protein multiple sequence alignments. Such data sets are becoming extremely large and thus taxing current visualization paradigms. Sequence Logos represent consensus sequences but have limitations for protein alignments. As an alternative, ProfileGrids are a new protein sequence alignment visualization paradigm that represents an alignment as a color-coded matrix of the residue frequency occurring at every homologous position in the aligned protein family. Results The JProfileGrid software program was used to analyze the BioVis contest data sets to generate figures for comparison with the Sequence Logo reference images. Conclusions The ProfileGrid representation allows for the clear and effective analysis of protein multiple sequence alignments. This includes both a general overview of the conservation and diversity sequence patterns as well as the interactive ability to query the details of the protein residue distributions in the alignment. The JProfileGrid software is free and available from http://www.ProfileGrid.org. PMID:25237393

  20. A data grid for imaging-based clinical trials

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng; Chao, Sander S.; Lee, Jasper; Liu, Brent; Documet, Jorge; Huang, H. K.

    2007-03-01

    Clinical trials play a crucial role in testing new drugs or devices in modern medicine. Medical imaging has also become an important tool in clinical trials because images provide a unique and fast diagnosis with visual observation and quantitative assessment. A typical imaging-based clinical trial consists of: 1) A well-defined rigorous clinical trial protocol, 2) a radiology core that has a quality control mechanism, a biostatistics component, and a server for storing and distributing data and analysis results; and 3) many field sites that generate and send image studies to the radiology core. As the number of clinical trials increases, it becomes a challenge for a radiology core servicing multiple trials to have a server robust enough to administrate and quickly distribute information to participating radiologists/clinicians worldwide. The Data Grid can satisfy the aforementioned requirements of imaging based clinical trials. In this paper, we present a Data Grid architecture for imaging-based clinical trials. A Data Grid prototype has been implemented in the Image Processing and Informatics (IPI) Laboratory at the University of Southern California to test and evaluate performance in storing trial images and analysis results for a clinical trial. The implementation methodology and evaluation protocol of the Data Grid are presented.

  1. [Grid laser photocoagulation in diffuse diabetic macular edema].

    PubMed

    Degenring, Robert F; Hugger, Philipp; Sauder, Gangolf; Jonas, Jost B

    2004-01-01

    To evaluate the clinical outcome of macular grid laser photocoagulation in the treatment of diffuse diabetic macular oedema. The retrospective study included 30 consecutive patients (41 eyes) who were treated by macular argon green grid laser photocoagulation for diffuse diabetic macular oedema. Follow-up time was 31.4 +/- 19.6 weeks. Visual acuity decreased from 0.25 +/- 0.18 (range, 0.03 - 0.8) to 0.20 +/- 0.18 (range, 0.02 - 0.8) (P = 0.045), representing a change of - 0.9 +/- 2,32 lines. 5 (12.2 %) eyes gained in visual acuity, visual acuity remained unchanged for 23 (56.1 %) eyes, and 13 (31.7 %) eyes showed a visual loss of more than one line. In eyes with a baseline visual acuity > or = 0.2 (N = 24) visual acuity dropped from 0.36 +/- 0.15 (0.2 - 0.8; median 0.3) to 0.29 +/- 0.19 (0.05 - 0.8; median 0.2) (p = 0.038). 3 eyes (12.5 %) gained > or = 2 lines, 11 eyes (45.8 %) lost > or = 2 lines, 10 eyes (41.7 %) remained stable. Mean loss was - 1.63 +/- 2.53 lines. Eyes with a baseline visual acuity < or = 0.2 did not change significantly. In the present study mean visual acuity decreased in the whole population and especially in the subgroup with a baseline visual acuity of > or = 0.2 after macular grid laser photocoagulation for diffuse diabetic macular oedema. Mean visual loss was just below the predefined 2 lines. In view of these results and upcoming new pharmacological and surgical treatment modalities, the significance of grid laser photocoagulation should be re-discussed.

  2. Earth Observation oriented teaching materials development based on OGC Web services and Bashyt generated reports

    NASA Astrophysics Data System (ADS)

    Stefanut, T.; Gorgan, D.; Giuliani, G.; Cau, P.

    2012-04-01

    Creating e-Learning materials in the Earth Observation domain is a difficult task especially for non-technical specialists who have to deal with distributed repositories, large amounts of information and intensive processing requirements. Furthermore, due to the lack of specialized applications for developing teaching resources, technical knowledge is required also for defining data presentation structures or in the development and customization of user interaction techniques for better teaching results. As a response to these issues during the GiSHEO FP7 project [1] and later in the EnviroGRIDS FP7 [2] project, we have developed the eGLE e-Learning Platform [3], a tool based application that provides dedicated functionalities to the Earth Observation specialists for developing teaching materials. The proposed architecture is built around a client-server design that provides the core functionalities (e.g. user management, tools integration, teaching materials settings, etc.) and has been extended with a distributed component implemented through the tools that are integrated into the platform, as described further. Our approach in dealing with multiple transfer protocol types, heterogeneous data formats or various user interaction techniques involve the development and integration of very specialized elements (tools) that can be customized by the trainers in a visual manner through simple user interfaces. In our concept each tool is dedicated to a specific data type, implementing optimized mechanisms for searching, retrieving, visualizing and interacting with it. At the same time, in each learning resource can be integrated any number of tools, through drag-and-drop interaction, allowing the teacher to retrieve pieces of data of various types (e.g. images, charts, tables, text, videos etc.) from different sources (e.g. OGC web services, charts created through Bashyt application, etc.) through different protocols (ex. WMS, BASHYT API, FTP, HTTP etc.) and to display them all together in a unitary manner using the same visual structure [4]. Addressing the High Power Computation requirements that are met while processing environmental data, our platform can be easily extended through tools that connect to GRID infrastructures, WCS web services, Bashyt API (for creating specialized hydrological reports) or any other specialized services (ex. graphics cluster visualization) that can be reached over the Internet. At run time, on the trainee's computer each tool is launched in an asynchronous running mode and connects to the data source that has been established by the teacher, retrieving and displaying the information to the user. The data transfer is accomplished directly between the trainee's computer and the corresponding services (e.g. OGC, Bashyt API, etc.) without passing through the core server platform. In this manner, the eGLE application can provide better and more responsive connections to a large number of users.

  3. NCAR's Research Data Archive: OPeNDAP Access for Complex Datasets

    NASA Astrophysics Data System (ADS)

    Dattore, R.; Worley, S. J.

    2014-12-01

    Many datasets have complex structures including hundreds of parameters and numerous vertical levels, grid resolutions, and temporal products. Making these data accessible is a challenge for a data provider. OPeNDAP is powerful protocol for delivering in real-time multi-file datasets that can be ingested by many analysis and visualization tools, but for these datasets there are too many choices about how to aggregate. Simple aggregation schemes can fail to support, or at least make it very challenging, for many potential studies based on complex datasets. We address this issue by using a rich file content metadata collection to create a real-time customized OPeNDAP service to match the full suite of access possibilities for complex datasets. The Climate Forecast System Reanalysis (CFSR) and it's extension, the Climate Forecast System Version 2 (CFSv2) datasets produced by the National Centers for Environmental Prediction (NCEP) and hosted by the Research Data Archive (RDA) at the Computational and Information Systems Laboratory (CISL) at NCAR are examples of complex datasets that are difficult to aggregate with existing data server software. CFSR and CFSv2 contain 141 distinct parameters on 152 vertical levels, six grid resolutions and 36 products (analyses, n-hour forecasts, multi-hour averages, etc.) where not all parameter/level combinations are available at all grid resolution/product combinations. These data are archived in the RDA with the data structure provided by the producer; no additional re-organization or aggregation have been applied. Since 2011, users have been able to request customized subsets (e.g. - temporal, parameter, spatial) from the CFSR/CFSv2, which are processed in delayed-mode and then downloaded to a user's system. Until now, the complexity has made it difficult to provide real-time OPeNDAP access to the data. We have developed a service that leverages the already-existing subsetting interface and allows users to create a virtual dataset with its own structure (das, dds). The user receives a URL to the customized dataset that can be used by existing tools to ingest, analyze, and visualize the data. This presentation will detail the metadata system and OPeNDAP server that enable user-customized real-time access and show an example of how a visualization tool can access the data.

  4. Towards a Comprehensive Computational Simulation System for Turbomachinery

    NASA Technical Reports Server (NTRS)

    Shih, Ming-Hsin

    1994-01-01

    The objective of this work is to develop algorithms associated with a comprehensive computational simulation system for turbomachinery flow fields. This development is accomplished in a modular fashion. These modules includes grid generation, visualization, network, simulation, toolbox, and flow modules. An interactive grid generation module is customized to facilitate the grid generation process associated with complicated turbomachinery configurations. With its user-friendly graphical user interface, the user may interactively manipulate the default settings to obtain a quality grid within a fraction of time that is usually required for building a grid about the same geometry with a general-purpose grid generation code. Non-Uniform Rational B-Spline formulations are utilized in the algorithm to maintain geometry fidelity while redistributing grid points on the solid surfaces. Bezier curve formulation is used to allow interactive construction of inner boundaries. It is also utilized to allow interactive point distribution. Cascade surfaces are transformed from three-dimensional surfaces of revolution into two-dimensional parametric planes for easy manipulation. Such a transformation allows these manipulated plane grids to be mapped to surfaces of revolution by any generatrix definition. A sophisticated visualization module is developed to al-low visualization for both grid and flow solution, steady or unsteady. A network module is built to allow data transferring in the heterogeneous environment. A flow module is integrated into this system, using an existing turbomachinery flow code. A simulation module is developed to combine the network, flow, and visualization module to achieve near real-time flow simulation about turbomachinery geometries. A toolbox module is developed to support the overall task. A batch version of the grid generation module is developed to allow portability and has been extended to allow dynamic grid generation for pitch changing turbomachinery configurations. Various applications with different characteristics are presented to demonstrate the success of this system.

  5. Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.

  6. Knowledge Discovery for Smart Grid Operation, Control, and Situation Awareness -- A Big Data Visualization Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gu, Yi; Jiang, Huaiguang; Zhang, Yingchen

    In this paper, a big data visualization platform is designed to discover the hidden useful knowledge for smart grid (SG) operation, control and situation awareness. The spawn of smart sensors at both grid side and customer side can provide large volume of heterogeneous data that collect information in all time spectrums. Extracting useful knowledge from this big-data poll is still challenging. In this paper, the Apache Spark, an open source cluster computing framework, is used to process the big-data to effectively discover the hidden knowledge. A high-speed communication architecture utilizing the Open System Interconnection (OSI) model is designed to transmitmore » the data to a visualization platform. This visualization platform uses Google Earth, a global geographic information system (GIS) to link the geological information with the SG knowledge and visualize the information in user defined fashion. The University of Denver's campus grid is used as a SG test bench and several demonstrations are presented for the proposed platform.« less

  7. AWE: Aviation Weather Data Visualization Environment

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly; Lodha, Suresh K.

    2000-01-01

    The two official sources for aviation weather reports both provide weather information to a pilot in a textual format. A number of systems have recently become available to help pilots with the visualization task by providing much of the data graphically. However, two types of aviation weather data are still not being presented graphically. These are airport-specific current weather reports (known as meteorological observations, or METARs) and forecast weather reports (known as terminal area forecasts, or TAFs). Our system, Aviation Weather Environment (AWE), presents intuitive graphical displays for both METARs and TAFs, as well as winds aloft forecasts. We start with a computer-generated textual aviation weather briefing. We map this briefing onto a cartographic grid specific to the pilot's area of interest. The pilot is able to obtain aviation-specific weather for the entire area or for his specific route. The route, altitude, true airspeed, and proposed departure time can each be modified in AWE. Integral visual display of these three elements of weather reports makes AWE a useful planning tool, as well as a weather briefing tool.

  8. An Adaptive Flow Solver for Air-Borne Vehicles Undergoing Time-Dependent Motions/Deformations

    NASA Technical Reports Server (NTRS)

    Singh, Jatinder; Taylor, Stephen

    1997-01-01

    This report describes a concurrent Euler flow solver for flows around complex 3-D bodies. The solver is based on a cell-centered finite volume methodology on 3-D unstructured tetrahedral grids. In this algorithm, spatial discretization for the inviscid convective term is accomplished using an upwind scheme. A localized reconstruction is done for flow variables which is second order accurate. Evolution in time is accomplished using an explicit three-stage Runge-Kutta method which has second order temporal accuracy. This is adapted for concurrent execution using another proven methodology based on concurrent graph abstraction. This solver operates on heterogeneous network architectures. These architectures may include a broad variety of UNIX workstations and PCs running Windows NT, symmetric multiprocessors and distributed-memory multi-computers. The unstructured grid is generated using commercial grid generation tools. The grid is automatically partitioned using a concurrent algorithm based on heat diffusion. This results in memory requirements that are inversely proportional to the number of processors. The solver uses automatic granularity control and resource management techniques both to balance load and communication requirements, and deal with differing memory constraints. These ideas are again based on heat diffusion. Results are subsequently combined for visualization and analysis using commercial CFD tools. Flow simulation results are demonstrated for a constant section wing at subsonic, transonic, and a supersonic case. These results are compared with experimental data and numerical results of other researchers. Performance results are under way for a variety of network topologies.

  9. Unit Planning Grids for Visual Arts--Grade 9-12 Advanced.

    ERIC Educational Resources Information Center

    Delaware State Dept. of Education, Dover.

    This planning grid for teaching visual arts (advanced) in grades 9-12 in Delaware outlines the following six standards for students to complete: (1) students will select and use form, media, techniques, and processes to create works of art and communicate meaning; (2) students will create ways to use visual, spatial, and temporal concepts in…

  10. Pycortex: an interactive surface visualizer for fMRI

    PubMed Central

    Gao, James S.; Huth, Alexander G.; Lescroart, Mark D.; Gallant, Jack L.

    2015-01-01

    Surface visualizations of fMRI provide a comprehensive view of cortical activity. However, surface visualizations are difficult to generate and most common visualization techniques rely on unnecessary interpolation which limits the fidelity of the resulting maps. Furthermore, it is difficult to understand the relationship between flattened cortical surfaces and the underlying 3D anatomy using tools available currently. To address these problems we have developed pycortex, a Python toolbox for interactive surface mapping and visualization. Pycortex exploits the power of modern graphics cards to sample volumetric data on a per-pixel basis, allowing dense and accurate mapping of the voxel grid across the surface. Anatomical and functional information can be projected onto the cortical surface. The surface can be inflated and flattened interactively, aiding interpretation of the correspondence between the anatomical surface and the flattened cortical sheet. The output of pycortex can be viewed using WebGL, a technology compatible with modern web browsers. This allows complex fMRI surface maps to be distributed broadly online without requiring installation of complex software. PMID:26483666

  11. Wide-area, real-time monitoring and visualization system

    DOEpatents

    Budhraja, Vikram S.; Dyer, James D.; Martinez Morales, Carlos A.

    2013-03-19

    A real-time performance monitoring system for monitoring an electric power grid. The electric power grid has a plurality of grid portions, each grid portion corresponding to one of a plurality of control areas. The real-time performance monitoring system includes a monitor computer for monitoring at least one of reliability metrics, generation metrics, transmission metrics, suppliers metrics, grid infrastructure security metrics, and markets metrics for the electric power grid. The data for metrics being monitored by the monitor computer are stored in a data base, and a visualization of the metrics is displayed on at least one display computer having a monitor. The at least one display computer in one said control area enables an operator to monitor the grid portion corresponding to a different said control area.

  12. Wide-area, real-time monitoring and visualization system

    DOEpatents

    Budhraja, Vikram S [Los Angeles, CA; Dyer, James D [La Mirada, CA; Martinez Morales, Carlos A [Upland, CA

    2011-11-15

    A real-time performance monitoring system for monitoring an electric power grid. The electric power grid has a plurality of grid portions, each grid portion corresponding to one of a plurality of control areas. The real-time performance monitoring system includes a monitor computer for monitoring at least one of reliability metrics, generation metrics, transmission metrics, suppliers metrics, grid infrastructure security metrics, and markets metrics for the electric power grid. The data for metrics being monitored by the monitor computer are stored in a data base, and a visualization of the metrics is displayed on at least one display computer having a monitor. The at least one display computer in one said control area enables an operator to monitor the grid portion corresponding to a different said control area.

  13. The GeoClaw software for depth-averaged flows with adaptive refinement

    USGS Publications Warehouse

    Berger, M.J.; George, D.L.; LeVeque, R.J.; Mandli, Kyle T.

    2011-01-01

    Many geophysical flow or wave propagation problems can be modeled with two-dimensional depth-averaged equations, of which the shallow water equations are the simplest example. We describe the GeoClaw software that has been designed to solve problems of this nature, consisting of open source Fortran programs together with Python tools for the user interface and flow visualization. This software uses high-resolution shock-capturing finite volume methods on logically rectangular grids, including latitude-longitude grids on the sphere. Dry states are handled automatically to model inundation. The code incorporates adaptive mesh refinement to allow the efficient solution of large-scale geophysical problems. Examples are given illustrating its use for modeling tsunamis and dam-break flooding problems. Documentation and download information is available at www.clawpack.org/geoclaw. ?? 2011.

  14. Enabling online studies of conceptual relationships between medical terms: developing an efficient web platform.

    PubMed

    Albin, Aaron; Ji, Xiaonan; Borlawsky, Tara B; Ye, Zhan; Lin, Simon; Payne, Philip Ro; Huang, Kun; Xiang, Yang

    2014-10-07

    The Unified Medical Language System (UMLS) contains many important ontologies in which terms are connected by semantic relations. For many studies on the relationships between biomedical concepts, the use of transitively associated information from ontologies and the UMLS has been shown to be effective. Although there are a few tools and methods available for extracting transitive relationships from the UMLS, they usually have major restrictions on the length of transitive relations or on the number of data sources. Our goal was to design an efficient online platform that enables efficient studies on the conceptual relationships between any medical terms. To overcome the restrictions of available methods and to facilitate studies on the conceptual relationships between medical terms, we developed a Web platform, onGrid, that supports efficient transitive queries and conceptual relationship studies using the UMLS. This framework uses the latest technique in converting natural language queries into UMLS concepts, performs efficient transitive queries, and visualizes the result paths. It also dynamically builds a relationship matrix for two sets of input biomedical terms. We are thus able to perform effective studies on conceptual relationships between medical terms based on their relationship matrix. The advantage of onGrid is that it can be applied to study any two sets of biomedical concept relations and the relations within one set of biomedical concepts. We use onGrid to study the disease-disease relationships in the Online Mendelian Inheritance in Man (OMIM). By crossvalidating our results with an external database, the Comparative Toxicogenomics Database (CTD), we demonstrated that onGrid is effective for the study of conceptual relationships between medical terms. onGrid is an efficient tool for querying the UMLS for transitive relations, studying the relationship between medical terms, and generating hypotheses.

  15. ARC SDK: A toolbox for distributed computing and data applications

    NASA Astrophysics Data System (ADS)

    Skou Andersen, M.; Cameron, D.; Lindemann, J.

    2014-06-01

    Grid middleware suites provide tools to perform the basic tasks of job submission and retrieval and data access, however these tools tend to be low-level, operating on individual jobs or files and lacking in higher-level concepts. User communities therefore generally develop their own application-layer software catering to their specific communities' needs on top of the Grid middleware. It is thus important for the Grid middleware to provide a friendly, well documented and simple to use interface for the applications to build upon. The Advanced Resource Connector (ARC), developed by NorduGrid, provides a Software Development Kit (SDK) which enables applications to use the middleware for job and data management. This paper presents the architecture and functionality of the ARC SDK along with an example graphical application developed with the SDK. The SDK consists of a set of libraries accessible through Application Programming Interfaces (API) in several languages. It contains extensive documentation and example code and is available on multiple platforms. The libraries provide generic interfaces and rely on plugins to support a given technology or protocol and this modular design makes it easy to add a new plugin if the application requires supporting additional technologies.The ARC Graphical Clients package is a graphical user interface built on top of the ARC SDK and the Qt toolkit and it is presented here as a fully functional example of an application. It provides a graphical interface to enable job submission and management at the click of a button, and allows data on any Grid storage system to be manipulated using a visual file system hierarchy, as if it were a regular file system.

  16. Visualization tool for the world ocean surface currents

    NASA Astrophysics Data System (ADS)

    Kasyanov, S.; Nikitin, O.

    2003-04-01

    Fortran-based software for the world ocean surface currents visualization functioning on the Windows platform (95 and higher) has been developed. The software works with the global interpolated drifting buoys data set (1979-2002) from the WOCE Surface Velocity Program and the global bottom relief five-minute resolution data set (ETOPO5). These data sets loaded in binary form into operative memory of a PC (256 Mb or better more), together with the software compose the world ocean surface currents visualization tool. The tool allows researches to process data on-line in any region of the world ocean, display data in different visualization forms, calculate currents velocity statistics and save chosen images as graphic files. It provides displays of buoy movement (animation), maps of buoy trajectories, averaged (by prescribed time and space grid intervals) current vector and modulus fields, fields of current mean and eddy kinetic energies and their ratio, current steadiness coefficient and sea surface temperature. Any trajectory may be selected simply by clicking it on any summary map of trajectories (or by given buoy number). It may then be viewed and analyzed in detail, while graphs of velocity (components, module and vector) and water temperature variations along this trajectory may be displayed. The description of the previous version of the tool and some screen shots are available at http://zhurnal.ape.relarn.ru/articles/2001/154.pdf(in Russian) and will be available (in English) at http://csit.ugatu.ac.ru (CSIT '2001, Proceedings, v.2, p. 32-41, Nikitin O.P. et al).

  17. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    NASA Astrophysics Data System (ADS)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  18. Grid Stability Awareness System (GSAS) Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feuerborn, Scott; Ma, Jian; Black, Clifton

    The project team developed a software suite named Grid Stability Awareness System (GSAS) for power system near real-time stability monitoring and analysis based on synchrophasor measurement. The software suite consists of five analytical tools: an oscillation monitoring tool, a voltage stability monitoring tool, a transient instability monitoring tool, an angle difference monitoring tool, and an event detection tool. These tools have been integrated into one framework to provide power grid operators with both real-time or near real-time stability status of a power grid and historical information about system stability status. These tools are being considered for real-time use in themore » operation environment.« less

  19. Enabling Efficient Intelligence Analysis in Degraded Environments

    DTIC Science & Technology

    2013-06-01

    Magnets Grid widget for multidimensional information exploration ; and a record browser of Visual Summary Cards widget for fast visual identification of...evolution analysis; a Magnets Grid widget for multi- dimensional information exploration ; and a record browser of Visual Summary Cards widget for fast...attention and inattentional blindness. It also explores and develops various techniques to represent information in a salient way and provide efficient

  20. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

    2005-12-01

    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy-to-use system for constructing SHA computations, a browser-based workflow assembly web portal has been developed. Users can compose complex SHA calculations, specifying SCEC/CME data sets as inputs to calculations, and calling SCEC/CME computational programs to process the data and the output. Knowledge-based software tools have been implemented that utilize ontological descriptions of SHA software and data can validate workflows created with this pathway assembly tool. Data visualization software developed by the collaboration supports analysis and validation of data sets. Several programs have been developed to visualize SCEC/CME data including GMT-based map making software for PSHA codes, 4D wavefield propagation visualization software based on OpenGL, and 3D Geowall-based visualization of earthquakes, faults, and seismic wave propagation. The SCEC/CME Project also helps to sponsor the SCEC UseIT Intern program. The UseIT Intern Program provides research opportunities in both Geosciences and Information Technology to undergraduate students in a variety of fields. The UseIT group has developed a 3D data visualization tool, called SCEC-VDO, as a part of this undergraduate research program.

  1. Web-Based Geospatial Visualization of GPM Data with CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matt

    2018-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology have made online visualization of large geospatial datasets such as those coming from precipitation satellites viable. These data benefit from being visualized on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS (http://cesiumjs.org), developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. This presentation will describe how CesiumJS has been used in three-dimensional visualization products developed as part of the NASA Precipitation Processing System (PPS) STORM data-order website. Existing methods of interacting with Global Precipitation Measurement (GPM) Mission data primarily focus on two-dimensional static images, whether displaying vertical slices or horizontal surface/height-level maps. These methods limit interactivity with the robust three-dimensional data coming from the GPM core satellite. Integrating the data with CesiumJS in a web-based user interface has allowed us to create the following products. We have linked with the data-order interface an on-the-fly visualization tool for any GPM/partner satellite orbit. A version of this tool also focuses on high-impact weather events. It enables viewing of combined radar and microwave-derived precipitation data on mobile devices and in a way that can be embedded into other websites. We also have used CesiumJS to visualize a method of integrating gridded precipitation data with modeled wind speeds that animates over time. Emphasis in the presentation will be placed on how a variety of technical methods were used to create these tools, and how the flexibility of the CesiumJS framework facilitates creative approaches to interact with the data.

  2. Navigation in Grid Space with the NAS Grid Benchmarks

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We present a navigational tool for computational grids. The navigational process is based on measuring the grid characteristics with the NAS Grid Benchmarks (NGB) and using the measurements to assign tasks of a grid application to the grid machines. The tool allows the user to explore the grid space and to navigate the execution at a grid application to minimize its turnaround time. We introduce the notion of gridscape as a user view of the grid and show how it can be me assured by NGB, Then we demonstrate how the gridscape can be used with two different schedulers to navigate a grid application through a rudimentary grid.

  3. Correlation of a scanning laser derived oedema index and visual function following grid laser treatment for diabetic macular oedema.

    PubMed

    Hudson, C; Flanagan, J G; Turner, G S; Chen, H C; Young, L B; McLeod, D

    2003-04-01

    To correlate change of an oedema index derived by scanning laser tomography with change of visual function in patients undergoing grid laser photocoagulation for clinically significant diabetic macular oedema (DMO). The sample comprised 24 diabetic patients with retinal thickening within 500 micro m of the fovea. Inclusion criteria included a logMAR visual acuity of 0.25, or better. Patients were assessed twice before a single session of grid laser treatment and within 1 week of, and at 1, 2, 4, and 12 weeks after, treatment. At each visit, patients underwent logMAR visual acuity, conventional and short wavelength automated perimetry (SWAP), and scanning laser tomography. Each visual function parameter was correlated with the mean oedema index. The mean oedema index represented the z-profile signal width divided by the maximum reflectance intensity (arbitrary units). A Pearson correlation coefficient (Bonferroni corrected) was undertaken on the data set of each patient. 13 patients exhibited significant correlation of the mean oedema index and at least one measure of visual function for the 10 degrees x 10 degrees scan field while 10 patients correlated for the 20 degrees x 20 degrees scan field. Seven patients demonstrated correlation for both scan fields. Laser photocoagulation typically resulted in an immediate loss of perimetric sensitivity whereas the oedema index changed over a period of weeks. Localised oedema did not impact upon visual acuity or letter contrast sensitivity when situated extrafoveally. Correlation of change of the oedema index and of visual function following grid laser photocoagulation was not found in all patients. An absence of correlation can be explained by the localised distribution of DMO in this sample of patients, as well as by differences in the time course of change of the oedema index and visual function. The study has objectively documented change in the magnitude and distribution of DMO following grid laser treatment and has established the relation of this change to the change in visual function.

  4. The Climate-G testbed: towards a large scale data sharing environment for climate change

    NASA Astrophysics Data System (ADS)

    Aloisio, G.; Fiore, S.; Denvil, S.; Petitdidier, M.; Fox, P.; Schwichtenberg, H.; Blower, J.; Barbera, R.

    2009-04-01

    The Climate-G testbed provides an experimental large scale data environment for climate change addressing challenging data and metadata management issues. The main scope of Climate-G is to allow scientists to carry out geographical and cross-institutional climate data discovery, access, visualization and sharing. Climate-G is a multidisciplinary collaboration involving both climate and computer scientists and it currently involves several partners such as: Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), Institut Pierre-Simon Laplace (IPSL), Fraunhofer Institut für Algorithmen und Wissenschaftliches Rechnen (SCAI), National Center for Atmospheric Research (NCAR), University of Reading, University of Catania and University of Salento. To perform distributed metadata search and discovery, we adopted a CMCC metadata solution (which provides a high level of scalability, transparency, fault tolerance and autonomy) leveraging both on P2P and grid technologies (GRelC Data Access and Integration Service). Moreover, data are available through OPeNDAP/THREDDS services, Live Access Server as well as the OGC compliant Web Map Service and they can be downloaded, visualized, accessed into the proposed environment through the Climate-G Data Distribution Centre (DDC), the web gateway to the Climate-G digital library. The DDC is a data-grid portal allowing users to easily, securely and transparently perform search/discovery, metadata management, data access, data visualization, etc. Godiva2 (integrated into the DDC) displays 2D maps (and animations) and also exports maps for display on the Google Earth virtual globe. Presently, Climate-G publishes (through the DDC) about 2TB of data related to the ENSEMBLES project (also including distributed replicas of data) as well as to the IPCC AR4. The main results of the proposed work are: wide data access/sharing environment for climate change; P2P/grid metadata approach; production-level Climate-G DDC; high quality tools for data visualization; metadata search/discovery across several countries/institutions; open environment for climate change data sharing.

  5. First responder tracking and visualization for command and control toolkit

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Petrov, Plamen; Meisinger, Roger

    2010-04-01

    In order for First Responder Command and Control personnel to visualize incidents at urban building locations, DHS sponsored a small business research program to develop a tool to visualize 3D building interiors and movement of First Responders on site. 21st Century Systems, Inc. (21CSI), has developed a toolkit called Hierarchical Grid Referenced Normalized Display (HiGRND). HiGRND utilizes three components to provide a full spectrum of visualization tools to the First Responder. First, HiGRND visualizes the structure in 3D. Utilities in the 3D environment allow the user to switch between views (2D floor plans, 3D spatial, evacuation routes, etc.) and manually edit fast changing environments. HiGRND accepts CAD drawings and 3D digital objects and renders these in the 3D space. Second, HiGRND has a First Responder tracker that uses the transponder signals from First Responders to locate them in the virtual space. We use the movements of the First Responder to map the interior of structures. Finally, HiGRND can turn 2D blueprints into 3D objects. The 3D extruder extracts walls, symbols, and text from scanned blueprints to create the 3D mesh of the building. HiGRND increases the situational awareness of First Responders and allows them to make better, faster decisions in critical urban situations.

  6. Power Plant Model Validation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less

  7. Information Power Grid (IPG) Tutorial 2003

    NASA Technical Reports Server (NTRS)

    Meyers, George

    2003-01-01

    For NASA and the general community today Grid middleware: a) provides tools to access/use data sources (databases, instruments, ...); b) provides tools to access computing (unique and generic); c) Is an enabler of large scale collaboration. Dynamically responding to needs is a key selling point of a grid. Independent resources can be joined as appropriate to solve a problem. Provide tools to enable the building of a frameworks for application. Provide value added service to the NASA user base for utilizing resources on the grid in new and more efficient ways. Provides tools for development of Frameworks.

  8. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less

  9. Panoramic Night Vision Goggle Testing For Diagnosis and Repair

    DTIC Science & Technology

    2000-01-01

    Visual Acuity Visual Acuity [ Marasco & Task, 1999] measures how well a human observer can see high contrast targets at specified light levels through...grid through the PNVG in-board and out-board channels simultaneously and comparing the defects to the size of grid features ( Marasco & Task, 1999). The

  10. Sampling Scattered Data Onto Rectangular Grids for Volume Visualization

    DTIC Science & Technology

    1989-12-01

    30 4.4 Building A Rectangular Grid ..... ................ 30 4.5 Sampling Methds ...... ...................... 34 4.6...dimensional data have been developed recently. In computational fluid flow analysis, methods for constructing three dimen- sional numerical grids are...structure of rectangular grids. Because finite element analysis is useful in fields other than fluid flow analysis and the numerical grid has promising

  11. Comparison of pediatric radiation dose and vessel visibility on angiographic systems using piglets as a surrogate: antiscatter grid removal vs. lower detector air kerma settings with a grid — a preclinical investigation

    PubMed Central

    Racadio, John M.; Abruzzo, Todd A.; Johnson, Neil D.; Patel, Manish N.; Kukreja, Kamlesh U.; den Hartog, Mark. J. H.; Hoornaert, Bart P.A.; Nachabe, Rami A.

    2015-01-01

    The purpose of this study was to reduce pediatric doses while maintaining or improving image quality scores without removing the grid from X‐ray beam. This study was approved by the Institutional Animal Care and Use Committee. Three piglets (5, 14, and 20 kg) were imaged using six different selectable detector air kerma (Kair) per frame values (100%, 70%, 50%, 35%, 25%, 17.5%) with and without the grid. Number of distal branches visualized with diagnostic confidence relative to the injected vessel defined image quality score. Five pediatric interventional radiologists evaluated all images. Image quality score and piglet Kair were statistically compared using analysis of variance and receiver operating curve analysis to define the preferred dose setting and use of grid for a visibility of 2nd and 3rd order vessel branches. Grid removal reduced both dose to subject and imaging quality by 26%. Third order branches could only be visualized with the grid present; 100% detector Kair was required for smallest pig, while 70% detector Kair was adequate for the two larger pigs. Second order branches could be visualized with grid at 17.5% detector Kair for all three pig sizes. Without the grid, 50%, 35%, and 35% detector Kair were required for smallest to largest pig, respectively. Grid removal reduces both dose and image quality score. Image quality scores can be maintained with less dose to subject with the grid in the beam as opposed to removed. Smaller anatomy requires more dose to the detector to achieve the same image quality score. PACS numbers: 87.53.Bn, 87.57.N‐, 87.57.cj, 87.59.cf, 87.59.Dj PMID:26699297

  12. Data Fusion and Visualization with the OpenEarth Framework (OEF)

    NASA Astrophysics Data System (ADS)

    Nadeau, D. R.; Baru, C.; Fouch, M. J.; Crosby, C. J.

    2010-12-01

    Data fusion is an increasingly important problem to solve as we strive to integrate data from multiple sources and build better models of the complex processes operating at the Earth’s surface and its interior. These data are often large, multi-dimensional, and subject to differing conventions for file formats, data structures, coordinate spaces, units of measure, and metadata organization. When visualized, these data require differing, and often conflicting, conventions for visual representations, dimensionality, icons, color schemes, labeling, and interaction. These issues make the visualization of fused Earth science data particularly difficult. The OpenEarth Framework (OEF) is an open-source data fusion and visualization suite of software being developed at the Supercomputer Center at the University of California, San Diego. Funded by the NSF, the project is leveraging virtual globe technology from NASA’s WorldWind to create interactive 3D visualization tools that combine layered data from a variety of sources to create a holistic view of features at, above, and beneath the Earth’s surface. The OEF architecture is cross-platform, multi-threaded, modular, and based upon Java. The OEF’s modular approach yields a collection of compatible mix-and-match components for assembling custom applications. Available modules support file format handling, web service communications, data management, data filtering, user interaction, and 3D visualization. File parsers handle a variety of formal and de facto standard file formats. Each one imports data into a general-purpose data representation that supports multidimensional grids, topography, points, lines, polygons, images, and more. From there these data then may be manipulated, merged, filtered, reprojected, and visualized. Visualization features support conventional and new visualization techniques for looking at topography, tomography, maps, and feature geometry. 3D grid data such as seismic tomography may be sliced by multiple oriented cutting planes and isosurfaced to create 3D skins that trace feature boundaries within the data. Topography may be overlaid with satellite imagery along with data such as gravity and magnetics measurements. Multiple data sets may be visualized simultaneously using overlapping layers and a common 3D+time coordinate space. Data management within the OEF handles and hides the quirks of differing file formats, web protocols, storage structures, coordinate spaces, and metadata representations. Derived data are computed automatically to support interaction and visualization while the original data is left unchanged in its original form. Data is cached for better memory and network efficiency, and all visualization is accelerated by 3D graphics hardware found on today’s computers. The OpenEarth Framework project is currently prototyping the software for use in the visualization, and integration of continental scale geophysical data being produced by EarthScope-related research in the Western US. The OEF is providing researchers with new ways to display and interrogate their data and is anticipated to be a valuable tool for future EarthScope-related research.

  13. Rapid Airplane Parametric Input Design (RAPID)

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.

    1995-01-01

    RAPID is a methodology and software system to define a class of airplane configurations and directly evaluate surface grids, volume grids, and grid sensitivity on and about the configurations. A distinguishing characteristic which separates RAPID from other airplane surface modellers is that the output grids and grid sensitivity are directly applicable in CFD analysis. A small set of design parameters and grid control parameters govern the process which is incorporated into interactive software for 'real time' visual analysis and into batch software for the application of optimization technology. The computed surface grids and volume grids are suitable for a wide range of Computational Fluid Dynamics (CFD) simulation. The general airplane configuration has wing, fuselage, horizontal tail, and vertical tail components. The double-delta wing and tail components are manifested by solving a fourth order partial differential equation (PDE) subject to Dirichlet and Neumann boundary conditions. The design parameters are incorporated into the boundary conditions and therefore govern the shapes of the surfaces. The PDE solution yields a smooth transition between boundaries. Surface grids suitable for CFD calculation are created by establishing an H-type topology about the configuration and incorporating grid spacing functions in the PDE equation for the lifting components and the fuselage definition equations. User specified grid parameters govern the location and degree of grid concentration. A two-block volume grid about a configuration is calculated using the Control Point Form (CPF) technique. The interactive software, which runs on Silicon Graphics IRIS workstations, allows design parameters to be continuously varied and the resulting surface grid to be observed in real time. The batch software computes both the surface and volume grids and also computes the sensitivity of the output grid with respect to the input design parameters by applying the precompiler tool ADIFOR to the grid generation program. The output of ADIFOR is a new source code containing the old code plus expressions for derivatives of specified dependent variables (grid coordinates) with respect to specified independent variables (design parameters). The RAPID methodology and software provide a means of rapidly defining numerical prototypes, grids, and grid sensitivity of a class of airplane configurations. This technology and software is highly useful for CFD research for preliminary design and optimization processes.

  14. Grid Work

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Pointwise Inc.'s, Gridgen Software is a system for the generation of 3D (three dimensional) multiple block, structured grids. Gridgen is a visually-oriented, graphics-based interactive code used to decompose a 3D domain into blocks, distribute grid points on curves, initialize and refine grid points on surfaces and initialize volume grid points. Gridgen is available to U.S. citizens and American-owned companies by license.

  15. Fluid Simulation in the Movies: Navier and Stokes Must Be Circulating in Their Graves

    NASA Astrophysics Data System (ADS)

    Tessendorf, Jerry

    2010-11-01

    Fluid simulations based on the Incompressible Navier-Stokes equations are commonplace computer graphics tools in the visual effects industry. These simulations mostly come from custom C++ code written by the visual effects companies. Their significant impact in films was recognized in 2008 with Academy Awards to four visual effects companies for their technical achievement. However artists are not fluid dynamicists, and fluid dynamics simulations are expensive to use in a deadline-driven production environment. As a result, the simulation algorithms are modified to limit the computational resources, adapt them to production workflow, and to respect the client's vision of the film plot. Eulerian solvers on fixed rectangular grids use a mix of momentum solvers, including Semi-Lagrangian, FLIP, and QUICK. Incompressibility is enforced with FFT, Conjugate Gradient, and Multigrid methods. For liquids, a levelset field tracks the free surface. Smooth Particle Hydrodynamics is also used, and is part of a hybrid Eulerian-SPH liquid simulator. Artists use all of them in a mix and match fashion to control the appearance of the simulation. Specially designed forces and boundary conditions control the flow. The simulation can be an input to artistically driven procedural particle simulations that enhance the flow with more detail and drama. Post-simulation processing increases the visual detail beyond the grid resolution. Ultimately, iterative simulation methods that fit naturally in the production workflow are extremely desirable but not yet successful. Results from some efforts for iterative methods are shown, and other approaches motivated by the history of production are proposed.

  16. Image-based red cell counting for wild animals blood.

    PubMed

    Mauricio, Claudio R M; Schneider, Fabio K; Dos Santos, Leonilda Correia

    2010-01-01

    An image-based red blood cell (RBC) automatic counting system is presented for wild animals blood analysis. Images with 2048×1536-pixel resolution acquired on an optical microscope using Neubauer chambers are used to evaluate RBC counting for three animal species (Leopardus pardalis, Cebus apella and Nasua nasua) and the error found using the proposed method is similar to that obtained for inter observer visual counting method, i.e., around 10%. Smaller errors (e.g., 3%) can be obtained in regions with less grid artifacts. These promising results allow the use of the proposed method either as a complete automatic counting tool in laboratories for wild animal's blood analysis or as a first counting stage in a semi-automatic counting tool.

  17. Is Fourier analysis performed by the visual system or by the visual investigator.

    PubMed

    Ochs, A L

    1979-01-01

    A numerical Fourier transform was made of the pincushion grid illusion and the spectral components orthogonal to the illusory lines were isolated. Their inverse transform creates a picture of the illusion. The spatial-frequency response of cortical, simple receptive field neurons similarly filters the grid. A complete set of these neurons thus approximates a two-dimensional Fourier analyzer. One cannot conclude, however, that the brain actually uses frequency-domain information to interpret visual images.

  18. The art and science of hyperbolic tessellations.

    PubMed

    Van Dusen, B; Taylor, R P

    2013-04-01

    The visual impact of hyperbolic tessellations has captured artists' imaginations ever since M.C. Escher generated his Circle Limit series in the 1950s. The scaling properties generated by hyperbolic geometry are different to the fractal scaling properties found in nature's scenery. Consequently, prevalent interpretations of Escher's art emphasize the lack of connection with nature's patterns. However, a recent collaboration between the two authors proposed that Escher's motivation for using hyperbolic geometry was as a method to deliberately distort nature's rules. Inspired by this hypothesis, this year's cover artist, Ben Van Dusen, embeds natural fractals such as trees, clouds and lightning into a hyperbolic scaling grid. The resulting interplay of visual structure at multiple size scales suggests that hybridizations of fractal and hyperbolic geometries provide a rich compositional tool for artists.

  19. Tools for Analysis and Visualization of Large Time-Varying CFD Data Sets

    NASA Technical Reports Server (NTRS)

    Wilhelms, Jane; VanGelder, Allen

    1997-01-01

    In the second year, we continued to built upon and improve our scanline-based direct volume renderer that we developed in the first year of this grant. This extremely general rendering approach can handle regular or irregular grids, including overlapping multiple grids, and polygon mesh surfaces. It runs in parallel on multi-processors. It can also be used in conjunction with a k-d tree hierarchy, where approximate models and error terms are stored in the nodes of the tree, and approximate fast renderings can be created. We have extended our software to handle time-varying data where the data changes but the grid does not. We are now working on extending it to handle more general time-varying data. We have also developed a new extension of our direct volume renderer that uses automatic decimation of the 3D grid, as opposed to an explicit hierarchy. We explored this alternative approach as being more appropriate for very large data sets, where the extra expense of a tree may be unacceptable. We also describe a new approach to direct volume rendering using hardware 3D textures and incorporates lighting effects. Volume rendering using hardware 3D textures is extremely fast, and machines capable of using this technique are becoming more moderately priced. While this technique, at present, is limited to use with regular grids, we are pursuing possible algorithms extending the approach to more general grid types. We have also begun to explore a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH '96. In our initial implementation, we automatically image the volume from 32 equi-distant positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation. We are studying whether this will give a quantitative measure of the effects of approximation. We have created new tools for exploring the differences between images produced by various rendering methods. Images created by our software can be stored in the SGI RGB format. Our idtools software reads in pair of images and compares them using various metrics. The differences of the images using the RGB, HSV, and HSL color models can be calculated and shown. We can also calculate the auto-correlation function and the Fourier transform of the image and image differences. We will explore how these image differences compare in order to find useful metrics for quantifying the success of various visualization approaches. In general, progress was consistent with our research plan for the second year of the grant.

  20. Modelling effects on grid cells of sensory input during self‐motion

    PubMed Central

    Raudies, Florian; Hinman, James R.

    2016-01-01

    Abstract The neural coding of spatial location for memory function may involve grid cells in the medial entorhinal cortex, but the mechanism of generating the spatial responses of grid cells remains unclear. This review describes some current theories and experimental data concerning the role of sensory input in generating the regular spatial firing patterns of grid cells, and changes in grid cell firing fields with movement of environmental barriers. As described here, the influence of visual features on spatial firing could involve either computations of self‐motion based on optic flow, or computations of absolute position based on the angle and distance of static visual cues. Due to anatomical selectivity of retinotopic processing, the sensory features on the walls of an environment may have a stronger effect on ventral grid cells that have wider spaced firing fields, whereas the sensory features on the ground plane may influence the firing of dorsal grid cells with narrower spacing between firing fields. These sensory influences could contribute to the potential functional role of grid cells in guiding goal‐directed navigation. PMID:27094096

  1. Efficient computation paths for the systematic analysis of sensitivities

    NASA Astrophysics Data System (ADS)

    Greppi, Paolo; Arato, Elisabetta

    2013-01-01

    A systematic sensitivity analysis requires computing the model on all points of a multi-dimensional grid covering the domain of interest, defined by the ranges of variability of the inputs. The issues to efficiently perform such analyses on algebraic models are handling solution failures within and close to the feasible region and minimizing the total iteration count. Scanning the domain in the obvious order is sub-optimal in terms of total iterations and is likely to cause many solution failures. The problem of choosing a better order can be translated geometrically into finding Hamiltonian paths on certain grid graphs. This work proposes two paths, one based on a mixed-radix Gray code and the other, a quasi-spiral path, produced by a novel heuristic algorithm. Some simple, easy-to-visualize examples are presented, followed by performance results for the quasi-spiral algorithm and the practical application of the different paths in a process simulation tool.

  2. Earth Observations for Early Detection of Agricultural Drought: Contributions of the Famine Early Warning Systems Network (FEWS NET)

    NASA Astrophysics Data System (ADS)

    Budde, M. E.; Funk, C.; Husak, G. J.; Peterson, P.; Rowland, J.; Senay, G. B.; Verdin, J. P.

    2016-12-01

    The U.S. Geological Survey (USGS) has a long history of supporting the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET) program. The use of remote sensing and crop modeling to address food security threats in the form of drought, floods, pests, and changing climatic regimes has been a core activity in monitoring FEWS NET countries. In recent years, it has become a requirement that FEWS NET apply monitoring and modeling frameworks at global scales to assess emerging crises in regions that FEWS NET does not traditionally monitor. USGS FEWS NET, in collaboration with the University of California, Santa Barbara, has developed a number of new global applications of satellite observations, derived products, and efficient tools for visualization and analyses to address these requirements. (1) A 35-year quasi-global (+/- 50 degrees latitude) time series of gridded rainfall estimates, the Climate Hazards Infrared Precipitation with Stations (CHIRPS) dataset, based on infrared satellite imagery and station observations. Data are available as 5-day (pentadal) accumulations at 0.05 degree spatial resolution. (2) Global actual evapotranspiration data based on application of the Simplified Surface Energy Balance (SSEB) model using 10-day MODIS Land Surface Temperature composites at 1-km resolution. (3) Production of global expedited MODIS (eMODIS) 10-day NDVI composites updated every 5 days. (4) Development of an updated Early Warning eXplorer (EWX) tool for data visualization, analysis, and sharing. (5) Creation of stand-alone tools for enhancement of gridded rainfall data and trend analyses. (6) Establishment of an agro-climatology analysis tool and knowledge base for more than 90 countries of interest to FEWS NET. In addition to these new products and tools, FEWS NET has partnered with the GEOGLAM community to develop a Crop Monitor for Early Warning (CM4EW) which brings together global expertise in agricultural monitoring to reach consensus on growing season status of "countries at risk". Such engagements will result in enhanced capabilities for extending our monitoring efforts globally.

  3. CAPRI: Using a Geometric Foundation for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2002-01-01

    CAPRI (Computational Analysis Programming Interface) is a software development tool intended to make computerized design, simulation and analysis faster and more efficient. The computational steps traditionally taken for most engineering analysis (Computational Fluid Dynamics (CFD), structural analysis, etc.) are: Surface Generation, usually by employing a Computer Aided Design (CAD) system; Grid Generation, preparing the volume for the simulation; Flow Solver, producing the results at the specified operational point; Post-processing Visualization, interactively attempting to understand the results. It should be noted that the structures problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go IGES files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D) data formats and the upcoming CGNS). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Instead of the serial approach to analysis, CAPRI takes a geometry centric approach. CAPRI is a software building tool-kit that refers to two ideas: (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. The connection to the geometry is made through an Application Programming Interface (API) and not a file system.

  4. Tree Cover Mapping Tool—Documentation and user manual

    USGS Publications Warehouse

    Cotillon, Suzanne E.; Mathis, Melissa L.

    2016-06-02

    The Tree Cover Mapping (TCM) tool was developed by scientists at the U.S. Geological Survey Earth Resources Observation and Science Center to allow a user to quickly map tree cover density over large areas using visual interpretation of high resolution imagery within a geographic information system interface. The TCM tool uses a systematic sample grid to produce maps of tree cover. The TCM tool allows the user to define sampling parameters to estimate tree cover within each sample unit. This mapping method generated the first on-farm tree cover maps of vast regions of Niger and Burkina Faso. The approach contributes to implementing integrated landscape management to scale up re-greening and restore degraded land in the drylands of Africa. The TCM tool is easy to operate, practical, and can be adapted to many other applications such as crop mapping, settlements mapping, or other features. This user manual provides step-by-step instructions for installing and using the tool, and creating tree cover maps. Familiarity with ArcMap tools and concepts is helpful for using the tool.

  5. GIS-MODFLOW: Ein kleines OpenSource-Werkzeug zur Anbindung von GIS-Daten an MODFLOW

    NASA Astrophysics Data System (ADS)

    Gossel, Wolfgang

    2013-06-01

    The numerical model MODFLOW (Harbaugh 2005) is an efficient and up-to-date tool for groundwater flow modelling. On the other hand, Geo-Information-Systems (GIS) provide useful tools for data preparation and visualization that can also be incorporated in numerical groundwater modelling. An interface between both would therefore be useful for many hydrogeological investigations. To date, several integrated stand-alone tools have been developed that rely on MODFLOW, MODPATH and transport modelling tools. Simultaneously, several open source-GIS codes were developed to improve functionality and ease of use. These GIS tools can be used as pre- and post-processors of the numerical model MODFLOW via a suitable interface. Here we present GIS-MODFLOW as an open-source tool that provides a new universal interface by using the ESRI ASCII GRID data format that can be converted into MODFLOW input data. This tool can also treat MODFLOW results. Such a combination of MODFLOW and open-source GIS opens new possibilities to render groundwater flow modelling, and simulation results, available to larger circles of hydrogeologists.

  6. Applications of the pipeline environment for visual informatics and genomics computations

    PubMed Central

    2011-01-01

    Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102

  7. OVERGRID: A Unified Overset Grid Generation Graphical Interface

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Akien, Edwin W. (Technical Monitor)

    1999-01-01

    This paper presents a unified graphical interface and gridding strategy for performing overset grid generation. The interface called OVERGRID has been specifically designed to follow an efficient overset gridding strategy, and contains general grid manipulation capabilities as well as modules that are specifically suited for overset grids. General grid utilities include functions for grid redistribution, smoothing, concatenation, extraction, extrapolation, projection, and many others. Modules specially tailored for overset grids include a seam curve extractor, hyperbolic and algebraic surface grid generators, a hyperbolic volume grid generator, and a Cartesian box grid generator, Grid visualization is achieved using OpenGL while widgets are constructed with Tcl/Tk. The software is portable between various platforms from UNIX workstations to personal computers.

  8. Demonstrating the Superiority of the FCB Grid as a Tool for Students To Write Effective Advertising Strategy.

    ERIC Educational Resources Information Center

    Yssel, Johan C.

    Although the FCB (Foote, Cone, & Belding) grid was never intended to serve as an educational tool, it can be applied successfully in advertising classes to address the three areas that S. E. Moriarty considers to be the minimum for writing strategy. To demonstrate the superiority of the FCB grid as a pedagogical tool, a study analyzed…

  9. Geophysical data analysis and visualization using the Grid Analysis and Display System

    NASA Technical Reports Server (NTRS)

    Doty, Brian E.; Kinter, James L., III

    1995-01-01

    Several problems posed by the rapidly growing volume of geophysical data are described, and a selected set of existing solutions to these problems is outlined. A recently developed desktop software tool called the Grid Analysis and Display System (GrADS) is presented. The GrADS' user interface is a natural extension of the standard procedures scientists apply to their geophysical data analysis problems. The basic GrADS operations have defaults that naturally map to data analysis actions, and there is a programmable interface for customizing data access and manipulation. The fundamental concept of the GrADS' dimension environment, which defines both the space in which the geophysical data reside and the 'slice' of data which is being analyzed at a given time, is expressed The GrADS' data storage and access model is described. An argument is made in favor of describable data formats rather than standard data formats. The manner in which GrADS users may perform operations on their data and display the results is also described. It is argued that two-dimensional graphics provides a powerful quantitative data analysis tool whose value is underestimated in the current development environment which emphasizes three dimensional structure modeling.

  10. PNNL Data-Intensive Computing for a Smarter Energy Grid

    ScienceCinema

    Carol Imhoff; Zhenyu (Henry) Huang; Daniel Chavarria

    2017-12-09

    The Middleware for Data-Intensive Computing (MeDICi) Integration Framework, an integrated platform to solve data analysis and processing needs, supports PNNL research on the U.S. electric power grid. MeDICi is enabling development of visualizations of grid operations and vulnerabilities, with goal of near real-time analysis to aid operators in preventing and mitigating grid failures.

  11. A Debugger for Computational Grid Applications

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Jost, Gabriele; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation gives an overview of a debugger for computational grid applications. Details are given on NAS parallel tools groups (including parallelization support tools, evaluation of various parallelization strategies, and distributed and aggregated computing), debugger dependencies, scalability, initial implementation, the process grid, and information on Globus.

  12. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  13. Integration and validation of a data grid software

    NASA Astrophysics Data System (ADS)

    Carenton-Madiec, Nicolas; Berger, Katharina; Cofino, Antonio

    2014-05-01

    The Earth System Grid Federation (ESGF) Peer-to-Peer (P2P) is a software infrastructure for the management, dissemination, and analysis of model output and observational data. The ESGF grid is composed with several types of nodes which have different roles. About 40 data nodes host model outputs and datasets using thredds catalogs. About 25 compute nodes offer remote visualization and analysis tools. About 15 index nodes crawl data nodes catalogs and implement faceted and federated search in a web interface. About 15 Identity providers nodes manage accounts, authentication and authorization. Here we will present an actual size test federation spread across different institutes in different countries and a python test suite that were started in December 2013. The first objective of the test suite is to provide a simple tool that helps to test and validate a single data node and its closest index, compute and identity provider peer. The next objective will be to run this test suite on every data node of the federation and therefore test and validate every single node of the whole federation. The suite already implements nosetests, requests, myproxy-logon, subprocess, selenium and fabric python libraries in order to test both web front ends, back ends and security services. The goal of this project is to improve the quality of deliverable in a small developers team context. Developers are widely spread around the world working collaboratively and without hierarchy. This kind of working organization context en-lighted the need of a federated integration test and validation process.

  14. Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems

    NASA Astrophysics Data System (ADS)

    Dogan, Firat; Atilgan, Yasemin

    The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.

  15. Personal Visual Aids for Aircrew.

    DTIC Science & Technology

    1981-06-01

    oedema or areas of pigmentation or depigmentation, such as central serous retinopathy or focal choroiditis of differing aetiology. Heat induced lenticular ...Fig 11). a. Normal grid pattern b. Pincushion distortion c. Astigmatic distortion d. Para central scotoma Fig 11. Amsler grids illustrating visual...ML). Ophtalmologie. Maladie yeux. Astigmatisme . Corne. Prothbse. Pilotes. Vision. Lunettes. 46. A propos du vol et de la correction des presbytes

  16. Response Grids: Practical Ways to Display Large Data Sets with High Visual Impact

    ERIC Educational Resources Information Center

    Gates, Simon

    2013-01-01

    Spreadsheets are useful for large data sets but they may be too wide or too long to print as conventional tables. Response grids offer solutions to the challenges posed by any large data set. They have wide application throughout science and for every subject and context where visual data displays are designed, within education and elsewhere.…

  17. GIS characterization of spatially distributed lifeline damage

    USGS Publications Warehouse

    Toprak, Selcuk; O'Rourke, Thomas; Tutuncu, Ilker

    1999-01-01

    This paper describes the visualization of spatially distributed water pipeline damage following an earthquake using geographical information systems (GIS). Pipeline damage is expressed as a repair rate (RR). Repair rate contours are developed with GIS by dividing the study area into grid cells (n ?? n), determining the number of particular pipeline repairs in each grid cell, and dividing the number of repairs by the length of that pipeline in each cell area. The resulting contour plot is a two-dimensional visualization of point source damage. High damage zones are defined herein as areas with an RR value greater than the mean RR for the entire study area of interest. A hyperbolic relationship between visual display of high pipeline damage zones and grid size, n, was developed. The relationship is expressed in terms of two dimensionless parameters, threshold area coverage (TAC) and dimensionless grid size (DGS). The relationship is valid over a wide range of different map scales spanning approximately 1,200 km2 for the largest portion of the Los Angeles water distribution system to 1 km2 for the Marina in San Francisco. This relationship can aid GIS users to get sufficiently refined, but easily visualized, maps of damage patterns.

  18. Fast 3D Net Expeditions: Tools for Effective Scientific Collaboration on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Watson, Val; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D (three dimensional), high resolution, dynamic, interactive viewing of scientific data. The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG (Motion Picture Expert Group) movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewers local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: (1) The visual is much higher in resolution (1280x1024 pixels with 24 bits of color) than typical video format transmitted over the network. (2) The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). (3) A rich variety of guided expeditions through the data can be included easily. (4) A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. (5) The scenes can be viewed in 3D using stereo vision. (6) The network bandwidth for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.) This talk will illustrate the use of these new technologies and present a proposal for using these technologies to improve science education.

  19. SigWin-detector: a Grid-enabled workflow for discovering enriched windows of genomic features related to DNA sequences.

    PubMed

    Inda, Márcia A; van Batenburg, Marinus F; Roos, Marco; Belloum, Adam S Z; Vasunin, Dmitry; Wibisono, Adianto; van Kampen, Antoine H C; Breit, Timo M

    2008-08-08

    Chromosome location is often used as a scaffold to organize genomic information in both the living cell and molecular biological research. Thus, ever-increasing amounts of data about genomic features are stored in public databases and can be readily visualized by genome browsers. To perform in silico experimentation conveniently with this genomics data, biologists need tools to process and compare datasets routinely and explore the obtained results interactively. The complexity of such experimentation requires these tools to be based on an e-Science approach, hence generic, modular, and reusable. A virtual laboratory environment with workflows, workflow management systems, and Grid computation are therefore essential. Here we apply an e-Science approach to develop SigWin-detector, a workflow-based tool that can detect significantly enriched windows of (genomic) features in a (DNA) sequence in a fast and reproducible way. For proof-of-principle, we utilize a biological use case to detect regions of increased and decreased gene expression (RIDGEs and anti-RIDGEs) in human transcriptome maps. We improved the original method for RIDGE detection by replacing the costly step of estimation by random sampling with a faster analytical formula for computing the distribution of the null hypothesis being tested and by developing a new algorithm for computing moving medians. SigWin-detector was developed using the WS-VLAM workflow management system and consists of several reusable modules that are linked together in a basic workflow. The configuration of this basic workflow can be adapted to satisfy the requirements of the specific in silico experiment. As we show with the results from analyses in the biological use case on RIDGEs, SigWin-detector is an efficient and reusable Grid-based tool for discovering windows enriched for features of a particular type in any sequence of values. Thus, SigWin-detector provides the proof-of-principle for the modular e-Science based concept of integrative bioinformatics experimentation.

  20. Power Grid Maintenance Scheduling Intelligence Arrangement Supporting System Based on Power Flow Forecasting

    NASA Astrophysics Data System (ADS)

    Xie, Chang; Wen, Jing; Liu, Wenying; Wang, Jiaming

    With the development of intelligent dispatching, the intelligence level of network control center full-service urgent need to raise. As an important daily work of network control center, the application of maintenance scheduling intelligent arrangement to achieve high-quality and safety operation of power grid is very important. By analyzing the shortages of the traditional maintenance scheduling software, this paper designs a power grid maintenance scheduling intelligence arrangement supporting system based on power flow forecasting, which uses the advanced technologies in maintenance scheduling, such as artificial intelligence, online security checking, intelligent visualization techniques. It implements the online security checking of maintenance scheduling based on power flow forecasting and power flow adjusting based on visualization, in order to make the maintenance scheduling arrangement moreintelligent and visual.

  1. PyPathway: Python Package for Biological Network Analysis and Visualization.

    PubMed

    Xu, Yang; Luo, Xiao-Chun

    2018-05-01

    Life science studies represent one of the biggest generators of large data sets, mainly because of rapid sequencing technological advances. Biological networks including interactive networks and human curated pathways are essential to understand these high-throughput data sets. Biological network analysis offers a method to explore systematically not only the molecular complexity of a particular disease but also the molecular relationships among apparently distinct phenotypes. Currently, several packages for Python community have been developed, such as BioPython and Goatools. However, tools to perform comprehensive network analysis and visualization are still needed. Here, we have developed PyPathway, an extensible free and open source Python package for functional enrichment analysis, network modeling, and network visualization. The network process module supports various interaction network and pathway databases such as Reactome, WikiPathway, STRING, and BioGRID. The network analysis module implements overrepresentation analysis, gene set enrichment analysis, network-based enrichment, and de novo network modeling. Finally, the visualization and data publishing modules enable users to share their analysis by using an easy web application. For package availability, see the first Reference.

  2. A package for 3-D unstructured grid generation, finite-element flow solution and flow field visualization

    NASA Technical Reports Server (NTRS)

    Parikh, Paresh; Pirzadeh, Shahyar; Loehner, Rainald

    1990-01-01

    A set of computer programs for 3-D unstructured grid generation, fluid flow calculations, and flow field visualization was developed. The grid generation program, called VGRID3D, generates grids over complex configurations using the advancing front method. In this method, the point and element generation is accomplished simultaneously, VPLOT3D is an interactive, menudriven pre- and post-processor graphics program for interpolation and display of unstructured grid data. The flow solver, VFLOW3D, is an Euler equation solver based on an explicit, two-step, Taylor-Galerkin algorithm which uses the Flux Corrected Transport (FCT) concept for a wriggle-free solution. Using these programs, increasingly complex 3-D configurations of interest to aerospace community were gridded including a complete Space Transportation System comprised of the space-shuttle orbitor, the solid-rocket boosters, and the external tank. Flow solutions were obtained on various configurations in subsonic, transonic, and supersonic flow regimes.

  3. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  4. State of the Oceans: A Satellite Data Processing System for Visualizing Near Real-Time Imagery on Google Earth

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Bingham, A. W.; Hall, J. R.; Alarcon, C.; Plesea, L.; Henderson, M. L.; Levoe, S.

    2011-12-01

    The State of the Oceans (SOTO) web tool was developed at NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC) at the Jet Propulsion Laboratory (JPL) as an interactive means for users to visually explore and assess ocean-based geophysical parameters extracted from the latest archived data products. The SOTO system consists of four extensible modules, a data polling tool, a preparation and imaging package, image server software, and the graphical user interface. Together, these components support multi-resolution visualization of swath (Level 2) and gridded Level 3/4) data products as either raster- or vector- based KML layers on Google Earth. These layers are automatically updated periodically throughout the day. Current parameters available include sea surface temperature, chlorophyll concentration, ocean winds, sea surface height anomaly, and sea surface temperature anomaly. SOTO also supports mash-ups, allowing KML feeds from other sources to be overlaid directly onto Google Earth such as hurricane tracks and buoy data. A version of the SOTO software has also been installed at Goddard Space Flight Center (GSFC) to support the Land Atmosphere Near real-time Capability for EOS (LANCE). The State of the Earth (SOTE) has similar functionality to SOTO but supports different data sets, among them the MODIS 250m data product.

  5. Weather from 250 Miles Up: Visualizing Precipitation Satellite Data (and Other Weather Applications) Using CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matt

    2017-01-01

    Geospatial weather visualization remains predominately a two-dimensional endeavor. Even popular advanced tools like the Nullschool Earth display 2-dimensional fields on a 3-dimensional globe. Yet much of the observational data and model output contains detailed three-dimensional fields. In 2014, NASA and JAXA (Japanese Space Agency) launched the Global Precipitation Measurement (GPM) satellite. Its two instruments, the Dual-frequency Precipitation Radar (DPR) and GPM Microwave Imager (GMI) observe much of the Earth's atmosphere between 65 degrees North Latitude and 65 degrees South Latitude. As part of the analysis and visualization tools developed by the Precipitation Processing System (PPS) Group at NASA Goddard, a series of CesiumJS [Using Cesium Markup Language (CZML), JavaScript (JS) and JavaScript Object Notation (JSON)] -based globe viewers have been developed to improve data acquisition decision making and to enhance scientific investigation of the satellite data. Other demos have also been built to illustrate the capabilities of CesiumJS in presenting atmospheric data, including model forecasts of hurricanes, observed surface radar data, and gridded analyses of global precipitation. This talk will present these websites and the various workflows used to convert binary satellite and model data into a form easily integrated with CesiumJS.

  6. Guasom Analysis Of The Alhambra Survey

    NASA Astrophysics Data System (ADS)

    Garabato, Daniel; Manteiga, Minia; Dafonte, Carlos; Álvarez, Marco A.

    2017-10-01

    GUASOM is a data mining tool designed for knowledge discovery in large astronomical spectrophotometric archives developed in the framework of Gaia DPAC (Data Processing and Analysis Consortium). Our tool is based on a type of unsupervised learning Artificial Neural Networks named Self-organizing maps (SOMs). SOMs permit the grouping and visualization of big amount of data for which there is no a priori knowledge and hence they are very useful for analyzing the huge amount of information present in modern spectrophotometric surveys. SOMs are used to organize the information in clusters of objects, as homogeneously as possible according to their spectral energy distributions, and to project them onto a 2D grid where the data structure can be visualized. Each cluster has a representative, called prototype which is a virtual pattern that better represents or resembles the set of input patterns belonging to such a cluster. Prototypes make easier the task of determining the physical nature and properties of the objects populating each cluster. Our algorithm has been tested on the ALHAMBRA survey spectrophotometric observations, here we present our results concerning the survey segmentation, visualization of the data structure, separation between types of objects (stars and galaxies), data homogeneity of neurons, cluster prototypes, redshift distribution and crossmatch with other databases (Simbad).

  7. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  8. TetrUSS Capabilities for S and C Applications

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Parikh, Paresh

    2004-01-01

    TetrUSS is a suite of loosely coupled computational fluid dynamics software that is packaged into a complete flow analysis system. The system components consist of tools for geometry setup, grid generation, flow solution, visualization, and various utilities tools. Development began in 1990 and it has evolved into a proven and stable system for Euler and Navier-Stokes analysis and design of unconventional configurations. It is 1) well developed and validated, 2) has a broad base of support, and 3) is presently is a workhorse code because of the level of confidence that has been established through wide use. The entire system can now run on linux or mac architectures. In the following slides, I will highlight more of the features of the VGRID and USM3D codes.

  9. Integration of Geographical Information Systems and Geophysical Applications with Distributed Computing Technologies.

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.

    2005-12-01

    We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.

  10. Recombination imaging of III-V solar cells

    NASA Technical Reports Server (NTRS)

    Virshup, G. F.

    1987-01-01

    An imaging technique based on the radiative recombination of minority carriers in forward-biased solar cells has been developed for characterization of III-V solar cells. When used in mapping whole wafers, it has helped identify three independent loss mechanisms (broken grid lines, shorting defects, and direct-to-indirect bandgap transitions), all of which resulted in lower efficiencies. The imaging has also led to improvements in processing techniques to reduce the occurrence of broken gridlines as well as surface defects. The ability to visualize current mechanisms in solar cells is an intuitive tool which is powerful in its simplicity.

  11. VeriClick: an efficient tool for table format verification

    NASA Astrophysics Data System (ADS)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  12. The 2002 NASA Faculty Fellowship Program Research Reports

    NASA Technical Reports Server (NTRS)

    Bland, J. (Compiler)

    2003-01-01

    Contents include the following: System Identification of X-33. Neural Network Advanced Ceramic Technology for Space Applications at NASA MSFC. Developing a MATLAB-Based Tool for Visualization and Transformation. Subsurface Stress Fields in Single Crystal (Anisotropic). Contacts Our Space Future: A Challenge to the Conceptual Artist Concept Art for Presentation and Education. Identification and Characterization of Extremophile Microorganisms. Significant to Astrobiology. Mathematical Investigation of Gamma Ray and Neutron. Absorption Grid Patterns for Homeland Defense-Related Fourier Imaging Systems. The Potential of Microwave Radiation for Processing Martian Soil. Fuzzy Logic Trajectory Design and Guidance for Terminal Area.

  13. An Excel-based tool for evaluating and visualizing geothermobarometry data

    NASA Astrophysics Data System (ADS)

    Hora, John Milan; Kronz, Andreas; Möller-McNett, Stefan; Wörner, Gerhard

    2013-07-01

    Application of geothermobarometry based on equilibrium exchange of chemical components between two mineral phases in natural samples frequently leads to the dilemma of either: (1) relying on relatively few measurements where there is a high likelihood of equilibrium, or (2) using many analysis pairs, where a significant proportion may not be useful and must be filtered out. The second approach leads to the challenges of (1) evaluation of equilibrium for large numbers of analysis pairs, (2) finding patterns in the dataset where multiple populations exist, and (3) visualizing relationships between calculated temperatures and compositional and textural parameters. Given the limitations of currently-used thermobarometry spreadsheets, we redesign them in a way that eliminates tedium by automating data importing, quality control and calculations, while making all results visible in a single view. Rather than using a traditional spreadsheet layout, we array the calculations in a grid. Each color-coded grid node contains the calculated temperature result corresponding to the intersection of two analyses given in the corresponding column and row. We provide Microsoft Excel templates for some commonly-used thermometers, that can be modified for use with any geothermometer or geobarometer involving two phases. Conditional formatting and ability to sort according to any chosen parameter simplifies pattern recognition, while tests for equilibrium can be incorporated into grid calculations. A case study of rhyodacite domes at Parinacota volcano, Chile, indicates a single population of Fe-Ti oxide temperatures, despite Mg-Mn compositional variability. Crystal zoning and differing thermal histories are, however, evident as a bimodal population of plagioclase-amphibole temperatures. Our approach aids in identification of suspect analyses and xenocrysts and visualization of links between temperature and phase composition. This facilitates interpretation of whether heat transfer was accompanied by bulk mass transfer, and to what degree diffusion has homogenized calculated temperature results in hybrid magmas.

  14. Visual Computing Environment

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Putt, Charles W.

    1997-01-01

    The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.

  15. Large-scale ground motion simulation using GPGPU

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Maeda, T.; Nishizawa, N.; Aoki, T.

    2012-12-01

    Huge computation resources are required to perform large-scale ground motion simulations using 3-D finite difference method (FDM) for realistic and complex models with high accuracy. Furthermore, thousands of various simulations are necessary to evaluate the variability of the assessment caused by uncertainty of the assumptions of the source models for future earthquakes. To conquer the problem of restricted computational resources, we introduced the use of GPGPU (General purpose computing on graphics processing units) which is the technique of using a GPU as an accelerator of the computation which has been traditionally conducted by the CPU. We employed the CPU version of GMS (Ground motion Simulator; Aoi et al., 2004) as the original code and implemented the function for GPU calculation using CUDA (Compute Unified Device Architecture). GMS is a total system for seismic wave propagation simulation based on 3-D FDM scheme using discontinuous grids (Aoi&Fujiwara, 1999), which includes the solver as well as the preprocessor tools (parameter generation tool) and postprocessor tools (filter tool, visualization tool, and so on). The computational model is decomposed in two horizontal directions and each decomposed model is allocated to a different GPU. We evaluated the performance of our newly developed GPU version of GMS on the TSUBAME2.0 which is one of the Japanese fastest supercomputer operated by the Tokyo Institute of Technology. First we have performed a strong scaling test using the model with about 22 million grids and achieved 3.2 and 7.3 times of the speed-up by using 4 and 16 GPUs. Next, we have examined a weak scaling test where the model sizes (number of grids) are increased in proportion to the degree of parallelism (number of GPUs). The result showed almost perfect linearity up to the simulation with 22 billion grids using 1024 GPUs where the calculation speed reached to 79.7 TFlops and about 34 times faster than the CPU calculation using the same number of cores. Finally, we applied GPU calculation to the simulation of the 2011 Tohoku-oki earthquake. The model was constructed using a slip model from inversion of strong motion data (Suzuki et al., 2012), and a geological- and geophysical-based velocity structure model comprising all the Tohoku and Kanto regions as well as the large source area, which consists of about 1.9 billion grids. The overall characteristics of observed velocity seismograms for a longer period than range of 8 s were successfully reproduced (Maeda et al., 2012 AGU meeting). The turn around time for 50 thousand-step calculation (which correspond to 416 s in seismograph) using 100 GPUs was 52 minutes which is fairly short, especially considering this is the performance for the realistic and complex model.

  16. AF-GEOSpace Version 2.0: Space Environment Software Products for 2002

    NASA Astrophysics Data System (ADS)

    Hilmer, R. V.; Ginet, G. P.; Hall, T.; Holeman, E.; Tautz, M.

    2002-05-01

    AF-GEOSpace Version 2.0 (release 2002 on WindowsNT/2000/XP) is a graphics-intensive software program developed by AFRL with space environment models and applications. It has grown steadily to become a development tool for automated space weather visualization products and helps with a variety of tasks: orbit specification for radiation hazard avoidance; satellite design assessment and post-event analysis; solar disturbance effects forecasting; frequency and antenna management for radar and HF communications; determination of link outage regions for active ionospheric conditions; and physics research and education. The object-oriented C++ code is divided into five module classes. Science Modules control science models to give output data on user-specified grids. Application Modules manipulate these data and provide orbit generation and magnetic field line tracing capabilities. Data Modules read and assist with the analysis of user-generated data sets. Graphics Modules enable the display of features such as plane slices, magnetic field lines, line plots, axes, the Earth, stars, and satellites. Worksheet Modules provide commonly requested coordinate transformations and calendar conversion tools. Common input data archive sets, application modules, and 1-, 2-, and 3-D visualization tools are provided to all models. The code documentation includes detailed examples with click-by-click instructions for investigating phenomena that have well known effects on communications and spacecraft systems. AF-GEOSpace Version 2.0 builds on the success of its predecessors. The first release (Version 1.21, 1996/IRIX on SGI) contained radiation belt particle flux and dose models derived from CRRES satellite data, an aurora model, an ionosphere model, and ionospheric HF ray tracing capabilities. Next (Version 1.4, 1999/IRIX on SGI) science modules were added related to cosmic rays and solar protons, low-Earth orbit radiation dosages, single event effects probability maps, ionospheric scintillation, and shock propagation models. New application modules for estimating linear energy transfer (LET) and single event upset (SEU) rates in solid-state devices, and graphic modules for visualizing radar fans, communication domes, and satellite detector cones and links were added. Automated FTP scripts permitted users to update their global input parameter set directly from NOAA/SEC. What?s New? Version 2.0 includes the first true dynamic run capabilities and offers new and enhanced graphical and data visualization tools such as 3-D volume rendering and eclipse umbra and penumbra determination. Animations of all model results can now be displayed together in all dimensions. There is a new realistic day-to-day ionospheric scintillation simulation generator (IONSCINT), an upgrade to the WBMOD scintillation code, a simplified HF ionospheric ray tracing module, and applications built on the NASA AE-8 and AP-8 radiation belt models. User-generated satellite data sets can now be visualized along with their orbital ephemeris. A prototype tool for visualizing MHD model results stored in structured grids provides a hint of where future space weather model development efforts are headed. A new graphical user interface (GUI) with improved module tracking and renaming features greatly simplifies software operation. AF-GEOSpace is distributed by the Space Weather Center of Excellence in the Space Vehicles Directorate of AFRL. Recently released for WindowsNT/2000/XP, versions for UNIX and LINUX operating systems will follow shortly. To obtain AF-GEOSpace Version 2.0, please send an e-mail request to the first author.

  17. Visual Analytics for Power Grid Contingency Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Huang, Zhenyu; Chen, Yousu

    2014-01-20

    Contingency analysis is the process of employing different measures to model scenarios, analyze them, and then derive the best response to remove the threats. This application paper focuses on a class of contingency analysis problems found in the power grid management system. A power grid is a geographically distributed interconnected transmission network that transmits and delivers electricity from generators to end users. The power grid contingency analysis problem is increasingly important because of both the growing size of the underlying raw data that need to be analyzed and the urgency to deliver working solutions in an aggressive timeframe. Failure tomore » do so may bring significant financial, economic, and security impacts to all parties involved and the society at large. The paper presents a scalable visual analytics pipeline that transforms about 100 million contingency scenarios to a manageable size and form for grid operators to examine different scenarios and come up with preventive or mitigation strategies to address the problems in a predictive and timely manner. Great attention is given to the computational scalability, information scalability, visual scalability, and display scalability issues surrounding the data analytics pipeline. Most of the large-scale computation requirements of our work are conducted on a Cray XMT multi-threaded parallel computer. The paper demonstrates a number of examples using western North American power grid models and data.« less

  18. FUN3D Grid Refinement and Adaptation Studies for the Ares Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.; Vasta, Veer; Carlson, Jan-Renee; Park, Mike; Mineck, Raymond E.

    2010-01-01

    This paper presents grid refinement and adaptation studies performed in conjunction with computational aeroelastic analyses of the Ares crew launch vehicle (CLV). The unstructured grids used in this analysis were created with GridTool and VGRID while the adaptation was performed using the Computational Fluid Dynamic (CFD) code FUN3D with a feature based adaptation software tool. GridTool was developed by ViGYAN, Inc. while the last three software suites were developed by NASA Langley Research Center. The feature based adaptation software used here operates by aligning control volumes with shock and Mach line structures and by refining/de-refining where necessary. It does not redistribute node points on the surface. This paper assesses the sensitivity of the complex flow field about a launch vehicle to grid refinement. It also assesses the potential of feature based grid adaptation to improve the accuracy of CFD analysis for a complex launch vehicle configuration. The feature based adaptation shows the potential to improve the resolution of shocks and shear layers. Further development of the capability to adapt the boundary layer and surface grids of a tetrahedral grid is required for significant improvements in modeling the flow field.

  19. ESIF 2016: Modernizing Our Grid and Energy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Becelaere, Kimberly

    This 2016 annual report highlights work conducted at the Energy Systems Integration Facility (ESIF) in FY 2016, including grid modernization, high-performance computing and visualization, and INTEGRATE projects.

  20. Daymet: Daily Surface Weather Data on a 1-km Grid for North America, Version 2.

    NASA Astrophysics Data System (ADS)

    Devarakonda, R.

    2014-12-01

    Daymet: Daily Surface Weather Data and Climatological Summaries provides gridded estimates of daily weather parameters for North America, including daily continuous surfaces of minimum and maximum temperature, precipitation occurrence and amount, humidity, shortwave radiation, snow water equivalent, and day length. The current data product (Version 2) covers the period January 1, 1980 to December 31, 2013 [1]. Data are available on a daily time step at a 1-km x 1-km spatial resolution in Lambert Conformal Conic projection with a spatial extent that covers the North America as meteorological station density allows. Daymet data can be downloaded from 1) the ORNL Distributed Active Archive Center (DAAC) search and order tools (http://daac.ornl.gov/cgi-bin/cart/add2cart.pl?add=1219) or directly from the DAAC FTP site (http://daac.ornl.gov/cgi-bin/dsviewer.pl?ds_id=1219) and 2) the Single Pixel Tool (http://daymet.ornl.gov/singlepixel.html) and THREDDS (Thematic Real-time Environmental Data Services) Data Server (TDS) (http://daymet.ornl.gov/thredds_mosaics.html). The Single Pixel Data Extraction Tool [2] allows users to enter a single geographic point by latitude and longitude in decimal degrees. A routine is executed that translates the (lon, lat) coordinates into projected Daymet (x,y) coordinates. These coordinates are used to access the Daymet database of daily-interpolated surface weather variables. The Single Pixel Data Extraction Tool also provides the option to download multiple coordinates programmatically. The ORNL DAAC's TDS provides customized visualization and access to Daymet time series of North American mosaics. Users can subset and download Daymet data via a variety of community standards, including OPeNDAP, NetCDF Subset service, and Open Geospatial Consortium (OGC) Web Map/Coverage Service. References: [1] Thornton, P. E., Thornton, M. M., Mayer, B. W., Wilhelmi, N., Wei, Y., Devarakonda, R., & Cook, R. (2012). "Daymet: Daily surface weather on a 1 km grid for North America, 1980-2008". Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center for Biogeochemical Dynamics (DAAC), 1. [2] Devarakonda R., et al. 2012. Daymet: Single Pixel Data Extraction Tool. Available [http://daymet.ornl.go/singlepixel.html].

  1. Statistical Considerations of Data Processing in Giovanni Online Tool

    NASA Technical Reports Server (NTRS)

    Suhung, Shen; Leptoukh, G.; Acker, J.; Berrick, S.

    2005-01-01

    The GES DISC Interactive Online Visualization and Analysis Infrastructure (Giovanni) is a web-based interface for the rapid visualization and analysis of gridded data from a number of remote sensing instruments. The GES DISC currently employs several Giovanni instances to analyze various products, such as Ocean-Giovanni for ocean products from SeaWiFS and MODIS-Aqua; TOMS & OM1 Giovanni for atmospheric chemical trace gases from TOMS and OMI, and MOVAS for aerosols from MODIS, etc. (http://giovanni.gsfc.nasa.gov) Foremost among the Giovanni statistical functions is data averaging. Two aspects of this function are addressed here. The first deals with the accuracy of averaging gridded mapped products vs. averaging from the ungridded Level 2 data. Some mapped products contain mean values only; others contain additional statistics, such as number of pixels (NP) for each grid, standard deviation, etc. Since NP varies spatially and temporally, averaging with or without weighting by NP will be different. In this paper, we address differences of various weighting algorithms for some datasets utilized in Giovanni. The second aspect is related to different averaging methods affecting data quality and interpretation for data with non-normal distribution. The present study demonstrates results of different spatial averaging methods using gridded SeaWiFS Level 3 mapped monthly chlorophyll a data. Spatial averages were calculated using three different methods: arithmetic mean (AVG), geometric mean (GEO), and maximum likelihood estimator (MLE). Biogeochemical data, such as chlorophyll a, are usually considered to have a log-normal distribution. The study determined that differences between methods tend to increase with increasing size of a selected coastal area, with no significant differences in most open oceans. The GEO method consistently produces values lower than AVG and MLE. The AVG method produces values larger than MLE in some cases, but smaller in other cases. Further studies indicated that significant differences between AVG and MLE methods occurred in coastal areas where data have large spatial variations and a log-bimodal distribution instead of log-normal distribution.

  2. Integrating TITAN2D Geophysical Mass Flow Model with GIS

    NASA Astrophysics Data System (ADS)

    Namikawa, L. M.; Renschler, C.

    2005-12-01

    TITAN2D simulates geophysical mass flows over natural terrain using depth-averaged granular flow models and requires spatially distributed parameter values to solve differential equations. Since a Geographical Information System (GIS) main task is integration and manipulation of data covering a geographic region, the use of a GIS for implementation of simulation of complex, physically-based models such as TITAN2D seems a natural choice. However, simulation of geophysical flows requires computationally intensive operations that need unique optimizations, such as adaptative grids and parallel processing. Thus GIS developed for general use cannot provide an effective environment for complex simulations and the solution is to develop a linkage between GIS and simulation model. The present work presents the solution used for TITAN2D where data structure of a GIS is accessed by simulation code through an Application Program Interface (API). GRASS is an open source GIS with published data formats thus GRASS data structure was selected. TITAN2D requires elevation, slope, curvature, and base material information at every cell to be computed. Results from simulation are visualized by a system developed to handle the large amount of output data and to support a realistic dynamic 3-D display of flow dynamics, which requires elevation and texture, usually from a remote sensor image. Data required by simulation is in raster format, using regular rectangular grids. GRASS format for regular grids is based on data file (binary file storing data either uncompressed or compressed by grid row), header file (text file, with information about georeferencing, data extents, and grid cell resolution), and support files (text files, with information about color table and categories names). The implemented API provides access to original data (elevation, base material, and texture from imagery) and slope and curvature derived from elevation data. From several existing methods to estimate slope and curvature from elevation, the selected one is based on estimation by a third-order finite difference method, which has shown to perform better or with minimal difference when compared to more computationally expensive methods. Derivatives are estimated using weighted sum of 8 grid neighbor values. The method was implemented and simulation results compared to derivatives estimated by a simplified version of the method (uses only 4 neighbor cells) and proven to perform better. TITAN2D uses an adaptative mesh grid, where resolution (grid cell size) is not constant, and visualization tools also uses texture with varying resolutions for efficient display. The API supports different resolutions applying bilinear interpolation when elevation, slope and curvature are required at a resolution higher (smaller cell size) than the original and using a nearest cell approach for elevations with lower resolution (larger) than the original. For material information nearest neighbor method is used since interpolation on categorical data has no meaning. Low fidelity characteristic of visualization allows use of nearest neighbor method for texture. Bilinear interpolation estimates the value at a point as the distance-weighted average of values at the closest four cell centers, and interpolation performance is just slightly inferior compared to more computationally expensive methods such as bicubic interpolation and kriging.

  3. A software platform for continuum modeling of ion channels based on unstructured mesh

    NASA Astrophysics Data System (ADS)

    Tu, B.; Bai, S. Y.; Chen, M. X.; Xie, Y.; Zhang, L. B.; Lu, B. Z.

    2014-01-01

    Most traditional continuum molecular modeling adopted finite difference or finite volume methods which were based on a structured mesh (grid). Unstructured meshes were only occasionally used, but an increased number of applications emerge in molecular simulations. To facilitate the continuum modeling of biomolecular systems based on unstructured meshes, we are developing a software platform with tools which are particularly beneficial to those approaches. This work describes the software system specifically for the simulation of a typical, complex molecular procedure: ion transport through a three-dimensional channel system that consists of a protein and a membrane. The platform contains three parts: a meshing tool chain for ion channel systems, a parallel finite element solver for the Poisson-Nernst-Planck equations describing the electrodiffusion process of ion transport, and a visualization program for continuum molecular modeling. The meshing tool chain in the platform, which consists of a set of mesh generation tools, is able to generate high-quality surface and volume meshes for ion channel systems. The parallel finite element solver in our platform is based on the parallel adaptive finite element package PHG which wass developed by one of the authors [1]. As a featured component of the platform, a new visualization program, VCMM, has specifically been developed for continuum molecular modeling with an emphasis on providing useful facilities for unstructured mesh-based methods and for their output analysis and visualization. VCMM provides a graphic user interface and consists of three modules: a molecular module, a meshing module and a numerical module. A demonstration of the platform is provided with a study of two real proteins, the connexin 26 and hemolysin ion channels.

  4. Observations of solar-cell metallization corrosion

    NASA Technical Reports Server (NTRS)

    Mon, G. R.

    1983-01-01

    The Engineering Sciences Area of the Jet Propulsion Laboratory (JPL) Flat-Plate Solar Array Project is performing long term environmental tests on photovoltaic modules at Wyle Laboratories in Huntsville, Alabama. Some modules have been exposed to 85 C/85% RH and 40 C/93% RH for up to 280 days. Other modules undergoing temperature-only exposures ( 3% RH) at 85 C and 100 C have been tested for more than 180 days. At least two modules of each design type are exposed to each environment - one with, and the other without a 100-mA forward bias. Degradation is both visually observed and electrically monitored. Visual observations of changes in appearance are recorded at each inspection time. Significant visual observations relating to metallization corrosion (and/or metallization-induced corrosion) include discoloration (yellowing and browning) of grid lines, migration of grid line material into the encapsulation (blossoming), the appearance of rainbow-like diffraction patterns on the grid lines, and brown spots on collectors and grid lines. All of these observations were recorded for electrically biased modules in the 280-day tests with humidity.

  5. RIMS: An Integrated Mapping and Analysis System with Applications to Earth Sciences and Hydrology

    NASA Astrophysics Data System (ADS)

    Proussevitch, A. A.; Glidden, S.; Shiklomanov, A. I.; Lammers, R. B.

    2011-12-01

    A web-based information and computational system for analysis of spatially distributed Earth system, climate, and hydrologic data have been developed. The System allows visualization, data exploration, querying, manipulation and arbitrary calculations with any loaded gridded or vector polygon dataset. The system's acronym, RIMS, stands for its core functionality as a Rapid Integrated Mapping System. The system can be deployed for a Global scale projects as well as for regional hydrology and climatology studies. In particular, the Water Systems Analysis Group of the University of New Hampshire developed the global and regional (Northern Eurasia, pan-Arctic) versions of the system with different map projections and specific data. The system has demonstrated its potential for applications in other fields of Earth sciences and education. The key Web server/client components of the framework include (a) a visualization engine built on Open Source libraries (GDAL, PROJ.4, etc.) that are utilized in a MapServer; (b) multi-level data querying tools built on XML server-client communication protocols that allow downloading map data on-the-fly to a client web browser; and (c) data manipulation and grid cell level calculation tools that mimic desktop GIS software functionality via a web interface. Server side data management of the system is designed around a simple database of dataset metadata facilitating mounting of new data to the system and maintaining existing data in an easy manner. RIMS contains "built-in" river network data that allows for query of upstream areas on-demand which can be used for spatial data aggregation and analysis of sub-basin areas. RIMS is an ongoing effort and currently being used to serve a number of websites hosting a suite of hydrologic, environmental and other GIS data.

  6. Distributed data mining on grids: services, tools, and applications.

    PubMed

    Cannataro, Mario; Congiusta, Antonio; Pugliese, Andrea; Talia, Domenico; Trunfio, Paolo

    2004-12-01

    Data mining algorithms are widely used today for the analysis of large corporate and scientific datasets stored in databases and data archives. Industry, science, and commerce fields often need to analyze very large datasets maintained over geographically distributed sites by using the computational power of distributed and parallel systems. The grid can play a significant role in providing an effective computational support for distributed knowledge discovery applications. For the development of data mining applications on grids we designed a system called Knowledge Grid. This paper describes the Knowledge Grid framework and presents the toolset provided by the Knowledge Grid for implementing distributed knowledge discovery. The paper discusses how to design and implement data mining applications by using the Knowledge Grid tools starting from searching grid resources, composing software and data components, and executing the resulting data mining process on a grid. Some performance results are also discussed.

  7. Teaching Fractions and Decimals: Fun with Picture Grids.

    ERIC Educational Resources Information Center

    Stix, Andi

    In these two companion papers, a learning activity is introduced that teaches students how mathematics works through the visual aid of picture grids. The picture grids are composed of 60 different acrylic stained-glass window overlays. Each fractional part is represented by a different color: fifths are green, quarters are yellow, etc. The square…

  8. QAPgrid: A Two Level QAP-Based Approach for Large-Scale Data Analysis and Visualization

    PubMed Central

    Inostroza-Ponta, Mario; Berretta, Regina; Moscato, Pablo

    2011-01-01

    Background The visualization of large volumes of data is a computationally challenging task that often promises rewarding new insights. There is great potential in the application of new algorithms and models from combinatorial optimisation. Datasets often contain “hidden regularities” and a combined identification and visualization method should reveal these structures and present them in a way that helps analysis. While several methodologies exist, including those that use non-linear optimization algorithms, severe limitations exist even when working with only a few hundred objects. Methodology/Principal Findings We present a new data visualization approach (QAPgrid) that reveals patterns of similarities and differences in large datasets of objects for which a similarity measure can be computed. Objects are assigned to positions on an underlying square grid in a two-dimensional space. We use the Quadratic Assignment Problem (QAP) as a mathematical model to provide an objective function for assignment of objects to positions on the grid. We employ a Memetic Algorithm (a powerful metaheuristic) to tackle the large instances of this NP-hard combinatorial optimization problem, and we show its performance on the visualization of real data sets. Conclusions/Significance Overall, the results show that QAPgrid algorithm is able to produce a layout that represents the relationships between objects in the data set. Furthermore, it also represents the relationships between clusters that are feed into the algorithm. We apply the QAPgrid on the 84 Indo-European languages instance, producing a near-optimal layout. Next, we produce a layout of 470 world universities with an observed high degree of correlation with the score used by the Academic Ranking of World Universities compiled in the The Shanghai Jiao Tong University Academic Ranking of World Universities without the need of an ad hoc weighting of attributes. Finally, our Gene Ontology-based study on Saccharomyces cerevisiae fully demonstrates the scalability and precision of our method as a novel alternative tool for functional genomics. PMID:21267077

  9. QAPgrid: a two level QAP-based approach for large-scale data analysis and visualization.

    PubMed

    Inostroza-Ponta, Mario; Berretta, Regina; Moscato, Pablo

    2011-01-18

    The visualization of large volumes of data is a computationally challenging task that often promises rewarding new insights. There is great potential in the application of new algorithms and models from combinatorial optimisation. Datasets often contain "hidden regularities" and a combined identification and visualization method should reveal these structures and present them in a way that helps analysis. While several methodologies exist, including those that use non-linear optimization algorithms, severe limitations exist even when working with only a few hundred objects. We present a new data visualization approach (QAPgrid) that reveals patterns of similarities and differences in large datasets of objects for which a similarity measure can be computed. Objects are assigned to positions on an underlying square grid in a two-dimensional space. We use the Quadratic Assignment Problem (QAP) as a mathematical model to provide an objective function for assignment of objects to positions on the grid. We employ a Memetic Algorithm (a powerful metaheuristic) to tackle the large instances of this NP-hard combinatorial optimization problem, and we show its performance on the visualization of real data sets. Overall, the results show that QAPgrid algorithm is able to produce a layout that represents the relationships between objects in the data set. Furthermore, it also represents the relationships between clusters that are feed into the algorithm. We apply the QAPgrid on the 84 Indo-European languages instance, producing a near-optimal layout. Next, we produce a layout of 470 world universities with an observed high degree of correlation with the score used by the Academic Ranking of World Universities compiled in the The Shanghai Jiao Tong University Academic Ranking of World Universities without the need of an ad hoc weighting of attributes. Finally, our Gene Ontology-based study on Saccharomyces cerevisiae fully demonstrates the scalability and precision of our method as a novel alternative tool for functional genomics.

  10. A Job Monitoring and Accounting Tool for the LSF Batch System

    NASA Astrophysics Data System (ADS)

    Sarkar, Subir; Taneja, Sonia

    2011-12-01

    This paper presents a web based job monitoring and group-and-user accounting tool for the LSF Batch System. The user oriented job monitoring displays a simple and compact quasi real-time overview of the batch farm for both local and Grid jobs. For Grid jobs the Distinguished Name (DN) of the Grid users is shown. The overview monitor provides the most up-to-date status of a batch farm at any time. The accounting tool works with the LSF accounting log files. The accounting information is shown for a few pre-defined time periods by default. However, one can also compute the same information for any arbitrary time window. The tool already proved to be an extremely useful means to validate more extensive accounting tools available in the Grid world. Several sites have already been using the present tool and more sites running the LSF batch system have shown interest. We shall discuss the various aspects that make the tool essential for site administrators and end-users alike and outline the current status of development as well as future plans.

  11. Context-dependent spatially periodic activity in the human entorhinal cortex

    PubMed Central

    Nguyen, T. Peter; Török, Ágoston; Shen, Jason Y.; Briggs, Deborah E.; Modur, Pradeep N.; Buchanan, Robert J.

    2017-01-01

    The spatially periodic activity of grid cells in the entorhinal cortex (EC) of the rodent, primate, and human provides a coordinate system that, together with the hippocampus, informs an individual of its location relative to the environment and encodes the memory of that location. Among the most defining features of grid-cell activity are the 60° rotational symmetry of grids and preservation of grid scale across environments. Grid cells, however, do display a limited degree of adaptation to environments. It remains unclear if this level of environment invariance generalizes to human grid-cell analogs, where the relative contribution of visual input to the multimodal sensory input of the EC is significantly larger than in rodents. Patients diagnosed with nontractable epilepsy who were implanted with entorhinal cortical electrodes performing virtual navigation tasks to memorized locations enabled us to investigate associations between grid-like patterns and environment. Here, we report that the activity of human entorhinal cortical neurons exhibits adaptive scaling in grid period, grid orientation, and rotational symmetry in close association with changes in environment size, shape, and visual cues, suggesting scale invariance of the frequency, rather than the wavelength, of spatially periodic activity. Our results demonstrate that neurons in the human EC represent space with an enhanced flexibility relative to neurons in rodents because they are endowed with adaptive scalability and context dependency. PMID:28396399

  12. Optimal Sizing Tool for Battery Storage in Grid Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-09-24

    The battery storage sizing tool developed at Pacific Northwest National Laboratory can be used to evaluate economic performance and determine the optimal size of battery storage in different use cases considering multiple power system applications. The considered use cases include i) utility owned battery storage, and ii) battery storage behind customer meter. The power system applications from energy storage include energy arbitrage, balancing services, T&D deferral, outage mitigation, demand charge reduction etc. Most of existing solutions consider only one or two grid services simultaneously, such as balancing service and energy arbitrage. ES-select developed by Sandia and KEMA is able tomore » consider multiple grid services but it stacks the grid services based on priorities instead of co-optimization. This tool is the first one that provides a co-optimization for systematic and local grid services.« less

  13. Vortex Filaments in Grids for Scalable, Fine Smoke Simulation.

    PubMed

    Meng, Zhang; Weixin, Si; Yinling, Qian; Hanqiu, Sun; Jing, Qin; Heng, Pheng-Ann

    2015-01-01

    Vortex modeling can produce attractive visual effects of dynamic fluids, which are widely applicable for dynamic media, computer games, special effects, and virtual reality systems. However, it is challenging to effectively simulate intensive and fine detailed fluids such as smoke with fast increasing vortex filaments and smoke particles. The authors propose a novel vortex filaments in grids scheme in which the uniform grids dynamically bridge the vortex filaments and smoke particles for scalable, fine smoke simulation with macroscopic vortex structures. Using the vortex model, their approach supports the trade-off between simulation speed and scale of details. After computing the whole velocity, external control can be easily exerted on the embedded grid to guide the vortex-based smoke motion. The experimental results demonstrate the efficiency of using the proposed scheme for a visually plausible smoke simulation with macroscopic vortex structures.

  14. Using Flow Charts to Visualize the Decision-Making Process in Space Weather Forecasting

    NASA Astrophysics Data System (ADS)

    Aung, M. T. Y.; Myat, T.; Zheng, Y.; Mays, M. L.; Ngwira, C.; Damas, M. C.

    2016-12-01

    Our society today relies heavily on technological systems such as satellites, navigation systems, power grids and aviation. These systems are very sensitive to space weather disturbances. When Earth-directed space weather driven by the Sun arrives at the Earth, it causes changes to the Earth's radiation environment and the magnetosphere. Strong disturbances in the magnetosphere of the Earth are responsible for geomagnetic storms that can last from hours to days depending on strength of storms. Geomagnetic storms can severely impact critical infrastructure on Earth, such as the electric power grid, and Solar Energetic Particles that can endanger life in outer space. How can we lessen these adverse effects? They can be lessened through the early warning signals sent by space weather forecasters before CME or high-speed stream arrives. A space weather forecaster's duty is to send predicted notifications to high-tech industries and NASA missions so that they could take extra measures for protection. NASA space weather forecasters make prediction decisions by following certain steps and processes from the time an event occurs at the sun all the way to the impact locations. However, there has never been a tool that helps these forecasters visualize the decision process until now. A flow chart is created to help forecasters visualize the decision process. This flow chart provides basic knowledge of space weather and can be used to train future space weather forecasters. It also helps to cut down the training period and increase consistency in forecasting. The flow chart is also a great reference for people who are already familiar with space weather.

  15. Distributed data analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Nilsson, Paul; Atlas Collaboration

    2012-12-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

  16. Corridor One:An Integrated Distance Visualization Enuronments for SSI+ASCI Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher R. Johnson, Charles D. Hansen

    2001-10-29

    The goal of Corridor One: An Integrated Distance Visualization Environment for ASCI and SSI Application was to combine the forces of six leading edge laboratories working in the areas of visualization and distributed computing and high performance networking (Argonne National Laboratory, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, University of Illinois, University of Utah and Princeton University) to develop and deploy the most advanced integrated distance visualization environment for large-scale scientific visualization and demonstrate it on applications relevant to the DOE SSI and ASCI programs. The Corridor One team brought world class expertise in parallel rendering, deep image basedmore » rendering, immersive environment technology, large-format multi-projector wall based displays, volume and surface visualization algorithms, collaboration tools and streaming media technology, network protocols for image transmission, high-performance networking, quality of service technology and distributed computing middleware. Our strategy was to build on the very successful teams that produced the I-WAY, ''Computational Grids'' and CAVE technology and to add these to the teams that have developed the fastest parallel visualizations systems and the most widely used networking infrastructure for multicast and distributed media. Unfortunately, just as we were getting going on the Corridor One project, DOE cut the program after the first year. As such, our final report consists of our progress during year one of the grant.« less

  17. A visual identification key utilizing both gestalt and analytic approaches to identification of Carices present in North America (Plantae, Cyperaceae)

    PubMed Central

    2013-01-01

    Abstract Images are a critical part of the identification process because they enable direct, immediate and relatively unmediated comparisons between a specimen being identified and one or more reference specimens. The Carices Interactive Visual Identification Key (CIVIK) is a novel tool for identification of North American Carex species, the largest vascular plant genus in North America, and two less numerous closely-related genera, Cymophyllus and Kobresia. CIVIK incorporates 1288 high-resolution tiled image sets that allow users to zoom in to view minute structures that are crucial at times for identification in these genera. Morphological data are derived from the earlier Carex Interactive Identification Key (CIIK) which in turn used data from the Flora of North America treatments. In this new iteration, images can be viewed in a grid or histogram format, allowing multiple representations of data. In both formats the images are fully zoomable. PMID:24723777

  18. Schlieren technique in soap film flows

    NASA Astrophysics Data System (ADS)

    Auliel, M. I.; Hebrero, F. Castro; Sosa, R.; Artana, G.

    2017-05-01

    We propose the use of the Schlieren technique as a tool to analyse the flows in soap film tunnels. The technique enables to visualize perturbations of the film produced by the interposition of an object in the flow. The variations of intensity of the image are produced as a consequence of the deviations of the light beam traversing the deformed surfaces of the film. The quality of the Schlieren image is compared to images produced by the conventional interferometric technique. The analysis of Schlieren images of a cylinder wake flow indicates that this technique enables an easy visualization of vortex centers. Post-processing of series of two successive images of a grid turbulent flow with a dense motion estimator is used to derive the velocity fields. The results obtained with this self-seeded flow show good agreement with the statistical properties of the 2D turbulent flows reported on the literature.

  19. Analysis of Ligand-Receptor Association and Intermediate Transfer Rates in Multienzyme Nanostructures with All-Atom Brownian Dynamics Simulations.

    PubMed

    Roberts, Christopher C; Chang, Chia-En A

    2016-08-25

    We present the second-generation GeomBD Brownian dynamics software for determining interenzyme intermediate transfer rates and substrate association rates in biomolecular complexes. Substrate and intermediate association rates for a series of enzymes or biomolecules can be compared between the freely diffusing disorganized configuration and various colocalized or complexed arrangements for kinetic investigation of enhanced intermediate transfer. In addition, enzyme engineering techniques, such as synthetic protein conjugation, can be computationally modeled and analyzed to better understand changes in substrate association relative to native enzymes. Tools are provided to determine nonspecific ligand-receptor association residence times, and to visualize common sites of nonspecific association of substrates on receptor surfaces. To demonstrate features of the software, interenzyme intermediate substrate transfer rate constants are calculated and compared for all-atom models of DNA origami scaffold-bound bienzyme systems of glucose oxidase and horseradish peroxidase. Also, a DNA conjugated horseradish peroxidase enzyme was analyzed for its propensity to increase substrate association rates and substrate local residence times relative to the unmodified enzyme. We also demonstrate the rapid determination and visualization of common sites of nonspecific ligand-receptor association by using HIV-1 protease and an inhibitor, XK263. GeomBD2 accelerates simulations by precomputing van der Waals potential energy grids and electrostatic potential grid maps, and has a flexible and extensible support for all-atom and coarse-grained force fields. Simulation software is written in C++ and utilizes modern parallelization techniques for potential grid preparation and Brownian dynamics simulation processes. Analysis scripts, written in the Python scripting language, are provided for quantitative simulation analysis. GeomBD2 is applicable to the fields of biophysics, bioengineering, and enzymology in both predictive and explanatory roles.

  20. An Integrated Multivariable Visualization Tool for Marine Sanctuary Climate Assessments

    NASA Astrophysics Data System (ADS)

    Shein, K. A.; Johnston, S.; Stachniewicz, J.; Duncan, B.; Cecil, D.; Ansari, S.; Urzen, M.

    2012-12-01

    The comprehensive development and use of ecological climate impact assessments by ecosystem managers can be limited by data access and visualization methods that require a priori knowledge about the various large and complex climate data products necessary to those impact assessments. In addition, it can be difficult to geographically and temporally integrate climate and ecological data to fully characterize climate-driven ecological impacts. To address these considerations, we have enhanced and extended the functionality of the NOAA National Climatic Data Center's Weather and Climate Toolkit (WCT). The WCT is a freely available Java-based tool designed to access and display NCDC's georeferenced climate data products (e.g., satellite, radar, and reanalysis gridded data). However, the WCT requires users already know how to obtain the data products, which products are preferred for a given variable, and which products are most relevant to their needs. Developed in cooperation with research and management customers at the Gulf of the Farallones National Marine Sanctuary, the Integrated Marine Protected Area Climate Tools (IMPACT) modification to the WCT simplifies or eliminates these requirements, while simultaneously adding core analytical functionality to the tool. Designed for use by marine ecosystem managers, WCT-IMPACT accesses a suite of data products that have been identified as relevant to marine ecosystem climate impact assessments, such as NOAA's Climate Data Records. WCT-IMPACT regularly crops these products to the geographic boundaries of each included marine protected area (MPA), and those clipped regions are processed to produce MPA-specific analytics. The tool retrieves the most appropriate data files based on the user selection of MPA, environmental variable(s), and time frame. Once the data are loaded, they may be visualized, explored, analyzed, and exported to other formats (e.g., Google KML). Multiple variables may be simultaneously visualized using a 4-panel display and compared via a variety of statistics such as difference, probability, or correlation maps.; NCDC's Weather and Climate Toolkit image of NARR-A non-convective cloud cover (%) over the Pacific Coast on June 17, 2012 at 09:00 GMT.

  1. Open source 3D visualization and interaction dedicated to hydrological models

    NASA Astrophysics Data System (ADS)

    Richard, Julien; Giangola-Murzyn, Agathe; Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel

    2014-05-01

    Climate change and surface urbanization strongly modify the hydrological cycle in urban areas, increasing the consequences of extreme events such as floods or draughts. These issues lead to the development of the Multi-Hydro model at the Ecole des Ponts ParisTech (A. Giangola-Murzyn et al., 2012). This fully distributed model allows to compute the hydrological response of urban and peri-urban areas. Unfortunately such models are seldom user friendly. Indeed generating the inputs before launching a new simulation is usually a tricky tasks, and understanding and interpreting the outputs remains specialist tasks not accessible to the wider public. The MH-AssimTool was developed to overcome these issues. To enable an easier and improved understanding of the model outputs, we decided to convert the raw output data (grids file in ascii format) to a 3D display. Some commercial paying models provide a 3D visualization. Because of the cost of their licenses, this kind of tools may not be accessible to the most concerned stakeholders. So, we are developing a new tool based on C++ for the computation, Qt for the graphic user interface, QGIS for the geographical side and OpenGL for the 3D display. All these languages and libraries are open source and multi-platform. We will discuss some preprocessing issues for the data conversion from 2.5D to 3D. Indeed, the GIS data, is considered as a 2.5D (e.i. 2D polygon + one height) and the its transform to 3D display implies a lot of algorithms. For example,to visualize in 3D one building, it is needed to have for each point the coordinates and the elevation according to the topography. Furthermore one have to create new points to represent the walls. Finally the interactions between the model and stakeholders through this new interface and how this helps converting a research tool into a an efficient operational decision tool will be discussed. This ongoing research on the improvement of the visualization methods is supported by the KIC-Climate Blue Green Dream project.

  2. The Rules Grid: Helping Children with Social Communication and Interaction Needs Manage Social Complexity

    ERIC Educational Resources Information Center

    Devlin, Niall

    2009-01-01

    This article introduces a new practical visual approach, the Rules Grid, to support children who have social communication and interaction needs. The Rules Grid involves a system whereby behaviours of concern can be broken down into smaller behavioural manifestations which in turn lead not only to problem identification and specification, but…

  3. On a learning curve for shared decision making: Interviews with clinicians using the knee osteoarthritis Option Grid.

    PubMed

    Elwyn, Glyn; Rasmussen, Julie; Kinsey, Katharine; Firth, Jill; Marrin, Katy; Edwards, Adrian; Wood, Fiona

    2018-02-01

    Tools used in clinical encounters to illustrate to patients the risks and benefits of treatment options have been shown to increase shared decision making. However, we do not have good information about how these tools are viewed by clinicians and how clinicians think patients would react to their use. Our aim was to examine clinicians' views about the possible and actual use of tools designed to support patients and clinicians to collaborate and deliberate about treatment options, namely, Option Grid decision aids. We conducted a thematic analysis of qualitative interviews embedded in the intervention phase of a trial of an Option Grid decision aid for osteoarthritis of the knee. Interviews were conducted with 6 participating clinicians before they used the tool and again after clinicians had used the tool with 6 patients. In the first interview, clinicians voiced concerns that the tool would lead to an increase in encounter duration, patient resistance regarding involvement in decision making, and potential information overload. At the second interview, after minimal training, the clinicians reported that the tool had changed their usual way of communicating, and it was generally acceptable and helpful to integrate it into practice. After experiencing the use of Option Grids, clinicians became more willing to use the tools in their clinical encounters with patients. How best to introduce Option Grids to clinicians and adopt their use into practice will need careful consideration of context, workflow, and clinical pathways. © 2016 John Wiley & Sons, Ltd.

  4. An open source tool for automatic spatiotemporal assessment of calcium transients and local ‘signal-close-to-noise’ activity in calcium imaging data

    PubMed Central

    Martin, Corinna; Jablonka, Sibylle

    2018-01-01

    Local and spontaneous calcium signals play important roles in neurons and neuronal networks. Spontaneous or cell-autonomous calcium signals may be difficult to assess because they appear in an unpredictable spatiotemporal pattern and in very small neuronal loci of axons or dendrites. We developed an open source bioinformatics tool for an unbiased assessment of calcium signals in x,y-t imaging series. The tool bases its algorithm on a continuous wavelet transform-guided peak detection to identify calcium signal candidates. The highly sensitive calcium event definition is based on identification of peaks in 1D data through analysis of a 2D wavelet transform surface. For spatial analysis, the tool uses a grid to separate the x,y-image field in independently analyzed grid windows. A document containing a graphical summary of the data is automatically created and displays the loci of activity for a wide range of signal intensities. Furthermore, the number of activity events is summed up to create an estimated total activity value, which can be used to compare different experimental situations, such as calcium activity before or after an experimental treatment. All traces and data of active loci become documented. The tool can also compute the signal variance in a sliding window to visualize activity-dependent signal fluctuations. We applied the calcium signal detector to monitor activity states of cultured mouse neurons. Our data show that both the total activity value and the variance area created by a sliding window can distinguish experimental manipulations of neuronal activity states. Notably, the tool is powerful enough to compute local calcium events and ‘signal-close-to-noise’ activity in small loci of distal neurites of neurons, which remain during pharmacological blockade of neuronal activity with inhibitors such as tetrodotoxin, to block action potential firing, or inhibitors of ionotropic glutamate receptors. The tool can also offer information about local homeostatic calcium activity events in neurites. PMID:29601577

  5. Overview of Transonic to Hypersonic Stage Separation Tool Development for Multi-Stage-to-Orbit Concepts

    NASA Technical Reports Server (NTRS)

    Murphy, Kelly J.; Bunning, Pieter G.; Pamadi, Bandu N.; Scallion, William I.; Jones, Kenneth M.

    2004-01-01

    An overview of research efforts at NASA in support of the stage separation and ascent aerothermodynamics research program is presented. The objective of this work is to develop a synergistic suite of experimental, computational, and engineering tools and methods to apply to vehicle separation across the transonic to hypersonic speed regimes. Proximity testing of a generic bimese wing-body configuration is on-going in the transonic (Mach numbers 0.6, 1.05, and 1.1), supersonic (Mach numbers 2.3, 3.0, and 4.5) and hypersonic (Mach numbers 6 and 10) speed regimes in four wind tunnel facilities at the NASA Langley Research Center. An overset grid, Navier-Stokes flow solver has been enhanced and demonstrated on a matrix of proximity cases and on a dynamic separation simulation of the bimese configuration. Steady-state predictions with this solver were in excellent agreement with wind tunnel data at Mach 3 as were predictions via a Cartesian-grid Euler solver. Experimental and computational data have been used to evaluate multi-body enhancements to the widely-used Aerodynamic Preliminary Analysis System, an engineering methodology, and to develop a new software package, SepSim, for the simulation and visualization of vehicle motions in a stage separation scenario. Web-based software will be used for archiving information generated from this research program into a database accessible to the user community. Thus, a framework has been established to study stage separation problems using coordinated experimental, computational, and engineering tools.

  6. S3D: An interactive surface grid generation tool

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David

    1992-01-01

    S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.

  7. Development of a large scale Chimera grid system for the Space Shuttle Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Pearce, Daniel G.; Stanley, Scott A.; Martin, Fred W., Jr.; Gomez, Ray J.; Le Beau, Gerald J.; Buning, Pieter G.; Chan, William M.; Chiu, Ing-Tsau; Wulf, Armin; Akdag, Vedat

    1993-01-01

    The application of CFD techniques to large problems has dictated the need for large team efforts. This paper offers an opportunity to examine the motivations, goals, needs, problems, as well as the methods, tools, and constraints that defined NASA's development of a 111 grid/16 million point grid system model for the Space Shuttle Launch Vehicle. The Chimera approach used for domain decomposition encouraged separation of the complex geometry into several major components each of which was modeled by an autonomous team. ICEM-CFD, a CAD based grid generation package, simplified the geometry and grid topology definition by provoding mature CAD tools and patch independent meshing. The resulting grid system has, on average, a four inch resolution along the surface.

  8. Using a Community-Engaged Research (CEnR) approach to develop and pilot a photo grid method to gain insights into early child health and development in a socio-economic disadvantaged community.

    PubMed

    Lowrie, Emma; Tyrrell-Smith, Rachel

    2017-01-01

    This paper reports on the use of a Community-Engaged Research (CEnR) approach to develop a new research tool to involve members of the community in thinking about priorities for early child health and development in a deprived area of the UK. The CEnR approach involves researchers, professionals and members of the public working together during all stages of research and development.Researchers used a phased approach to the development of a Photo Grid tool including reviewing tools which could be used for community engagement, and testing the new tool based on feedback from workshops with local early years professionals and parents of young children.The Photo Grid tool is a flat square grid on which photo cards can be placed. Participants were asked to pace at the top of the grid the photos they considered most important for early child health and development, working down to the less important ones at the bottom. The findings showed that the resulting Photo Grid tool was a useful and successful method of engaging with the local community. The evidence for this is the high numbers of participants who completed a pilot study and who provided feedback on the method. By involving community members throughout the research process, it was possible to develop a method that would be acceptable to the local population, thus decreasing the likelihood of a lack of engagement. The success of the tool is therefore particularly encouraging as it engages "seldom heard voices," such as those with low literacy. The aim of this research was to consult with professionals and parents to develop a new research toolkit (Photo Grid), to understand community assets and priorities in relation to early child health and development in Blackpool, a socio-economic disadvantaged community. A Community-Engaged Research (CEnR) approach was used to consult with community members. This paper describes the process of using a CEnR approach in developing a Photo Grid toolkit. A phased CEnR approach was used to design, test and pilot a Photo Grid tool. Members of the Blackpool community; parents with children aged 0-4 years, health professionals, members of the early year's workforce, and community development workers were involved in the development of the research tool at various stages. They were recruited opportunistically via a venue-based time-space sampling method. In total, 213 parents and 18 professionals engaged in the research process. Using a CEnR approach allowed effective engagement with the local community and professionals, evidence by high levels of engagement throughout the development process. This approach improved the acceptability and usability of the resulting Photo Grid toolkit. Community members found the method accessible, engaging, useful, and thought provoking. The Photo Grid toolkit was seen by community members as accessible, engaging, useful and thought provoking in an area of high social deprivation, complex problems, and low literacy. The Photo Grid is an adaptable tool which can be used in other areas of socio-economic disadvantage to engage with the community to understand a wide variety of complex topics.

  9. Electromagnetic sensing for deterministic finishing gridded domes

    NASA Astrophysics Data System (ADS)

    Galbraith, Stephen L.

    2013-06-01

    Electromagnetic sensing is a promising technology for precisely locating conductive grid structures that are buried in optical ceramic domes. Burying grid structures directly in the ceramic makes gridded dome construction easier, but a practical sensing technology is required to locate the grid relative to the dome surfaces. This paper presents a novel approach being developed for locating mesh grids that are physically thin, on the order of a mil, curved, and 75% to 90% open space. Non-contact location sensing takes place over a distance of 1/2 inch. A non-contact approach was required because the presence of the ceramic material precludes touching the grid with a measurement tool. Furthermore, the ceramic which may be opaque or transparent is invisible to the sensing technology which is advantageous for calibration. The paper first details the physical principles being exploited. Next, sensor impedance response is discussed for thin, open mesh, grids versus thick, solid, metal conductors. Finally, the technology approach is incorporated into a practical field tool for use in inspecting gridded domes.

  10. RF Models for Plasma-Surface Interactions in VSim

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Smithe, D. N.; Pankin, A. Y.; Roark, C. M.; Zhou, C. D.; Stoltz, P. H.; Kruger, S. E.

    2014-10-01

    An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath physics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath, can thus be simulated in complex geometries. Generalizations of the model to include sputtering, secondary electron emission, and effects from multiple ion species and background magnetic fields are summarized; related numerical results are also presented. In addition, improved tools for plasma chemistry and IEDF/EEDF visualization and modeling are discussed, as well as our initial efforts toward the development of hybrid fluid/kinetic transition capabilities within VSim. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling industrial plasma processes. Supported by US DoE SBIR-I/II Award DE-SC0009501.

  11. PBEQ-Solver for online visualization of electrostatic potential of biomolecules.

    PubMed

    Jo, Sunhwan; Vargyas, Miklos; Vasko-Szedlar, Judit; Roux, Benoît; Im, Wonpil

    2008-07-01

    PBEQ-Solver provides a web-based graphical user interface to read biomolecular structures, solve the Poisson-Boltzmann (PB) equations and interactively visualize the electrostatic potential. PBEQ-Solver calculates (i) electrostatic potential and solvation free energy, (ii) protein-protein (DNA or RNA) electrostatic interaction energy and (iii) pKa of a selected titratable residue. All the calculations can be performed in both aqueous solvent and membrane environments (with a cylindrical pore in the case of membrane). PBEQ-Solver uses the PBEQ module in the biomolecular simulation program CHARMM to solve the finite-difference PB equation of molecules specified by users. Users can interactively inspect the calculated electrostatic potential on the solvent-accessible surface as well as iso-electrostatic potential contours using a novel online visualization tool based on MarvinSpace molecular visualization software, a Java applet integrated within CHARMM-GUI (http://www.charmm-gui.org). To reduce the computational time on the server, and to increase the efficiency in visualization, all the PB calculations are performed with coarse grid spacing (1.5 A before and 1 A after focusing). PBEQ-Solver suggests various physical parameters for PB calculations and users can modify them if necessary. PBEQ-Solver is available at http://www.charmm-gui.org/input/pbeqsolver.

  12. Advanced in Visualization of 3D Time-Dependent CFD Solutions

    NASA Technical Reports Server (NTRS)

    Lane, David A.; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Numerical simulations of complex 3D time-dependent (unsteady) flows are becoming increasingly feasible because of the progress in computing systems. Unfortunately, many existing flow visualization systems were developed for time-independent (steady) solutions and do not adequately depict solutions from unsteady flow simulations. Furthermore, most systems only handle one time step of the solutions individually and do not consider the time-dependent nature of the solutions. For example, instantaneous streamlines are computed by tracking the particles using one time step of the solution. However, for streaklines and timelines, particles need to be tracked through all time steps. Streaklines can reveal quite different information about the flow than those revealed by instantaneous streamlines. Comparisons of instantaneous streamlines with dynamic streaklines are shown. For a complex 3D flow simulation, it is common to generate a grid system with several millions of grid points and to have tens of thousands of time steps. The disk requirement for storing the flow data can easily be tens of gigabytes. Visualizing solutions of this magnitude is a challenging problem with today's computer hardware technology. Even interactive visualization of one time step of the flow data can be a problem for some existing flow visualization systems because of the size of the grid. Current approaches for visualizing complex 3D time-dependent CFD solutions are described. The flow visualization system developed at NASA Ames Research Center to compute time-dependent particle traces from unsteady CFD solutions is described. The system computes particle traces (streaklines) by integrating through the time steps. This system has been used by several NASA scientists to visualize their CFD time-dependent solutions. The flow visualization capabilities of this system are described, and visualization results are shown.

  13. Bringing "Scientific Expeditions" Into the Schools

    NASA Technical Reports Server (NTRS)

    Watson, Val; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D, high resolution, dynamic, interactive viewing of scientific data (such as simulations or measurements of fluid dynamics). The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects in computational fluid dynamics (CFD) and wind tunnel testing. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualiZation of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewer's local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: 1. The visual is much higher in resolution (1280xl024 pixels with 24 bits of color) than typical video format transmitted over the network. 2. The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). 3. A rich variety of guided expeditions through the data can be included easily. 4. A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. 5. The scenes can be viewed in 3D using stereo vision. 6. The network bandwidth used for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.)

  14. Visual supports for shared reading with young children: the effect of static overlay design.

    PubMed

    Wood Jackson, Carla; Wahlquist, Jordan; Marquis, Cassandra

    2011-06-01

    This study examined the effects of two types of static overlay design (visual scene display and grid display) on 39 children's use of a speech-generating device during shared storybook reading with an adult. This pilot project included two groups: preschool children with typical communication skills (n = 26) and with complex communication needs (n = 13). All participants engaged in shared reading with two books using each visual layout on a speech-generating device (SGD). The children averaged a greater number of activations when presented with a grid display during introductory exploration and free play. There was a large effect of the static overlay design on the number of silent hits, evidencing more silent hits with visual scene displays. On average, the children demonstrated relatively few spontaneous activations of the speech-generating device while the adult was reading, regardless of overlay design. When responding to questions, children with communication needs appeared to perform better when using visual scene displays, but the effect of display condition on the accuracy of responses to wh-questions was not statistically significant. In response to an open ended question, children with communication disorders demonstrated more frequent activations of the SGD using a grid display than a visual scene. Suggestions for future research as well as potential implications for designing AAC systems for shared reading with young children are discussed.

  15. Patients' views on the use of an Option Grid for knee osteoarthritis in physiotherapy clinical encounters: An interview study.

    PubMed

    Kinsey, Katharine; Firth, Jill; Elwyn, Glyn; Edwards, Adrian; Brain, Katherine; Marrin, Katy; Nye, Alan; Wood, Fiona

    2017-12-01

    Patient decision support tools have been developed as a means of providing accurate and accessible information in order for patients to make informed decisions about their care. Option Grids ™ are a type of decision support tool specifically designed to be used during clinical encounters. To explore patients' views of the Option Grid encounter tool used in clinical consultations with physiotherapists, in comparison with usual care, within a patient population who are likely to be disadvantaged by age and low health literacy. Semi-structured interviews with 72 patients (36 who had been given an Option Grid in their consultation and 36 who had not). Thematic analysis explored patients' understanding of treatment options, perceptions of involvement, and readability and utility of the Option Grid. Interviews suggested that the Option Grid facilitated more detailed discussion about the risks and benefits of a wider range of treatment options for osteoarthritis of the knee. Participants indicated that the Option Grid was clear and aided their understanding of a structured progression of the options as their condition advanced, although it was not clear whether the Option Grid facilitated greater engagement in shared decision making. The Option Grid for osteoarthritis of the knee was well received by patient participants who reported that it helped them to understand their options, and made the notion of choice explicit. Use of Option Grids should be considered within routine consultations. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  16. A grid layout algorithm for automatic drawing of biochemical networks.

    PubMed

    Li, Weijiang; Kurata, Hiroyuki

    2005-05-01

    Visualization is indispensable in the research of complex biochemical networks. Available graph layout algorithms are not adequate for satisfactorily drawing such networks. New methods are required to visualize automatically the topological architectures and facilitate the understanding of the functions of the networks. We propose a novel layout algorithm to draw complex biochemical networks. A network is modeled as a system of interacting nodes on squared grids. A discrete cost function between each node pair is designed based on the topological relation and the geometric positions of the two nodes. The layouts are produced by minimizing the total cost. We design a fast algorithm to minimize the discrete cost function, by which candidate layouts can be produced efficiently. A simulated annealing procedure is used to choose better candidates. Our algorithm demonstrates its ability to exhibit cluster structures clearly in relatively compact layout areas without any prior knowledge. We developed Windows software to implement the algorithm for CADLIVE. All materials can be freely downloaded from http://kurata21.bio.kyutech.ac.jp/grid/grid_layout.htm; http://www.cadlive.jp/ http://kurata21.bio.kyutech.ac.jp/grid/grid_layout.htm; http://www.cadlive.jp/

  17. Real-Time 3D Visualization

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Butler Hine, former director of the Intelligent Mechanism Group (IMG) at Ames Research Center, and five others partnered to start Fourth Planet, Inc., a visualization company that specializes in the intuitive visual representation of dynamic, real-time data over the Internet and Intranet. Over a five-year period, the then NASA researchers performed ten robotic field missions in harsh climes to mimic the end- to-end operations of automated vehicles trekking across another world under control from Earth. The core software technology for these missions was the Virtual Environment Vehicle Interface (VEVI). Fourth Planet has released VEVI4, the fourth generation of the VEVI software, and NetVision. VEVI4 is a cutting-edge computer graphics simulation and remote control applications tool. The NetVision package allows large companies to view and analyze in virtual 3D space such things as the health or performance of their computer network or locate a trouble spot on an electric power grid. Other products are forthcoming. Fourth Planet is currently part of the NASA/Ames Technology Commercialization Center, a business incubator for start-up companies.

  18. An interactive multi-block grid generation system

    NASA Technical Reports Server (NTRS)

    Kao, T. J.; Su, T. Y.; Appleby, Ruth

    1992-01-01

    A grid generation procedure combining interactive and batch grid generation programs was put together to generate multi-block grids for complex aircraft configurations. The interactive section provides the tools for 3D geometry manipulation, surface grid extraction, boundary domain construction for 3D volume grid generation, and block-block relationships and boundary conditions for flow solvers. The procedure improves the flexibility and quality of grid generation to meet the design/analysis requirements.

  19. Introducing FNCS: Framework for Network Co-Simulation

    ScienceCinema

    None

    2018-06-07

    This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.

  20. Introducing FNCS: Framework for Network Co-Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-10-23

    This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.

  1. Kwf-Grid workflow management system for Earth science applications

    NASA Astrophysics Data System (ADS)

    Tran, V.; Hluchy, L.

    2009-04-01

    In this paper, we present workflow management tool for Earth science applications in EGEE. The workflow management tool was originally developed within K-wf Grid project for GT4 middleware and has many advanced features like semi-automatic workflow composition, user-friendly GUI for managing workflows, knowledge management. In EGEE, we are porting the workflow management tool to gLite middleware for Earth science applications K-wf Grid workflow management system was developed within "Knowledge-based Workflow System for Grid Applications" under the 6th Framework Programme. The workflow mangement system intended to - semi-automatically compose a workflow of Grid services, - execute the composed workflow application in a Grid computing environment, - monitor the performance of the Grid infrastructure and the Grid applications, - analyze the resulting monitoring information, - capture the knowledge that is contained in the information by means of intelligent agents, - and finally to reuse the joined knowledge gathered from all participating users in a collaborative way in order to efficiently construct workflows for new Grid applications. Kwf Grid workflow engines can support different types of jobs (e.g. GRAM job, web services) in a workflow. New class of gLite job has been added to the system, allows system to manage and execute gLite jobs in EGEE infrastructure. The GUI has been adapted to the requirements of EGEE users, new credential management servlet is added to portal. Porting K-wf Grid workflow management system to gLite would allow EGEE users to use the system and benefit from its avanced features. The system is primarly tested and evaluated with applications from ES clusters.

  2. Parametric Geometry, Structured Grid Generation, and Initial Design Study for REST-Class Hypersonic Inlets

    NASA Technical Reports Server (NTRS)

    Ferlemann, Paul G.; Gollan, Rowan J.

    2010-01-01

    Computational design and analysis of three-dimensional hypersonic inlets with shape transition has been a significant challenge due to the complex geometry and grid required for three-dimensional viscous flow calculations. Currently, the design process utilizes an inviscid design tool to produce initial inlet shapes by streamline tracing through an axisymmetric compression field. However, the shape is defined by a large number of points rather than a continuous surface and lacks important features such as blunt leading edges. Therefore, a design system has been developed to parametrically construct true CAD geometry and link the topology of a structured grid to the geometry. The Adaptive Modeling Language (AML) constitutes the underlying framework that is used to build the geometry and grid topology. Parameterization of the CAD geometry allows the inlet shapes produced by the inviscid design tool to be generated, but also allows a great deal of flexibility to modify the shape to account for three-dimensional viscous effects. By linking the grid topology to the parametric geometry, the GridPro grid generation software can be used efficiently to produce a smooth hexahedral multiblock grid. To demonstrate the new capability, a matrix of inlets were designed by varying four geometry parameters in the inviscid design tool. The goals of the initial design study were to explore inviscid design tool geometry variations with a three-dimensional analysis approach, demonstrate a solution rate which would enable the use of high-fidelity viscous three-dimensional CFD in future design efforts, process the results for important performance parameters, and perform a sample optimization.

  3. GIS embedded hydrological modeling: the SID&GRID project

    NASA Astrophysics Data System (ADS)

    Borsi, I.; Rossetto, R.; Schifani, C.

    2012-04-01

    The SID&GRID research project, started April 2010 and funded by Regione Toscana (Italy) under the POR FSE 2007-2013, aims to develop a Decision Support System (DSS) for water resource management and planning based on open source and public domain solutions. In order to quantitatively assess water availability in space and time and to support the planning decision processes, the SID&GRID solution consists of hydrological models (coupling 3D existing and newly developed surface- and ground-water and unsaturated zone modeling codes) embedded in a GIS interface, applications and library, where all the input and output data are managed by means of DataBase Management System (DBMS). A graphical user interface (GUI) to manage, analyze and run the SID&GRID hydrological models based on open source gvSIG GIS framework (Asociación gvSIG, 2011) and a Spatial Data Infrastructure to share and interoperate with distributed geographical data is being developed. Such a GUI is thought as a "master control panel" able to guide the user from pre-processing spatial and temporal data, running the hydrological models, and analyzing the outputs. To achieve the above-mentioned goals, the following codes have been selected and are being integrated: 1. Postgresql/PostGIS (PostGIS, 2011) for the Geo Data base Management System; 2. gvSIG with Sextante (Olaya, 2011) geo-algorithm library capabilities and Grass tools (GRASS Development Team, 2011) for the desktop GIS; 3. Geoserver and Geonetwork to share and discover spatial data on the web according to Open Geospatial Consortium; 4. new tools based on the Sextante GeoAlgorithm framework; 5. MODFLOW-2005 (Harbaugh, 2005) groundwater modeling code; 6. MODFLOW-LGR (Mehl and Hill 2005) for local grid refinement; 7. VSF (Thoms et al., 2006) for the variable saturated flow component; 8. new developed routines for overland flow; 9. new algorithms in Jython integrated in gvSIG to compute the net rainfall rate reaching the soil surface, as input for the unsaturated/saturated flow model. At this stage of the research (which will end April 2013), two primary components of the master control panel are being developed: i. a SID&GRID toolbar integrated into gvSIG map context; ii. a new Sextante set of geo-algorithm to pre- and post-process the spatial data to run the hydrological models. The groundwater part of the code has been fully integrated and tested and 3D visualization tools are being developed. The LGR capability has been extended to the 3D solution of the Richards' equation in order to solve in detail the unsaturated zone where required. To be updated about the project, please follow us at the website: http://ut11.isti.cnr.it/SIDGRID/

  4. Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline

    PubMed Central

    Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur

    2010-01-01

    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408

  5. A Randomized Trial Comparing Intravitreal Triamcinolone Acetonide and Focal/Grid Photocoagulation for Diabetic Macular Edema

    PubMed Central

    2009-01-01

    Objective To evaluate the efficacy and safety of 1 mg and 4 mg doses of preservative-free intravitreal triamcinolone in comparison with focal/grid photocoagulation for the treatment of diabetic macular edema (DME). Design Multi-center randomized clinical trial Participants 840 study eyes of 693 subjects with DME involving the fovea and visual acuity 20/40 to 20/320 Methods Eyes were randomized to focal/grid photocoagulation (N=330), 1 mg intravitreal triamcinolone (N=256), or 4 mg intravitreal triamcinolone (N=254). Retreatment was given for persistent or new edema at 4-month intervals. The primary outcome was at 2 years. Main Outcome Measures Visual acuity measured with the Electronic Early Treatment Diabetic Retinopathy Study (E-ETDRS) method (primary), optical coherence tomography (OCT)-measured retinal thickness (secondary), and safety. Results At 4 months, mean visual acuity was better in the 4 mg triamcinolone group than in either the laser group (P<0.001) or the 1 mg triamcinolone group (P=0.001). By 1 year, there were no significant differences among groups in mean visual acuity. At the 16-month visit and extending through the primary outcome visit at 2 years, mean visual acuity was better in the laser group than in the other two groups (at 2 years, P=0.02 comparing the laser and 1 mg groups, P=0.002 comparing the laser and 4 mg groups, and P=0.49 comparing the 1mg and 4 mg groups). Treatment group differences in the visual acuity outcome could not be attributed solely to cataract formation. OCT results generally paralleled the visual acuity results. Intraocular pressure was increased from baseline by ≥10 mm Hg at any visit in 4%, 16%, and 33% of eyes in the three treatment groups, respectively, and cataract surgery was performed in 13%, 23%, and 51% of eyes in the three treatment groups, respectively. Conclusions Over a 2-year period, focal/grid photocoagulation is more effective and has fewer side effects than 1 mg or 4 mg doses of preservative-free intravitreal triamcinolone for most patients with DME who have characteristics similar to the cohort in this clinical trial. The results of this study also support that focal/grid photocoagulation currently should be the benchmark against which other treatments are compared in clinical trials of DME. PMID:18662829

  6. A randomized trial comparing intravitreal triamcinolone acetonide and focal/grid photocoagulation for diabetic macular edema.

    PubMed

    2008-09-01

    To evaluate the efficacy and safety of 1-mg and 4-mg doses of preservative-free intravitreal triamcinolone in comparison with focal/grid photocoagulation for the treatment of diabetic macular edema (DME). Multicenter, randomized clinical trial. Eight hundred forty study eyes of 693 subjects with DME involving the fovea and with visual acuity of 20/40 to 20/320. Eyes were randomized to focal/grid photocoagulation (n = 330), 1 mg intravitreal triamcinolone (n = 256), or 4 mg intravitreal triamcinolone (n = 254). Retreatment was given for persistent or new edema at 4-month intervals. The primary outcome was evaluated at 2 years. Visual acuity measured with the electronic Early Treatment Diabetic Retinopathy Study method (primary), optical coherence tomography-measured retinal thickness (secondary), and safety. At 4 months, mean visual acuity was better in the 4-mg triamcinolone group than in either the laser group (P<0.001) or the 1-mg triamcinolone group (P = 0.001). By 1 year, there were no significant differences among groups in mean visual acuity. At the 16-month visit and extending through the primary outcome visit at 2 years, mean visual acuity was better in the laser group than in the other 2 groups (at 2 years, P = 0.02 comparing the laser and 1-mg groups, P = 0.002 comparing the laser and 4-mg groups, and P = 0.49 comparing the 1-mg and 4-mg groups). Treatment group differences in the visual acuity outcome could not be attributed solely to cataract formation. Optical coherence tomography results generally paralleled the visual acuity results. Intraocular pressure increased from baseline by 10 mmHg or more at any visit in 4%, 16%, and 33% of eyes in the 3 treatment groups, respectively, and cataract surgery was performed in 13%, 23%, and 51% of eyes in the 3 treatment groups, respectively. Over a 2-year period, focal/grid photocoagulation is more effective and has fewer side effects than 1-mg or 4-mg doses of preservative-free intravitreal triamcinolone for most patients with DME who have characteristics similar to the cohort in this clinical trial. The results of this study also support that focal/grid photocoagulation currently should be the benchmark against which other treatments are compared in clinical trials of DME.

  7. Novel grid-based optical Braille conversion: from scanning to wording

    NASA Astrophysics Data System (ADS)

    Yoosefi Babadi, Majid; Jafari, Shahram

    2011-12-01

    Grid-based optical Braille conversion (GOBCO) is explained in this article. The grid-fitting technique involves processing scanned images taken from old hard-copy Braille manuscripts, recognising and converting them into English ASCII text documents inside a computer. The resulted words are verified using the relevant dictionary to provide the final output. The algorithms employed in this article can be easily modified to be implemented on other visual pattern recognition systems and text extraction applications. This technique has several advantages including: simplicity of the algorithm, high speed of execution, ability to help visually impaired persons and blind people to work with fax machines and the like, and the ability to help sighted people with no prior knowledge of Braille to understand hard-copy Braille manuscripts.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph H.; Parashar, Manu; Lewis, Nancy Jo

    The Real Time System Operations (RTSO) 2006-2007 project focused on two parallel technical tasks: (1) Real-Time Applications of Phasors for Monitoring, Alarming and Control; and (2) Real-Time Voltage Security Assessment (RTVSA) Prototype Tool. The overall goal of the phasor applications project was to accelerate adoption and foster greater use of new, more accurate, time-synchronized phasor measurements by conducting research and prototyping applications on California ISO's phasor platform - Real-Time Dynamics Monitoring System (RTDMS) -- that provide previously unavailable information on the dynamic stability of the grid. Feasibility assessment studies were conducted on potential application of this technology for small-signal stabilitymore » monitoring, validating/improving existing stability nomograms, conducting frequency response analysis, and obtaining real-time sensitivity information on key metrics to assess grid stress. Based on study findings, prototype applications for real-time visualization and alarming, small-signal stability monitoring, measurement based sensitivity analysis and frequency response assessment were developed, factory- and field-tested at the California ISO and at BPA. The goal of the RTVSA project was to provide California ISO with a prototype voltage security assessment tool that runs in real time within California ISO?s new reliability and congestion management system. CERTS conducted a technical assessment of appropriate algorithms, developed a prototype incorporating state-of-art algorithms (such as the continuation power flow, direct method, boundary orbiting method, and hyperplanes) into a framework most suitable for an operations environment. Based on study findings, a functional specification was prepared, which the California ISO has since used to procure a production-quality tool that is now a part of a suite of advanced computational tools that is used by California ISO for reliability and congestion management.« less

  9. CASAS: Cancer Survival Analysis Suite, a web based application

    PubMed Central

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946

  10. CASAS: Cancer Survival Analysis Suite, a web based application.

    PubMed

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.

  11. CFD Methods and Tools for Multi-Element Airfoil Analysis

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; George, Michael W. (Technical Monitor)

    1995-01-01

    This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

  12. Recent Developments in OVERGRID, OVERFLOW-2 and Chimera Grid Tools Scripts

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    OVERGRID and OVERFLOW-2 feature easy to use multiple-body dynamics. The new features of OVERGRID include a preliminary chemistry interface, standard atmosphere and mass properties calculators, a simple unsteady solution viewer, and a debris tracking interface. Script library development in Chimera Grid Tools has applications in turbopump grid generation. This viewgraph presentation profiles multiple component dynamics, validation test cases for a sphere, cylinder, and oscillating airfoil, and debris analysis.

  13. Cartograms Facilitate Communication of Climate Change Risks and Responsibilities

    NASA Astrophysics Data System (ADS)

    Döll, Petra

    2017-12-01

    Communication of climate change (CC) risks is challenging, in particular if global-scale spatially resolved quantitative information is to be conveyed. Typically, visualization of CC risks, which arise from the combination of hazard, exposure and vulnerability, is confined to showing only the hazards in the form of global thematic maps. This paper explores the potential of contiguous value-by-area cartograms, that is, distorted density-equalizing maps, for improving communication of CC risks and the countries' differentiated responsibilities for CC. Two global-scale cartogram sets visualize, as an example, groundwater-related CC risks in 0.5° grid cells, another one the correlation of (cumulative) fossil-fuel carbon dioxide emissions with the countries' population and gross domestic product. Viewers of the latter set visually recognize the lack of global equity and that the countries' wealth has been built on harmful emissions. I recommend that CC risks are communicated by bivariate gridded cartograms showing the hazard in color and population, or a combination of population and a vulnerability indicator, by distortion of grid cells. Gridded cartograms are also appropriate for visualizing the availability of natural resources to humans. For communicating complex information, sets of cartograms should be carefully designed instead of presenting single cartograms. Inclusion of a conventionally distorted map enhances the viewers' capability to take up the information represented by distortion. Empirical studies about the capability of global cartograms to convey complex information and to trigger moral emotions should be conducted, with a special focus on risk communication.

  14. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current focus is to integrate in the proposed platform the Cloud infrastructure, which is still a paradigm with critical problems to be solved despite the great efforts and investments. Cloud computing comes as a new way of delivering resources while using a large set of old as well as new technologies and tools for providing the necessary functionalities. The main challenges in the Cloud computing, most of them identified also in the Open Cloud Manifesto 2009, address resource management and monitoring, data and application interoperability and portability, security, scalability, software licensing, etc. We propose a platform able to execute different Geospatial applications on different parallel and distributed architectures such as Grid, Cloud, Multicore, etc. with the possibility of choosing among these architectures based on application characteristics and complexity, user requirements, necessary performances, cost support, etc. The execution redirection on a selected architecture is realized through a specialized component and has the purpose of offering a flexible way in achieving the best performances considering the existing restrictions.

  15. Potential of knowledge discovery using workflows implemented in the C3Grid

    NASA Astrophysics Data System (ADS)

    Engel, Thomas; Fink, Andreas; Ulbrich, Uwe; Schartner, Thomas; Dobler, Andreas; Fritzsch, Bernadette; Hiller, Wolfgang; Bräuer, Benny

    2013-04-01

    With the increasing number of climate simulations, reanalyses and observations, new infrastructures to search and analyse distributed data are necessary. In recent years, the Grid architecture became an important technology to fulfill these demands. For the German project "Collaborative Climate Community Data and Processing Grid" (C3Grid) computer scientists and meteorologists developed a system that offers its users a webinterface to search and download climate data and use implemented analysis tools (called workflows) to further investigate them. In this contribution, two workflows that are implemented in the C3Grid architecture are presented: the Cyclone Tracking (CT) and Stormtrack workflow. They shall serve as an example on how to perform numerous investigations on midlatitude winterstorms on a large amount of analysis and climate model data without having an insight into the data source, program code and a low-to-moderate understanding of the theortical background. CT is based on the work of Murray and Simmonds (1991) to identify and track local minima in the mean sea level pressure (MSLP) field of the selected dataset. Adjustable thresholds for the curvature of the isobars as well as the minimum lifetime of a cyclone allow the distinction of weak subtropical heat low systems and stronger midlatitude cyclones e.g. in the Northern Atlantic. The user gets the resulting track data including statistics about the track density, average central pressure, average central curvature, cyclogenesis and cyclolysis as well as pre-built visualizations of these results. Stormtrack calculates the 2.5-6 day bandpassfiltered standard deviation of the geopotential height on a selected pressure level. Although this workflow needs much less computational effort compared to CT it shows structures that are in good agreement with the track density of the CT workflow. To what extent changes in the mid-level tropospheric storm track are reflected in trough density and intensity alteration of surface cyclones. A specific feature of C3Grid is the flexible Workflow Scheduling Service (WSS) which also allows for automated nightly analysis runs of CT, Stormtrack, etc. with different input parameter sets. The statistical results of these workflows can be accumulated afterwards by a scheduled final analysis step, thereby providing a tool for data intensive analytics for the massive amounts of climate model data accessible through C3Grid. First tests with these automated analysis workflows show promising results to speed up the investigation of high volume modeling data. This example is relevant to the thorough analysis of future changes in storminess in Europe and is just one example of the potential of knowledge discovery using automated workflows implemented in the C3Grid architecture.

  16. Chimera grids in the simulation of three-dimensional flowfields in turbine-blade-coolant passages

    NASA Technical Reports Server (NTRS)

    Stephens, M. A.; Rimlinger, M. J.; Shih, T. I.-P.; Civinskas, K. C.

    1993-01-01

    When computing flows inside geometrically complex turbine-blade coolant passages, the structure of the grid system used can affect significantly the overall time and cost required to obtain solutions. This paper addresses this issue while evaluating and developing computational tools for the design and analysis of coolant-passages, and is divided into two parts. In the first part, the various types of structured and unstructured grids are compared in relation to their ability to provide solutions in a timely and cost-effective manner. This comparison shows that the overlapping structured grids, known as Chimera grids, can rival and in some instances exceed the cost-effectiveness of unstructured grids in terms of both the man hours needed to generate grids and the amount of computer memory and CPU time needed to obtain solutions. In the second part, a computational tool utilizing Chimera grids was used to compute the flow and heat transfer in two different turbine-blade coolant passages that contain baffles and numerous pin fins. These computations showed the versatility and flexibility offered by Chimera grids.

  17. Residential scene classification for gridded population sampling in developing countries using deep convolutional neural networks on satellite imagery.

    PubMed

    Chew, Robert F; Amer, Safaa; Jones, Kasey; Unangst, Jennifer; Cajka, James; Allpress, Justine; Bruhn, Mark

    2018-05-09

    Conducting surveys in low- and middle-income countries is often challenging because many areas lack a complete sampling frame, have outdated census information, or have limited data available for designing and selecting a representative sample. Geosampling is a probability-based, gridded population sampling method that addresses some of these issues by using geographic information system (GIS) tools to create logistically manageable area units for sampling. GIS grid cells are overlaid to partition a country's existing administrative boundaries into area units that vary in size from 50 m × 50 m to 150 m × 150 m. To avoid sending interviewers to unoccupied areas, researchers manually classify grid cells as "residential" or "nonresidential" through visual inspection of aerial images. "Nonresidential" units are then excluded from sampling and data collection. This process of manually classifying sampling units has drawbacks since it is labor intensive, prone to human error, and creates the need for simplifying assumptions during calculation of design-based sampling weights. In this paper, we discuss the development of a deep learning classification model to predict whether aerial images are residential or nonresidential, thus reducing manual labor and eliminating the need for simplifying assumptions. On our test sets, the model performs comparable to a human-level baseline in both Nigeria (94.5% accuracy) and Guatemala (96.4% accuracy), and outperforms baseline machine learning models trained on crowdsourced or remote-sensed geospatial features. Additionally, our findings suggest that this approach can work well in new areas with relatively modest amounts of training data. Gridded population sampling methods like geosampling are becoming increasingly popular in countries with outdated or inaccurate census data because of their timeliness, flexibility, and cost. Using deep learning models directly on satellite images, we provide a novel method for sample frame construction that identifies residential gridded aerial units. In cases where manual classification of satellite images is used to (1) correct for errors in gridded population data sets or (2) classify grids where population estimates are unavailable, this methodology can help reduce annotation burden with comparable quality to human analysts.

  18. Partners | Energy Systems Integration Facility | NREL

    Science.gov Websites

    Renewable Electricity to Grid Integration Evaluation of New Technology IGBT Industry Asetek High Performance Energy Commission High Performance Computing & Visualization Real-Time Data Collection for Institute/Schneider Electric Renewable Electricity to Grid Integration End-to-End Communication and Control

  19. Cost benefit analysis for smart grid projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karali, Nihan; He, Gang; Mauzey, J

    The U.S. is unusual in that a definition of the term “smart grid” was written into legislation, appearing in the Energy Independence and Security Act (2007). When the recession called for stimulus spending and the American Recovery and Reinvestment Act (ARRA, 2009) was passed, a framework already existed for identification of smart grid projects. About $4.5B of the U.S. Department of Energy’s (U.S. DOE’s) $37B allocation from ARRA was directed to smart grid projects of two types, investment grants and demonstrations. Matching funds from other sources more than doubled the total value of ARRA-funded smart grid projects. The Smart Gridmore » Investment Grant Program (SGIG) consumed all but $620M of the ARRA funds, which was available for the 32 projects in the Smart Grid Demonstration Program (SGDP, or demonstrations). Given the economic potential of these projects and the substantial investments required, there was keen interest in estimating the benefits of the projects (i.e., quantifying and monetizing the performance of smart grid technologies). Common method development and application, data collection, and analysis to calculate and publicize the benefits were central objectives of the program. For this purpose standard methods and a software tool, the Smart Grid Computational Tool (SGCT), were developed by U.S. DOE and a spreadsheet model was made freely available to grantees and other analysts. The methodology was intended to define smart grid technologies or assets, the mechanisms by which they generate functions, their impacts and, ultimately, their benefits. The SGCT and its application to the Demonstration Projects are described, and actual projects in Southern California and in China are selected to test and illustrate the tool. The usefulness of the methodology and tool for international analyses is then assessed.« less

  20. Interoperability of GADU in using heterogeneous Grid resources for bioinformatics applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sulakhe, D.; Rodriguez, A.; Wilde, M.

    2008-03-01

    Bioinformatics tools used for efficient and computationally intensive analysis of genetic sequences require large-scale computational resources to accommodate the growing data. Grid computational resources such as the Open Science Grid and TeraGrid have proved useful for scientific discovery. The genome analysis and database update system (GADU) is a high-throughput computational system developed to automate the steps involved in accessing the Grid resources for running bioinformatics applications. This paper describes the requirements for building an automated scalable system such as GADU that can run jobs on different Grids. The paper describes the resource-independent configuration of GADU using the Pegasus-based virtual datamore » system that makes high-throughput computational tools interoperable on heterogeneous Grid resources. The paper also highlights the features implemented to make GADU a gateway to computationally intensive bioinformatics applications on the Grid. The paper will not go into the details of problems involved or the lessons learned in using individual Grid resources as it has already been published in our paper on genome analysis research environment (GNARE) and will focus primarily on the architecture that makes GADU resource independent and interoperable across heterogeneous Grid resources.« less

  1. Accessing and visualizing scientific spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Bergou, Attila; Berriman, G. Bruce; Block, Gary L.; Collier, Jim; Curkendall, David W.; Good, John; Husman, Laura; Jacob, Joseph C.; Laity, Anastasia; hide

    2004-01-01

    This paper discusses work done by JPL's Parallel Applications Technologies Group in helping scientists access and visualize very large data sets through the use of multiple computing resources, such as parallel supercomputers, clusters, and grids.

  2. Runway Texture and Grid Pattern Effects on Rate-of-Descent Perception

    NASA Technical Reports Server (NTRS)

    Schroeder, J. A.; Dearing, M. G.; Sweet, B. T.; Kaiser, M. K.; Rutkowski, Mike (Technical Monitor)

    2001-01-01

    To date, perceptual errors occur in determining descent rate from a computer-generated image in flight simulation. Pilots tend to touch down twice as hard in simulation than in flight, and more training time is needed in simulation before reaching steady-state performance. Barnes suggested that recognition of range may be the culprit, and he cited that problems such as collimated objects, binocular vision, and poor resolution lead to poor estimation of the velocity vector. Brown's study essentially ruled out that the lack of binocular vision is the problem. Dorfel added specificity to the problem by showing that pilots underestimated range in simulated scenes by 50% when 800 ft from the runway threshold. Palmer and Petitt showed that pilots are able to distinguish between a 1.7 ft/sec and 2.9 ft/sec sink rate when passively observing sink rates in a night scene. Platform motion also plays a role, as previous research has shown that the addition of substantial platform motion improves pilot estimates of vertical velocity and results in simulated touchdown rates more closely resembling flight. This experiment examined how some specific variations in the visual scene properties affect a pilot's perception of sink rate. It extended another experiment that focused on the visual and motion cues necessary for helicopter autorotations. In that experiment, pilots performed steep approaches to a runway. The visual content of the runway and its surroundings varied in two ways: texture and rectangular grid spacing. Four textures, included a no-texture case, were evaluated. Three grid spacings, including a no-grid case, were evaluated. The results showed that pilot better controlled their vertical descent rates when good texture cues were present. No significant differences were found for the grid manipulation. Using those visual scenes a simple psychophysics, experiment was performed. The purpose was to determine if the variations in the visual scenes allowed pilots to better perceive vertical velocity. To determine that answer, pilots passively viewed a particular visual scene in which the vehicle was descending at two different rates. Pilots had to select which of the two rates they thought was the fastest rate. The difference between the two rates changed using a staircase method, depending on whether or not the pilot was correct, until a minimum threshold between the two descent rates was reached. This process was repeated for all of the visual scenes to decide whether or not the visual scenes did allow pilots to perceive vertical velocity better among them. All of the data have yet to be analyzed; however, neither the effects of grid nor texture revealed any statistically significant trends. On further examination of the staircase method employed, a possibility exists that the lack of an evident trend may be due to the exit criterion used during the study. As such, the experiment will be repeated with an improved exit criterion in February. Results of this study will be presented in the submitted paper.

  3. Large-Scale Astrophysical Visualization on Smartphones

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  4. Applicability of computer-aided comprehensive tool (LINDA: LINeament Detection and Analysis) and shaded digital elevation model for characterizing and interpreting morphotectonic features from lineaments

    NASA Astrophysics Data System (ADS)

    Masoud, Alaa; Koike, Katsuaki

    2017-09-01

    Detection and analysis of linear features related to surface and subsurface structures have been deemed necessary in natural resource exploration and earth surface instability assessment. Subjectivity in choosing control parameters required in conventional methods of lineament detection may cause unreliable results. To reduce this ambiguity, we developed LINDA (LINeament Detection and Analysis), an integrated tool with graphical user interface in Visual Basic. This tool automates processes of detection and analysis of linear features from grid data of topography (digital elevation model; DEM), gravity and magnetic surfaces, as well as data from remote sensing imagery. A simple interface with five display windows forms a user-friendly interactive environment. The interface facilitates grid data shading, detection and grouping of segments, lineament analyses for calculating strike and dip and estimating fault type, and interactive viewing of lineament geometry. Density maps of the center and intersection points of linear features (segments and lineaments) are also included. A systematic analysis of test DEMs and Landsat 7 ETM+ imagery datasets in the North and South Eastern Deserts of Egypt is implemented to demonstrate the capability of LINDA and correct use of its functions. Linear features from the DEM are superior to those from the imagery in terms of frequency, but both linear features agree with location and direction of V-shaped valleys and dykes and reference fault data. Through the case studies, LINDA applicability is demonstrated to highlight dominant structural trends, which can aid understanding of geodynamic frameworks in any region.

  5. Advances in a distributed approach for ocean model data interoperability

    USGS Publications Warehouse

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  6. Spatial cell firing during virtual navigation of open arenas by head-restrained mice.

    PubMed

    Chen, Guifen; King, John Andrew; Lu, Yi; Cacucci, Francesca; Burgess, Neil

    2018-06-18

    We present a mouse virtual reality (VR) system which restrains head-movements to horizontal rotations, compatible with multi-photon imaging. This system allows expression of the spatial navigation and neuronal firing patterns characteristic of real open arenas (R). Comparing VR to R: place and grid, but not head-direction, cell firing had broader spatial tuning; place, but not grid, cell firing was more directional; theta frequency increased less with running speed; whereas increases in firing rates with running speed and place and grid cells' theta phase precession were similar. These results suggest that the omni-directional place cell firing in R may require local-cues unavailable in VR, and that the scale of grid and place cell firing patterns, and theta frequency, reflect translational motion inferred from both virtual (visual and proprioceptive) and real (vestibular translation and extra-maze) cues. By contrast, firing rates and theta phase precession appear to reflect visual and proprioceptive cues alone. © 2018, Chen et al.

  7. Batch mode grid generation: An endangered species

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    1992-01-01

    Non-interactive grid generation schemes should thrive as emphasis shifts from development of numerical analysis and design methods to application of these tools to real engineering problems. A strong case is presented for the continued development and application of non-interactive geometry modeling methods. Guidelines, strategies, and techniques for developing and implementing these tools are presented using current non-interactive grid generation methods as examples. These schemes play an important role in the development of multidisciplinary analysis methods and some of these applications are also discussed.

  8. Real-time performance monitoring and management system

    DOEpatents

    Budhraja, Vikram S [Los Angeles, CA; Dyer, James D [La Mirada, CA; Martinez Morales, Carlos A [Upland, CA

    2007-06-19

    A real-time performance monitoring system for monitoring an electric power grid. The electric power grid has a plurality of grid portions, each grid portion corresponding to one of a plurality of control areas. The real-time performance monitoring system includes a monitor computer for monitoring at least one of reliability metrics, generation metrics, transmission metrics, suppliers metrics, grid infrastructure security metrics, and markets metrics for the electric power grid. The data for metrics being monitored by the monitor computer are stored in a data base, and a visualization of the metrics is displayed on at least one display computer having a monitor. The at least one display computer in one said control area enables an operator to monitor the grid portion corresponding to a different said control area.

  9. Electricity Capacity Expansion Modeling, Analysis, and Visualization. A Summary of High-Renewable Modeling Experience for China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blair, Nate; Zhou, Ella; Getman, Dan

    2015-10-01

    Mathematical and computational models are widely used for the analysis and design of both physical and financial systems. Modeling the electric grid is of particular importance to China for three reasons. First, power-sector assets are expensive and long-lived, and they are critical to any country's development. China's electric load, transmission, and other energy-related infrastructure are expected to continue to grow rapidly; therefore it is crucial to understand and help plan for the future in which those assets will operate (NDRC ERI 2015). Second, China has dramatically increased its deployment of renewable energy (RE), and is likely to continue further acceleratingmore » such deployment over the coming decades. Careful planning and assessment of the various aspects (technical, economic, social, and political) of integrating a large amount of renewables on the grid is required. Third, companies need the tools to develop a strategy for their own involvement in the power market China is now developing, and to enable a possible transition to an efficient and high RE future.« less

  10. Integrated Visualization of Multi-sensor Ocean Data across the Web

    NASA Astrophysics Data System (ADS)

    Platt, F.; Thompson, C. K.; Roberts, J. T.; Tsontos, V. M.; Hin Lam, C.; Arms, S. C.; Quach, N.

    2017-12-01

    Whether for research or operational decision support, oceanographic applications rely on the visualization of multivariate in situ and remote sensing data as an integral part of analysis workflows. However, given their inherently 3D-spatial and temporally dynamic nature, the visual representation of marine in situ data in particular poses a challenge. The Oceanographic In situ data Interoperability Project (OIIP) is a collaborative project funded under the NASA/ACCESS program that seeks to leverage and enhance higher TRL (technology readiness level) informatics technologies to address key data interoperability and integration issues associated with in situ ocean data, including the dearth of effective web-based visualization solutions. Existing web tools for the visualization of key in situ data types - point, profile, trajectory series - are limited in their support for integrated, dynamic and coordinated views of the spatiotemporal characteristics of the data. Via the extension of the JPL Common Mapping Client (CMC) software framework, OIIP seeks to provide improved visualization support for oceanographic in situ data sets. More specifically, this entails improved representation of both horizontal and vertical aspects of these data, which inherently are depth resolved and time referenced, as well as the visual synchronization with relevant remotely-sensed gridded data products, such as sea surface temperature and salinity. Electronic tagging datasets, which are a focal use case for OIIP, provide a representative, if somewhat complex, visualization challenge in this regard. Critical to the achievement of these development objectives has been compilation of a well-rounded set of visualization use cases and requirements based on a series of end-user consultations aimed at understanding their satellite-in situ visualization needs. Here we summarize progress on aspects of the technical work and our approach.

  11. A System of Systems Approach to Integrating Global Sea Level Change Application Programs

    NASA Astrophysics Data System (ADS)

    Bambachus, M. J.; Foster, R. S.; Powell, C.; Cole, M.

    2005-12-01

    The global sea level change application community has numerous disparate models used to make predications over various regional and temporal scales. These models have typically been focused on limited sets of data and optimized for specific areas or questions of interest. Increasingly, decision makers at the national, international, and local/regional levels require access to these application data models and want to be able to integrate large disparate data sets, with new ubiquitous sensor data, and use these data across models from multiple sources. These requirements will force the Global Sea Level Change application community to take a new system-of-systems approach to their programs. We present a new technical architecture approach to the global sea level change program that provides external access to the vast stores of global sea level change data, provides a collaboration forum for the discussion and visualization of data, and provides a simulation environment to evaluate decisions. This architectural approach will provide the tools to support multi-disciplinary decision making. A conceptual system of systems approach is needed to address questions around the multiple approaches to tracking and predicting Sea Level Change. A systems of systems approach would include (1) a forum of data providers, modelers, and users, (2) a service oriented architecture including interoperable web services with a backbone of Grid computing capability, and (3) discovery and access functionality to the information developed through this structure. Each of these three areas would be clearly designed to maximize communication, data use for decision making and flexibility and extensibility for evolution of technology and requirements. In contemplating a system-of-systems approach, it is important to highlight common understanding and coordination as foundational to success across the multiple systems. The workflow of science in different applications is often conceptually similar but different in the details. These differences can discourage the potential for collaboration. Resources that are not inherently shared (or do not spring from a common authority) must be explicitly coordinated to avoid disrupting the collaborative research workflow. This includes tools which make the interaction of systems (and users with systems, and administrators of systems) more conceptual and higher-level than is typically done today. Such tools all appear under the heading of Grid, within a larger idea of metacomputing. We present an approach for successful collaboration and shared use of distributed research resources. The real advances in research throughput that are occurring through the use of large computers are occurring less as a function of progress in a given discrete algorithm and much more as a function of model and data coupling. Complexity normally reduces the ability of the human mind to understand and work with this kind of coupling. Intuitive Grid-based computational resources simultaneously reduce the effect of this complexity on the scientist/decision maker, and increase the ability to rationalize complexity. Research progress can even be achieved before full understanding of complexity has been reached, by modeling and experimenting and providing more data to think about. Analytic engines provided via the Grid can help digest this data and make it tractable through visualization and exploration tools. We present a rationale for increasing research throughput by leveraging more complex model and data interaction.

  12. Hemodynamics model of fluid–solid interaction in internal carotid artery aneurysms

    PubMed Central

    Fu-Yu, Wang; Lei, Liu; Xiao-Jun, Zhang; Hai-Yue, Ju

    2010-01-01

    The objective of this study is to present a relatively simple method to reconstruct cerebral aneurysms as 3D numerical grids. The method accurately duplicates the geometry to provide computer simulations of the blood flow. Initial images were obtained by using CT angiography and 3D digital subtraction angiography in DICOM format. The image was processed by using MIMICS software, and the 3D fluid model (blood flow) and 3D solid model (wall) were generated. The subsequent output was exported to the ANSYS workbench software to generate the volumetric mesh for further hemodynamic study. The fluid model was defined and simulated in CFX software while the solid model was calculated in ANSYS software. The force data calculated firstly in the CFX software were transferred to the ANSYS software, and after receiving the force data, total mesh displacement data were calculated in the ANSYS software. Then, the mesh displacement data were transferred back to the CFX software. The data exchange was processed in workbench software. The results of simulation could be visualized in CFX-post. Two examples of grid reconstruction and blood flow simulation for patients with internal carotid artery aneurysms were presented. The wall shear stress, wall total pressure, and von Mises stress could be visualized. This method seems to be relatively simple and suitable for direct use by neurosurgeons or neuroradiologists, and maybe a practical tool for planning treatment and follow-up of patients after neurosurgical or endovascular interventions with 3D angiography. PMID:20812022

  13. Hemodynamics model of fluid-solid interaction in internal carotid artery aneurysms.

    PubMed

    Bai-Nan, Xu; Fu-Yu, Wang; Lei, Liu; Xiao-Jun, Zhang; Hai-Yue, Ju

    2011-01-01

    The objective of this study is to present a relatively simple method to reconstruct cerebral aneurysms as 3D numerical grids. The method accurately duplicates the geometry to provide computer simulations of the blood flow. Initial images were obtained by using CT angiography and 3D digital subtraction angiography in DICOM format. The image was processed by using MIMICS software, and the 3D fluid model (blood flow) and 3D solid model (wall) were generated. The subsequent output was exported to the ANSYS workbench software to generate the volumetric mesh for further hemodynamic study. The fluid model was defined and simulated in CFX software while the solid model was calculated in ANSYS software. The force data calculated firstly in the CFX software were transferred to the ANSYS software, and after receiving the force data, total mesh displacement data were calculated in the ANSYS software. Then, the mesh displacement data were transferred back to the CFX software. The data exchange was processed in workbench software. The results of simulation could be visualized in CFX-post. Two examples of grid reconstruction and blood flow simulation for patients with internal carotid artery aneurysms were presented. The wall shear stress, wall total pressure, and von Mises stress could be visualized. This method seems to be relatively simple and suitable for direct use by neurosurgeons or neuroradiologists, and maybe a practical tool for planning treatment and follow-up of patients after neurosurgical or endovascular interventions with 3D angiography.

  14. Visualization of membrane protein crystals in lipid cubic phase using X-ray imaging

    PubMed Central

    Warren, Anna J.; Armour, Wes; Axford, Danny; Basham, Mark; Connolley, Thomas; Hall, David R.; Horrell, Sam; McAuley, Katherine E.; Mykhaylyk, Vitaliy; Wagner, Armin; Evans, Gwyndaf

    2013-01-01

    The focus in macromolecular crystallography is moving towards even more challenging target proteins that often crystallize on much smaller scales and are frequently mounted in opaque or highly refractive materials. It is therefore essential that X-ray beamline technology develops in parallel to accommodate such difficult samples. In this paper, the use of X-ray microradiography and microtomography is reported as a tool for crystal visualization, location and characterization on the macromolecular crystallography beamlines at the Diamond Light Source. The technique is particularly useful for microcrystals and for crystals mounted in opaque materials such as lipid cubic phase. X-ray diffraction raster scanning can be used in combination with radiography to allow informed decision-making at the beamline prior to diffraction data collection. It is demonstrated that the X-ray dose required for a full tomography measurement is similar to that for a diffraction grid-scan, but for sample location and shape estimation alone just a few radiographic projections may be required. PMID:23793151

  15. Visualization of membrane protein crystals in lipid cubic phase using X-ray imaging.

    PubMed

    Warren, Anna J; Armour, Wes; Axford, Danny; Basham, Mark; Connolley, Thomas; Hall, David R; Horrell, Sam; McAuley, Katherine E; Mykhaylyk, Vitaliy; Wagner, Armin; Evans, Gwyndaf

    2013-07-01

    The focus in macromolecular crystallography is moving towards even more challenging target proteins that often crystallize on much smaller scales and are frequently mounted in opaque or highly refractive materials. It is therefore essential that X-ray beamline technology develops in parallel to accommodate such difficult samples. In this paper, the use of X-ray microradiography and microtomography is reported as a tool for crystal visualization, location and characterization on the macromolecular crystallography beamlines at the Diamond Light Source. The technique is particularly useful for microcrystals and for crystals mounted in opaque materials such as lipid cubic phase. X-ray diffraction raster scanning can be used in combination with radiography to allow informed decision-making at the beamline prior to diffraction data collection. It is demonstrated that the X-ray dose required for a full tomography measurement is similar to that for a diffraction grid-scan, but for sample location and shape estimation alone just a few radiographic projections may be required.

  16. Real Time Monitor of Grid job executions

    NASA Astrophysics Data System (ADS)

    Colling, D. J.; Martyniak, J.; McGough, A. S.; Křenek, A.; Sitera, J.; Mulač, M.; Dvořák, F.

    2010-04-01

    In this paper we describe the architecture and operation of the Real Time Monitor (RTM), developed by the Grid team in the HEP group at Imperial College London. This is arguably the most popular dissemination tool within the EGEE [1] Grid. Having been used, on many occasions including GridFest and LHC inauguration events held at CERN in October 2008. The RTM gathers information from EGEE sites hosting Logging and Bookkeeping (LB) services. Information is cached locally at a dedicated server at Imperial College London and made available for clients to use in near real time. The system consists of three main components: the RTM server, enquirer and an apache Web Server which is queried by clients. The RTM server queries the LB servers at fixed time intervals, collecting job related information and storing this in a local database. Job related data includes not only job state (i.e. Scheduled, Waiting, Running or Done) along with timing information but also other attributes such as Virtual Organization and Computing Element (CE) queue - if known. The job data stored in the RTM database is read by the enquirer every minute and converted to an XML format which is stored on a Web Server. This decouples the RTM server database from the client removing the bottleneck problem caused by many clients simultaneously accessing the database. This information can be visualized through either a 2D or 3D Java based client with live job data either being overlaid on to a 2 dimensional map of the world or rendered in 3 dimensions over a globe map using OpenGL.

  17. FitEM2EM—Tools for Low Resolution Study of Macromolecular Assembly and Dynamics

    PubMed Central

    Frankenstein, Ziv; Sperling, Joseph; Sperling, Ruth; Eisenstein, Miriam

    2008-01-01

    Studies of the structure and dynamics of macromolecular assemblies often involve comparison of low resolution models obtained using different techniques such as electron microscopy or atomic force microscopy. We present new computational tools for comparing (matching) and docking of low resolution structures, based on shape complementarity. The matched or docked objects are represented by three dimensional grids where the value of each grid point depends on its position with regard to the interior, surface or exterior of the object. The grids are correlated using fast Fourier transformations producing either matches of related objects or docking models depending on the details of the grid representations. The procedures incorporate thickening and smoothing of the surfaces of the objects which effectively compensates for differences in the resolution of the matched/docked objects, circumventing the need for resolution modification. The presented matching tool FitEM2EMin successfully fitted electron microscopy structures obtained at different resolutions, different conformers of the same structure and partial structures, ranking correct matches at the top in every case. The differences between the grid representations of the matched objects can be used to study conformation differences or to characterize the size and shape of substructures. The presented low-to-low docking tool FitEM2EMout ranked the expected models at the top. PMID:18974836

  18. Visual Environment for Rich Data Interpretation (VERDI) program for environmental modeling systems

    EPA Pesticide Factsheets

    VERDI is a flexible, modular, Java-based program used for visualizing multivariate gridded meteorology, emissions and air quality modeling data created by environmental modeling systems such as the CMAQ model and WRF.

  19. The State of NASA's Information Power Grid

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation transfers the concept of the power grid to information sharing in the NASA community. An information grid of this sort would be characterized as comprising tools, middleware, and services for the facilitation of interoperability, distribution of new technologies, human collaboration, and data management. While a grid would increase the ability of information sharing, it would not necessitate it. The onus of utilizing the grid would rest with the users.

  20. Op art and visual perception.

    PubMed

    Wade, N J

    1978-01-01

    An attempt is made to list the visual phenomena exploited in op art. These include moire frinlude moiré fringes, afterimages, Hermann grid effects, Gestalt grouping principles, blurring and movement due to astigmatic fluctuations in accommodation, scintillation and streaming possibly due to eye movements, and visual persistence. The historical origins of these phenomena are also noted.

  1. 21 CFR 886.1330 - Amsler grid.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Amsler grid. 886.1330 Section 886.1330 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES... the patient and intended to rapidly detect central and paracentral irregularities in the visual field...

  2. Grid Generation for Multidisciplinary Design and Optimization of an Aerospace Vehicle: Issues and Challenges

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2000-01-01

    The purpose of this paper is to discuss grid generation issues and to challenge the grid generation community to develop tools suitable for automated multidisciplinary analysis and design optimization of aerospace vehicles. Special attention is given to the grid generation issues of computational fluid dynamics and computational structural mechanics disciplines.

  3. Modeling RF-induced Plasma-Surface Interactions with VSim

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Smithe, David N.; Pankin, Alexei Y.; Roark, Christine M.; Stoltz, Peter H.; Zhou, Sean C.-D.; Kruger, Scott E.

    2014-10-01

    An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath dynamics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath (e.g. sputtering), can thus be simulated in complex, experimentally relevant geometries. Simulations of RF sheath-enhanced impurity production near surfaces of the C-Mod field-aligned ICRF antenna are presented to illustrate the model; impurity mitigation techniques are also explored. Model extensions to capture the physics of secondary electron emission and of multispecies plasmas are summarized, together with a discussion of improved tools for plasma chemistry and IEDF/EEDF visualization and modeling. The latter tools are also highly relevant for commercial plasma processing applications. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling fusion and industrial plasma processes. Supported by U.S. DoE SBIR Phase I/II Award DE-SC0009501.

  4. CTG Analyzer: A graphical user interface for cardiotocography.

    PubMed

    Sbrollini, Agnese; Agostinelli, Angela; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most commonly used test for establishing the good health of the fetus during pregnancy and labor. CTG consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions (UC; mmHg). FHR is characterized by baseline, baseline variability, tachycardia, bradycardia, acceleration and decelerations. Instead, UC signal is characterized by presence of contractions and contractions period. Such parameters are usually evaluated by visual inspection. However, visual analysis of CTG recordings has a well-demonstrated poor reproducibility, due to the complexity of physiological phenomena affecting fetal heart rhythm and being related to clinician's experience. Computerized tools in support of clinicians represents a possible solution for improving correctness in CTG interpretation. This paper proposes CTG Analyzer as a graphical tool for automatic and objective analysis of CTG tracings. CTG Analyzer was developed under MATLAB®; it is a very intuitive and user friendly graphical user interface. FHR time series and UC signal are represented one under the other, on a grid with reference lines, as usually done for CTG reports printed on paper. Colors help identification of FHR and UC features. Automatic analysis is based on some unchangeable features definitions provided by the FIGO guidelines, and other arbitrary settings whose default values can be changed by the user. Eventually, CTG Analyzer provides a report file listing all the quantitative results of the analysis. Thus, CTG Analyzer represents a potentially useful graphical tool for automatic and objective analysis of CTG tracings.

  5. Power Systems Design and Studies | Grid Modernization | NREL

    Science.gov Websites

    Design and Studies Power Systems Design and Studies NREL develops new tools, algorithms, and market design and performance evaluations; and planning, operations, and protection studies. Photo of two researchers looking at a screen showing a distribution grid map Current design and planning tools for the

  6. Augmented Visual Experience of Simulated Solar Phenomena

    NASA Astrophysics Data System (ADS)

    Tucker, A. O., IV; Berardino, R. A.; Hahne, D.; Schreurs, B.; Fox, N. J.; Raouafi, N.

    2017-12-01

    The Parker Solar Probe (PSP) mission will explore the Sun's corona, studying solar wind, flares and coronal mass ejections. The effects of these phenomena can impact the technology that we use in ways that are not readily apparent, including affecting satellite communications and power grids. Determining the structure and dynamics of corona magnetic fields, tracing the flow of energy that heats the corona, and exploring dusty plasma near the Sun to understand its influence on solar wind and energetic particle formation requires a suite of sensors on board the PSP spacecraft that are engineered to observe specific phenomena. Using models of these sensors and simulated observational data, we can visualize what the PSP spacecraft will "see" during its multiple passes around the Sun. Augmented reality (AR) technologies enable convenient user access to massive data sets. We are developing an application that allows users to experience environmental data from the point of view of the PSP spacecraft in AR using the Microsoft HoloLens. Observational data, including imagery, magnetism, temperature, and density are visualized in 4D within the user's immediate environment. Our application provides an educational tool for comprehending the complex relationships of observational data, which aids in our understanding of the Sun.

  7. Numerical computation of complex multi-body Navier-Stokes flows with applications for the integrated Space Shuttle launch vehicle

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    1993-01-01

    An enhanced grid system for the Space Shuttle Orbiter was built by integrating CAD definitions from several sources and then generating the surface and volume grids. The new grid system contains geometric components not modeled previously plus significant enhancements on geometry that has been modeled in the old grid system. The new orbiter grids were then integrated with new grids for the rest of the launch vehicle. Enhancements were made to the hyperbolic grid generator HYPGEN and new tools for grid projection, manipulation, and modification, Cartesian box grid and far field grid generation and post-processing of flow solver data were developed.

  8. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  9. Visualization of grid-generated turbulence in He II using PTV

    NASA Astrophysics Data System (ADS)

    Mastracci, B.; Guo, W.

    2017-12-01

    Due to its low viscosity, cryogenic He II has potential use for simulating large-scale, high Reynolds number turbulent flow in a compact and efficient apparatus. To realize this potential, the behavior of the fluid in the simplest cases, such as turbulence generated by flow past a mesh grid, must be well understood. We have designed, constructed, and commissioned an apparatus to visualize the evolution of turbulence in the wake of a mesh grid towed through He II. Visualization is accomplished using the particle tracking velocimetry (PTV) technique, where μm-sized tracer particles are introduced to the flow, illuminated with a planar laser sheet, and recorded by a scientific imaging camera; the particles move with the fluid, and tracking their motion with a computer algorithm results in a complete map of the turbulent velocity field in the imaging region. In our experiment, this region is inside a carefully designed He II filled cast acrylic channel measuring approximately 16 × 16 × 330 mm. One of three different grids, which have mesh numbers M = 3, 3.75, or 5 mm, can be attached to the pulling system which moves it through the channel with constant velocity up to 600 mm/s. The consequent motion of the solidified deuterium tracer particles is used to investigate the energy statistics, effective kinematic viscosity, and quantized vortex dynamics in turbulent He II.

  10. Modeling and measuring the visual detection of ecologically relevant motion by an Anolis lizard.

    PubMed

    Pallus, Adam C; Fleishman, Leo J; Castonguay, Philip M

    2010-01-01

    Motion in the visual periphery of lizards, and other animals, often causes a shift of visual attention toward the moving object. This behavioral response must be more responsive to relevant motion (predators, prey, conspecifics) than to irrelevant motion (windblown vegetation). Early stages of visual motion detection rely on simple local circuits known as elementary motion detectors (EMDs). We presented a computer model consisting of a grid of correlation-type EMDs, with videos of natural motion patterns, including prey, predators and windblown vegetation. We systematically varied the model parameters and quantified the relative response to the different classes of motion. We carried out behavioral experiments with the lizard Anolis sagrei and determined that their visual response could be modeled with a grid of correlation-type EMDs with a spacing parameter of 0.3 degrees visual angle, and a time constant of 0.1 s. The model with these parameters gave substantially stronger responses to relevant motion patterns than to windblown vegetation under equivalent conditions. However, the model is sensitive to local contrast and viewer-object distance. Therefore, additional neural processing is probably required for the visual system to reliably distinguish relevant from irrelevant motion under a full range of natural conditions.

  11. Characterization of Slosh Damping for Ortho-Grid and Iso-Grid Internal Tank Structures

    NASA Technical Reports Server (NTRS)

    Westra, Douglas G.; Sansone, Marco D.; Eberhart, Chad J.; West, Jeffrey S.

    2016-01-01

    Grid stiffened tank structures such as Ortho-Grid and Iso-Grid are widely used in cryogenic tanks for providing stiffening to the tank while reducing mass, compared to tank walls of constant cross-section. If the structure is internal to the tank, it will positively affect the fluid dynamic behavior of the liquid propellant, in regard to fluid slosh damping. As NASA and commercial companies endeavor to explore the solar system, vehicles will by necessity become more mass efficient, and design margin will be reduced where possible. Therefore, if the damping characteristics of the Ortho-Grid and Iso-Grid structure is understood, their positive damping effect can be taken into account in the systems design process. Historically, damping by internal structures has been characterized by rules of thumb and for Ortho-Grid, empirical design tools intended for slosh baffles of much larger cross-section have been used. There is little or no information available to characterize the slosh behavior of Iso-Grid internal structure. Therefore, to take advantage of these structures for their positive damping effects, there is much need for obtaining additional data and tools to characterize them. Recently, the NASA Marshall Space Flight Center conducted both sub-scale testing and computational fluid dynamics (CFD) simulations of slosh damping for Ortho-Grid and Iso-Grid tanks for cylindrical tanks containing water. Enhanced grid meshing techniques were applied to the geometrically detailed and complex Ortho-Grid and Iso-Grid structures. The Loci-STREAM CFD program with the Volume of Fluid Method module for tracking and locating the water-air fluid interface was used to conduct the simulations. The CFD simulations were validated with the test data and new empirical models for predicting damping and frequency of Ortho-Grid and Iso-Grid structures were generated.

  12. High-Performance Computing and Visualization of Tsunamis and Wind-Driven Waves

    NASA Astrophysics Data System (ADS)

    Liu, Y. S.; Zhang, H.; Yuen, D. A.; Wang, M.

    2005-12-01

    The Sumatran earthquake and the tsunami waves produced have awakened great scientific interest in wave-propagation over undulated bottom topography and along complicated coastlines. The recent hurricane Katrina has also called our attention to shorter period waves near the coast. Analytical approximations are valid over long wavelengths in the far field. For near field regions with complex geography and other complications, such as islands and harbors, numerical simulations must be employed to obtain accurate predictions in time and space. Nowadays using 10**7 to 10**8 grid points become quite routine with massively parallel computers and large RAM and disk memories. Besides tsunamis, river discharges from upstream events and waves driven by hurricanes are also of societal relevance, especially in central China and now also in U.S.A. Using automatic grid generation methods, we have devised a finite-element based code, for the three stages which culminates with the use of the augmented Lagrangian method for the run-up process, as well as the Arbitrary Lagrange- Euler Configuration method to tackle the free surface problem near the seashore. This formulation allows for the wave surface to be self-consistently determined within a linearized framework and is computationally very fast. Our continuous efforts are focussed on seeking novel algorithms and state of art techniques, in order to unravel the mysteries associated with tsunami wave propagation and wind-driven waves in 3-D. We have cast the Navier-Stokes equations within the framework of a compressible model with an equation of state for sea-water. Our formulation allows the tracking and simulation of three stages , principally the formation, propagation and run-up stages of tsunami and waves coming ashore. The sequential version of this code can run on a workstation with 4 Gbyte memory less than 2 minutes per time step for one million grid points. This code has also been parallelized with MPI-2 and has good scaling properties, nearly linear speedup, which has been tested on a 32-node PC cluster. We have employed the actual ocean seafloor topographical data to construct oceanic volume and attempt to construct the coastline as realistic as possible, using 11 levels structure meshes in the radial direction of the earth. In order to understand the intricate dynamics of the wave interactions, we have implemented a visualization overlay based on Amira, a 3-D volume rendering visualization tools for massive data post-processing. The ability to visualize the large data sets remotely is an important objective we are aiming for, as international collaboration is one of the top aims of this research.

  13. Cyberinfrastructure for Atmospheric Discovery

    NASA Astrophysics Data System (ADS)

    Wilhelmson, R.; Moore, C. W.

    2004-12-01

    Each year across the United States, floods, tornadoes, hail, strong winds, lightning, hurricanes, and winter storms cause hundreds of deaths, routinely disrupt transportation and commerce, and result in billions of dollars in annual economic losses . MEAD and LEAD are two recent efforts aimed at developing the cyberinfrastructure for studying and forecasting these events through collection, integration, and analysis of observational data coupled with numerical simulation, data mining, and visualization. MEAD (Modeling Environment for Atmospheric Discovery) has been funded for two years as an NCSA (National Center for Supercomputing Applications) Alliance Expedition. The goal of this expedition has been the development/adaptation of cyberinfrastructure that will enable research simulations, datamining, machine learning and visualization of hurricanes and storms utilizing the high performance computing environments including the TeraGrid. Portal grid and web infrastructure are being tested that will enable launching of hundreds of individual WRF (Weather Research and Forecasting) simulations. In a similar way, multiple Regional Ocean Modeling System (ROMS) or WRF/ROMS simulations can be carried out. Metadata and the resulting large volumes of data will then be made available for further study and for educational purposes using analysis, mining, and visualization services. Initial coupling of the ROMS and WRF codes has been completed and parallel I/O is being implemented for these models. Management of these activities (services) are being enabled through Grid workflow technologies (e.g. OGCE). LEAD (Linked Environments for Atmospheric Discovery) is a recently funded 5-year, large NSF ITR grant that involves 9 institutions who are developing a comprehensive national cyberinfrastructure in mesoscale meteorology, particularly one that can interoperate with others being developed. LEAD is addressing the fundamental information technology (IT) research challenges needed to create an integrated, scalable for identifying, accessing, preparing, assimilating, predicting, managing, analyzing, mining, and visualizing a broad array of meteorological data and model output, independent of format and physical location. A transforming element of LEAD is Workflow Orchestration for On-Demand, Real-Time, Dynamically-Adaptive Systems (WOORDS), which allows the use of analysis tools, forecast models, and data repositories as dynamically adaptive, on-demand, Grid-enabled systems that can a) change configuration rapidly and automatically in response to weather; b) continually be steered by new data; c) respond to decision-driven inputs from users; d) initiate other processes automatically; and e) steer remote observing technologies to optimize data collection for the problem at hand. Although LEAD efforts are primiarly directed at mesoscale meteorology, the IT services being developed has general applicability to other geoscience and environmental science. Integration of traditional and new data sources is a crucial component in LEAD for data analysis and assimilation, for integration of (ensemble mining) of data from sets of simulations, and for comparing results to observational data. As part of the integration effort, LEAD is creating a myLEAD metadata catalog service: a personal metacatalog that extends the Globus MCS system and is built on top of the OGSA-DAI system developed at the National e-Science Center in Edinburgh, Scotland.

  14. Implementation of visual data mining for unsteady blood flow field in an aortic aneurysm.

    PubMed

    Morizawa, Seiichiro; Shimoyama, Koji; Obayashi, Shigeru; Funamoto, Kenichi; Hayase, Toshiyuki

    2011-12-01

    This study was performed to determine the relations between the features of wall shear stress and aneurysm rupture. For this purpose, visual data mining was performed in unsteady blood flow simulation data for an aortic aneurysm. The time-series data of wall shear stress given at each grid point were converted to spatial and temporal indices, and the grid points were sorted using a self-organizing map based on the similarity of these indices. Next, the results of cluster analysis were mapped onto the real space of the aortic aneurysm to specify the regions that may lead to aneurysm rupture. With reference to previous reports regarding aneurysm rupture, the visual data mining suggested specific hemodynamic features that cause aneurysm rupture. GRAPHICAL ABSTRACT:

  15. Design and Implementation of Real-Time Off-Grid Detection Tool Based on FNET/GridEye

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Jiahui; Zhang, Ye; Liu, Yilu

    2014-01-01

    Real-time situational awareness tools are of critical importance to power system operators, especially during emergencies. The availability of electric power has become a linchpin of most post disaster response efforts as it is the primary dependency for public and private sector services, as well as individuals. Knowledge of the scope and extent of facilities impacted, as well as the duration of their dependence on backup power, enables emergency response officials to plan for contingencies and provide better overall response. Based on real-time data acquired by Frequency Disturbance Recorders (FDRs) deployed in the North American power grid, a real-time detection methodmore » is proposed. This method monitors critical electrical loads and detects the transition of these loads from an on-grid state, where the loads are fed by the power grid to an off-grid state, where the loads are fed by an Uninterrupted Power Supply (UPS) or a backup generation system. The details of the proposed detection algorithm are presented, and some case studies and off-grid detection scenarios are also provided to verify the effectiveness and robustness. Meanwhile, the algorithm has already been implemented based on the Grid Solutions Framework (GSF) and has effectively detected several off-grid situations.« less

  16. Energy Systems Integration News - November 2016 | Energy Systems

    Science.gov Websites

    visualization. NREL Study Finds Integrated Utility Control Can Improve Grid Voltage Regulation Beyond Advanced large solar photovoltaic (PV) system is connected to the electric grid, a centralized control system at more PV power is being fed into the line than is being used, leading to voltage control issues and

  17. Creative computing with Landlab: an open-source toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics

    NASA Astrophysics Data System (ADS)

    Hobley, Daniel E. J.; Adams, Jordan M.; Nudurupati, Sai Siddhartha; Hutton, Eric W. H.; Gasparini, Nicole M.; Istanbulluoglu, Erkan; Tucker, Gregory E.

    2017-01-01

    The ability to model surface processes and to couple them to both subsurface and atmospheric regimes has proven invaluable to research in the Earth and planetary sciences. However, creating a new model typically demands a very large investment of time, and modifying an existing model to address a new problem typically means the new work is constrained to its detriment by model adaptations for a different problem. Landlab is an open-source software framework explicitly designed to accelerate the development of new process models by providing (1) a set of tools and existing grid structures - including both regular and irregular grids - to make it faster and easier to develop new process components, or numerical implementations of physical processes; (2) a suite of stable, modular, and interoperable process components that can be combined to create an integrated model; and (3) a set of tools for data input, output, manipulation, and visualization. A set of example models built with these components is also provided. Landlab's structure makes it ideal not only for fully developed modelling applications but also for model prototyping and classroom use. Because of its modular nature, it can also act as a platform for model intercomparison and epistemic uncertainty and sensitivity analyses. Landlab exposes a standardized model interoperability interface, and is able to couple to third-party models and software. Landlab also offers tools to allow the creation of cellular automata, and allows native coupling of such models to more traditional continuous differential equation-based modules. We illustrate the principles of component coupling in Landlab using a model of landform evolution, a cellular ecohydrologic model, and a flood-wave routing model.

  18. BrainBrowser: distributed, web-based neurological data visualization.

    PubMed

    Sherif, Tarek; Kassis, Nicolas; Rousseau, Marc-Étienne; Adalat, Reza; Evans, Alan C

    2014-01-01

    Recent years have seen massive, distributed datasets become the norm in neuroimaging research, and the methodologies used to analyze them have, in response, become more collaborative and exploratory. Tools and infrastructure are continuously being developed and deployed to facilitate research in this context: grid computation platforms to process the data, distributed data stores to house and share them, high-speed networks to move them around and collaborative, often web-based, platforms to provide access to and sometimes manage the entire system. BrainBrowser is a lightweight, high-performance JavaScript visualization library built to provide easy-to-use, powerful, on-demand visualization of remote datasets in this new research environment. BrainBrowser leverages modern web technologies, such as WebGL, HTML5 and Web Workers, to visualize 3D surface and volumetric neuroimaging data in any modern web browser without requiring any browser plugins. It is thus trivial to integrate BrainBrowser into any web-based platform. BrainBrowser is simple enough to produce a basic web-based visualization in a few lines of code, while at the same time being robust enough to create full-featured visualization applications. BrainBrowser can dynamically load the data required for a given visualization, so no network bandwidth needs to be waisted on data that will not be used. BrainBrowser's integration into the standardized web platform also allows users to consider using 3D data visualization in novel ways, such as for data distribution, data sharing and dynamic online publications. BrainBrowser is already being used in two major online platforms, CBRAIN and LORIS, and has been used to make the 1TB MACACC dataset openly accessible.

  19. BrainBrowser: distributed, web-based neurological data visualization

    PubMed Central

    Sherif, Tarek; Kassis, Nicolas; Rousseau, Marc-Étienne; Adalat, Reza; Evans, Alan C.

    2015-01-01

    Recent years have seen massive, distributed datasets become the norm in neuroimaging research, and the methodologies used to analyze them have, in response, become more collaborative and exploratory. Tools and infrastructure are continuously being developed and deployed to facilitate research in this context: grid computation platforms to process the data, distributed data stores to house and share them, high-speed networks to move them around and collaborative, often web-based, platforms to provide access to and sometimes manage the entire system. BrainBrowser is a lightweight, high-performance JavaScript visualization library built to provide easy-to-use, powerful, on-demand visualization of remote datasets in this new research environment. BrainBrowser leverages modern web technologies, such as WebGL, HTML5 and Web Workers, to visualize 3D surface and volumetric neuroimaging data in any modern web browser without requiring any browser plugins. It is thus trivial to integrate BrainBrowser into any web-based platform. BrainBrowser is simple enough to produce a basic web-based visualization in a few lines of code, while at the same time being robust enough to create full-featured visualization applications. BrainBrowser can dynamically load the data required for a given visualization, so no network bandwidth needs to be waisted on data that will not be used. BrainBrowser's integration into the standardized web platform also allows users to consider using 3D data visualization in novel ways, such as for data distribution, data sharing and dynamic online publications. BrainBrowser is already being used in two major online platforms, CBRAIN and LORIS, and has been used to make the 1TB MACACC dataset openly accessible. PMID:25628562

  20. Expandable Grids: A User Interface Visualization Technique and a Policy Semantics to Support Fast, Accurate Security and Privacy Policy Authoring

    DTIC Science & Technology

    2008-07-01

    dropout rate amongst Grid participants suggests participants found the Grid more frustrating to use, and subjective satisfaction scores show... learned more than N years of graduate school could ever teach me, and my sister, who was always there for me when my Black Friday letters came. Abstract...greatly affect whether policies match their authors’ intentions ; a bad user interface can lead to policies with many errors, while a good user interface

  1. ScyFlow: An Environment for the Visual Specification and Execution of Scientific Workflows

    NASA Technical Reports Server (NTRS)

    McCann, Karen M.; Yarrow, Maurice; DeVivo, Adrian; Mehrotra, Piyush

    2004-01-01

    With the advent of grid technologies, scientists and engineers are building more and more complex applications to utilize distributed grid resources. The core grid services provide a path for accessing and utilizing these resources in a secure and seamless fashion. However what the scientists need is an environment that will allow them to specify their application runs at a high organizational level, and then support efficient execution across any given set or sets of resources. We have been designing and implementing ScyFlow, a dual-interface architecture (both GUT and APT) that addresses this problem. The scientist/user specifies the application tasks along with the necessary control and data flow, and monitors and manages the execution of the resulting workflow across the distributed resources. In this paper, we utilize two scenarios to provide the details of the two modules of the project, the visual editor and the runtime workflow engine.

  2. Influence of grid control and object detection on radiation exposure and image quality using mobile C-arms - first results.

    PubMed

    Gosch, D; Ratzmer, A; Berauer, P; Kahn, T

    2007-09-01

    The objective of this study was to examine the extent to which the image quality on mobile C-arms can be improved by an innovative exposure rate control system (grid control). In addition, the possible dose reduction in the pulsed fluoroscopy mode using 25 pulses/sec produced by automatic adjustment of the pulse rate through motion detection was to be determined. As opposed to conventional exposure rate control systems, which use a measuring circle in the center of the field of view, grid control is based on a fine mesh of square cells which are overlaid on the entire fluoroscopic image. The system uses only those cells for exposure control that are covered by the object to be visualized. This is intended to ensure optimally exposed images, regardless of the size, shape and position of the object to be visualized. The system also automatically detects any motion of the object. If a pulse rate of 25 pulses/sec is selected and no changes in the image are observed, the pulse rate used for pulsed fluoroscopy is gradually reduced. This may decrease the radiation exposure. The influence of grid control on image quality was examined using an anthropomorphic phantom. The dose reduction achieved with the help of object detection was determined by evaluating the examination data of 146 patients from 5 different countries. The image of the static phantom made with grid control was always optimally exposed, regardless of the position of the object to be visualized. The average dose reduction when using 25 pulses/sec resulting from object detection and automatic down-pulsing was 21 %, and the maximum dose reduction was 60 %. Grid control facilitates C-arm operation, since optimum image exposure can be obtained independently of object positioning. Object detection may lead to a reduction in radiation exposure for the patient and operating staff.

  3. Cyberinfrastructure to support Real-time, End-to-End, High Resolution, Localized Forecasting

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Lindholm, D.; Baltzer, T.; Domenico, B.

    2004-12-01

    From natural disasters such as flooding and forest fires to man-made disasters such as toxic gas releases, the impact of weather-influenced severe events on society can be profound. Understanding, predicting, and mitigating such local, mesoscale events calls for a cyberinfrastructure to integrate multidisciplinary data, tools, and services as well as the capability to generate and use high resolution data (such as wind and precipitation) from localized models. The need for such end to end systems -- including data collection, distribution, integration, assimilation, regionalized mesoscale modeling, analysis, and visualization -- has been realized to some extent in many academic and quasi-operational environments, especially for atmospheric sciences data. However, many challenges still remain in the integration and synthesis of data from multiple sources and the development of interoperable data systems and services across those disciplines. Over the years, the Unidata Program Center has developed several tools that have either directly or indirectly facilitated these local modeling activities. For example, the community is using Unidata technologies such as the Internet Data Distribution (IDD) system, Local Data Manger (LDM), decoders, netCDF libraries, Thematic Realtime Environmental Distributed Data Services (THREDDS), and the Integrated Data Viewer (IDV) in their real-time prediction efforts. In essence, these technologies for data reception and processing, local and remote access, cataloging, and analysis and visualization coupled with technologies from others in the community are becoming the foundation of a cyberinfrastructure to support an end-to-end regional forecasting system. To build on these capabilities, the Unidata Program Center is pleased to be a significant contributor to the Linked Environments for Atmospheric Discovery (LEAD) project, a NSF-funded multi-institutional large Information Technology Research effort. The goal of LEAD is to create an integrated and scalable framework for identifying, accessing, preparing, assimilating, predicting, managing, analyzing, mining, and visualizing a broad array of meteorological data and model output, independent of format and physical location. To that end, LEAD will create a series of interconnected, heterogeneous Grid environments to provide a complete framework for mesoscale research, including a set of integrated Grid and Web Services. This talk will focus on the transition from today's end-to-end systems into the types of systems that the LEAD project envisions and the multidisciplinary research problems they will enable.

  4. [Assessment of a supervision grid being used in the laboratories of cutaneous leishmaniasis in Morocco].

    PubMed

    El Mansouri, Bouchra; Amarir, Fatima; Hajli, Yamina; Fellah, Hajiba; Sebti, Faiza; Delouane, Bouchra; Sadak, Abderrahim; Adlaoui, El Bachir; Rhajaoui, Mohammed

    2017-01-01

    The aim of our study was to assess a standardized supervisory grid as a new supervision tool being used in the laboratories of leishmaniasis. We conducted a pilot trial to evaluate the ongoing performances of seven provincial laboratories, in four provinces in Morocco, over a period of two years, between 2006 and 2014. This study detailed the situation in provincial laboratories before and after the implementation of the supervisory grid. A total of twenty-one grids were analyzed. In 2006, the results clearly showed a poor performance of laboratories: need for training (41.6%), staff performing skin biopsy (25%), shortage of materials and reagents (65%), non-compliant document and local management (85%). Several corrective actions were conducted by the National Reference Laboratory (LNRL) of Leishmaniasis during the study period. In 2014, the LNRL recorded a net improvement of the performances of the laboratories. The need for training, the quality of the biopsy, the supply of tools and reagents were met and an effective coordination activity was established between the LNRL and the provincial laboratories. This trial shows the effectiveness of the grid as a high quality supervisory tool and as a cornerstone of making progress on fight programs against leishmaniases.

  5. The GEON Integrated Data Viewer (IDV) for Exploration of Geoscience Data With Visualizations

    NASA Astrophysics Data System (ADS)

    Wier, S.; Meertens, C.

    2008-12-01

    The GEON Integrated Data Viewer (GEON IDV) is a fully interactive, research-level, true 3D and 4D (latitude, longitude, depth or altitude, and time) tool to display and explore almost any data located on the Earth, inside the Earth, or above the Earth's surface. Although the GEON IDV makes impressive 3D displays, it is primarily designed for data exploration and analysis. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three- dimensional geoscience questions anywhere on earth. The GEON IDV supports simultaneous displays of data sets of differing sources and data type or character, with complete control over map projection and area, time animation, vertical scale, and color schemes. The GEON IDV displays gridded and point data, images, GIS shape files, and other types of data, from files, HTTP servers, OPeNDAP catalogs, RSS feeds, and web map servers. GEON IDV displays include images and geology maps on 3D topographic relief surfaces, vertical geologic cross sections in their correct depth extent, tectonic plate boundaries and plate motion vectors including time animation, GPS velocity vectors and error ellipses, GPS time series at a station, earthquake locations in depth optionally colored and sized by magnitude, earthquake focal mechanisms 'beachballs,' 2D grids of gravity or magnetic anomalies, 2D grids of crustal strain imagery, seismic raypaths, seismic tomography model 3D grids as vertical and horizontal cross sections and isosurfaces, 3D grids of crust and mantle structure for any property, and time animation of 3D grids of mantle convection models as cross sections and isosurfaces. The IDV can also show tracks of aircraft, ships, drifting buoys and marine animals, colored observed values, borehole soundings, and vertical probes of 3D grids. The GEON IDV can drive a GeoWall or other 3D stereo system. IDV output files include imagery, movies, and KML files for Google Earth. The IDV has built in analysis capabilities with user-created Python language routines, and with automatic conversion of data sources with differing units and grid structures. The IDV can be scripted to create display images on user request or automatically on data arrival, offering the use of the IDV as a back end to support image generation in a data portal. Examples of GEON IDV use in seismology, geodesy, geodynamics and other fields will be shown.

  6. HappyFace as a generic monitoring tool for HEP experiments

    NASA Astrophysics Data System (ADS)

    Kawamura, Gen; Magradze, Erekle; Musheghyan, Haykuhi; Quadt, Arnulf; Rzehorz, Gerhard

    2015-12-01

    The importance of monitoring on HEP grid computing systems is growing due to a significant increase in their complexity. Computer scientists and administrators have been studying and building effective ways to gather information on and clarify a status of each local grid infrastructure. The HappyFace project aims at making the above-mentioned workflow possible. It aggregates, processes and stores the information and the status of different HEP monitoring resources into the common database of HappyFace. The system displays the information and the status through a single interface. However, this model of HappyFace relied on the monitoring resources which are always under development in the HEP experiments. Consequently, HappyFace needed to have direct access methods to the grid application and grid service layers in the different HEP grid systems. To cope with this issue, we use a reliable HEP software repository, the CernVM File System. We propose a new implementation and an architecture of HappyFace, the so-called grid-enabled HappyFace. It allows its basic framework to connect directly to the grid user applications and the grid collective services, without involving the monitoring resources in the HEP grid systems. This approach gives HappyFace several advantages: Portability, to provide an independent and generic monitoring system among the HEP grid systems. Eunctionality, to allow users to perform various diagnostic tools in the individual HEP grid systems and grid sites. Elexibility, to make HappyFace beneficial and open for the various distributed grid computing environments. Different grid-enabled modules, to connect to the Ganga job monitoring system and to check the performance of grid transfers among the grid sites, have been implemented. The new HappyFace system has been successfully integrated and now it displays the information and the status of both the monitoring resources and the direct access to the grid user applications and the grid collective services.

  7. The Use of Smart phones in Ophthalmology.

    PubMed

    Zvornicanin, Edita; Zvornicanin, Jasmin; Hadziefendic, Bahrudin

    2014-06-01

    Smart phones are being increasingly used among health professionals. Ophthalmological applications are widely available and can turn smart phones into sophisticated medical devices. Smart phones can be useful instruments for the practice of evidence-based medicine, professional education, mobile clinical communication, patient education, disease self-management, remote patient monitoring or as powerful administrative tools. Several applications are available for different ophthalmological examinations that can assess visual acuity, color vision, astigmatism, pupil size, Amsler grid test and more. Smart phones can be useful ophthalmic devices for taking images of anterior and posterior eye segment. Professional literature and educational material for patients are easily available with use of smart phones. Smart phones can store great amount of informations and are useful for long term monitoring with caution for patient confidentiality. The use of smart phones especially as diagnostic tools is not standardized and results should be carefully considered. Innovative role of smartphone technology and its use in research, education and information sharing makes smart phones a future of ophthalmology and medicine.

  8. Advanced Analysis and Visualization of Space Weather Phenomena

    NASA Astrophysics Data System (ADS)

    Murphy, Joshua J.

    As the world becomes more technologically reliant, the more susceptible society as a whole is to adverse interactions with the sun. This "space weather'' can produce significant effects on modern technology, from interrupting satellite service, to causing serious damage to Earth-side power grids. These concerns have, over the past several years, prompted an out-welling of research in an attempt to understand the processes governing, and to provide a means of forecasting, space weather events. The research presented in this thesis couples to current work aimed at understanding Coronal Mass Ejections (CMEs) and their influence on the evolution of Earth's magnetic field and associated Van Allen radiation belts. To aid in the analysis of how these solar wind transients affect Earth's magnetic field, a system named Geospace/Heliosphere Observation & Simulation Tool-kit (GHOSTkit), along with its python analysis tools, GHOSTpy, has been devised to calculate the adiabatic invariants of trapped particle motion within Earth's magnetic field. These invariants aid scientists in ordering observations of the radiation belts, providing a more natural presentation of data, but can be computationally expensive to calculate. The GHOSTpy system, in the phase presented here, is aimed at providing invariant calculations based on LFM magnetic field simulation data. This research first examines an ideal dipole application to gain understanding on system performance. Following this, the challenges of applying the algorithms to gridded LFM MHD data is examined. Performance profiles are then presented, followed by a real-world application of the system.

  9. Inverse current source density method in two dimensions: inferring neural activation from multielectrode recordings.

    PubMed

    Łęski, Szymon; Pettersen, Klas H; Tunstall, Beth; Einevoll, Gaute T; Gigg, John; Wójcik, Daniel K

    2011-12-01

    The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a two-dimensional grid using multi-electrode rectangular arrays. This new method, which we call two-dimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one- and three-dimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include system-specific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUI-based MATLAB toolbox to analyze and visualize our test data as well as user datasets.

  10. Spaceflight Operations Services Grid (SOSG)

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.

    2004-01-01

    In an effort to adapt existing space flight operations services to new emerging Grid technologies we are developing a Grid-based prototype space flight operations Grid. This prototype is based on the operational services being provided to the International Space Station's Payload operations located at the Marshall Space Flight Center, Alabama. The prototype services will be Grid or Web enabled and provided to four user communities through portal technology. Users will have the opportunity to assess the value and feasibility of Grid technologies to their specific areas or disciplines. In this presentation descriptions of the prototype development, User-based services, Grid-based services and status of the project will be presented. Expected benefits, findings and observations (if any) to date will also be discussed. The focus of the presentation will be on the project in general, status to date and future plans. The End-use services to be included in the prototype are voice, video, telemetry, commanding, collaboration tools and visualization among others. Security is addressed throughout the project and is being designed into the Grid technologies and standards development. The project is divided into three phases. Phase One establishes the baseline User-based services required for space flight operations listed above. Phase Two involves applying Gridlweb technologies to the User-based services and development of portals for access by users. Phase Three will allow NASA and end users to evaluate the services and determine the future of the technology as applied to space flight operational services. Although, Phase One, which includes the development of the quasi-operational User-based services of the prototype, development will be completed by March 2004, the application of Grid technologies to these services will have just begun. We will provide status of the Grid technologies to the individual User-based services. This effort will result in an extensible environment that incorporates existing and new spaceflight services into a standards-based framework providing current and future NASA programs with cost savings and new and evolvable methods to conduct science. This project will demonstrate how the use of new programming paradigms such as web and grid services can provide three significant benefits to the cost-effective delivery of spaceflight services. They will enable applications to operate more efficiently by being able to utilize pooled resources. They will also permit the reuse of common services to rapidly construct new and more powerful applications. Finally they will permit easy and secure access to services via a combination of grid and portal technology by a distributed user community consisting of NASA operations centers, scientists, the educational community and even the general population as outreach. The approach will be to deploy existing mission support applications such as the Telescience Resource Kit (TReK) and new applications under development, such as the Grid Video Distribution System (GViDS), together with existing grid applications and services such as high-performance computing and visualization services provided by NASA s Information Power Grid (IPG) in the MSFC s Payload Operations Integration Center (POIC) HOSC Annex. Once the initial applications have been moved to the grid, a process will begin to apply the new programming paradigms to integrate them where possible. For example, with GViDS, instead of viewing the Distribution service as an application that must run on a single node, the new approach is to build it such that it can be dispatched across a pool of resources in response to dynamic loads. To make this a reality, reusable services will be critical, such as a brokering service to locate appropriate resource within the pool. This brokering service can then be used by other applications such as the TReK. To expand further, if the GViDS application is constructed using a services-based mel, then other applications such as the Video Auditorium can then use GViDS as a service to easily incorporate these video streams into a collaborative conference. Finally, as these applications are re-factored into this new services-based paradigm, the construction of portals to integrate them will be a simple process. As a result, portals can be tailored to meet the requirements of specific user communities.

  11. Personal Constructions of Biological Concepts--The Repertory Grid Approach

    ERIC Educational Resources Information Center

    McCloughlin, Thomas J. J.; Matthews, Philip S. C.

    2017-01-01

    This work discusses repertory grid analysis as a tool for investigating the structures of students' representations of biological concepts. Repertory grid analysis provides the researcher with a variety of techniques that are not associated with standard methods of concept mapping for investigating conceptual structures. It can provide valuable…

  12. GRAPEVINE: Grids about anything by Poisson's equation in a visually interactive networking environment

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.; Mccann, Karen

    1992-01-01

    A proven 3-D multiple-block elliptic grid generator, designed to run in 'batch mode' on a supercomputer, is improved by the creation of a modern graphical user interface (GUI) running on a workstation. The two parts are connected in real time by a network. The resultant system offers a significant speedup in the process of preparing and formatting input data and the ability to watch the grid solution converge by replotting the grid at each iteration step. The result is a reduction in user time and CPU time required to generate the grid and an enhanced understanding of the elliptic solution process. This software system, called GRAPEVINE, is described, and certain observations are made concerning the creation of such software.

  13. SmaggIce User Guide. 1.0

    NASA Technical Reports Server (NTRS)

    Baez, Marivell; Vickerman, Mary; Choo, Yung

    2000-01-01

    SmaggIce (Surface Modeling And Grid Generation for Iced Airfoils) is one of NASNs aircraft icing research codes developed at the Glenn Research Center. It is a software toolkit used in the process of aerodynamic performance prediction of iced airfoils. It includes tools which complement the 2D grid-based Computational Fluid Dynamics (CFD) process: geometry probing; surface preparation for gridding: smoothing and re-discretization of geometry. Future releases will also include support for all aspects of gridding: domain decomposition; perimeter discretization; grid generation and modification.

  14. CAGI: Computer Aided Grid Interface. A work in progress

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.; Yu, Tzu-Yi; Vaughn, David

    1992-01-01

    Progress realized in the development of a Computer Aided Grid Interface (CAGI) software system in integrating CAD/CAM geometric system output and/or Interactive Graphics Exchange Standard (IGES) files, geometry manipulations associated with grid generation, and robust grid generation methodologies is presented. CAGI is being developed in a modular fashion and will offer fast, efficient and economical response to geometry/grid preparation, allowing the ability to upgrade basic geometry in a step-by-step fashion interactively and under permanent visual control along with minimizing the differences between the actual hardware surface descriptions and corresponding numerical analog. The computer code GENIE is used as a basis. The Non-Uniform Rational B-Splines (NURBS) representation of sculptured surfaces is utilized for surface grid redistribution. The computer aided analysis system, PATRAN, is adapted as a CAD/CAM system. The progress realized in NURBS surface grid generation, the development of IGES transformer, and geometry adaption using PATRAN will be presented along with their applicability to grid generation associated with rocket propulsion applications.

  15. On the Uses of Full-Scale Schlieren Flow Visualization

    NASA Astrophysics Data System (ADS)

    Settles, G. S.; Miller, J. D.; Dodson-Dreibelbis, L. J.

    2000-11-01

    A lens-and-grid-type schlieren system using a very large grid as a light source was described at earlier APS/DFD meetings. With a field-of-view of 2.3x2.9 m (7.5x9.5 feet), it is the largest indoor schlieren system in the world. Still and video examples of several full-scale airflows and heat-transfer problems visualized thus far will be shown. These include: heating and ventilation airflows, flows due to appliances and equipment, the thermal plumes of people, the aerodynamics of an explosive trace detection portal, gas leak detection, shock wave motion associated with aviation security problems, and heat transfer from live crops. Planned future projects include visualizing fume-hood and grocery display freezer airflows and studying the dispersion of insect repellent plumes at full scale.

  16. Determination and representation of electric charge distributions associated with adverse weather conditions

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    1992-01-01

    Algorithms are presented for determining the size and location of electric charges which model storm systems and lightning strikes. The analysis utilizes readings from a grid of ground level field mills and geometric constraints on parameters to arrive at a representative set of charges. This set is used to generate three dimensional graphical depictions of the set as well as contour maps of the ground level electrical environment over the grid. The composite, analytic and graphic package is demonstrated and evaluated using controlled input data and archived data from a storm system. The results demonstrate the packages utility as: an operational tool in appraising adverse weather conditions; a research tool in studies of topics such as storm structure, storm dynamics, and lightning; and a tool in designing and evaluating grid systems.

  17. Haptic over visual information in the distribution of visual attention after tool-use in near and far space.

    PubMed

    Park, George D; Reed, Catherine L

    2015-10-01

    Despite attentional prioritization for grasping space near the hands, tool-use appears to transfer attentional bias to the tool's end/functional part. The contributions of haptic and visual inputs to attentional distribution along a tool were investigated as a function of tool-use in near (Experiment 1) and far (Experiment 2) space. Visual attention was assessed with a 50/50, go/no-go, target discrimination task, while a tool was held next to targets appearing near the tool-occupied hand or tool-end. Target response times (RTs) and sensitivity (d-prime) were measured at target locations, before and after functional tool practice for three conditions: (1) open-tool: tool-end visible (visual + haptic inputs), (2) hidden-tool: tool-end visually obscured (haptic input only), and (3) short-tool: stick missing tool's length/end (control condition: hand occupied but no visual/haptic input). In near space, both open- and hidden-tool groups showed a tool-end, attentional bias (faster RTs toward tool-end) before practice; after practice, RTs near the hand improved. In far space, the open-tool group showed no bias before practice; after practice, target RTs near the tool-end improved. However, the hidden-tool group showed a consistent tool-end bias despite practice. Lack of short-tool group results suggested that hidden-tool group results were specific to haptic inputs. In conclusion, (1) allocation of visual attention along a tool due to tool practice differs in near and far space, and (2) visual attention is drawn toward the tool's end even when visually obscured, suggesting haptic input provides sufficient information for directing attention along the tool.

  18. NCWin — A Component Object Model (COM) for processing and visualizing NetCDF data

    USGS Publications Warehouse

    Liu, Jinxun; Chen, J.M.; Price, D.T.; Liu, S.

    2005-01-01

    NetCDF (Network Common Data Form) is a data sharing protocol and library that is commonly used in large-scale atmospheric and environmental data archiving and modeling. The NetCDF tool described here, named NCWin and coded with Borland C + + Builder, was built as a standard executable as well as a COM (component object model) for the Microsoft Windows environment. COM is a powerful technology that enhances the reuse of applications (as components). Environmental model developers from different modeling environments, such as Python, JAVA, VISUAL FORTRAN, VISUAL BASIC, VISUAL C + +, and DELPHI, can reuse NCWin in their models to read, write and visualize NetCDF data. Some Windows applications, such as ArcGIS and Microsoft PowerPoint, can also call NCWin within the application. NCWin has three major components: 1) The data conversion part is designed to convert binary raw data to and from NetCDF data. It can process six data types (unsigned char, signed char, short, int, float, double) and three spatial data formats (BIP, BIL, BSQ); 2) The visualization part is designed for displaying grid map series (playing forward or backward) with simple map legend, and displaying temporal trend curves for data on individual map pixels; and 3) The modeling interface is designed for environmental model development by which a set of integrated NetCDF functions is provided for processing NetCDF data. To demonstrate that the NCWin can easily extend the functions of some current GIS software and the Office applications, examples of calling NCWin within ArcGIS and MS PowerPoint for showing NetCDF map animations are given.

  19. Cloud computing for energy management in smart grid - an application survey

    NASA Astrophysics Data System (ADS)

    Naveen, P.; Kiing Ing, Wong; Kobina Danquah, Michael; Sidhu, Amandeep S.; Abu-Siada, Ahmed

    2016-03-01

    The smart grid is the emerging energy system wherein the application of information technology, tools and techniques that make the grid run more efficiently. It possesses demand response capacity to help balance electrical consumption with supply. The challenges and opportunities of emerging and future smart grids can be addressed by cloud computing. To focus on these requirements, we provide an in-depth survey on different cloud computing applications for energy management in the smart grid architecture. In this survey, we present an outline of the current state of research on smart grid development. We also propose a model of cloud based economic power dispatch for smart grid.

  20. The BioGRID interaction database: 2013 update.

    PubMed

    Chatr-Aryamontri, Andrew; Breitkreutz, Bobby-Joe; Heinicke, Sven; Boucher, Lorrie; Winter, Andrew; Stark, Chris; Nixon, Julie; Ramage, Lindsay; Kolas, Nadine; O'Donnell, Lara; Reguly, Teresa; Breitkreutz, Ashton; Sellam, Adnane; Chen, Daici; Chang, Christie; Rust, Jennifer; Livstone, Michael; Oughtred, Rose; Dolinski, Kara; Tyers, Mike

    2013-01-01

    The Biological General Repository for Interaction Datasets (BioGRID: http//thebiogrid.org) is an open access archive of genetic and protein interactions that are curated from the primary biomedical literature for all major model organism species. As of September 2012, BioGRID houses more than 500 000 manually annotated interactions from more than 30 model organisms. BioGRID maintains complete curation coverage of the literature for the budding yeast Saccharomyces cerevisiae, the fission yeast Schizosaccharomyces pombe and the model plant Arabidopsis thaliana. A number of themed curation projects in areas of biomedical importance are also supported. BioGRID has established collaborations and/or shares data records for the annotation of interactions and phenotypes with most major model organism databases, including Saccharomyces Genome Database, PomBase, WormBase, FlyBase and The Arabidopsis Information Resource. BioGRID also actively engages with the text-mining community to benchmark and deploy automated tools to expedite curation workflows. BioGRID data are freely accessible through both a user-defined interactive interface and in batch downloads in a wide variety of formats, including PSI-MI2.5 and tab-delimited files. BioGRID records can also be interrogated and analyzed with a series of new bioinformatics tools, which include a post-translational modification viewer, a graphical viewer, a REST service and a Cytoscape plugin.

  1. On Bi-Grid Local Mode Analysis of Solution Techniques for 3-D Euler and Navier-Stokes Equations

    NASA Technical Reports Server (NTRS)

    Ibraheem, S. O.; Demuren, A. O.

    1994-01-01

    A procedure is presented for utilizing a bi-grid stability analysis as a practical tool for predicting multigrid performance in a range of numerical methods for solving Euler and Navier-Stokes equations. Model problems based on the convection, diffusion and Burger's equation are used to illustrate the superiority of the bi-grid analysis as a predictive tool for multigrid performance in comparison to the smoothing factor derived from conventional von Neumann analysis. For the Euler equations, bi-grid analysis is presented for three upwind difference based factorizations, namely Spatial, Eigenvalue and Combination splits, and two central difference based factorizations, namely LU and ADI methods. In the former, both the Steger-Warming and van Leer flux-vector splitting methods are considered. For the Navier-Stokes equations, only the Beam-Warming (ADI) central difference scheme is considered. In each case, estimates of multigrid convergence rates from the bi-grid analysis are compared to smoothing factors obtained from single-grid stability analysis. Effects of grid aspect ratio and flow skewness are examined. Both predictions are compared with practical multigrid convergence rates for 2-D Euler and Navier-Stokes solutions based on the Beam-Warming central scheme.

  2. A Variable Resolution Stretched Grid General Circulation Model: Regional Climate Simulation

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.; Takacs, Lawrence L.; Govindaraju, Ravi C.; Suarez, Max J.

    2000-01-01

    The development of and results obtained with a variable resolution stretched-grid GCM for the regional climate simulation mode, are presented. A global variable resolution stretched- grid used in the study has enhanced horizontal resolution over the U.S. as the area of interest The stretched-grid approach is an ideal tool for representing regional to global scale interaction& It is an alternative to the widely used nested grid approach introduced over a decade ago as a pioneering step in regional climate modeling. The major results of the study are presented for the successful stretched-grid GCM simulation of the anomalous climate event of the 1988 U.S. summer drought- The straightforward (with no updates) two month simulation is performed with 60 km regional resolution- The major drought fields, patterns and characteristics such as the time averaged 500 hPa heights precipitation and the low level jet over the drought area. appear to be close to the verifying analyses for the stretched-grid simulation- In other words, the stretched-grid GCM provides an efficient down-scaling over the area of interest with enhanced horizontal resolution. It is also shown that the GCM skill is sustained throughout the simulation extended to one year. The developed and tested in a simulation mode stretched-grid GCM is a viable tool for regional and subregional climate studies and applications.

  3. GES DAAC HDF Data Processing and Visualization Tools

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Cho, S.; Johnson, J.; Li, J.; Liu, Z.; Lu, L.; Pollack, N.; Qin, J.; Savtchenko, A.; Teng, B.

    2002-12-01

    The Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC) plays a major role in enabling basic scientific research and providing access to scientific data to the general user community. Several GES DAAC Data Support Teams provide expert assistance to users in accessing data, including information on visualization tools and documentation for data products. To provide easy access to the science data, the data support teams have additionally developed many online and desktop tools for data processing and visualization. This presentation is an overview of major HDF tools implemented at the GES DAAC and aimed at optimizing access to EOS data for the Earth Sciences community. GES DAAC ONLINE TOOLS: MODIS and AIRS on-demand Channel/Variable Subsetter are web-based, on-the-fly/on-demand subsetters that perform channel/variable subsetting and restructuring for Level1B and Level 2 data products. Users can specify criteria to subset data files with desired channels and variables and then download the subsetted file. AIRS QuickLook is a CGI/IDL combo package that allows users to view AIRS/HSB/AMSU Level-1B data online by specifying a channel prior to obtaining data. A global map is also provided along with the image to show geographic coverage of the granule and flight direction of the spacecraft. OASIS (Online data AnalySIS) is an IDL-based HTML/CGI interface for search, selection, and simple analysis of earth science data. It supports binary and GRIB formatted data, such as TOVS, Data Assimilation products, and some NCEP operational products. TRMM Online Analysis System is designed for quick exploration, analyses, and visualization of TRMM Level-3 and other precipitation products. The products consist of the daily (3B42), monthly(3B43), near-real-time (3B42RT), and Willmott's climate data. The system is also designed to be simple and easy to use - users can plot the average or accumulated rainfall over their region of interest for a given time period, or plot the time series of regional rainfall average. WebGIS is an online web software that implements the Open GIS Consortium (OGC) standards for mapping requests and rendering. It allows users access to TRMM, MODIS, SeaWiFS, and AVHRR data from several DAAC map servers, as well as externally served data such as political boundaries, population centers, lakes, rivers, and elevation. GES DAAC DESKTOP TOOLS: HDFLook-MODIS is a new, multifunctional, data processing and visualization tool for Radiometric and Geolocation, Atmosphere, Ocean, and Land MODIS HDF-EOS data. Features include (1) accessing and visualization of all swath (Levels l and 2) MODIS and AIRS products, and gridded (Levels 3 and 4) MODIS products; (2) re-mapping of swath data to world map; (3) geo-projection conversion; (4) interactive and batch mode capabilities; (5) subsetting and multi-granule processing; and (6) data conversion. SIMAP is an IDL-based script that is designed to read and map MODIS Level 1B (L1B) and Level 2 (L2) Ocean and Atmosphere products. It is a non-interactive, command line executed tool. The resulting maps are scaled to physical units (e.g., radiances, concentrations, brightness temperatures) and saved in binary files. TRMM HDF (in C and Fortran), reads in TRMM HDF data files and writes out user-selected SDS arrays and Vdata tables as separate flat binary files.

  4. Correlation of macular structure and function in a boy with primary foveomacular retinitis and sequence of changes over 5 years.

    PubMed

    Badhani, Anurag; Padhi, Tapas Ranjan; Panda, Gopal Krishna; Mukherjee, Sujoy; Das, Taraprasad; Jalali, Subhadra

    2017-08-01

    To describe the clinical characteristics, macular structure and function, and to document sequential changes over 5 years in a 10-year-old boy with bilateral primary foveomacular retinitis. A 10-year-old boy presented with sudden onset scotoma in both eyes, experienced after getting up from bed on a non-eclipse day. He persistently denied direct sun-gazing. He neither had any significant systemic illness, nor was using any medications. In addition to a detailed examination at presentation that included fundus fluorescein angiogram (FFA), electroretinogram (ERG), pattern ERG and electrooculogram (EOG), he was examined periodically for 5 years with Humphrey visual field (HVF), spectral domain optical coherence tomogram (SDOCT), Amsler grid charting and multifocal ERG. The macular structure and functions were analyzed over the years and correlated with the symptoms. All findings were bilaterally symmetrical at each visit. At presentation, his corrected visual acuity was 20/25 with subfoveal yellow dot similar to solar retinopathy, central scotoma with reduced foveal threshold in HVF 24-2, micropsia in Amsler grid, missing of two plates on Ishihara color vision chart, transfoveal full thickness hyper-reflective band on SD OCT, unremarkable FFA and normal foveal peak in mfERG. The flash ERG and EOG were unremarkable. A month later, his VA improved to 20/20, he had relative scotoma in Amsler grid, no scotoma in HVF (10-2), restoration of the inner segment of the photoreceptors with sharp defect involving ellipsoid and photoreceptor interdigitation zone in SDOCT and blunting of foveal peaks in mfERG. Three months later, his corrected VA was 20/20 with relative scotoma in Amsler grid, normal color vision, no scotoma in HVF 10-2 and unchanged SDOCT findings. In subsequent examinations at 6, 9, 14, 29, 39 and 60 months, he was symptomless with VA 20/20, unremarkable fundus, normal Amsler grid and HVF (normal foveal threshold), unchanged SDOCT findings and the reduced foveal peaks on mfERG in both eyes got normalized only at 60 months. Presented here is a case of bilaterally symmetrical idiopathic foveomacular retinitis that had a clinical appearance similar to solar retinopathy. The fundus changes persisted for 4 weeks, the symptoms and changes in Amsler grid lasted for 3 months, and the foveal threshold in visual fields normalized within 3 months. Maximum change in the SDOCT defect occurred within a month, and the extrafoveal defect in the ellipsoid and photoreceptor interdigitation line persisted despite resolution of symptoms and resolution of the visual field defect and normal distance vision. Probably, the foveal lesion detected on SDOCT was too small to cause a reduction in the distance visual acuity or show up in the visual field and mfERG later.

  5. Developing and Delivering National-Scale Gridded Phenology Data Products

    NASA Astrophysics Data System (ADS)

    Marsh, L.; Crimmins, M.; Crimmins, T. M.; Gerst, K.; Rosemartin, A.; Switzer, J.; Weltzin, J. F.

    2016-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org) is now producing and freely delivering daily maps and short-term forecasts of accumulated growing degree days and spring onset dates (based on the Extended Spring Indices) at fine spatial scale for the conterminous United States. These data products have utility for a wide range of natural resource planning and management applications, including scheduling invasive species and pest detection and control activities, determining planting dates, anticipating allergy outbreaks and planning agricultural harvest dates. Accumulated growing degree day (AGDD) maps were selected because accumulated temperature is a strong driver of phenological transitions in plants and animals, including leaf-out, flowering, fruit ripening and migration. The Extended Spring Indices (SI-x) are based on predictive climate models for lilac and honeysuckle leaf and bloom; they have been widely used to summarize changes in the timing of spring onset. The SI-x is used as a national indicator of climate change impacts by the US Global Change Research Program and the Environmental Protection Agency. The USA-NPN is a national-scale program that supports scientific advancement and decision-making by collecting, storing, and sharing phenology data and information. To best serve various audiences, the AGDD and SI-x gridded maps are available in various formats through a range of access tools, including the USA-NPN online visualization tool as well as industry standards compliant web services. We plan to expand the suite of gridded map products offered by the USA-NPN to include predictive maps of phenological transitions for additional plant and animal species at fine spatial and temporal resolution in the near future. USA-NPN invites you to use freely available daily and short-term forecast maps of accumulated growing degree days and spring onset dates at fine spatial scale for the conterminous United States.

  6. Towards an Analysis of Visual Images in School Science Textbooks and Press Articles about Science and Technology

    NASA Astrophysics Data System (ADS)

    Dimopoulos, Kostas; Koulaidis, Vasilis; Sklaveniti, Spyridoula

    2003-04-01

    This paper aims at presenting the application of a grid for the analysis of the pedagogic functions of visual images included in school science textbooks and daily press articles about science and technology. The analysis is made using the dimensions of content specialisation (classification) and social-pedagogic relationships (framing) promoted by the images as well as the elaboration and abstraction of the corresponding visual code (formality), thus combining pedagogical and socio-semiotic perspectives. The grid is applied to the analysis of 2819 visual images collected from school science textbooks and another 1630 visual images additionally collected from the press. The results show that the science textbooks in comparison to the press material: a) use ten times more images, b) use more images so as to familiarise their readers with the specialised techno-scientific content and codes, and c) tend to create a sense of higher empowerment for their readers by using the visual mode. Furthermore, as the educational level of the school science textbooks (i.e., from primary to lower secondary level) rises, the content specialisation projected by the visual images and the elaboration and abstraction of the corresponding visual code also increases. The above results have implications for the terms and conditions for the effective exploitation of visual material as the educational level rises as well as for the effective incorporation of visual images from press material into science classes.

  7. Electricity Capacity Expansion Modeling, Analysis, and Visualization: A Summary of High-Renewable Modeling Experiences (Chinese Translation) (in Chinese)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blair, Nate; Zhou, Ella; Getman, Dan

    2015-10-01

    This is the Chinese translation of NREL/TP-6A20-64831. Mathematical and computational models are widely used for the analysis and design of both physical and financial systems. Modeling the electric grid is of particular importance to China for three reasons. First, power-sector assets are expensive and long-lived, and they are critical to any country's development. China's electric load, transmission, and other energy-related infrastructure are expected to continue to grow rapidly; therefore it is crucial to understand and help plan for the future in which those assets will operate. Second, China has dramatically increased its deployment of renewable energy (RE), and is likelymore » to continue further accelerating such deployment over the coming decades. Careful planning and assessment of the various aspects (technical, economic, social, and political) of integrating a large amount of renewables on the grid is required. Third, companies need the tools to develop a strategy for their own involvement in the power market China is now developing, and to enable a possible transition to an efficient and high RE future.« less

  8. Intelligent Patching of Conceptual Geometry for CFD Analysis

    NASA Technical Reports Server (NTRS)

    Li, Wu

    2010-01-01

    The iPatch computer code for intelligently patching surface grids was developed to convert conceptual geometry to computational fluid dynamics (CFD) geometry (see figure). It automatically uses bicubic B-splines to extrapolate (if necessary) each surface in a conceptual geometry so that all the independently defined geometric components (such as wing and fuselage) can be intersected to form a watertight CFD geometry. The software also computes the intersection curves of surface patches at any resolution (up to 10.4 accuracy) specified by the user, and it writes the B-spline surface patches, and the corresponding boundary points, for the watertight CFD geometry in the format that can be directly used by the grid generation tool VGRID. iPatch requires that input geometry be in PLOT3D format where each component surface is defined by a rectangular grid {(x(i,j), y(i,j), z(i,j)):1less than or equal to i less than or equal to m, 1 less than or equal to j less than or equal to n} that represents a smooth B-spline surface. All surfaces in the PLOT3D file conceptually represent a watertight geometry of components of an aircraft on the half-space y greater than or equal to 0. Overlapping surfaces are not allowed, but could be fixed by a utility code "fixp3d". The fixp3d utility code first finds the two grid lines on the two surface grids that are closest to each other in Hausdorff distance (a metric to measure the discrepancies of two sets); then uses one of the grid lines as the transition line, extending grid lines on one grid to the other grid to form a merged grid. Any two connecting surfaces shall have a "visually" common boundary curve, or can be described by an intersection relationship defined in a geometry specification file. The intersection of two surfaces can be at a conceptual level. However, the intersection is directional (along either i or j index direction), and each intersecting grid line (or its spine extrapolation) on the first surface should intersect the second surface. No two intersection relationships will result in a common intersection point of three surfaces. The output files of iPatch are IGES, d3m, and mapbc files that define the CFD geometry in VGRID format. The IGES file gives the NURBS definition of the outer mold line in the geometry. The d3m file defines how the outer mold line is broken into surface patches whose boundary curves are defined by points. The mapbc file specifies what the boundary condition is on each patch and the corresponding NURBS surface definition of each non-planar patch in the IGES file.

  9. Finite-difference time-domain modelling of through-the-Earth radio signal propagation

    NASA Astrophysics Data System (ADS)

    Ralchenko, M.; Svilans, M.; Samson, C.; Roper, M.

    2015-12-01

    This research seeks to extend the knowledge of how a very low frequency (VLF) through-the-Earth (TTE) radio signal behaves as it propagates underground, by calculating and visualizing the strength of the electric and magnetic fields for an arbitrary geology through numeric modelling. To achieve this objective, a new software tool has been developed using the finite-difference time-domain method. This technique is particularly well suited to visualizing the distribution of electromagnetic fields in an arbitrary geology. The frequency range of TTE radio (400-9000 Hz) and geometrical scales involved (1 m resolution for domains a few hundred metres in size) involves processing a grid composed of millions of cells for thousands of time steps, which is computationally expensive. Graphics processing unit acceleration was used to reduce execution time from days and weeks, to minutes and hours. Results from the new modelling tool were compared to three cases for which an analytic solution is known. Two more case studies were done featuring complex geologic environments relevant to TTE communications that cannot be solved analytically. There was good agreement between numeric and analytic results. Deviations were likely caused by numeric artifacts from the model boundaries; however, in a TTE application in field conditions, the uncertainty in the conductivity of the various geologic formations will greatly outweigh these small numeric errors.

  10. Subsurface data visualization in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Krijnen, Robbert; Smelik, Ruben; Appleton, Rick; van Maanen, Peter-Paul

    2017-04-01

    Due to their increasing complexity and size, visualization of geological data is becoming more and more important. It enables detailed examining and reviewing of large volumes of geological data and it is often used as a communication tool for reporting and education to demonstrate the importance of the geology to policy makers. In the Netherlands two types of nation-wide geological models are available: 1) Layer-based models in which the subsurface is represented by a series of tops and bases of geological or hydrogeological units, and 2) Voxel models in which the subsurface is subdivided in a regular grid of voxels that can contain different properties per voxel. The Geological Survey of the Netherlands (GSN) provides an interactive web portal that delivers maps and vertical cross-sections of such layer-based and voxel models. From this portal you can download a 3D subsurface viewer that can visualize the voxel model data of an area of 20 × 25 km with 100 × 100 × 5 meter voxel resolution on a desktop computer. Virtual Reality (VR) technology enables us to enhance the visualization of this volumetric data in a more natural way as compared to a standard desktop, keyboard mouse setup. The use of VR for data visualization is not new but recent developments has made expensive hardware and complex setups unnecessary. The availability of consumer of-the-shelf VR hardware enabled us to create an new intuitive and low visualization tool. A VR viewer has been implemented using the HTC Vive head set and allows visualization and analysis of the GSN voxel model data with geological or hydrogeological units. The user can navigate freely around the voxel data (20 × 25 km) which is presented in a virtual room at a scale of 2 × 2 or 3 × 3 meters. To enable analysis, e.g. hydraulic conductivity, the user can select filters to remove specific hydrogeological units. The user can also use slicing to cut-off specific sections of the voxel data to get a closer look. This slicing can be done in any direction using a 'virtual knife'. Future plans are to further improve performance from 30 up to 90 Hz update rate to reduce possible motion sickness, add more advanced filtering capabilities as well as a multi user setup, annotation capabilities and visualizing of historical data.

  11. Enabling Reanalysis Intercomparison with the CREATE-IP and CREATE-V Projects

    NASA Astrophysics Data System (ADS)

    Carriere, L.; Potter, G. L.; Hertz, J.; Shen, Y.; Britzolakis, G.; Peters, J.; Maxwell, T. P.; Li, J.; Strong, S.; Schnase, J. L.

    2016-12-01

    NASA Goddard Space Flight Center's Office of Computational and Information Sciences and Technology, the NASA Center for Climate Simulation (NCCS), and the Earth System Grid Federation (ESGF) are working together to build a uniform environment for the comparative study and use of a group of reanalysis datasets of particular importance to the research community. This effort is called the Collaborative REAnalysis Technical Environment (CREATE) and it contains two components: the CREATE-Intercomparison Project (CREATE-IP) and CREATE-V. For CREATE-IP, our target reanalyses include ECMWF ERA-Interim, NASA/GMAO MERRA and MERRA2, NOAA/NCEP CFSR, NOAA/ESRL 20CR and 20CRv2, JMA JRA25, and JRA55. Each dataset is reformatted similarly to the models in the CMIP5 archive. By repackaging the reanalysis data into a common structure and format, it simplifies access, subsetting, and reanalysis comparison. Both monthly average data and a selection of high frequency data (6-hr) relevant to investigations such as the 2016 El Niño are provided. Much of the processing workflow has been automated and new data appear on a regular basis. In collaboration with the CLIVAR Global Synthesis and Observations Panel (GSOP), we are also processing and publishing eight ocean reanalyses, from 1980 to the present. Here, the data are regridded to a common 1° x 1° grid, vertically interpolated to the World Ocean Atlas 09 (WOA09) depths, and an ensemble is generated. CREATE-V is a web based visualization tool that allows the user to simultaneously view four reanalyses to facilitate comparison. The addition of a backend analytics engine, based on UV-CDAT and Scala provides the ability to generate a time series and anomaly for any given location on a map. The system enables scientists to identify data of interest and visualize, subset, and compare data without the need for download large volumes of data for local visualization.

  12. GeoMapApp, Virtual Ocean, and other Free Data Resources for the 21st Century Classroom

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.; Ryan, W.; Carbotte, S.; Melkonian, A.; Coplan, J.; Arko, R.; Ferrini, V.; O'Hara, S.; Leung, A.; Bonckzowski, J.

    2008-12-01

    With funding from the U.S. National Science Foundation, the Marine Geoscience Data System (MGDS) (http://www.marine-geo.org/) is developing GeoMapApp (http://www.geomapapp.org) - a computer application that provides wide-ranging map-based visualization and manipulation options for interdisciplinary geosciences research and education. The novelty comes from the use of this visual tool to discover and explore data, with seamless links to further discovery using traditional text-based approaches. Users can generate custom maps and grids and import their own data sets. Built-in functionality allows users to readily explore a broad suite of interactive data sets and interfaces. Examples include multi-resolution global digital models of topography, gravity, sediment thickness, and crustal ages; rock, fluid, biology and sediment sample information; research cruise underway geophysical and multibeam data; earthquake events; submersible dive photos of hydrothermal vents; geochemical analyses; DSDP/ODP core logs; seismic reflection profiles; contouring, shading, profiling of grids; and many more. On-line audio-visual tutorials lead users step-by-step through GeoMapApp functionality (http://www.geomapapp.org/tutorials/). Virtual Ocean (http://www.virtualocean.org/) integrates GeoMapApp with a 3-D earth browser based upon NASA WorldWind, providing yet more powerful capabilities. The searchable MGDS Media Bank (http://media.marine-geo.org/) supports viewing of remarkable images and video from the NSF Ridge 2000 and MARGINS programs. For users familiar with Google Earth (tm), KML files are available for viewing several MGDS data sets (http://www.marine-geo.org/education/kmls.php). Examples of accessing and manipulating a range of geoscience data sets from various NSF-funded programs will be shown. GeoMapApp, Virtual Ocean, the MGDS Media Bank and KML files are free MGDS data resources and work on any type of computer. They are currently used by educators, researchers, school teachers and the general public.

  13. Measurement of velocity distribution and turbulence in a special wind tunnel using a laser Doppler velocimeter

    NASA Astrophysics Data System (ADS)

    Mueller, J.; Petersen, J. C.; Pilz, E.; Wiegand, H.

    1981-06-01

    The flow behavior in a jet mixing visualization chamber for turbulent fuel spray mixing with air under compression, e.g., at top dead center in diesel engines, was investigated with a laser Doppler velocimeter. The measurements were performed in two cuts in the profile perpendicular to the flow direction. The range of flow conditions in the measuring chamber was tested. The measurements were conducted with and without turbulence grids and shear flow grids behind the inlet nozzle. Wire grids did not enhance the turbulence in the measuring chamber. One of the tested shear flow grids produced shear flow as expected. A turbulence grid whose design was based on experimental results, produced a turbulence degree of up to 30% over the whole measuring cross section.

  14. Systems Engineering Building Advances Power Grid Research

    ScienceCinema

    Virden, Jud; Huang, Henry; Skare, Paul; Dagle, Jeff; Imhoff, Carl; Stoustrup, Jakob; Melton, Ron; Stiles, Dennis; Pratt, Rob

    2018-01-16

    Researchers and industry are now better equipped to tackle the nation’s most pressing energy challenges through PNNL’s new Systems Engineering Building – including challenges in grid modernization, buildings efficiency and renewable energy integration. This lab links real-time grid data, software platforms, specialized laboratories and advanced computing resources for the design and demonstration of new tools to modernize the grid and increase buildings energy efficiency.

  15. Designing for Wide-Area Situation Awareness in Future Power Grid Operations

    NASA Astrophysics Data System (ADS)

    Tran, Fiona F.

    Power grid operation uncertainty and complexity continue to increase with the rise of electricity market deregulation, renewable generation, and interconnectedness between multiple jurisdictions. Human operators need appropriate wide-area visualizations to help them monitor system status to ensure reliable operation of the interconnected power grid. We observed transmission operations at a control centre, conducted critical incident interviews, and led focus group sessions with operators. The results informed a Work Domain Analysis of power grid operations, which in turn informed an Ecological Interface Design concept for wide-area monitoring. I validated design concepts through tabletop discussions and a usability evaluation with operators, earning a mean System Usability Scale score of 77 out of 90. The design concepts aim to support an operator's complete and accurate understanding of the power grid state, which operators increasingly require due to the critical nature of power grid infrastructure and growing sources of system uncertainty.

  16. PNNL Future Power Grid Initiative-developed GridOPTICS Software System (GOSS)

    ScienceCinema

    None

    2018-01-16

    The power grid is changing and evolving. One aspect of this change is the growing use of smart meters and other devices, which are producing large volumes of useful data. However, in many cases, the data can’t be translated quickly into actionable guidance to improve grid performance. There's a need for innovative tools. The GridOPTICS(TM) Software System, or GOSS, developed through PNNL's Future Power Grid Initiative, is open source and became publicly available in spring 2014. The value of this middleware is that it easily integrates grid applications with sources of data and facilitates communication between them. Such a capability provides a foundation for developing a range of applications to improve grid management.

  17. Modeling and Grid Generation of Iced Airfoils

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Baez, Marivell; Braun, Donald C.; Hackenberg, Anthony W.; Pennline, James A.; Schilling, Herbert W.

    2007-01-01

    SmaggIce Version 2.0 is a software toolkit for geometric modeling and grid generation for two-dimensional, singleand multi-element, clean and iced airfoils. A previous version of SmaggIce was described in Preparing and Analyzing Iced Airfoils, NASA Tech Briefs, Vol. 28, No. 8 (August 2004), page 32. To recapitulate: Ice shapes make it difficult to generate quality grids around airfoils, yet these grids are essential for predicting ice-induced complex flow. This software efficiently creates high-quality structured grids with tools that are uniquely tailored for various ice shapes. SmaggIce Version 2.0 significantly enhances the previous version primarily by adding the capability to generate grids for multi-element airfoils. This version of the software is an important step in streamlining the aeronautical analysis of ice airfoils using computational fluid dynamics (CFD) tools. The user may prepare the ice shape, define the flow domain, decompose it into blocks, generate grids, modify/divide/merge blocks, and control grid density and smoothness. All these steps may be performed efficiently even for the difficult glaze and rime ice shapes. Providing the means to generate highly controlled grids near rough ice, the software includes the creation of a wrap-around block (called the "viscous sublayer block"), which is a thin, C-type block around the wake line and iced airfoil. For multi-element airfoils, the software makes use of grids that wrap around and fill in the areas between the viscous sub-layer blocks for all elements that make up the airfoil. A scripting feature records the history of interactive steps, which can be edited and replayed later to produce other grids. Using this version of SmaggIce, ice shape handling and grid generation can become a practical engineering process, rather than a laborious research effort.

  18. Quantification of myocardial fibrosis by digital image analysis and interactive stereology

    PubMed Central

    2014-01-01

    Background Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist’s visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist’s visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Methods Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson’s trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist’s visual score. Results A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r > 0.9, p < 0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Conclusion Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist’s visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193 PMID:24912374

  19. Quantification of myocardial fibrosis by digital image analysis and interactive stereology.

    PubMed

    Daunoravicius, Dainius; Besusparis, Justinas; Zurauskas, Edvardas; Laurinaviciene, Aida; Bironaite, Daiva; Pankuweit, Sabine; Plancoulaine, Benoit; Herlin, Paulette; Bogomolovas, Julius; Grabauskiene, Virginija; Laurinavicius, Arvydas

    2014-06-09

    Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist's visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist's visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson's trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist's visual score. A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r>0.9, p<0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist's visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193.

  20. Computational System For Rapid CFD Analysis In Engineering

    NASA Technical Reports Server (NTRS)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  1. Grid Modernization Laboratory Consortium - Testing and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroposki, Benjamin; Skare, Paul; Pratt, Rob

    This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.

  2. A grid for a precise analysis of daily activities.

    PubMed

    Wojtasik, V; Olivier, C; Lekeu, F; Quittre, A; Adam, S; Salmon, E

    2010-01-01

    Assessment of daily living activities is essential in patients with Alzheimer's disease. Most current tools quantitatively assess overall ability but provide little qualitative information on individual difficulties. Only a few tools allow therapists to evaluate stereotyped activities and record different types of errors. We capitalised on the Kitchen Activity Assessment to design a widely applicable analysis grid that provides both qualitative and quantitative data on activity performance. A cooking activity was videotaped in 15 patients with dementia and assessed according to the different steps in the execution of the task. The evaluations obtained with our grid showed good correlations between raters, between versions of the grid and between sessions. Moreover, the degree of independence obtained with our analysis of the task correlated with the Kitchen Activity Assessment score and with a global score of cognitive functioning. We conclude that assessment of a daily living activity with this analysis grid is reproducible and relatively independent of the therapist, and thus provides quantitative and qualitative information useful for both evaluating and caring for demented patients.

  3. Sustainable Data Evolution Technology for Power Grid Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The SDET Tool is used to create open-access power grid data sets and facilitate updates of these data sets by the community. Pacific Northwest National Laboratory (PNNL) and its power industry and software vendor partners are developing an innovative sustainable data evolution technology (SDET) to create open-access power grid datasets and facilitate updates to these datasets by the power grid community. The objective is to make this a sustained effort within and beyond the ARPA-E GRID DATA program so that the datasets can evolve over time and meet the current and future needs for power grid optimization and potentially othermore » applications in power grid operation and planning.« less

  4. Understanding the Role of Hemodynamics in the Initiation, Progression, Rupture, and Treatment Outcome of Cerebral Aneurysm from Medical Image-Based Computational Studies

    PubMed Central

    Castro, Marcelo A.

    2013-01-01

    About a decade ago, the first image-based computational hemodynamic studies of cerebral aneurysms were presented. Their potential for clinical applications was the result of a right combination of medical image processing, vascular reconstruction, and grid generation techniques used to reconstruct personalized domains for computational fluid and solid dynamics solvers and data analysis and visualization techniques. A considerable number of studies have captivated the attention of clinicians, neurosurgeons, and neuroradiologists, who realized the ability of those tools to help in understanding the role played by hemodynamics in the natural history and management of intracranial aneurysms. This paper intends to summarize the most relevant results in the field reported during the last years. PMID:24967285

  5. IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.

    This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less

  6. Smart Grid Interoperability Maturity Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Levinson, Alex; Mater, J.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less

  7. Reservoir property grids improve with geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogt, J.

    1993-09-01

    Visualization software, reservoir simulators and many other E and P software applications need reservoir property grids as input. Using geostatistics, as compared to other gridding methods, to produce these grids leads to the best output from the software programs. For the purpose stated herein, geostatistics is simply two types of gridding methods. Mathematically, these methods are based on minimizing or duplicating certain statistical properties of the input data. One geostatical method, called kriging, is used when the highest possible point-by-point accuracy is desired. The other method, called conditional simulation, is used when one wants statistics and texture of the resultingmore » grid to be the same as for the input data. In the following discussion, each method is explained, compared to other gridding methods, and illustrated through example applications. Proper use of geostatistical data in flow simulations, use of geostatistical data for history matching, and situations where geostatistics has no significant advantage over other methods, also will be covered.« less

  8. Comprehension of Navigation Directions

    NASA Technical Reports Server (NTRS)

    Schneider, Vivian I.; Healy, Alice F.

    2000-01-01

    In an experiment simulating communication between air traffic controllers and pilots, subjects were given navigation instructions varying in length telling them to move in a space represented by grids on a computer screen. The subjects followed the instructions by clicking on the grids in the locations specified. Half of the subjects read the instructions, and half heard them. Half of the subjects in each modality condition repeated back the instructions before following them,and half did not. Performance was worse for the visual than for the auditory modality on the longer messages. Repetition of the instructions generally depressed performance, especially with the longer messages, which required more output than did the shorter messages, and especially with the visual modality, in which phonological recoding from the visual input to the spoken output was necessary. These results are explained in terms of the degrading effects of output interference on memory for instructions.

  9. Comprehension of Navigation Directions

    NASA Technical Reports Server (NTRS)

    Healy, Alice F.; Schneider, Vivian I.

    2002-01-01

    Subjects were shown navigation instructions varying in length directing them to move in a space represented by grids on a computer screen. They followed the instructions by clicking on the grids in the locations specified. Some subjects repeated back the instructions before following them, some did not, and others repeated back the instructions in reduced form, including only the critical words. The commands in each message were presented simultaneously for half of the subjects and sequentially for the others. For the longest messages, performance was better on the initial commands and worse on the final commands with simultaneous than with sequential presentation. Instruction repetition depressed performance, but reduced repetition removed this disadvantage. Effects of presentation format were attributed to visual scanning strategies. The advantage for reduced repetition was attributable either to enhanced visual scanning or to reduced output interference. A follow-up study with auditory presentation supported the visual scanning explanation.

  10. Using Option Grids: steps toward shared decision-making for neonatal circumcision.

    PubMed

    Fay, Mary; Grande, Stuart W; Donnelly, Kyla; Elwyn, Glyn

    2016-02-01

    To assess the impact, acceptability and feasibility of a short encounter tool designed to enhance the process of shared decision-making and parental engagement. We analyzed video-recordings of clinical encounters, half undertaken before and half after a brief intervention that trained four clinicians how to use Option Grids, using an observer-based measure of shared decision-making. We also analyzed semi-structured interviews conducted with the clinicians four weeks after their exposure to the intervention. Observer OPTION(5) scores were higher at post-intervention, with a mean of 33.9 (SD=23.5) compared to a mean of 16.1 (SD=7.1) for pre-intervention, a significant difference of 17.8 (95% CI: 2.4, 33.2). Prior to using the intervention, clinicians used a consent document to frame circumcision as a default practice. Encounters with the Option Grid conferred agency to both parents and clinicians, and facilitated shared decision-making. Clinician reported recognizing the tool's positive effect on their communication process. Tools such as Option Grids have the potential to make it easier for clinicians to achieve shared decision-making. Encounter tools have the potential to change practice. More research is needed to test their feasibility in routine practice. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. MEVA--An Interactive Visualization Application for Validation of Multifaceted Meteorological Data with Multiple 3D Devices.

    PubMed

    Helbig, Carolin; Bilke, Lars; Bauer, Hans-Stefan; Böttinger, Michael; Kolditz, Olaf

    2015-01-01

    To achieve more realistic simulations, meteorologists develop and use models with increasing spatial and temporal resolution. The analyzing, comparing, and visualizing of resulting simulations becomes more and more challenging due to the growing amounts and multifaceted character of the data. Various data sources, numerous variables and multiple simulations lead to a complex database. Although a variety of software exists suited for the visualization of meteorological data, none of them fulfills all of the typical domain-specific requirements: support for quasi-standard data formats and different grid types, standard visualization techniques for scalar and vector data, visualization of the context (e.g., topography) and other static data, support for multiple presentation devices used in modern sciences (e.g., virtual reality), a user-friendly interface, and suitability for cooperative work. Instead of attempting to develop yet another new visualization system to fulfill all possible needs in this application domain, our approach is to provide a flexible workflow that combines different existing state-of-the-art visualization software components in order to hide the complexity of 3D data visualization tools from the end user. To complete the workflow and to enable the domain scientists to interactively visualize their data without advanced skills in 3D visualization systems, we developed a lightweight custom visualization application (MEVA - multifaceted environmental data visualization application) that supports the most relevant visualization and interaction techniques and can be easily deployed. Specifically, our workflow combines a variety of different data abstraction methods provided by a state-of-the-art 3D visualization application with the interaction and presentation features of a computer-games engine. Our customized application includes solutions for the analysis of multirun data, specifically with respect to data uncertainty and differences between simulation runs. In an iterative development process, our easy-to-use application was developed in close cooperation with meteorologists and visualization experts. The usability of the application has been validated with user tests. We report on how this application supports the users to prove and disprove existing hypotheses and discover new insights. In addition, the application has been used at public events to communicate research results.

  12. MEVA - An Interactive Visualization Application for Validation of Multifaceted Meteorological Data with Multiple 3D Devices

    PubMed Central

    Helbig, Carolin; Bilke, Lars; Bauer, Hans-Stefan; Böttinger, Michael; Kolditz, Olaf

    2015-01-01

    Background To achieve more realistic simulations, meteorologists develop and use models with increasing spatial and temporal resolution. The analyzing, comparing, and visualizing of resulting simulations becomes more and more challenging due to the growing amounts and multifaceted character of the data. Various data sources, numerous variables and multiple simulations lead to a complex database. Although a variety of software exists suited for the visualization of meteorological data, none of them fulfills all of the typical domain-specific requirements: support for quasi-standard data formats and different grid types, standard visualization techniques for scalar and vector data, visualization of the context (e.g., topography) and other static data, support for multiple presentation devices used in modern sciences (e.g., virtual reality), a user-friendly interface, and suitability for cooperative work. Methods and Results Instead of attempting to develop yet another new visualization system to fulfill all possible needs in this application domain, our approach is to provide a flexible workflow that combines different existing state-of-the-art visualization software components in order to hide the complexity of 3D data visualization tools from the end user. To complete the workflow and to enable the domain scientists to interactively visualize their data without advanced skills in 3D visualization systems, we developed a lightweight custom visualization application (MEVA - multifaceted environmental data visualization application) that supports the most relevant visualization and interaction techniques and can be easily deployed. Specifically, our workflow combines a variety of different data abstraction methods provided by a state-of-the-art 3D visualization application with the interaction and presentation features of a computer-games engine. Our customized application includes solutions for the analysis of multirun data, specifically with respect to data uncertainty and differences between simulation runs. In an iterative development process, our easy-to-use application was developed in close cooperation with meteorologists and visualization experts. The usability of the application has been validated with user tests. We report on how this application supports the users to prove and disprove existing hypotheses and discover new insights. In addition, the application has been used at public events to communicate research results. PMID:25915061

  13. The International Solid Earth Research Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Fox, G.; Pierce, M.; Rundle, J.; Donnellan, A.; Parker, J.; Granat, R.; Lyzenga, G.; McLeod, D.; Grant, L.

    2004-12-01

    We describe the architecture and initial implementation of the International Solid Earth Research Virtual Observatory (iSERVO). This has been prototyped within the USA as SERVOGrid and expansion is planned to Australia, China, Japan and other countries. We base our design on a globally scalable distributed "cyber-infrastructure" or Grid built around a Web Services-based approach consistent with the extended Web Service Interoperability approach. The Solid Earth Science Working Group of NASA has identified several challenges for Earth Science research. In order to investigate these, we need to couple numerical simulation codes and data mining tools to observational data sets. This observational data are now available on-line in internet-accessible forms, and the quantity of this data is expected to grow explosively over the next decade. We architect iSERVO as a loosely federated Grid of Grids with each country involved supporting a national Solid Earth Research Grid. The national Grid Operations, possibly with dedicated control centers, are linked together to support iSERVO where an International Grid control center may eventually be necessary. We address the difficult multi-administrative domain security and ownership issues by exposing capabilities as services for which the risk of abuse is minimized. We support large scale simulations within a single domain using service-hosted tools (mesh generation, data repository and sensor access, GIS, visualization). Simulations typically involve sequential or parallel machines in a single domain supported by cross-continent services. We use Web Services implement Service Oriented Architecture (SOA) using WSDL for service description and SOAP for message formats. These are augmented by UDDI, WS-Security, WS-Notification/Eventing and WS-ReliableMessaging in the WS-I+ approach. Support for the latter two capabilities will be available over the next 6 months from the NaradaBrokering messaging system. We augment these specifications with the powerful portlet architecture using WSRP and JSR168 supported by such portal containers as uPortal, WebSphere, and Apache JetSpeed2. The latter portal aggregates component user interfaces for each iSERVO service allowing flexible customization of the user interface. We exploit the portlets produced by the NSF NMI (Middleware initiative) OGCE activity. iSERVO also uses specifications from the Open Geographical Information Systems (GIS) Consortium (OGC) that defines a number of standards for modeling earth surface feature data and services for interacting with this data. The data models are expressed in the XML-based Geography Markup Language (GML), and the OGC service framework are being adapted to use the Web Service model. The SERVO prototype includes a GIS Grid that currently includes the core WMS and WFS (Map and Feature) services. We will follow the best practice in the Grid and Web Service field and will adapt our technology as appropriate. For example, we expect to support services built on WS-RF when is finalized and to make use of the database interfaces OGSA-DAI and its WS-I+ versions. Finally, we review advances in Web Service scripting (such as HPSearch) and workflow systems (such as GCF) and their applications to iSERVO.

  14. Cloud-Based Computational Tools for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Arendt, A. A.; Fatland, R.; Howe, B.

    2015-12-01

    Earth scientists are increasingly required to think across disciplines and utilize a wide range of datasets in order to solve complex environmental challenges. Although significant progress has been made in distributing data, researchers must still invest heavily in developing computational tools to accommodate their specific domain. Here we document our development of lightweight computational data systems aimed at enabling rapid data distribution, analytics and problem solving tools for Earth science applications. Our goal is for these systems to be easily deployable, scalable and flexible to accommodate new research directions. As an example we describe "Ice2Ocean", a software system aimed at predicting runoff from snow and ice in the Gulf of Alaska region. Our backend components include relational database software to handle tabular and vector datasets, Python tools (NumPy, pandas and xray) for rapid querying of gridded climate data, and an energy and mass balance hydrological simulation model (SnowModel). These components are hosted in a cloud environment for direct access across research teams, and can also be accessed via API web services using a REST interface. This API is a vital component of our system architecture, as it enables quick integration of our analytical tools across disciplines, and can be accessed by any existing data distribution centers. We will showcase several data integration and visualization examples to illustrate how our system has expanded our ability to conduct cross-disciplinary research.

  15. Specialized CFD Grid Generation Methods for Near-Field Sonic Boom Prediction

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Campbell, Richard L.; Elmiligui, Alaa; Cliff, Susan E.; Nayani, Sudheer N.

    2014-01-01

    Ongoing interest in analysis and design of low sonic boom supersonic transports re- quires accurate and ecient Computational Fluid Dynamics (CFD) tools. Specialized grid generation techniques are employed to predict near- eld acoustic signatures of these con- gurations. A fundamental examination of grid properties is performed including grid alignment with ow characteristics and element type. The issues a ecting the robustness of cylindrical surface extrusion are illustrated. This study will compare three methods in the extrusion family of grid generation methods that produce grids aligned with the freestream Mach angle. These methods are applied to con gurations from the First AIAA Sonic Boom Prediction Workshop.

  16. AstroGrid: the UK's Virtual Observatory Initiative

    NASA Astrophysics Data System (ADS)

    Mann, Robert G.; Astrogrid Consortium; Lawrence, Andy; Davenhall, Clive; Mann, Bob; McMahon, Richard; Irwin, Mike; Walton, Nic; Rixon, Guy; Watson, Mike; Osborne, Julian; Page, Clive; Allan, Peter; Giaretta, David; Perry, Chris; Pike, Dave; Sherman, John; Murtagh, Fionn; Harra, Louise; Bentley, Bob; Mason, Keith; Garrington, Simon

    AstroGrid is the UK's Virtual Observatory (VO) initiative. It brings together the principal astronomical data centres in the UK, and has been funded to the tune of ˜pounds 5M over the next three years, via PPARC, as part of the UK e--science programme. Its twin goals are the provision of the infrastructure and tools for the federation and exploitation of large astronomical (X-ray to radio), solar and space plasma physics datasets, and the delivery of federations of current datasets for its user communities to exploit using those tools. Whilst AstroGrid's work will be centred on existing and future (e.g. VISTA) UK datasets, it will seek solutions to generic VO problems and will contribute to the developing international virtual observatory framework: AstroGrid is a member of the EU-funded Astrophysical Virtual Observatory project, has close links to a second EU Grid initiative, the European Grid of Solar Observations (EGSO), and will seek an active role in the development of the common standards on which the international virtual observatory will rely. In this paper we shall primarily describe the concrete plans for AstroGrid's one-year Phase A study, which will centre on: (i) the definition of detailed science requirements through community consultation; (ii) the undertaking of a ``functionality market survey" to test the utility of existing technologies for the VO; and (iii) a pilot programme of database federations, each addressing different aspects of the general database federation problem. Further information on AstroGrid can be found at AstroGrid .

  17. GENIE(++): A Multi-Block Structured Grid System

    NASA Technical Reports Server (NTRS)

    Williams, Tonya; Nadenthiran, Naren; Thornburg, Hugh; Soni, Bharat K.

    1996-01-01

    The computer code GENIE++ is a continuously evolving grid system containing a multitude of proven geometry/grid techniques. The generation process in GENIE++ is based on an earlier version. The process uses several techniques either separately or in combination to quickly and economically generate sculptured geometry descriptions and grids for arbitrary geometries. The computational mesh is formed by using an appropriate algebraic method. Grid clustering is accomplished with either exponential or hyperbolic tangent routines which allow the user to specify a desired point distribution. Grid smoothing can be accomplished by using an elliptic solver with proper forcing functions. B-spline and Non-Uniform Rational B-splines (NURBS) algorithms are used for surface definition and redistribution. The built in sculptured geometry definition with desired distribution of points, automatic Bezier curve/surface generation for interior boundaries/surfaces, and surface redistribution is based on NURBS. Weighted Lagrance/Hermite transfinite interpolation methods, interactive geometry/grid manipulation modules, and on-line graphical visualization of the generation process are salient features of this system which result in a significant time savings for a given geometry/grid application.

  18. Quadtree of TIN: a new algorithm of dynamic LOD

    NASA Astrophysics Data System (ADS)

    Zhang, Junfeng; Fei, Lifan; Chen, Zhen

    2009-10-01

    Currently, Real-time visualization of large-scale digital elevation model mainly employs the regular structure of GRID based on quadtree and triangle simplification methods based on irregular triangulated network (TIN). TIN is a refined means to express the terrain surface in the computer science, compared with GRID. However, the data structure of TIN model is complex, and is difficult to realize view-dependence representation of level of detail (LOD) quickly. GRID is a simple method to realize the LOD of terrain, but contains more triangle count. A new algorithm, which takes full advantage of the two methods' merit, is presented in this paper. This algorithm combines TIN with quadtree structure to realize the view-dependence LOD controlling over the irregular sampling point sets, and holds the details through the distance of viewpoint and the geometric error of terrain. Experiments indicate that this approach can generate an efficient quadtree triangulation hierarchy over any irregular sampling point sets and achieve dynamic and visual multi-resolution performance of large-scale terrain at real-time.

  19. Exploring multivariate representations of indices along linear geographic features

    NASA Astrophysics Data System (ADS)

    Bleisch, Susanne; Hollenstein, Daria

    2018-05-01

    A study of the walkability of a Swiss town required finding suitable representations of multivariate geographical da-ta. The goal was to represent multiple indices of walkability concurrently and visualizing the data along the street network it relates to. Different indices of pedestrian friendliness were assessed for short street sections and then mapped to an overlaid grid. Basic and composite glyphs were designed using square- or triangle-areas to display one to four index values concurrently within the grid structure. Color was used to indicate different indices. Implement-ing visualizations for different combinations of index sets, we find that single values can be emphasized or de-emphasized by selecting the color scheme accordingly and that different color selections either allow perceiving sin-gle values or overall trends over the evaluated area. Values for up to four indices can be displayed in combination within the resulting geovisualizations and the underlying gridded road network references the data to its real world locations.

  20. Experimental Study of Two Phase Flow Behavior Past BWR Spacer Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ratnayake, Ruwan K.; Hochreiter, L.E.; Ivanov, K.N.

    2002-07-01

    Performance of best estimate codes used in the nuclear industry can be significantly improved by reducing the empiricism embedded in their constitutive models. Spacer grids have been found to have an important impact on the maximum allowable Critical Heat Flux within the fuel assembly of a nuclear reactor core. Therefore, incorporation of suitable spacer grids models can improve the critical heat flux prediction capability of best estimate codes. Realistic modeling of entrainment behavior of spacer grids requires understanding the different mechanisms that are involved. Since visual information pertaining to the entrainment behavior of spacer grids cannot possibly be obtained frommore » operating nuclear reactors, experiments have to be designed and conducted for this specific purpose. Most of the spacer grid experiments available in literature have been designed in view of obtaining quantitative data for the purpose of developing or modifying empirical formulations for heat transfer, critical heat flux or pressure drop. Very few experiments have been designed to provide fundamental information which can be used to understand spacer grid effects and phenomena involved in two phase flow. Air-water experiments were conducted to obtain visual information on the two-phase flow behavior both upstream and downstream of Boiling Water Reactor (BWR) spacer grids. The test section was designed and constructed using prototypic dimensions such as the channel cross-section, rod diameter and other spacer grid configurations of a typical BWR fuel assembly. The test section models the flow behavior in two adjacent sub channels in the BWR core. A portion of a prototypic BWR spacer grid accounting for two adjacent channels was used with industrial mild steel rods for the purpose of representing the channel internals. Symmetry was preserved in this practice, so that the channel walls could effectively be considered as the channel boundaries. Thin films were established on the rod surfaces by injecting water through a set of perforations at the bottom ends of the rods, ensuring that the flow upstream of the bottom-most spacer grid is predominantly annular. The flow conditions were regulated such that they represent typical BWR operating conditions. Photographs taken during experiments show that the film entrainment increases significantly at the spacer grids, since the points of contact between the rods and the grids result in a peeling off of large portions of the liquid film from the rod surfaces. Decreasing the water flow resulted in eventual drying out, beginning at positions immediately upstream of the spacer grids. (authors)« less

  1. Connecting Swath Satellite Data With Imagery in Mapping Applications

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Hall, J. R.; Penteado, P. F.; Roberts, J. T.; Zhou, A. Y.

    2016-12-01

    Visualizations of gridded science data products (referred to as Level 3 or Level 4) typically provide a straightforward correlation between image pixels and the source science data. This direct relationship allows users to make initial inferences based on imagery values, facilitating additional operations on the underlying data values, such as data subsetting and analysis. However, that same pixel-to-data relationship for ungridded science data products (referred to as Level 2) is significantly more challenging. These products, also referred to as "swath products", are in orbital "instrument space" and raster visualization pixels do not directly correlate to science data values. Interpolation algorithms are often employed during the gridding or projection of a science dataset prior to image generation, introducing intermediary values that separate the image from the source data values. NASA's Global Imagery Browse Services (GIBS) is researching techniques for efficiently serving "image-ready" data allowing client-side dynamic visualization and analysis capabilities. This presentation will cover some GIBS prototyping work designed to maintain connectivity between Level 2 swath data and its corresponding raster visualizations. Specifically, we discuss the DAta-to-Image-SYstem (DAISY), an indexing approach for Level 2 swath data, and the mechanisms whereby a client may dynamically visualize the data in raster form.

  2. Grid Data and Tools | Grid Modernization | NREL

    Science.gov Websites

    technologies and strategies, including renewable resource data sets and models of the electric power system . Renewable Resource Data A library of resource information to inform the design of efficient, integrated

  3. Randomized trial evaluating short-term effects of intravitreal ranibizumab or triamcinolone acetonide on macular edema after focal/grid laser for diabetic macular edema in eyes also receiving panretinal photocoagulation.

    PubMed

    Googe, Joseph; Brucker, Alexander J; Bressler, Neil M; Qin, Haijing; Aiello, Lloyd P; Antoszyk, Andrew; Beck, Roy W; Bressler, Susan B; Ferris, Frederick L; Glassman, Adam R; Marcus, Dennis; Stockdale, Cynthia R

    2011-06-01

    To evaluate 14-week effects of intravitreal ranibizumab or triamcinolone in eyes receiving focal/grid laser for diabetic macular edema and panretinal photocoagulation. Three hundred and forty-five eyes with a visual acuity of 20/320 or better, center-involved diabetic macular edema receiving focal/grid laser, and diabetic retinopathy receiving prompt panretinal photocoagulation were randomly assigned to sham (n = 123), 0.5-mg ranibizumab (n = 113) at baseline and 4 weeks, and 4-mg triamcinolone at baseline and sham at 4 weeks (n = 109). Treatment was at investigator discretion from 14 weeks to 56 weeks. Mean changes (±SD) in visual acuity letter score from baseline were significantly better in the ranibizumab (+1 ± 11; P < 0.001) and triamcinolone (+2 ± 11; P < 0.001) groups compared with those in the sham group (-4 ± 14) at the 14-week visit, mirroring retinal thickening results. These differences were not maintained when study participants were followed for 56 weeks for safety outcomes. One eye (0.9%; 95% confidence interval, 0.02%-4.7%) developed endophthalmitis after receiving ranibizumab. Cerebrovascular/cardiovascular events occurred in 4%, 7%, and 3% of the sham, ranibizumab, and triamcinolone groups, respectively. The addition of 1 intravitreal triamcinolone injection or 2 intravitreal ranibizumab injections in eyes receiving focal/grid laser for diabetic macular edema and panretinal photocoagulation is associated with better visual acuity and decreased macular edema by 14 weeks. Whether continued long-term intravitreal treatment is beneficial cannot be determined from this study.

  4. Influences of the inner retinal sublayers and analytical areas in macular scans by spectral-domain OCT on the diagnostic ability of early glaucoma.

    PubMed

    Nakatani, Yusuke; Higashide, Tomomi; Ohkubo, Shinji; Sugiyama, Kazuhisa

    2014-10-23

    We investigated the influences of the inner retinal sublayers and analytical areas in macular scans by spectral-domain optical coherence tomography (OCT) on the diagnostic ability of early glaucoma. A total of 64 early (including 24 preperimetric) glaucomatous and 40 normal eyes underwent macular and peripapillary retinal nerve fiber layer (pRNFL) scans (3D-OCT-2000). The area under the receiver operating characteristics (AUC) for glaucoma diagnosis was determined from the average thickness of the total 100 grids (6 × 6 mm), central 44 grids (3.6 × 4.8 mm), and peripheral 56 grids (outside of the 44 grids), and for each macular sublayer: macular RNFL (mRNFL), ganglion cell layer plus inner plexiform layer (GCL/IPL), and mRNFL plus GCL/IPL (ganglion cell complex [GCC]). Correlation of OCT parameters with visual field parameters was evaluated by Spearman's rank correlation coefficients (rs). The GCC-related parameters had a significantly larger AUC (0.82-0.97) than GCL/IPL (0.81-0.91), mRNFL-related parameters (0.72-0.94), or average pRNFL (0.88) in more than half of all comparisons. The central 44 grids had a significantly lower AUC than other analytical areas in GCC and mRNFL thickness. Conversely, the peripheral 56 grids had a significantly lower AUC than the 100 grids in GCL/IPL inferior thickness. Inferior thickness of GCC (rs, 0.45-0.49) and mRNFL (rs, 0.43-0.51) showed comparably high correlations with central visual field parameters to average pRNFL thickness (rs, 0.41, 0.47) even in the central 44 grids. The diagnostic ability of macular OCT parameters for early glaucoma differed by inner retinal sublayers and also by the analytical areas studied. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  5. Macular grid laser photocoagulation for branch retinal vein occlusion.

    PubMed

    Lam, Fook Chang; Chia, Seen N; Lee, Richard M H

    2015-05-11

    Branch retinal vein occlusion (BRVO) is the second most common cause of retinal vascular abnormality after diabetic retinopathy. Persistent macular oedema develops in 60% of eyes with a BRVO. Untreated, only 14% of eyes with chronic macular oedema will have a visual acuity (VA) of 20/40 or better. Macular grid laser photocoagulation is used for chronic non-ischaemic macular oedema following BRVO and has been the mainstay of treatment for over 20 years. New treatments are available and a systematic review is necessary to ensure that the most up-to-date evidence is considered objectively. To examine the effects of macular grid laser photocoagulation in the treatment of macular oedema following BRVO. We searched CENTRAL, Ovid MEDLINE, EMBASE, Web of Science Conference Proceedings Citation Index, the metaRegister of Controlled Trials (mRCT), ClinicalTrials.gov and the WHO International Clinical Trials Registry Platform. We did not use any date or language restrictions in the electronic searches for trials. We last searched the electronic databases on 21 August 2014. We included randomised controlled trials (RCTs) comparing macular grid laser photocoagulation treatment to another treatment, sham treatment or no treatment. We used standard methodological procedures expected by Cochrane. We included five studies conducted in Europe and North America. Four separate trials compared grid laser to no treatment, sham treatment, intravitreal bevacizumab and intravitreal triamcinolone. One further trial compared subthreshold to threshold laser. Two of these trials were judged to be at high risk of bias in one or more domains.In one trial of grid laser versus observation, people receiving grid laser were more likely to gain visual acuity (VA) (10 or more ETDRS letters) at 36 months (RR 1.75, 95% confidence interval (CI) 1.08 to 2.84, 78 participants, moderate-quality evidence). The effect of grid laser on loss of VA (10 or more letters) was uncertain as the results were imprecise (RR 0.68, 95% CI 0.23 to 2.04, 78 participants, moderate-quality evidence). On average, people receiving grid laser had better improvement in VA (mean difference (MD) 0.11 logMAR, 95% CI 0.05 to 0.17, high-quality evidence). In a trial of early and delayed grid laser treatment versus sham laser (n = 108, data available for 99 participants), no participant gained or lost VA (15 or more ETDRS letters). At 12 months, there was no evidence for a difference in change in VA (from baseline) between early grid laser and sham laser (MD -0.03 logMAR, 95% confidence interval (CI) -0.07 to 0.01, 68 participants, low-quality evidence) or between delayed grid laser and sham laser (MD 0.00, 95% CI -0.04 to 0.04, 66 participants, low-quality evidence).The relative effects of subthreshold and threshold laser were uncertain. In one trial, the RR for gain of VA (15 or more letters) at 12 months was 1.68 (95% CI 0.57 to 4.95, 36 participants, moderate-quality evidence); the RR for loss of VA (15 or more letters) was 0.56 (95% CI 0.06 to 5.63, moderate-quality evidence); and at 24 months the change in VA from baseline was MD 0.07 (95% CI -0.10 to 0.24, moderate-quality evidence).The relative effects of macular grid laser and intravitreal bevacizumab were uncertain. In one trial, the RR for gain of 15 or more letters at 12 months was 0.67 (95% CI 0.39 to 1.14, 30 participants, low-quality evidence). Loss of 15 or more letters was not reported. Change in VA at 12 months was MD 0.11 logMAR (95% CI -0.36 to 0.14, low-quality evidence).The relative effects of grid laser and 1mg triamcinolone were uncertain at 12 months. RR for gain of VA (15 or more letters) was 1.13 (95% CI 0.75 to 1.71, 1 RCT, 242 participants, moderate-quality evidence); RR for loss of VA (15 or more letters) was 1.20 (95% CI 0.63 to 2.27, moderate-quality evidence); MD for change in VA was -0.03 letters (95% CI -0.12 to 0.06, moderate-quality evidence). Similar results were seen for the comparison with 4mg triamcinolone. Beyond 12 months, the visual outcomes were in favour of grid laser at 24 months and 36 months with people in the macular grid group gaining more VA.Four studies reported on adverse effects. Laser photocoagulation appeared to be well tolerated in the studies. One participant (out of 71) suffered a perforation of Bruch's membrane, but this did not affect visual acuity. Moderate-quality evidence from one RCT supports the use of grid laser photocoagulation to treat macular oedema following BRVO. There was insufficient evidence to support the use of early grid laser or subthreshold laser. There was insufficient evidence to show a benefit of intravitreal triamcinolone or anti-vascular endothelial growth factor (VEGF) over macular grid laser photocoagulation in BRVO. With recent interest in the use of intravitreal anti-VEGF or steroid therapy, assessment of treatment efficacy (change in visual acuity and foveal or central macular thickness using optical coherence tomography (OCT)) and the number of treatments needed for maintenance and long-term safety will be important for future studies.

  6. Synthetic perspective optical flow: Influence on pilot control tasks

    NASA Technical Reports Server (NTRS)

    Bennett, C. Thomas; Johnson, Walter W.; Perrone, John A.; Phatak, Anil V.

    1989-01-01

    One approach used to better understand the impact of visual flow on control tasks has been to use synthetic perspective flow patterns. Such patterns are the result of apparent motion across a grid or random dot display. Unfortunately, the optical flow so generated is based on a subset of the flow information that exists in the real world. The danger is that the resulting optical motions may not generate the visual flow patterns useful for actual flight control. Researchers conducted a series of studies directed at understanding the characteristics of synthetic perspective flow that support various pilot tasks. In the first of these, they examined the control of altitude over various perspective grid textures (Johnson et al., 1987). Another set of studies was directed at studying the head tracking of targets moving in a 3-D coordinate system. These studies, parametric in nature, utilized both impoverished and complex virtual worlds represented by simple perspective grids at one extreme, and computer-generated terrain at the other. These studies are part of an applied visual research program directed at understanding the design principles required for the development of instruments displaying spatial orientation information. The experiments also highlight the need for modeling the impact of spatial displays on pilot control tasks.

  7. Comparison of grid laser, intravitreal triamcinolone, and intravitreal bevacizumab in the treatment of diffuse diabetic macular edema.

    PubMed

    Sobaci, Güngör; Ozge, Gökhan; Erdurman, Cüneyt; Durukan, Hakan A; Bayraktar, Zeki M

    2012-01-01

    To compare the effects of grid laser (GL), intravitreal bevacizumab (IVB), and intravitreal triamcinolone acetonide (IVTA) in diffuse diabetic macular edema (DDME). One hundred and twenty-six patients (126 eyes) treated with GL (modified grid), IVTA (4 mg), and IVB (1.25 mg) injections, matched for best corrected visual acuity (BCVA) and OCT-based central macular thickness at presentation, were enrolled. Primary outcome measure was change in best corrected logMAR visual acuity at 1-year follow-up. Rates of visual stabilization (within ±0.2 logMAR of baseline BCVA) (71.4, 83.3, 78.6%, respectively) were not different between the groups (p = 0.41) at 12-month follow-up. Higher rates of anatomical and functional success, however, were evident in IVB and IVTA groups within 6 months of treatment (p < 0.05 for both). No severe adverse effects except higher intraocular pressure (10 mm Hg from baseline) in one third (14 eyes) of the IVTA cases, who required trabeculectomy in 2 (4.8%) eyes, were observed. Intraocular injections may give favorable results within the first 6 months, and after 6 months, GL results seem to be more favorable in the treatment of treatment-naïve, acute, nonischemic, and center-involving DDME. Copyright © 2011 S. Karger AG, Basel.

  8. Grist : grid-based data mining for astronomy

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S.; Miller, Craig D.; Walia, Harshpreet; Williams, Roy; Djorgovski, S. George; Graham, Matthew J.; Mahabal, Ashish; Babu, Jogesh; Berk, Daniel E. Vanden; hide

    2004-01-01

    The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the 'hyperatlas' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.

  9. Work step indication with grid-pattern projection for demented senior people.

    PubMed

    Uranishi, Yuki; Yamamoto, Goshiro; Asghar, Zeeshan; Pulli, Petri; Kato, Hirokazu; Oshiro, Osamu

    2013-01-01

    This paper proposes a work step indication method for supporting daily work with a grid-pattern projection. To support an independent life of demented senior people, it is desirable that an instruction is easy to understand visually and not complicated. The proposed method in this paper uses a range image sensor and a camera in addition to a projector. A 3D geometry of a target scene is measured by the range image sensor, and the grid-pattern is projected onto the scene directly. Direct projection of the work step is easier to be associated with the target objects around the assisted person, and the grid-pattern is a solution to indicate the spatial instruction. A prototype has been implemented and has demonstrated that the proposed grid-pattern projection is easy to show the work step.

  10. Grist: Grid-based Data Mining for Astronomy

    NASA Astrophysics Data System (ADS)

    Jacob, J. C.; Katz, D. S.; Miller, C. D.; Walia, H.; Williams, R. D.; Djorgovski, S. G.; Graham, M. J.; Mahabal, A. A.; Babu, G. J.; vanden Berk, D. E.; Nichol, R.

    2005-12-01

    The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the ``hyperatlas'' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.

  11. Computational fluid dynamics for propulsion technology: Geometric grid visualization in CFD-based propulsion technology research

    NASA Technical Reports Server (NTRS)

    Ziebarth, John P.; Meyer, Doug

    1992-01-01

    The coordination is examined of necessary resources, facilities, and special personnel to provide technical integration activities in the area of computational fluid dynamics applied to propulsion technology. Involved is the coordination of CFD activities between government, industry, and universities. Current geometry modeling, grid generation, and graphical methods are established to use in the analysis of CFD design methodologies.

  12. Evidence for Feature and Location Learning in Human Visual Perceptual Learning

    ERIC Educational Resources Information Center

    Moreno-Fernández, María Manuela; Salleh, Nurizzati Mohd; Prados, Jose

    2015-01-01

    In Experiment 1, human participants were pre-exposed to two similar checkerboard grids (AX and X) in alternation, and to a third grid (BX) in a separate block of trials. In a subsequent test, the unique feature A was better detected than the feature B when they were presented in the same location during the pre-exposure and test phases. However,…

  13. Colored floaters as a manifestation of digoxin toxicity.

    PubMed

    Shi, Lynn; Sun, Linus D; Odel, Jeffrey G

    2018-06-01

    Since its report in one patient more than 70 years ago, digitalis-induced colored muscae volitantes have not surfaced again in the literature. We report here a case of digoxin induced colored floaters. An 89-year-old man on 0.25 mg digoxin daily developed visual hallucinations and colored floaters. He had floaters in the past but now they were in various colors including yellow, green, blue and red, though predominantly in yellow. These "weirdly" shaped little particles wiggled around as if in a viscous solution and casted shadows in his vision. He also saw geometric shapes, spirals, and cross hatch patterns of various colors that moved and undulated, especially on wallpaper. Ophthalmic examination revealed reduced visual acuity, poor color vision especially in his left eye, along with central depression on Amsler grid and Humphrey visual field in his left eye. Discontinuation of digoxin resulted in complete resolution of his visual symptoms. On subsequent ophthalmic examination, the patient's visual acuity, field testing and color vision improved and he had normal Amsler grid test results. Colored floaters may occur in patients taking cardiac glycosides but this association has not been explored. Unlike optical illusions and visual hallucinations, floaters are entoptic phenomena casting a physical shadow upon the retina and their coloring likely arise from retinal dysfunction. Colored floaters may be a more common visual phenomenon than realized.

  14. Integrating DICOM structure reporting (SR) into the medical imaging informatics data grid

    NASA Astrophysics Data System (ADS)

    Lee, Jasper; Le, Anh; Liu, Brent

    2008-03-01

    The Medical Imaging Informatics (MI2) Data Grid developed at the USC Image Processing and Informatics Laboratory enables medical images to be shared securely between multiple imaging centers. Current applications include an imaging-based clinical trial setting where multiple field sites perform image acquisition and a centralized radiology core performs image analysis, often using computer-aided diagnosis tools (CAD) that generate a DICOM-SR to report their findings and measurements. As more and more CAD tools are being developed in the radiology field, the generated DICOM Structure Reports (SR) holding key radiological findings and measurements that are not part of the DICOM image need to be integrated into the existing Medical Imaging Informatics Data Grid with the corresponding imaging studies. We will discuss the significance and method involved in adapting DICOM-SR into the Medical Imaging Informatics Data Grid. The result is a MI2 Data Grid repository from which users can send and receive DICOM-SR objects based on the imaging-based clinical trial application. The services required to extract and categorize information from the structured reports will be discussed, and the workflow to store and retrieve a DICOM-SR file into the existing MI2 Data Grid will be shown.

  15. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    DOE PAGES

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-01-01

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less

  16. NAS Grid Benchmarks: A Tool for Grid Space Exploration

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; VanderWijngaart, Rob F.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We present an approach for benchmarking services provided by computational Grids. It is based on the NAS Parallel Benchmarks (NPB) and is called NAS Grid Benchmark (NGB) in this paper. We present NGB as a data flow graph encapsulating an instance of an NPB code in each graph node, which communicates with other nodes by sending/receiving initialization data. These nodes may be mapped to the same or different Grid machines. Like NPB, NGB will specify several different classes (problem sizes). NGB also specifies the generic Grid services sufficient for running the bench-mark. The implementor has the freedom to choose any specific Grid environment. However, we describe a reference implementation in Java, and present some scenarios for using NGB.

  17. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-06-23

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less

  18. Reliable Detection and Smart Deletion of Malassez Counting Chamber Grid in Microscopic White Light Images for Microbiological Applications.

    PubMed

    Denimal, Emmanuel; Marin, Ambroise; Guyot, Stéphane; Journaux, Ludovic; Molin, Paul

    2015-08-01

    In biology, hemocytometers such as Malassez slides are widely used and are effective tools for counting cells manually. In a previous work, a robust algorithm was developed for grid extraction in Malassez slide images. This algorithm was evaluated on a set of 135 images and grids were accurately detected in most cases, but there remained failures for the most difficult images. In this work, we present an optimization of this algorithm that allows for 100% grid detection and a 25% improvement in grid positioning accuracy. These improvements make the algorithm fully reliable for grid detection. This optimization also allows complete erasing of the grid without altering the cells, which eases their segmentation.

  19. The Climate-G Portal: a Grid Enabled Scientifc Gateway for Climate Change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni

    2010-05-01

    Grid portals are web gateways aiming at concealing the underlying infrastructure through a pervasive, transparent, user-friendly, ubiquitous and seamless access to heterogeneous and geographical spread resources (i.e. storage, computational facilities, services, sensors, network, databases). Definitively they provide an enhanced problem-solving environment able to deal with modern, large scale scientific and engineering problems. Scientific gateways are able to introduce a revolution in the way scientists and researchers organize and carry out their activities. Access to distributed resources, complex workflow capabilities, and community-oriented functionalities are just some of the features that can be provided by such a web-based environment. In the context of the EGEE NA4 Earth Science Cluster, Climate-G is a distributed testbed focusing on climate change research topics. The Euro-Mediterranean Center for Climate Change (CMCC) is actively participating in the testbed providing the scientific gateway (Climate-G Portal) to access to the entire infrastructure. The Climate-G Portal has to face important and critical challenges as well as has to satisfy and address key requirements. In the following, the most relevant ones are presented and discussed. Transparency: the portal has to provide a transparent access to the underlying infrastructure preventing users from dealing with low level details and the complexity of a distributed grid environment. Security: users must be authenticated and authorized on the portal to access and exploit portal functionalities. A wide set of roles is needed to clearly assign the proper one to each user. The access to the computational grid must be completely secured, since the target infrastructure to run jobs is a production grid environment. A security infrastructure (based on X509v3 digital certificates) is strongly needed. Pervasivity and ubiquity: the access to the system must be pervasive and ubiquitous. This is easily true due to the nature of the needed web approach. Usability and simplicity: the portal has to provide simple, high level and user friendly interfaces to ease the access and exploitation of the entire system. Coexistence of general purpose and domain oriented services: along with general purpose services (file transfer, job submission, etc.), the portal has to provide domain based services and functionalities. Subsetting of data, visualization of 2D maps around a virtual globe, delivery of maps through OGC compliant interfaces (i.e. Web Map Service - WMS) are just some examples. Since april 2009, about 70 users (85% coming from the climate change community) got access to the portal. A key challenge of this work is the idea to provide users with an integrated working environment, that is a place where scientists can find huge amount of data, complete metadata support, a wide set of data access services, data visualization and analysis tools, easy access to the underlying grid infrastructure and advanced monitoring interfaces.

  20. Simulation of Etching in Chlorine Discharges Using an Integrated Feature Evolution-Plasma Model

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Bose, Deepak; Govindan, T. R.; Meyyappan, M.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    To better utilize its vast collection of heterogeneous resources that are geographically distributed across the United States, NASA is constructing a computational grid called the Information Power Grid (IPG). This paper describes various tools and techniques that we are developing to measure and improve the performance of a broad class of NASA applications when run on the IPG. In particular, we are investigating the areas of grid benchmarking, grid monitoring, user-level application scheduling, and decentralized system-level scheduling.

  1. Visual Perception of Touchdown Point During Simulated Landing

    ERIC Educational Resources Information Center

    Palmisano, Stephen; Gillam, Barbara

    2005-01-01

    Experiments examined the accuracy of visual touchdown point perception during oblique descents (1.5?-15?) toward a ground plane consisting of (a) randomly positioned dots, (b) a runway outline, or (c) a grid. Participants judged whether the perceived touchdown point was above or below a probe that appeared at a random position following each…

  2. Designing a Visual Factors-Based Screen Display Interface: The New Role of the Graphic Technologist.

    ERIC Educational Resources Information Center

    Faiola, Tony; DeBloois, Michael L.

    1988-01-01

    Discusses the role of the graphic technologist in preparing computer screen displays for interactive videodisc systems, and suggests screen design guidelines. Topics discussed include the grid system; typography; visual factors research; color; course mobility through branching and software menus; and a model of course integration. (22 references)…

  3. PREVIEW: Computer Assistance for Visual Management of Forested Landscapes

    Treesearch

    Erik Myklestad; J. Alan Wagar

    1976-01-01

    The PREVIEW computer program facilitates visual management of forested landscapes by generating perspective drawings that show proposed timber harvesting and regrowth throughout a rotation. Drawings show how changes would appear from selected viewing points and show landscapes as either a grid of distorted squares or by symbols representing trees, clearings, water,...

  4. Mash-up of techniques between data crawling/transfer, data preservation/stewardship and data processing/visualization technologies on a science cloud system designed for Earth and space science: a report of successful operation and science projects of the NICT Science Cloud

    NASA Astrophysics Data System (ADS)

    Murata, K. T.

    2014-12-01

    Data-intensive or data-centric science is 4th paradigm after observational and/or experimental science (1st paradigm), theoretical science (2nd paradigm) and numerical science (3rd paradigm). Science cloud is an infrastructure for 4th science methodology. The NICT science cloud is designed for big data sciences of Earth, space and other sciences based on modern informatics and information technologies [1]. Data flow on the cloud is through the following three techniques; (1) data crawling and transfer, (2) data preservation and stewardship, and (3) data processing and visualization. Original tools and applications of these techniques have been designed and implemented. We mash up these tools and applications on the NICT Science Cloud to build up customized systems for each project. In this paper, we discuss science data processing through these three steps. For big data science, data file deployment on a distributed storage system should be well designed in order to save storage cost and transfer time. We developed a high-bandwidth virtual remote storage system (HbVRS) and data crawling tool, NICTY/DLA and Wide-area Observation Network Monitoring (WONM) system, respectively. Data files are saved on the cloud storage system according to both data preservation policy and data processing plan. The storage system is developed via distributed file system middle-ware (Gfarm: GRID datafarm). It is effective since disaster recovery (DR) and parallel data processing are carried out simultaneously without moving these big data from storage to storage. Data files are managed on our Web application, WSDBank (World Science Data Bank). The big-data on the cloud are processed via Pwrake, which is a workflow tool with high-bandwidth of I/O. There are several visualization tools on the cloud; VirtualAurora for magnetosphere and ionosphere, VDVGE for google Earth, STICKER for urban environment data and STARStouch for multi-disciplinary data. There are 30 projects running on the NICT Science Cloud for Earth and space science. In 2003 56 refereed papers were published. At the end, we introduce a couple of successful results of Earth and space sciences using these three techniques carried out on the NICT Sciences Cloud. [1] http://sc-web.nict.go.jp

  5. AMP: a science-driven web-based application for the TeraGrid

    NASA Astrophysics Data System (ADS)

    Woitaszek, M.; Metcalfe, T.; Shorrock, I.

    The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.

  6. The National Grid Project: A system overview

    NASA Technical Reports Server (NTRS)

    Gaither, Adam; Gaither, Kelly; Jean, Brian; Remotigue, Michael; Whitmire, John; Soni, Bharat; Thompson, Joe; Dannenhoffer,, John; Weatherill, Nigel

    1995-01-01

    The National Grid Project (NGP) is a comprehensive numerical grid generation software system that is being developed at the National Science Foundation (NSF) Engineering Research Center (ERC) for Computational Field Simulation (CFS) at Mississippi State University (MSU). NGP is supported by a coalition of U.S. industries and federal laboratories. The objective of the NGP is to significantly decrease the amount of time it takes to generate a numerical grid for complex geometries and to increase the quality of these grids to enable computational field simulations for applications in industry. A geometric configuration can be discretized into grids (or meshes) that have two fundamental forms: structured and unstructured. Structured grids are formed by intersecting curvilinear coordinate lines and are composed of quadrilateral (2D) and hexahedral (3D) logically rectangular cells. The connectivity of a structured grid provides for trivial identification of neighboring points by incrementing coordinate indices. Unstructured grids are composed of cells of any shape (commonly triangles, quadrilaterals, tetrahedra and hexahedra), but do not have trivial identification of neighbors by incrementing an index. For unstructured grids, a set of points and an associated connectivity table is generated to define unstructured cell shapes and neighboring points. Hybrid grids are a combination of structured grids and unstructured grids. Chimera (overset) grids are intersecting or overlapping structured grids. The NGP system currently provides a user interface that integrates both 2D and 3D structured and unstructured grid generation, a solid modeling topology data management system, an internal Computer Aided Design (CAD) system based on Non-Uniform Rational B-Splines (NURBS), a journaling language, and a grid/solution visualization system.

  7. Real-time Social Media Data Analytics for Situational Awareness of the Electric Grid

    NASA Astrophysics Data System (ADS)

    Mao, H.; Chinthavali, S.; Lee, S.; Shankar, M.; Thiagarajan, S.

    2016-12-01

    With the increasing frequency of extreme events due to climate change, wide area situational awareness (SA) of the electric grid has become a primary need for federal agencies like DOE,FEMA etc. for emergency preparedness and recovery purposes. While several sensor feeds from Genscape, GridEye, PMUs provide a comprehensive view of the transmission grid, national-scale situational awareness tools are still relying on utility websites for outage information at a distribution level. The inconsistency and the variety in outage website's data formats makes this approach unreliable and also incurs huge software maintenance costs. Social media has emerged as a great medium for the utilities to share outage information with their customers. Despite their potential usefulness, extracting relevant data from these social media data-streams is challenging due to the inherent noise and irrelevant information such as tips to customers during storms, marketing, etc. In this study, we implement a practical and novel machine learning based data-analytics pipeline (Fig 1) for SA, which extracts real-time tweets from around 300 utility companies, processes these tweets using keyword filtering and Naïve-Bayes text classifier trained using supervised learning techniques to detect only relevant tweets. We validated the results by comparing it with the results identified by a human analyst for a period of 48 hours, and it showed around 98.3% accuracy. In addition to the tweets posted by utility companies, millions of twitter users, who are considered as human "social sensors", report power outages online. Therefore, we use Twitter Streaming API to extract real-time tweets containing keywords such as "power outage", "blackout", and "power cuts". An advanced natural language processing technique is proposed to identify the geo-locations associated with this power outage data. The detected tweets are visualized as a color-coded state and a county US map based on the number of outage tweets posted. Therefore, by analyzing a large amount of tweets posted by utilities and the general public, our approach can detect real-time power outages at a national-scale. This framework has been integrated into existing SA tools such as VERDE, EARSS and EAGLE-I, which is deployed by the Oak Ridge National Laboratory for the DOE.

  8. The Role of Motor Learning in Spatial Adaptation near a Tool

    PubMed Central

    Brown, Liana E.; Doole, Robert; Malfait, Nicole

    2011-01-01

    Some visual-tactile (bimodal) cells have visual receptive fields (vRFs) that overlap and extend moderately beyond the skin of the hand. Neurophysiological evidence suggests, however, that a vRF will grow to encompass a hand-held tool following active tool use but not after passive holding. Why does active tool use, and not passive holding, lead to spatial adaptation near a tool? We asked whether spatial adaptation could be the result of motor or visual experience with the tool, and we distinguished between these alternatives by isolating motor from visual experience with the tool. Participants learned to use a novel, weighted tool. The active training group received both motor and visual experience with the tool, the passive training group received visual experience with the tool, but no motor experience, and finally, a no-training control group received neither visual nor motor experience using the tool. After training, we used a cueing paradigm to measure how quickly participants detected targets, varying whether the tool was placed near or far from the target display. Only the active training group detected targets more quickly when the tool was placed near, rather than far, from the target display. This effect of tool location was not present for either the passive-training or control groups. These results suggest that motor learning influences how visual space around the tool is represented. PMID:22174944

  9. Case-based fracture image retrieval.

    PubMed

    Zhou, Xin; Stern, Richard; Müller, Henning

    2012-05-01

    Case-based fracture image retrieval can assist surgeons in decisions regarding new cases by supplying visually similar past cases. This tool may guide fracture fixation and management through comparison of long-term outcomes in similar cases. A fracture image database collected over 10 years at the orthopedic service of the University Hospitals of Geneva was used. This database contains 2,690 fracture cases associated with 43 classes (based on the AO/OTA classification). A case-based retrieval engine was developed and evaluated using retrieval precision as a performance metric. Only cases in the same class as the query case are considered as relevant. The scale-invariant feature transform (SIFT) is used for image analysis. Performance evaluation was computed in terms of mean average precision (MAP) and early precision (P10, P30). Retrieval results produced with the GNU image finding tool (GIFT) were used as a baseline. Two sampling strategies were evaluated. One used a dense 40 × 40 pixel grid sampling, and the second one used the standard SIFT features. Based on dense pixel grid sampling, three unsupervised feature selection strategies were introduced to further improve retrieval performance. With dense pixel grid sampling, the image is divided into 1,600 (40 × 40) square blocks. The goal is to emphasize the salient regions (blocks) and ignore irrelevant regions. Regions are considered as important when a high variance of the visual features is found. The first strategy is to calculate the variance of all descriptors on the global database. The second strategy is to calculate the variance of all descriptors for each case. A third strategy is to perform a thumbnail image clustering in a first step and then to calculate the variance for each cluster. Finally, a fusion between a SIFT-based system and GIFT is performed. A first comparison on the selection of sampling strategies using SIFT features shows that dense sampling using a pixel grid (MAP = 0.18) outperformed the SIFT detector-based sampling approach (MAP = 0.10). In a second step, three unsupervised feature selection strategies were evaluated. A grid parameter search is applied to optimize parameters for feature selection and clustering. Results show that using half of the regions (700 or 800) obtains the best performance for all three strategies. Increasing the number of clusters in clustering can also improve the retrieval performance. The SIFT descriptor variance in each case gave the best indication of saliency for the regions (MAP = 0.23), better than the other two strategies (MAP = 0.20 and 0.21). Combining GIFT (MAP = 0.23) and the best SIFT strategy (MAP = 0.23) produced significantly better results (MAP = 0.27) than each system alone. A case-based fracture retrieval engine was developed and is available for online demonstration. SIFT is used to extract local features, and three feature selection strategies were introduced and evaluated. A baseline using the GIFT system was used to evaluate the salient point-based approaches. Without supervised learning, SIFT-based systems with optimized parameters slightly outperformed the GIFT system. A fusion of the two approaches shows that the information contained in the two approaches is complementary. Supervised learning on the feature space is foreseen as the next step of this study.

  10. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    NASA Astrophysics Data System (ADS)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term forecasts (0-20 min ahead) to improve optimization and control of equipment on distribution feeders with high penetration of solar. Leveraging such tools that have seen extensive use in the atmospheric sciences supports the development of accurate physics-based solar forecast models. Directions for future research are also provided.

  11. Energy Management Challenges and Opportunities with Increased Intermittent Renewable Generation on the California Electrical Grid

    NASA Astrophysics Data System (ADS)

    Eichman, Joshua David

    Renewable resources including wind, solar, geothermal, biomass, hydroelectric, wave and tidal, represent an opportunity for environmentally preferred generation of electricity that also increases energy security and independence. California is very proactive in encouraging the implementation of renewable energy in part through legislation like Assembly Bill 32 and the development and execution of Renewable Portfolio Standards (RPS); however renewable technologies are not without challenges. All renewable resources have some resource limitations, be that from location, capacity, cost or availability. Technologies like wind and solar are intermittent in nature but represent one of the most abundant resources for generating renewable electricity. If RPS goals are to be achieved high levels of intermittent renewables must be considered. This work explores the effects of high penetration of renewables on a grid system, with respect to resource availability and identifies the key challenges from the perspective of the grid to introducing these resources. The HiGRID tool was developed for this analysis because no other tool could explore grid operation, while maintaining system reliability, with a diverse set of renewable resources and a wide array of complementary technologies including: energy efficiency, demand response, energy storage technologies and electric transportation. This tool resolves the hourly operation of conventional generation resources (nuclear, coal, geothermal, natural gas and hydro). The resulting behavior from introducing additional renewable resources and the lifetime costs for each technology is analyzed.

  12. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  13. Notes on Accuracy of Finite-Volume Discretization Schemes on Irregular Grids

    NASA Technical Reports Server (NTRS)

    Diskin, Boris; Thomas, James L.

    2011-01-01

    Truncation-error analysis is a reliable tool in predicting convergence rates of discretization errors on regular smooth grids. However, it is often misleading in application to finite-volume discretization schemes on irregular (e.g., unstructured) grids. Convergence of truncation errors severely degrades on general irregular grids; a design-order convergence can be achieved only on grids with a certain degree of geometric regularity. Such degradation of truncation-error convergence does not necessarily imply a lower-order convergence of discretization errors. In these notes, irregular-grid computations demonstrate that the design-order discretization-error convergence can be achieved even when truncation errors exhibit a lower-order convergence or, in some cases, do not converge at all.

  14. Investigating Time-Varying Drivers of Grid Project Emissions Impacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Emily L.; Thayer, Brandon L.; Pal, Seemita

    The emissions consequences of smart grid technologies depend heavily on their context and vary not only by geographical location, but by time of year. The same technology operated to meet the same objective may increase the emissions associated with energy generation for part of the year and decrease emissions during other times. The Grid Project Impact Quantification (GridPIQ) tool provides the ability to estimate these seasonal variations and garner insight into the time-varying drivers of grid project emissions impacts. This work leverages GridPIQ to examine the emissions implications across years and seasons of adding energy storage technology to reduce dailymore » peak demand in California and New York.« less

  15. Intraoperative visualization and assessment of electromagnetic tracking error

    NASA Astrophysics Data System (ADS)

    Harish, Vinyas; Ungi, Tamas; Lasso, Andras; MacDonald, Andrew; Nanji, Sulaiman; Fichtinger, Gabor

    2015-03-01

    Electromagnetic tracking allows for increased flexibility in designing image-guided interventions, however it is well understood that electromagnetic tracking is prone to error. Visualization and assessment of the tracking error should take place in the operating room with minimal interference with the clinical procedure. The goal was to achieve this ideal in an open-source software implementation in a plug and play manner, without requiring programming from the user. We use optical tracking as a ground truth. An electromagnetic sensor and optical markers are mounted onto a stylus device, pivot calibrated for both trackers. Electromagnetic tracking error is defined as difference of tool tip position between electromagnetic and optical readings. Multiple measurements are interpolated into the thin-plate B-spline transform visualized in real time using 3D Slicer. All tracked devices are used in a plug and play manner through the open-source SlicerIGT and PLUS extensions of the 3D Slicer platform. Tracking error was measured multiple times to assess reproducibility of the method, both with and without placing ferromagnetic objects in the workspace. Results from exhaustive grid sampling and freehand sampling were similar, indicating that a quick freehand sampling is sufficient to detect unexpected or excessive field distortion in the operating room. The software is available as a plug-in for the 3D Slicer platforms. Results demonstrate potential for visualizing electromagnetic tracking error in real time for intraoperative environments in feasibility clinical trials in image-guided interventions.

  16. Application of Approximate Pattern Matching in Two Dimensional Spaces to Grid Layout for Biochemical Network Maps

    PubMed Central

    Inoue, Kentaro; Shimozono, Shinichi; Yoshida, Hideaki; Kurata, Hiroyuki

    2012-01-01

    Background For visualizing large-scale biochemical network maps, it is important to calculate the coordinates of molecular nodes quickly and to enhance the understanding or traceability of them. The grid layout is effective in drawing compact, orderly, balanced network maps with node label spaces, but existing grid layout algorithms often require a high computational cost because they have to consider complicated positional constraints through the entire optimization process. Results We propose a hybrid grid layout algorithm that consists of a non-grid, fast layout (preprocessor) algorithm and an approximate pattern matching algorithm that distributes the resultant preprocessed nodes on square grid points. To demonstrate the feasibility of the hybrid layout algorithm, it is characterized in terms of the calculation time, numbers of edge-edge and node-edge crossings, relative edge lengths, and F-measures. The proposed algorithm achieves outstanding performances compared with other existing grid layouts. Conclusions Use of an approximate pattern matching algorithm quickly redistributes the laid-out nodes by fast, non-grid algorithms on the square grid points, while preserving the topological relationships among the nodes. The proposed algorithm is a novel use of the pattern matching, thereby providing a breakthrough for grid layout. This application program can be freely downloaded from http://www.cadlive.jp/hybridlayout/hybridlayout.html. PMID:22679486

  17. Application of approximate pattern matching in two dimensional spaces to grid layout for biochemical network maps.

    PubMed

    Inoue, Kentaro; Shimozono, Shinichi; Yoshida, Hideaki; Kurata, Hiroyuki

    2012-01-01

    For visualizing large-scale biochemical network maps, it is important to calculate the coordinates of molecular nodes quickly and to enhance the understanding or traceability of them. The grid layout is effective in drawing compact, orderly, balanced network maps with node label spaces, but existing grid layout algorithms often require a high computational cost because they have to consider complicated positional constraints through the entire optimization process. We propose a hybrid grid layout algorithm that consists of a non-grid, fast layout (preprocessor) algorithm and an approximate pattern matching algorithm that distributes the resultant preprocessed nodes on square grid points. To demonstrate the feasibility of the hybrid layout algorithm, it is characterized in terms of the calculation time, numbers of edge-edge and node-edge crossings, relative edge lengths, and F-measures. The proposed algorithm achieves outstanding performances compared with other existing grid layouts. Use of an approximate pattern matching algorithm quickly redistributes the laid-out nodes by fast, non-grid algorithms on the square grid points, while preserving the topological relationships among the nodes. The proposed algorithm is a novel use of the pattern matching, thereby providing a breakthrough for grid layout. This application program can be freely downloaded from http://www.cadlive.jp/hybridlayout/hybridlayout.html.

  18. The Variable Grid Method, an Approach for the Simultaneous Visualization and Assessment of Spatial Trends and Uncertainty

    NASA Astrophysics Data System (ADS)

    Rose, K.; Glosser, D.; Bauer, J. R.; Barkhurst, A.

    2015-12-01

    The products of spatial analyses that leverage the interpolation of sparse, point data to represent continuous phenomena are often presented without clear explanations of the uncertainty associated with the interpolated values. As a result, there is frequently insufficient information provided to effectively support advanced computational analyses and individual research and policy decisions utilizing these results. This highlights the need for a reliable approach capable of quantitatively producing and communicating spatial data analyses and their inherent uncertainties for a broad range of uses. To address this need, we have developed the Variable Grid Method (VGM), and associated Python tool, which is a flexible approach that can be applied to a variety of analyses and use case scenarios where users need a method to effectively study, evaluate, and analyze spatial trends and patterns while communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations, etc. We will present examples of our research utilizing the VGM to quantify key spatial trends and patterns for subsurface data interpolations and their uncertainties and leverage these results to evaluate storage estimates and potential impacts associated with underground injection for CO2 storage and unconventional resource production and development. The insights provided by these examples identify how the VGM can provide critical information about the relationship between uncertainty and spatial data that is necessary to better support their use in advance computation analyses and informing research, management and policy decisions.

  19. GIS Services, Visualization Products, and Interoperability at the National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center (NCDC)

    NASA Astrophysics Data System (ADS)

    Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.

    2007-12-01

    The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.

  20. Visualization of Underfill Flow in Ball Grid Array (BGA) using Particle Image Velocimetry (PIV)

    NASA Astrophysics Data System (ADS)

    Ng, Fei Chong; Abas, Aizat; Abustan, Ismail; Remy Rozainy, Z. Mohd; Abdullah, MZ; Jamaludin, Ali b.; Kon, Sharon Melissa

    2018-05-01

    This paper presents the experimental methodology using particle image velocimetry (PIV) to study the underfill process of ball grid array (BGA) chip package. PIV is a non-intrusive approach to visualize the flow behavior of underfill across the solder ball array. The BGA model of three different configurations – perimeter, middle empty and full array – were studied in current research. Through PIV experimental works, the underfill velocity distribution and vector fields for each BGA models were successfully obtained. It is found that perimeter has the shortest filling time resulting to a higher underfill velocity. Therefore, it is concluded that the flow behavior of underfill in BGA can be justified thoroughly with the aid of PIV.

  1. Experimental Validation of an Ion Beam Optics Code with a Visualized Ion Thruster

    NASA Astrophysics Data System (ADS)

    Nakayama, Yoshinori; Nakano, Masakatsu

    For validation of an ion beam optics code, the behavior of ion beam optics was experimentally observed and evaluated with a two-dimensional visualized ion thruster (VIT). Since the observed beam focus positions, sheath positions and measured ion beam currents were in good agreement with the numerical results, it was confirmed that the numerical model of this code was appropriated. In addition, it was also confirmed that the beam focus position was moved on center axis of grid hole according to the applied grid potentials, which differs from conventional understanding/assumption. The VIT operations may be useful not only for the validation of ion beam optics codes but also for the fundamental and intuitive understanding of the Child Law Sheath theory.

  2. Designing Wind and Solar Power Purchase Agreements to Support Grid Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Neill, Barbara; Chernyakhovskiy, Ilya

    Power purchase agreements (PPAs) represent one of many institutional tools that power systems can use to improve grid services from variable renewable energy (VRE) generators. This fact sheet introduces the concept of PPAs for VRE generators and provides a brief summary of key PPA components that can facilitate VRE generators to enhance grid stability and serve as a source of power system flexibility.

  3. Smagglce: Surface Modeling and Grid Generation for Iced Airfoils: Phase 1 Results

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Choo, Yung K.; Braun, Donald C.; Baez, Marivell; Gnepp, Steven

    1999-01-01

    SmaggIce (Surface Modeling and Grid Generation for Iced Airfoils) is a software toolkit used in the process of aerodynamic performance prediction of iced airfoils with grid-based Computational Fluid Dynamics (CFD). It includes tools for data probing, boundary smoothing, domain decomposition, and structured grid generation and refinement. SmaggIce provides the underlying computations to perform these functions, a GUI (Graphical User Interface) to control and interact with those functions, and graphical displays of results, it is being developed at NASA Glenn Research Center. This paper discusses the overall design of SmaggIce as well as what has been implemented in Phase 1. Phase 1 results provide two types of software tools: interactive ice shape probing and interactive ice shape control. The ice shape probing tools will provide aircraft icing engineers and scientists with an interactive means to measure the physical characteristics of ice shapes. On the other hand, the ice shape control features of SmaggIce will allow engineers to examine input geometry data, correct or modify any deficiencies in the geometry, and perform controlled systematic smoothing to a level that will make the CFD process manageable.

  4. Geometry definition and grid generation for a complete fighter aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, T. A.

    1986-01-01

    Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. This paper presents a procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.

  5. Geometry definition and grid generation for a complete fighter aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, Thomas A.

    1986-01-01

    Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. A procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system is presented. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.

  6. Learning GIS and exploring geolocated data with the all-in-one Geolokit toolbox for Google Earth

    NASA Astrophysics Data System (ADS)

    Watlet, A.; Triantafyllou, A.; Bastin, C.

    2016-12-01

    GIS software are today's essential tools to gather and visualize geological data, to apply spatial and temporal analysis and finally, to create and share interactive maps for further investigations in geosciences. Such skills are especially essential to learn for students who go through fieldtrips, samples collections or field experiments. However, time is generally missing to teach in detail all the aspects of visualizing geolocated geoscientific data. For these purposes, we developed Geolokit: a lightweight freeware dedicated to geodata visualization and written in Python, a high-level, cross-platform programming language. Geolokit software is accessible through a graphical user interface, designed to run in parallel with Google Earth, benefitting from the numerous interactive capabilities. It is designed as a very user-friendly toolbox that allows `geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to visualize these into the Google Earth environment using KML code; with no require of third party software, except Google Earth itself. Geolokit comes with a large number of geosciences labels, symbols, colours and placemarks and is applicable to display several types of geolocated data, including: Multi-points datasets Automatically computed contours of multi-points datasets via several interpolation methods Discrete planar and linear structural geology data in 2D or 3D supporting large range of structures input format Clustered stereonets and rose diagrams 2D cross-sections as vertical sections Georeferenced maps and grids with user defined coordinates Field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS In the end, Geolokit is helpful for quickly visualizing and exploring data without losing too much time in the numerous capabilities of GIS software suites. We are looking for students and teachers to discover all the functionalities of Geolokit. As this project is under development and planned to be open source, we are definitely looking to discussions regarding particular needs or ideas, and to contributions in the Geolokit project.

  7. Direct Mask Overlay Inspection

    NASA Astrophysics Data System (ADS)

    Hsia, Liang-Choo; Su, Lo-Soun

    1983-11-01

    In this paper, we present a mask inspection methodology and procedure that involves direct X-Y measurements. A group of dice is selected for overlay measurement; four measurement targets were laid out in the kerf of each die. The measured coordinates are then fit-ted to either a "historical" grid, which reflects the individual tool bias, or to an ideal grid squares fashion. Measurements are done using a Nikon X-Y laser interferometric measurement system, which provides a reference grid. The stability of the measurement system is essential. We then apply appropriate statistics to the residual after the fit to determine the overlay performance. Statistical methods play an important role in the product disposition. The acceptance criterion is, however, a compromise between the cost for mask making and the final device yield. In order to satisfy the demand on mask houses for quality of masks and high volume, mixing lithographic tools in mask making has become more popular, in particular, mixing optical and E-beam tools. In this paper, we also discuss the inspection procedure for mixing different lithographic tools.

  8. Sub-Second Parallel State Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Rice, Mark J.; Glaesemann, Kurt R.

    This report describes the performance of Pacific Northwest National Laboratory (PNNL) sub-second parallel state estimation (PSE) tool using the utility data from the Bonneville Power Administrative (BPA) and discusses the benefits of the fast computational speed for power system applications. The test data were provided by BPA. They are two-days’ worth of hourly snapshots that include power system data and measurement sets in a commercial tool format. These data are extracted out from the commercial tool box and fed into the PSE tool. With the help of advanced solvers, the PSE tool is able to solve each BPA hourly statemore » estimation problem within one second, which is more than 10 times faster than today’s commercial tool. This improved computational performance can help increase the reliability value of state estimation in many aspects: (1) the shorter the time required for execution of state estimation, the more time remains for operators to take appropriate actions, and/or to apply automatic or manual corrective control actions. This increases the chances of arresting or mitigating the impact of cascading failures; (2) the SE can be executed multiple times within time allowance. Therefore, the robustness of SE can be enhanced by repeating the execution of the SE with adaptive adjustments, including removing bad data and/or adjusting different initial conditions to compute a better estimate within the same time as a traditional state estimator’s single estimate. There are other benefits with the sub-second SE, such as that the PSE results can potentially be used in local and/or wide-area automatic corrective control actions that are currently dependent on raw measurements to minimize the impact of bad measurements, and provides opportunities to enhance the power grid reliability and efficiency. PSE also can enable other advanced tools that rely on SE outputs and could be used to further improve operators’ actions and automated controls to mitigate effects of severe events on the grid. The power grid continues to grow and the number of measurements is increasing at an accelerated rate due to the variety of smart grid devices being introduced. A parallel state estimation implementation will have better performance than traditional, sequential state estimation by utilizing the power of high performance computing (HPC). This increased performance positions parallel state estimators as valuable tools for operating the increasingly more complex power grid.« less

  9. Development of a gridded meteorological dataset over Java island, Indonesia 1985-2014.

    PubMed

    Yanto; Livneh, Ben; Rajagopalan, Balaji

    2017-05-23

    We describe a gridded daily meteorology dataset consisting of precipitation, minimum and maximum temperature over Java Island, Indonesia at 0.125°×0.125° (~14 km) resolution spanning 30 years from 1985-2014. Importantly, this data set represents a marked improvement from existing gridded data sets over Java with higher spatial resolution, derived exclusively from ground-based observations unlike existing satellite or reanalysis-based products. Gap-infilling and gridding were performed via the Inverse Distance Weighting (IDW) interpolation method (radius, r, of 25 km and power of influence, α, of 3 as optimal parameters) restricted to only those stations including at least 3,650 days (~10 years) of valid data. We employed MSWEP and CHIRPS rainfall products in the cross-validation. It shows that the gridded rainfall presented here produces the most reasonable performance. Visual inspection reveals an increasing performance of gridded precipitation from grid, watershed to island scale. The data set, stored in a network common data form (NetCDF), is intended to support watershed-scale and island-scale studies of short-term and long-term climate, hydrology and ecology.

  10. Accuracy Analysis for Finite-Volume Discretization Schemes on Irregular Grids

    NASA Technical Reports Server (NTRS)

    Diskin, Boris; Thomas, James L.

    2010-01-01

    A new computational analysis tool, downscaling test, is introduced and applied for studying the convergence rates of truncation and discretization errors of nite-volume discretization schemes on general irregular (e.g., unstructured) grids. The study shows that the design-order convergence of discretization errors can be achieved even when truncation errors exhibit a lower-order convergence or, in some cases, do not converge at all. The downscaling test is a general, efficient, accurate, and practical tool, enabling straightforward extension of verification and validation to general unstructured grid formulations. It also allows separate analysis of the interior, boundaries, and singularities that could be useful even in structured-grid settings. There are several new findings arising from the use of the downscaling test analysis. It is shown that the discretization accuracy of a common node-centered nite-volume scheme, known to be second-order accurate for inviscid equations on triangular grids, degenerates to first order for mixed grids. Alternative node-centered schemes are presented and demonstrated to provide second and third order accuracies on general mixed grids. The local accuracy deterioration at intersections of tangency and in flow/outflow boundaries is demonstrated using the DS tests tailored to examining the local behavior of the boundary conditions. The discretization-error order reduction within inviscid stagnation regions is demonstrated. The accuracy deterioration is local, affecting mainly the velocity components, but applies to any order scheme.

  11. Animating Wall-Bounded Turbulent Smoke via Filament-Mesh Particle-Particle Method.

    PubMed

    Liao, Xiangyun; Si, Weixin; Yuan, Zhiyong; Sun, Hanqiu; Qin, Jing; Wang, Qiong; Heng, Pheng-Ann; Xiangyun Liao; Weixin Si; Zhiyong Yuan; Hanqiu Sun; Jing Qin; Qiong Wang; Pheng-Ann Heng

    2018-03-01

    Turbulent vortices in smoke flows are crucial for a visually interesting appearance. Unfortunately, it is challenging to efficiently simulate these appealing effects in the framework of vortex filament methods. The vortex filaments in grids scheme allows to efficiently generate turbulent smoke with macroscopic vortical structures, but suffers from the projection-related dissipation, and thus the small-scale vortical structures under grid resolution are hard to capture. In addition, this scheme cannot be applied in wall-bounded turbulent smoke simulation, which requires efficiently handling smoke-obstacle interaction and creating vorticity at the obstacle boundary. To tackle above issues, we propose an effective filament-mesh particle-particle (FMPP) method for fast wall-bounded turbulent smoke simulation with ample details. The Filament-Mesh component approximates the smooth long-range interactions by splatting vortex filaments on grid, solving the Poisson problem with a fast solver, and then interpolating back to smoke particles. The Particle-Particle component introduces smoothed particle hydrodynamics (SPH) turbulence model for particles in the same grid, where interactions between particles cannot be properly captured under grid resolution. Then, we sample the surface of obstacles with boundary particles, allowing the interaction between smoke and obstacle being treated as pressure forces in SPH. Besides, the vortex formation region is defined at the back of obstacles, providing smoke particles flowing by the separation particles with a vorticity force to simulate the subsequent vortex shedding phenomenon. The proposed approach can synthesize the lost small-scale vortical structures and also achieve the smoke-obstacle interaction with vortex shedding at obstacle boundaries in a lightweight manner. The experimental results demonstrate that our FMPP method can achieve more appealing visual effects than vortex filaments in grids scheme by efficiently simulating more vivid thin turbulent features.

  12. A Computing Infrastructure for Supporting Climate Studies

    NASA Astrophysics Data System (ADS)

    Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team

    2011-12-01

    Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.

  13. Model Data Interoperability for the United States Integrated Ocean Observing System (IOOS)

    NASA Astrophysics Data System (ADS)

    Signell, Richard P.

    2010-05-01

    Model data interoperability for the United States Integrated Ocean Observing System (IOOS) was initiated with a focused one year project. The problem was that there were many regional and national providers of oceanographic model data; each had unique file conventions, distribution techniques and analysis tools that made it difficult to compare model results and observational data. To solve this problem, a distributed system was built utilizing a customized middleware layer and a common data model. This allowed each model data provider to keep their existing model and data files unchanged, yet deliver model data via web services in a common form. With standards-based applications that used these web services, end users then had a common way to access data from any of the models. These applications included: (1) a 2D mapping and animation using a web browser application, (2) an advanced 3D visualization and animation using a desktop application, and (3) a toolkit for a common scientific analysis environment. Due to the flexibility and low impact of the approach on providers, rapid progress was made. The system was implemented in all eleven US IOOS regions and at the NOAA National Coastal Data Development Center, allowing common delivery of regional and national oceanographic model forecast and archived results that cover all US waters. The system, based heavily on software technology from the NSF-sponsored Unidata Program Center, is applicable to any structured gridded data, not just oceanographic model data. There is a clear pathway to expand the system to include unstructured grid (e.g. triangular grid) data.

  14. Automated CFD Database Generation for a 2nd Generation Glide-Back-Booster

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.; Rogers, Stuart E.; Aftosmis, Michael J.; Pandya, Shishir A.; Ahmad, Jasim U.; Tejmil, Edward

    2003-01-01

    A new software tool, AeroDB, is used to compute thousands of Euler and Navier-Stokes solutions for a 2nd generation glide-back booster in one week. The solution process exploits a common job-submission grid environment using 13 computers located at 4 different geographical sites. Process automation and web-based access to the database greatly reduces the user workload, removing much of the tedium and tendency for user input errors. The database consists of forces, moments, and solution files obtained by varying the Mach number, angle of attack, and sideslip angle. The forces and moments compare well with experimental data. Stability derivatives are also computed using a monotone cubic spline procedure. Flow visualization and three-dimensional surface plots are used to interpret and characterize the nature of computed flow fields.

  15. RGLite, an interface between ROOT and gLite—proof on the grid

    NASA Astrophysics Data System (ADS)

    Malzacher, P.; Manafov, A.; Schwarz, K.

    2008-07-01

    Using the gLitePROOF package it is possible to perform PROOF-based distributed data analysis on the gLite Grid. The LHC experiments managed to run globally distributed Monte Carlo productions on the Grid, now the development of tools for data analysis is in the foreground. To grant access interfaces must be provided. The ROOT/PROOF framework is used as a starting point. Using abstract ROOT classes (TGrid, ...) interfaces can be implemented, via which Grid access from ROOT can be accomplished. A concrete implementation exists for the ALICE Grid environment AliEn. Within the D-Grid project an interface to the common Grid middleware of all LHC experiments, gLite, has been created. Therefore it is possible to query Grid File Catalogues from ROOT for the location of the data to be analysed. Grid jobs can be submitted into a gLite based Grid. The status of the jobs can be asked for, and their results can be obtained.

  16. DICOMGrid: a middleware to integrate PACS and EELA-2 grid infrastructure

    NASA Astrophysics Data System (ADS)

    Moreno, Ramon A.; de Sá Rebelo, Marina; Gutierrez, Marco A.

    2010-03-01

    Medical images provide lots of information for physicians, but the huge amount of data produced by medical image equipments in a modern Health Institution is not completely explored in its full potential yet. Nowadays medical images are used in hospitals mostly as part of routine activities while its intrinsic value for research is underestimated. Medical images can be used for the development of new visualization techniques, new algorithms for patient care and new image processing techniques. These research areas usually require the use of huge volumes of data to obtain significant results, along with enormous computing capabilities. Such qualities are characteristics of grid computing systems such as EELA-2 infrastructure. The grid technologies allow the sharing of data in large scale in a safe and integrated environment and offer high computing capabilities. In this paper we describe the DicomGrid to store and retrieve medical images, properly anonymized, that can be used by researchers to test new processing techniques, using the computational power offered by grid technology. A prototype of the DicomGrid is under evaluation and permits the submission of jobs into the EELA-2 grid infrastructure while offering a simple interface that requires minimal understanding of the grid operation.

  17. Evaluation of grid generation technologies from an applied perspective

    NASA Technical Reports Server (NTRS)

    Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.

    1995-01-01

    An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.

  18. Efficacy of the Amsler Grid Test in Evaluating Glaucomatous Central Visual Field Defects.

    PubMed

    Su, Daniel; Greenberg, Andrew; Simonson, Joseph L; Teng, Christopher C; Liebmann, Jeffrey M; Ritch, Robert; Park, Sung Chul

    2016-04-01

    To investigate the efficacy of the Amsler grid test in detecting central visual field (VF) defects in glaucoma. Prospective, cross-sectional study. Patients with glaucoma with reliable Humphrey 10-2 Swedish Interactive Threshold Algorithm standard VF on the date of enrollment or within the previous 3 months. Amsler grid tests were performed for each eye and were considered "abnormal" if there was any perceived scotoma with missing or blurry grid lines within the central 10 degrees ("Amsler grid scotoma"). An abnormal 10-2 VF was defined as ≥3 adjacent points at P < 0.01 with at least 1 point at P < 0.005 in the same hemifield on the pattern deviation plot. Sensitivity, specificity, and positive and negative predictive values of the Amsler grid scotoma area were calculated with the 10-2 VF as the clinical reference standard. Among eyes with an abnormal 10-2 VF, regression analyses were performed between the Amsler grid scotoma area and the 10-2 VF parameters (mean deviation [MD], scotoma extent [number of test points with P < 0.01 in total deviation map] and scotoma mean depth [mean sensitivity of test points with P < 0.01 in total deviation map]). Sensitivity, specificity, and positive and negative predictive values of the Amsler grid scotoma area. A total of 106 eyes (53 patients) were included (mean ± standard deviation age, 24-2 MD and 10-2 MD = 66±12 years, -9.61±8.64 decibels [dB] and -9.75±9.00 dB, respectively). Sensitivity, specificity, and positive and negative predictive values of the Amsler grid test were 68%, 92%, 97%, and 46%, respectively. Sensitivity was 40% in eyes with 10-2 MD better than -6 dB, 58% in eyes with 10-2 MD between -12 and -6 dB, and 92% in eyes with 10-2 MD worse than -12 dB. The area under the receiver operating characteristic curve of the Amsler grid scotoma area was 0.810 (95% confidence interval, 0.723-0.880, P < 0.001). The Amsler grid scotoma area had the strongest relationship with 10-2 MD (quadratic R(2)=0.681), followed by 10-2 scotoma extent (quadratic R(2)=0.611) and 10-2 scotoma mean depth (quadratic R(2)=0.299) (all P < 0.001). The Amsler grid can be used to screen for moderate to severe central vision loss from glaucoma. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  19. Adaptive refinement tools for tetrahedral unstructured grids

    NASA Technical Reports Server (NTRS)

    Pao, S. Paul (Inventor); Abdol-Hamid, Khaled S. (Inventor)

    2011-01-01

    An exemplary embodiment providing one or more improvements includes software which is robust, efficient, and has a very fast run time for user directed grid enrichment and flow solution adaptive grid refinement. All user selectable options (e.g., the choice of functions, the choice of thresholds, etc.), other than a pre-marked cell list, can be entered on the command line. The ease of application is an asset for flow physics research and preliminary design CFD analysis where fast grid modification is often needed to deal with unanticipated development of flow details.

  20. Young-Person's Guide to Detached-Eddy Simulation Grids

    NASA Technical Reports Server (NTRS)

    Spalart, Philippe R.; Streett, Craig (Technical Monitor)

    2001-01-01

    We give the "philosophy", fairly complete instructions, a sketch and examples of creating Detached-Eddy Simulation (DES) grids from simple to elaborate, with a priority on external flows. Although DES is not a zonal method, flow regions with widely different gridding requirements emerge, and should be accommodated as far as possible if a good use of grid points is to be made. This is not unique to DES. We brush on the time-step choice, on simple pitfalls, and on tools to estimate whether a simulation is well resolved.

  1. "It's Not Much of a Life": The Benefits and Ethics of Using Life History Methods With People Who Inject Drugs in Qualitative Harm Reduction Research.

    PubMed

    Harris, Magdalena; Rhodes, Tim

    2018-06-01

    A life history approach enables study of how risk or health protection is shaped by critical transitions and turning points in a life trajectory and in the context of social environment and time. We employed visual and narrative life history methods with people who inject drugs to explore how hepatitis C protection was enabled and maintained over the life course. We overview our methodological approach, with a focus on the ethics in practice of using life history timelines and life-grids with 37 participants. The life-grid evoked mixed emotions for participants: pleasure in receiving a personalized visual history and pain elicited by its contents. A minority managed this pain with additional heroin use. The methodological benefits of using life history methods and visual aids have been extensively reported. Crucial to consider are the ethical implications of this process, particularly for people who lack socially ascribed markers of a "successful life."

  2. Comprehensive Smart Grid Planning in a Regulated Utility Environment

    NASA Astrophysics Data System (ADS)

    Turner, Matthew; Liao, Yuan; Du, Yan

    2015-06-01

    This paper presents the tools and exercises used during the Kentucky Smart Grid Roadmap Initiative in a collaborative electric grid planning process involving state regulators, public utilities, academic institutions, and private interest groups. The mandate of the initiative was to assess the existing condition of smart grid deployments in Kentucky, to enhance understanding of smart grid concepts by stakeholders, and to develop a roadmap for the deployment of smart grid technologies by the jurisdictional utilities of Kentucky. Through involvement of many important stakeholder groups, the resultant Smart Grid Deployment Roadmap proposes an aggressive yet achievable strategy and timetable designed to promote enhanced availability, security, efficiency, reliability, affordability, sustainability and safety of the electricity supply throughout the state while maintaining Kentucky's nationally competitive electricity rates. The models and methods developed for this exercise can be utilized as a systematic process for the planning of coordinated smart grid deployments.

  3. ORBKIT: A modular python toolbox for cross-platform postprocessing of quantum chemical wavefunction data.

    PubMed

    Hermann, Gunter; Pohl, Vincent; Tremblay, Jean Christophe; Paulus, Beate; Hege, Hans-Christian; Schild, Axel

    2016-06-15

    ORBKIT is a toolbox for postprocessing electronic structure calculations based on a highly modular and portable Python architecture. The program allows computing a multitude of electronic properties of molecular systems on arbitrary spatial grids from the basis set representation of its electronic wavefunction, as well as several grid-independent properties. The required data can be extracted directly from the standard output of a large number of quantum chemistry programs. ORBKIT can be used as a standalone program to determine standard quantities, for example, the electron density, molecular orbitals, and derivatives thereof. The cornerstone of ORBKIT is its modular structure. The existing basic functions can be arranged in an individual way and can be easily extended by user-written modules to determine any other derived quantity. ORBKIT offers multiple output formats that can be processed by common visualization tools (VMD, Molden, etc.). Additionally, ORBKIT possesses routines to order molecular orbitals computed at different nuclear configurations according to their electronic character and to interpolate the wavefunction between these configurations. The program is open-source under GNU-LGPLv3 license and freely available at https://github.com/orbkit/orbkit/. This article provides an overview of ORBKIT with particular focus on its capabilities and applicability, and includes several example calculations. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. Visualizer: 3D Gridded Data Visualization Software for Geoscience Education and Research

    NASA Astrophysics Data System (ADS)

    Harwood, C.; Billen, M. I.; Kreylos, O.; Jadamec, M.; Sumner, D. Y.; Kellogg, L. H.; Hamann, B.

    2008-12-01

    In both research and education learning is an interactive and iterative process of exploring and analyzing data or model results. However, visualization software often presents challenges on the path to learning because it assumes the user already knows the locations and types of features of interest, instead of enabling flexible and intuitive examination of results. We present examples of research and teaching using the software, Visualizer, specifically designed to create an effective and intuitive environment for interactive, scientific analysis of 3D gridded data. Visualizer runs in a range of 3D virtual reality environments (e.g., GeoWall, ImmersaDesk, or CAVE), but also provides a similar level of real-time interactivity on a desktop computer. When using Visualizer in a 3D-enabled environment, the software allows the user to interact with the data images as real objects, grabbing, rotating or walking around the data to gain insight and perspective. On the desktop, simple features, such as a set of cross-bars marking the plane of the screen, provide extra 3D spatial cues that allow the user to more quickly understand geometric relationships within the data. This platform portability allows the user to more easily integrate research results into classroom demonstrations and exercises, while the interactivity provides an engaging environment for self-directed and inquiry-based learning by students. Visualizer software is freely available for download (www.keckcaves.org) and runs on Mac OSX and Linux platforms.

  5. Strategies and Decision Support Systems for Integrating Variable Energy Resources in Control Centers for Reliable Grid Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Lawrence E.

    This report provides findings from the field regarding the best ways in which to guide operational strategies, business processes and control room tools to support the integration of renewable energy into electrical grids.

  6. Visually Imperceptible Liquid-Metal Circuits for Transparent, Stretchable Electronics with Direct Laser Writing.

    PubMed

    Pan, Chengfeng; Kumar, Kitty; Li, Jianzhao; Markvicka, Eric J; Herman, Peter R; Majidi, Carmel

    2018-03-01

    A material architecture and laser-based microfabrication technique is introduced to produce electrically conductive films (sheet resistance = 2.95 Ω sq -1 ; resistivity = 1.77 × 10 -6 Ω m) that are soft, elastic (strain limit >100%), and optically transparent. The films are composed of a grid-like array of visually imperceptible liquid-metal (LM) lines on a clear elastomer. Unlike previous efforts in transparent LM circuitry, the current approach enables fully imperceptible electronics that have not only high optical transmittance (>85% at 550 nm) but are also invisible under typical lighting conditions and reading distances. This unique combination of properties is enabled with a laser writing technique that results in LM grid patterns with a line width and pitch as small as 4.5 and 100 µm, respectively-yielding grid-like wiring that has adequate conductivity for digital functionality but is also well below the threshold for visual perception. The electrical, mechanical, electromechanical, and optomechanical properties of the films are characterized and it is found that high conductivity and transparency are preserved at tensile strains of ≈100%. To demonstrate their effectiveness for emerging applications in transparent displays and sensing electronics, the material architecture is incorporated into a couple of illustrative use cases related to chemical hazard warning. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    PubMed

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.

  8. TDIGG - TWO-DIMENSIONAL, INTERACTIVE GRID GENERATION CODE

    NASA Technical Reports Server (NTRS)

    Vu, B. T.

    1994-01-01

    TDIGG is a fast and versatile program for generating two-dimensional computational grids for use with finite-difference flow-solvers. Both algebraic and elliptic grid generation systems are included. The method for grid generation by algebraic transformation is based on an interpolation algorithm and the elliptic grid generation is established by solving the partial differential equation (PDE). Non-uniform grid distributions are carried out using a hyperbolic tangent stretching function. For algebraic grid systems, interpolations in one direction (univariate) and two directions (bivariate) are considered. These interpolations are associated with linear or cubic Lagrangian/Hermite/Bezier polynomial functions. The algebraic grids can subsequently be smoothed using an elliptic solver. For elliptic grid systems, the PDE can be in the form of Laplace (zero forcing function) or Poisson. The forcing functions in the Poisson equation come from the boundary or the entire domain of the initial algebraic grids. A graphics interface procedure using the Silicon Graphics (GL) Library is included to allow users to visualize the grid variations at each iteration. This will allow users to interactively modify the grid to match their applications. TDIGG is written in FORTRAN 77 for Silicon Graphics IRIS series computers running IRIX. This package requires either MIT's X Window System, Version 11 Revision 4 or SGI (Motif) Window System. A sample executable is provided on the distribution medium. It requires 148K of RAM for execution. The standard distribution medium is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. This program was developed in 1992.

  9. CERTS: Consortium for Electric Reliability Technology Solutions - Research Highlights

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph

    2003-07-30

    Historically, the U.S. electric power industry was vertically integrated, and utilities were responsible for system planning, operations, and reliability management. As the nation moves to a competitive market structure, these functions have been disaggregated, and no single entity is responsible for reliability management. As a result, new tools, technologies, systems, and management processes are needed to manage the reliability of the electricity grid. However, a number of simultaneous trends prevent electricity market participants from pursuing development of these reliability tools: utilities are preoccupied with restructuring their businesses, research funding has declined, and the formation of Independent System Operators (ISOs) andmore » Regional Transmission Organizations (RTOs) to operate the grid means that control of transmission assets is separate from ownership of these assets; at the same time, business uncertainty, and changing regulatory policies have created a climate in which needed investment for transmission infrastructure and tools for reliability management has dried up. To address the resulting emerging gaps in reliability R&D, CERTS has undertaken much-needed public interest research on reliability technologies for the electricity grid. CERTS' vision is to: (1) Transform the electricity grid into an intelligent network that can sense and respond automatically to changing flows of power and emerging problems; (2) Enhance reliability management through market mechanisms, including transparency of real-time information on the status of the grid; (3) Empower customers to manage their energy use and reliability needs in response to real-time market price signals; and (4) Seamlessly integrate distributed technologies--including those for generation, storage, controls, and communications--to support the reliability needs of both the grid and individual customers.« less

  10. eBird—Using citizen-science data to help solve real-world conservation challenges (Invited)

    NASA Astrophysics Data System (ADS)

    Sullivan, B. L.; Iliff, M. J.; Wood, C. L.; Fink, D.; Kelling, S.

    2010-12-01

    eBird (www.ebird.org) is an Internet-based citizen-science project that collects bird observations worldwide. eBird is foremost a tool for birders, providing users with a resource for bird information and a way to keep track of their personal bird lists, thus establishing a model for sustained participation and new project growth. Importantly, eBird data are shared with scientists and conservationists working to save birds and their habitats. Here we highlight two different ways these data are used: as a real-time data gathering and visualization tool; and as the primary resource for developing large-scale bird distribution models that explore species-habitat associations and climate change scenarios. eBird provides data across broad temporal and spatial scales, and is a valuable tool for documenting and monitoring bird populations facing a multitude of anthropogenic and environmental impacts. For example, a focused effort to monitor birds on Gulf Coast beaches using eBird is providing essential baseline data and enabling long-term monitoring of bird populations throughout the region. Additionally, new data visualization tools that incorporate data from eBird, NOAA, and Google, are specifically designed to highlight the potential impacts of the Gulf oil spill on bird populations. Through a collaboration of partners in the DataONE network, such as the Oak Ridge National Laboratory, we will use supercomputing time from the National Science Foundation’s TeraGrid to allow Lab scientists to model bird migration phenology at the population level based on eBird data. The process involves combining bird observations with remotely sensed variables such as landcover and greening index to predict bird movements. Preliminary results of these models allow us to animate bird movements across large spatial scales, and to explore how migration timing might be affected under different climate change scenarios.

  11. Distribution and Validation of CERES Irradiance Global Data Products Via Web Based Tools

    NASA Technical Reports Server (NTRS)

    Rutan, David; Mitrescu, Cristian; Doelling, David; Kato, Seiji

    2016-01-01

    The CERES SYN1deg product provides climate quality 3-hourly globally gridded and temporally complete maps of top of atmosphere, in atmosphere, and surface fluxes. This product requires efficient release to the public and validation to maintain quality assurance. The CERES team developed web-tools for the distribution of both the global gridded products and grid boxes that contain long term validation sites that maintain high quality flux observations at the Earth's surface. These are found at: http://ceres.larc.nasa.gov/order_data.php. In this poster we explore the various tools available to users to sub-set, download, and validate using surface observations the SYN1Deg and Surface-EBAF products. We also analyze differences found in long-term records from well-maintained land surface sites such as the ARM central facility and high quality buoy radiometers, which due to their isolated nature cannot be maintained in a similar manner to their land based counterparts.

  12. Estimating the Impacts of Direct Load Control Programs Using GridPIQ, a Web-Based Screening Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pal, Seemita; Thayer, Brandon L.; Barrett, Emily L.

    In direct load control (DLC) programs, utilities can curtail the demand of participating loads to contractually agreed-upon levels during periods of critical peak load, thereby reducing stress on the system, generation cost, and required transmission and generation capacity. Participating customers receive financial incentives. The impacts of implementing DLC programs extend well beyond peak shaving. There may be a shift of load proportional to the interrupted load to the times before or after a DLC event, and different load shifts have different consequences. Tools that can quantify the impacts of such programs on load curves, peak demand, emissions, and fossil fuelmore » costs are currently lacking. The Grid Project Impact Quantification (GridPIQ) screening tool includes a Direct Load Control module, which takes into account project-specific inputs as well as the larger system context in order to quantify the impacts of a given DLC program. This allows users (utilities, researchers, etc.) to test and compare different program specifications and their impacts.« less

  13. Efficient in-situ visualization of unsteady flows in climate simulation

    NASA Astrophysics Data System (ADS)

    Vetter, Michael; Olbrich, Stephan

    2017-04-01

    The simulation of climate data tends to produce very large data sets, which hardly can be processed in classical post-processing visualization applications. Typically, the visualization pipeline consisting of the processes data generation, visualization mapping and rendering is distributed into two parts over the network or separated via file transfer. Within most traditional post-processing scenarios the simulation is done on a supercomputer whereas the data analysis and visualization is done on a graphics workstation. That way temporary data sets with huge volume have to be transferred over the network, which leads to bandwidth bottlenecks and volume limitations. The solution to this issue is the avoidance of temporary storage, or at least significant reduction of data complexity. Within the Climate Visualization Lab - as part of the Cluster of Excellence "Integrated Climate System Analysis and Prediction" (CliSAP) at the University of Hamburg, in cooperation with the German Climate Computing Center (DKRZ) - we develop and integrate an in-situ approach. Our software framework DSVR is based on the separation of the process chain between the mapping and the rendering processes. It couples the mapping process directly to the simulation by calling methods of a parallelized data extraction library, which create a time-based sequence of geometric 3D scenes. This sequence is stored on a special streaming server with an interactive post-filtering option and then played-out asynchronously in a separate 3D viewer application. Since the rendering is part of this viewer application, the scenes can be navigated interactively. In contrast to other in-situ approaches where 2D images are created as part of the simulation or synchronous co-visualization takes place, our method supports interaction in 3D space and in time, as well as fixed frame rates. To integrate in-situ processing based on our DSVR framework and methods in the ICON climate model, we are continuously evolving the data structures and mapping algorithms of the framework to support the ICON model's native grid structures, since DSVR originally was designed for rectilinear grids only. We now have implemented a new output module to ICON to take advantage of the DSVR visualization. The visualization can be configured as most output modules by using a specific namelist and is exemplarily integrated within the non-hydrostatic atmospheric model time loop. With the integration of a DSVR based in-situ pathline extraction within ICON, a further milestone is reached. The pathline algorithm as well as the grid data structures have been optimized for the domain decomposition used for the parallelization of ICON based on MPI and OpenMP. The software implementation and evaluation is done on the supercomputers at DKRZ. In principle, the data complexity is reduced from O(n3) to O(m), where n is the grid resolution and m the number of supporting point of all pathlines. The stability and scalability evaluation is done using Atmospheric Model Intercomparison Project (AMIP) runs. We will give a short introduction in our software framework, as well as a short overview on the implementation and usage of DSVR within ICON. Furthermore, we will present visualization and evaluation results of sample applications.

  14. Grid commerce, market-driven G-negotiation, and Grid resource management.

    PubMed

    Sim, Kwang Mong

    2006-12-01

    Although the management of resources is essential for realizing a computational grid, providing an efficient resource allocation mechanism is a complex undertaking. Since Grid providers and consumers may be independent bodies, negotiation among them is necessary. The contribution of this paper is showing that market-driven agents (MDAs) are appropriate tools for Grid resource negotiation. MDAs are e-negotiation agents designed with the flexibility of: 1) making adjustable amounts of concession taking into account market rivalry, outside options, and time preferences and 2) relaxing bargaining terms in the face of intense pressure. A heterogeneous testbed consisting of several types of e-negotiation agents to simulate a Grid computing environment was developed. It compares the performance of MDAs against other e-negotiation agents (e.g., Kasbah) in a Grid-commerce environment. Empirical results show that MDAs generally achieve: 1) higher budget efficiencies in many market situations than other e-negotiation agents in the testbed and 2) higher success rates in acquiring Grid resources under high Grid loadings.

  15. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    NASA Astrophysics Data System (ADS)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  16. Open source software projects of the caBIG In Vivo Imaging Workspace Software special interest group.

    PubMed

    Prior, Fred W; Erickson, Bradley J; Tarbox, Lawrence

    2007-11-01

    The Cancer Bioinformatics Grid (caBIG) program was created by the National Cancer Institute to facilitate sharing of IT infrastructure, data, and applications among the National Cancer Institute-sponsored cancer research centers. The program was launched in February 2004 and now links more than 50 cancer centers. In April 2005, the In Vivo Imaging Workspace was added to promote the use of imaging in cancer clinical trials. At the inaugural meeting, four special interest groups (SIGs) were established. The Software SIG was charged with identifying projects that focus on open-source software for image visualization and analysis. To date, two projects have been defined by the Software SIG. The eXtensible Imaging Platform project has produced a rapid application development environment that researchers may use to create targeted workflows customized for specific research projects. The Algorithm Validation Tools project will provide a set of tools and data structures that will be used to capture measurement information and associated needed to allow a gold standard to be defined for the given database against which change analysis algorithms can be tested. Through these and future efforts, the caBIG In Vivo Imaging Workspace Software SIG endeavors to advance imaging informatics and provide new open-source software tools to advance cancer research.

  17. 20 Plus Years of Chimera Grid Development for the Space Shuttle. STS-107, Return to Flight, End of the Program

    NASA Technical Reports Server (NTRS)

    Gomez, Reynaldo J., III

    2010-01-01

    This slide presentation reviews the progress in grid development for the space shuttle, with particular focus on the development from the los of STS-107 and the return to flight, to the end of the program. Included are views from the current Space Shuttle Launch Vehicle (SSLV) grid system, containing 1.8 million surface points, and 95+ million volume points. Charts showing wind tunnel tests comparisons, and Computational fluid dynamics (CFD) vs 1A613B wing pressures, wind tunnel test comparison with CFD of the proposed ice/frost ramp configuration are shown. The use of pressure sensitive paint and particle imaging velocimetry was used to support debris transport tools, The actual creation of the grids and the use of overset CFD to assess the external tank redesign was also reviewed. It also asks was the use of the overset tool the right choice. The presentation ends with a review of the work to be done still.

  18. 3-D Surface Visualization of pH Titration "Topos": Equivalence Point Cliffs, Dilution Ramps, and Buffer Plateaus

    ERIC Educational Resources Information Center

    Smith, Garon C.; Hossain, Md Mainul; MacCarthy, Patrick

    2014-01-01

    3-D topographic surfaces ("topos") can be generated to visualize how pH behaves during titration and dilution procedures. The surfaces are constructed by plotting computed pH values above a composition grid with volume of base added in one direction and overall system dilution on the other. What emerge are surface features that…

  19. Visual and x-ray inspection characteristics of eutectic and lead free assemblies

    NASA Technical Reports Server (NTRS)

    Ghaffarian, R.

    2003-01-01

    For high reliability applications, visual inspection has been the key technique for most conventional electronic package assemblies. Now, the use of x-ray technique has become an additional inspection requirement for quality control and detection of unique defects due to manufacturing of advanced electronic array packages such as ball grid array (BGAs) and chip scale packages (CSPs).

  20. Visual counts as an index of White-Tailed Prairie Dog density

    USGS Publications Warehouse

    Menkens, George E.; Biggins, Dean E.; Anderson, Stanley H.

    1990-01-01

    Black-footed ferrets (Mustela nigripes) are depended on prairie dogs (Cynomys spp.) for food and shelter and were historically restricted to prairie dog towns (Anderson et al. 1986). Because ferrets and prairie dogs are closely associated, successful ferret management and conservation depends on successful prairie dog management. A critical component of any management program for ferrets will be monitoring prairie dog population dynamics on towns containing ferrets or on towns proposed as ferret reintroduction sites. Three techniques for estimating prairie dog population size and density are counts of plugged and reopened burrows (Tietjen and Matschke 1982), mark-recapture (Otis et al. 1978; Seber 1982, 1986; Menkens and Anderson 1989), and visual counts (Fagerstone and Biggins 1986, Knowles 1986). The technique of plugging burrows and counting the number reopened by prairie dogs is too time and labor intensive for population evaluation on a large number of towns or over large areas. Total burrow counts are not correlated with white-tailed prairie dog (C. leucurus) densities and thus cannot be used for populated evaluation (Menkens et al. 1988). Mark-recapture requires trapping that is expensive and time and labor intensive. Monitoring a large number of prairie dog populations using mark-recapture would be difficult. Alternatively a large number of populations could be monitored in short periods of time using the visual count technique (Fagerstone and Biggins 1986, Knowles 1986). However, the accuracy of visual counts has only been evaluated in a few locations. Thus, it is not known whether the relationship between counts and prairie dog density is consistent throughout the prairie dog's range. Our objective was to evaluate the potential of using visual counts as a rapid means of estimating white-tailed prairie dog density in prairie dog towns throughout Wyoming. We studied 18 white-tailed prairie dog towns in 4 white-tailed prairie dog complexes in Wyoming near Laramie (105°40'W, 41°20'N, 3 grids), Pathfinder reservoir (106°55'W, 42°30'N, 6 grids), Shirley Basin (106°10'W, 42°20'N, 6 grids), and Meeteetse (108°10'W, 44°10'N, 3 grids). All towns were dominated by grasses, forbs, and shrubs (details in Collins and Lichvar 1986). Topography of towns ranged from flat to gently rolling hills.

  1. Evaluation of NASA SPoRT's Pseudo-Geostationary Lightning Mapper Products in the 2011 Spring Program

    NASA Technical Reports Server (NTRS)

    Stano, Geoffrey T.; Carcione, Brian; Siewert, Christopher; Kuhlman, Kristin M.

    2012-01-01

    NASA's Short-term Prediction Research and Transition (SPoRT) program is a contributing partner with the GOES-R Proving Ground (PG) preparing forecasters to understand and utilize the unique products that will be available in the GOES-R era. This presentation emphasizes SPoRT s actions to prepare the end user community for the Geostationary Lightning Mapper (GLM). This preparation is a collaborative effort with SPoRT's National Weather Service partners, the National Severe Storms Laboratory (NSSL), and the Hazardous Weather Testbed s Spring Program. SPoRT continues to use its effective paradigm of matching capabilities to forecast problems through collaborations with our end users and working with the developers at NSSL to create effective evaluations and visualizations. Furthermore, SPoRT continues to develop software plug-ins so that these products will be available to forecasters in their own decision support system, AWIPS and eventually AWIPS II. In 2009, the SPoRT program developed the original pseudo geostationary lightning mapper (PGLM) flash extent product to demonstrate what forecasters may see with GLM. The PGLM replaced the previous GLM product and serves as a stepping-stone until the AWG s official GLM proxy is ready. The PGLM algorithm is simple and can be applied to any ground-based total lightning network. For 2011, the PGLM used observations from four ground-based networks (North Alabama, Kennedy Space Center, Oklahoma, and Washington D.C.). While the PGLM is not a true proxy product, it is intended as a tool to train forecasters about total lightning as well as foster discussions on product visualizations and incorporating GLM-resolution data into forecast operations. The PGLM has been used in 2010 and 2011 and is likely to remain the primary lightning training tool for the GOES-R program for the near future. This presentation will emphasize the feedback received during the 2011 Spring Program. This will discuss several topics. Based on feedback from the 2010 Spring Program, SPoRT created two variant PGLM products, which NSSL produced locally and provided in real-time within AWIPS for 2011. The first is the flash initiation density (FID) product, which creates a gridded display showing the number of flashes that originated in each 8 8 km grid box. The second product is the maximum flash density (MFD). This shows the highest PGLM value for each grid point over a specific period of time, ranging from 30 to 120 minutes. In addition to the evaluation of these two new products, the evaluation of the PGLM itself will be covered. The presentation will conclude with forecaster feedback for additional improvements requested for future evaluations, such as within the 2012 Spring Program.

  2. PIQMIe: a web server for semi-quantitative proteomics data management and analysis

    PubMed Central

    Kuzniar, Arnold; Kanaar, Roland

    2014-01-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. PMID:24861615

  3. PIQMIe: a web server for semi-quantitative proteomics data management and analysis.

    PubMed

    Kuzniar, Arnold; Kanaar, Roland

    2014-07-01

    We present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates peptide and (non-redundant) protein identifications and quantitations from multiple experiments with additional biological information on the protein entries, and makes the linked data available in the form of a light-weight relational database, which enables dedicated data analyses (e.g. in R) and user-driven queries. Using the web interface, users are presented with a concise summary of their proteomics experiments in numerical and graphical forms, as well as with a searchable protein grid and interactive visualization tools to aid in the rapid assessment of the experiments and in the identification of proteins of interest. The web server not only provides data access through a web interface but also supports programmatic access through RESTful web service. The web server is available at http://piqmie.semiqprot-emc.cloudlet.sara.nl or http://www.bioinformatics.nl/piqmie. This website is free and open to all users and there is no login requirement. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. NASA GES DISC Aerosol analysis and visualization services

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Ichoku, C. M.; Petrenko, M.; Yang, W.; Albayrak, A.; Zhao, P.; Johnson, J. E.; Kempler, S.

    2015-12-01

    Among the known atmospheric constituents, aerosols represent the greatest uncertainty in climate research. Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite-borne sensors routinely measure aerosols. There is often disagreement between similar aerosol parameters retrieved from different sensors, leaving users confused as to which sensors to trust for answering important science questions about the distribution, properties, and impacts of aerosols. Such misunderstanding may be avoided by providing satellite data with accurate pixel-level (Level 2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) have developed multiple MAPSS applications as a part of Giovanni (Geospatial Interactive Online Visualization and Analysis Interface) data visualization and analysis tool - Giovanni-MAPSS and Giovanni-MAPSS_Explorer since 2007. The MAPSS database provides spatio-temporal statistics for multiple spatial spaceborne Level 2 aerosol products (MODIS Terra, MODIS Aqua, MISR, POLDER, OMI, CALIOP, SeaWiFS Deep Blue, and VIIRS) sampled over AERONET ground stations. In this presentation, I will demonstrate the improved features from Giovanni-MAPSS and introduce a new visualization service (Giovanni VizMAP) supporting various visualization and data accessing capabilities from satellite Level 2 data (non-aggregated and un-gridded) at high spatial resolution. Functionality will include selecting data sources (e.g., multiple parameters under the same measurement), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting.

  5. Iterating between Tools to Create and Edit Visualizations.

    PubMed

    Bigelow, Alex; Drucker, Steven; Fisher, Danyel; Meyer, Miriah

    2017-01-01

    A common workflow for visualization designers begins with a generative tool, like D3 or Processing, to create the initial visualization; and proceeds to a drawing tool, like Adobe Illustrator or Inkscape, for editing and cleaning. Unfortunately, this is typically a one-way process: once a visualization is exported from the generative tool into a drawing tool, it is difficult to make further, data-driven changes. In this paper, we propose a bridge model to allow designers to bring their work back from the drawing tool to re-edit in the generative tool. Our key insight is to recast this iteration challenge as a merge problem - similar to when two people are editing a document and changes between them need to reconciled. We also present a specific instantiation of this model, a tool called Hanpuku, which bridges between D3 scripts and Illustrator. We show several examples of visualizations that are iteratively created using Hanpuku in order to illustrate the flexibility of the approach. We further describe several hypothetical tools that bridge between other visualization tools to emphasize the generality of the model.

  6. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  7. ICT-infrastructures for hydrometeorology science and natural disaster societal impact assessment: the DRIHMS project

    NASA Astrophysics Data System (ADS)

    Parodi, A.; Craig, G. C.; Clematis, A.; Kranzlmueller, D.; Schiffers, M.; Morando, M.; Rebora, N.; Trasforini, E.; D'Agostino, D.; Keil, K.

    2010-09-01

    Hydrometeorological science has made strong progress over the last decade at the European and worldwide level: new modeling tools, post processing methodologies and observational data and corresponding ICT (Information and Communication Technology) technologies are available. Recent European efforts in developing a platform for e-Science, such as EGEE (Enabling Grids for E-sciencE), SEEGRID-SCI (South East Europe GRID e-Infrastructure for regional e-Science), and the German C3-Grid, have demonstrated their abilities to provide an ideal basis for the sharing of complex hydrometeorological data sets and tools. Despite these early initiatives, however, the awareness of the potential of the Grid technology as a catalyst for future hydrometeorological research is still low and both the adoption and the exploitation have astonishingly been slow, not only within individual EC member states, but also on a European scale. With this background in mind and the fact that European ICT-infrastructures are in the progress of transferring to a sustainable and permanent service utility as underlined by the European Grid Initiative (EGI) and the Partnership for Advanced Computing in Europe (PRACE), the Distributed Research Infrastructure for Hydro-Meteorology Study (DRIHMS, co-Founded by the EC under the 7th Framework Programme) project has been initiated. The goal of DRIHMS is the promotion of the Grids in particular and e-Infrastructures in general within the European hydrometeorological research (HMR) community through the diffusion of a Grid platform for e-collaboration in this earth science sector: the idea is to further boost European research excellence and competitiveness in the fields of hydrometeorological research and Grid research by bridging the gaps between these two scientific communities. Furthermore the project is intended to transfer the results to areas beyond the strict hydrometeorology science as a support for the assessment of the effects of extreme hydrometeorological events on society and for the development of the tools improving the adaptation and resilience of society to the challenges of climate change. This paper will be devoted to provide an overview of DRIHMS ideas and to present the results of the DRIHMS HMR and ICT surveys.

  8. ICT-based hydrometeorology science and natural disaster societal impact assessment

    NASA Astrophysics Data System (ADS)

    Parodi, A.; Clematis, A.; Craig, G. C.; Kranzmueller, D.

    2009-09-01

    In the Lisbon strategy, the 2005 European Council identified knowledge and innovation as the engines of sustainable growth and stated that it is essential to build a fully inclusive information society. In parallel, the World Conference on Disaster Reduction (Hyogo, 2005), defined among its thematic priorities the improvement of international cooperation in hydrometeorology research activities. This was recently confirmed at the joint press conference of the Center for Research on Epidemiology of Disasters (CRED) with the United Nations International Strategy for Disaster Reduction (UNISDR) Secretariat, held on January 2009, where it was noted that flood and storm events are among the natural disasters that most impact human life. Hydrometeorological science has made strong progress over the last decade at the European and worldwide level: new modelling tools, post processing methodologies and observational data are available. Recent European efforts in developing a platform for e-science, like EGEE (Enabling Grids for E-sciencE), SEE-GRID-SCI (South East Europe GRID e-Infrastructure for regional e-Science), and the German C3-Grid, provide an ideal basis for the sharing of complex hydrometeorological data sets and tools. Despite these early initiatives, however, the awareness of the potential of the Grid technology as a catalyst for future hydrometeorological research is still low and both the adoption and the exploitation have astonishingly been slow, not only within individual EC member states, but also on a European scale. With this background in mind, the goal of the Distributed Research Infrastructure for Hydro-Meteorology Study (DRIHMS) project is the promotion of the Grid culture within the European hydrometeorological research community through the diffusion of a Grid platform for e-collaboration in this earth science sector: the idea is to further boost European research excellence and competitiveness in the fields of hydrometeorological research and Grid research by bridging the gaps between these two scientific communities. Furthermore the project is intended to transfer the results to areas beyond the strict hydrometeorology science as a support for the assessment of the effects of extreme hydrometeorological events on society and for the development of the tools improving the adaptation and resilience of society to the challenges of climate change.

  9. The Watershed Deposition Tool: A Tool for Incorporating Atmospheric Deposition in Watershed Analysis

    EPA Science Inventory

    The tool for providing the linkage between air and water quality modeling needed for determining the Total Maximum Daily Load (TMDL) and for analyzing related nonpoint-source impacts on watersheds has been developed. The Watershed Deposition Tool (WDT) takes gridded output of at...

  10. Towards Automated Screening of Two-dimensional Crystals

    PubMed Central

    Cheng, Anchi; Leung, Albert; Fellmann, Denis; Quispe, Joel; Suloway, Christian; Pulokas, James; Carragher, Bridget; Potter, Clinton S.

    2007-01-01

    Screening trials to determine the presence of two-dimensional (2D) protein crystals suitable for three-dimensional structure determination using electron crystallography is a very labor-intensive process. Methods compatible with fully automated screening have been developed for the process of crystal production by dialysis and for producing negatively stained grids of the resulting trials. Further automation via robotic handling of the EM grids, and semi-automated transmission electron microscopic imaging and evaluation of the trial grids is also possible. We, and others, have developed working prototypes for several of these tools and tested and evaluated them in a simple screen of 24 crystallization conditions. While further development of these tools is certainly required for a turn-key system, the goal of fully automated screening appears to be within reach. PMID:17977016

  11. Policy-based Distributed Data Management

    NASA Astrophysics Data System (ADS)

    Moore, R. W.

    2009-12-01

    The analysis and understanding of climate variability and change builds upon access to massive collections of observational and simulation data. The analyses involve distributed computing, both at the storage systems (which support data subsetting) and at compute engines (for assimilation of observational data into simulations). The integrated Rule Oriented Data System (iRODS) organizes the distributed data into collections to facilitate enforcement of management policies, support remote data processing, and enable development of reference collections. Currently at RENCI, the iRODS data grid is being used to manage ortho-photos and lidar data for the State of North Carolina, provide a unifying storage environment for engagement centers across the state, support distributed access to visualizations of weather data, and is being explored to manage and disseminate collections of ensembles of meteorological and hydrological model results. In collaboration with the National Climatic Data Center, an iRODS data grid is being established to support data transmission from NCDC to ORNL, and to integrate NCDC archives with ORNL compute services. To manage the massive data transfers, parallel I/O streams are used between High Performance Storage System tape archives and the supercomputers at ORNL. Further, we are exploring the movement and management of large RADAR and in situ datasets to be used for data mining between RENCI and NCDC, and for the distributed creation of decision support and climate analysis tools. The iRODS data grid supports all phases of the scientific data life cycle, from management of data products for a project, to sharing of data between research institutions, to publication of data in a digital library, to preservation of data for use in future research projects. Each phase is characterized by a broader user community, with higher expectations for more detailed descriptions and analysis mechanisms for manipulating the data. The higher usage requirements are enforced by management policies that define the required metadata, the required data formats, and the required analysis tools. The iRODS policy based data management system automates the creation of the community chosen data products, validates integrity and authenticity assessment criteria, and enforces management policies across all accesses of the system.

  12. Grid Modeling Tools | Grid Modernization | NREL

    Science.gov Websites

    integrates primary frequency response (turbine governor control) with secondary frequency response (automatic generation control). It simulates the power system dynamic response in full time spectrum with variable time control model places special emphasis on electric power systems with high penetrations of renewable

  13. A Roadmap for caGrid, an Enterprise Grid Architecture for Biomedical Research

    PubMed Central

    Saltz, Joel; Hastings, Shannon; Langella, Stephen; Oster, Scott; Kurc, Tahsin; Payne, Philip; Ferreira, Renato; Plale, Beth; Goble, Carole; Ervin, David; Sharma, Ashish; Pan, Tony; Permar, Justin; Brezany, Peter; Siebenlist, Frank; Madduri, Ravi; Foster, Ian; Shanbhag, Krishnakant; Mead, Charlie; Hong, Neil Chue

    2012-01-01

    caGrid is a middleware system which combines the Grid computing, the service oriented architecture, and the model driven architecture paradigms to support development of interoperable data and analytical resources and federation of such resources in a Grid environment. The functionality provided by caGrid is an essential and integral component of the cancer Biomedical Informatics Grid (caBIG™) program. This program is established by the National Cancer Institute as a nationwide effort to develop enabling informatics technologies for collaborative, multi-institutional biomedical research with the overarching goal of accelerating translational cancer research. Although the main application domain for caGrid is cancer research, the infrastructure provides a generic framework that can be employed in other biomedical research and healthcare domains. The development of caGrid is an ongoing effort, adding new functionality and improvements based on feedback and use cases from the community. This paper provides an overview of potential future architecture and tooling directions and areas of improvement for caGrid and caGrid-like systems. This summary is based on discussions at a roadmap workshop held in February with participants from biomedical research, Grid computing, and high performance computing communities. PMID:18560123

  14. A roadmap for caGrid, an enterprise Grid architecture for biomedical research.

    PubMed

    Saltz, Joel; Hastings, Shannon; Langella, Stephen; Oster, Scott; Kurc, Tahsin; Payne, Philip; Ferreira, Renato; Plale, Beth; Goble, Carole; Ervin, David; Sharma, Ashish; Pan, Tony; Permar, Justin; Brezany, Peter; Siebenlist, Frank; Madduri, Ravi; Foster, Ian; Shanbhag, Krishnakant; Mead, Charlie; Chue Hong, Neil

    2008-01-01

    caGrid is a middleware system which combines the Grid computing, the service oriented architecture, and the model driven architecture paradigms to support development of interoperable data and analytical resources and federation of such resources in a Grid environment. The functionality provided by caGrid is an essential and integral component of the cancer Biomedical Informatics Grid (caBIG) program. This program is established by the National Cancer Institute as a nationwide effort to develop enabling informatics technologies for collaborative, multi-institutional biomedical research with the overarching goal of accelerating translational cancer research. Although the main application domain for caGrid is cancer research, the infrastructure provides a generic framework that can be employed in other biomedical research and healthcare domains. The development of caGrid is an ongoing effort, adding new functionality and improvements based on feedback and use cases from the community. This paper provides an overview of potential future architecture and tooling directions and areas of improvement for caGrid and caGrid-like systems. This summary is based on discussions at a roadmap workshop held in February with participants from biomedical research, Grid computing, and high performance computing communities.

  15. EMI datalib - joining the best of ARC and gLite data libraries

    NASA Astrophysics Data System (ADS)

    Nilsen, J. K.; Cameron, D.; Devresse, A.; Molnar, Zs; Nagy, Zs; Salichos, M.

    2012-12-01

    To manage data in the grid, with its jungle of protocols and enormous amount of data in different storage solutions, it is important to have a strong, versatile and reliable data management library. While there are several data management tools and libraries available, they all have different strengths and weaknesses, and it can be hard to decide which tool to use for which purpose. EMI is a collaboration between the European middleware providers aiming to take the best out of each middleware to create one consolidated, all-purpose grid middleware. When EMI started there were two main tools for managing data - gLite had lcg util and the GFAL library, ARC had the ARC data tools and libarcdata2. While different in design and purpose, they both have the same goal; to manage data in the grid. The design of the new EMI datalib was ready by the end of 2011, and a first prototype is now implemented and going through a thorough testing phase. This presentation will give the latest results of the consolidated library together with an overview of the design, test plan and roadmap of EMI datalib.

  16. Development of a gridded meteorological dataset over Java island, Indonesia 1985–2014

    PubMed Central

    Yanto; Livneh, Ben; Rajagopalan, Balaji

    2017-01-01

    We describe a gridded daily meteorology dataset consisting of precipitation, minimum and maximum temperature over Java Island, Indonesia at 0.125°×0.125° (~14 km) resolution spanning 30 years from 1985–2014. Importantly, this data set represents a marked improvement from existing gridded data sets over Java with higher spatial resolution, derived exclusively from ground-based observations unlike existing satellite or reanalysis-based products. Gap-infilling and gridding were performed via the Inverse Distance Weighting (IDW) interpolation method (radius, r, of 25 km and power of influence, α, of 3 as optimal parameters) restricted to only those stations including at least 3,650 days (~10 years) of valid data. We employed MSWEP and CHIRPS rainfall products in the cross-validation. It shows that the gridded rainfall presented here produces the most reasonable performance. Visual inspection reveals an increasing performance of gridded precipitation from grid, watershed to island scale. The data set, stored in a network common data form (NetCDF), is intended to support watershed-scale and island-scale studies of short-term and long-term climate, hydrology and ecology. PMID:28534871

  17. Validation of a Pressure-Based Combustion Simulation Tool Using a Single Element Injector Test Problem

    NASA Technical Reports Server (NTRS)

    Thakur, Siddarth; Wright, Jeffrey

    2006-01-01

    The traditional design and analysis practice for advanced propulsion systems, particularly chemical rocket engines, relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment by non-CFD specialists. A computational tool, called Loci-STREAM is being developed for this purpose. It is a pressure-based, Reynolds-averaged Navier-Stokes (RANS) solver for generalized unstructured grids, which is designed to handle all-speed flows (incompressible to hypersonic) and is particularly suitable for solving multi-species flow in fixed-frame combustion devices. Loci-STREAM integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective of the ongoing work is to develop a robust simulation capability for combustion problems in rocket engines. As an initial step towards validating this capability, a model problem is investigated in the present study which involves a gaseous oxygen/gaseous hydrogen (GO2/GH2) shear coaxial single element injector, for which experimental data are available. The sensitivity of the computed solutions to grid density, grid distribution, different turbulence models, and different near-wall treatments is investigated. A refined grid, which is clustered in the vicinity of the solid walls as well as the flame, is used to obtain a steady state solution which may be considered as the best solution attainable with the steady-state RANS methodology. From a design point of view, quick turnaround times are desirable; with this in mind, coarser grids are also employed and the resulting solutions are evaluated with respect to the fine grid solution.

  18. Results from Evaluations of Gridded CrIS/ATMS Visualization for Operational Forecasting

    NASA Astrophysics Data System (ADS)

    Stevens, E.; Zavodsky, B.; Dostalek, J.; Berndt, E.; Hoese, D.; White, K.; Bowlan, M.; Gambacorta, A.; Wheeler, A.; Haisley, C.; Smith, N.

    2017-12-01

    For forecast challenges which require diagnosis of the three-dimensional atmosphere, current observations, such as radiosondes, may not offer enough information. Satellite data can help fill the spatial and temporal gaps between soundings. In particular, temperature and moisture retrievals from the NOAA-Unique Combined Atmospheric Processing System (NUCAPS), which combines infrared soundings from the Cross-track Infrared Sounder (CrIS) with the Advanced Technology Microwave Sounder (ATMS) to retrieve profiles of temperature and moisture. NUCAPS retrievals are available in a wide swath with approximately 45-km spatial resolution at nadir and a local Equator crossing time of 1:30 A.M./P.M. enabling three-dimensional observations at asynoptic times. This abstract focuses on evaluation of a new visualization for NUCAPS within the operational National Weather Service Advanced Weather Interactive Processing System (AWIPS) decision support system that allows these data to be viewed in gridded horizontal maps or vertical cross sections. Two testbed evaluations have occurred in 2017: a Cold Air Aloft (CAA) evaluation at the Alaska Center Weather Service Unit and a Convective Potential evaluation at the NOAA Hazardous Weather Testbed. For CAA, at high latitudes during the winter months, the air at altitudes used by passenger and cargo aircraft can reach temperatures cold enough (-65°C) to begin to freeze jet fuel, and Gridded NUCAPS visualization was shown to help fill in the spatial and temporal gaps in data-sparse areas across the Alaskan airspace by identifying the 3D spatial extent of cold air features. For convective potential, understanding the vertical distribution of temperature and moisture is also very important for forecasting the potential for convection related to severe weather such as lightning, large hail, and tornadoes. The Gridded NUCAPS visualization was shown to aid forecasters in understanding temperature and moisture characteristics at critical levels for determining cap strength and instability. In both cases, when the products are used in conjunction with numerical output to reinforce confidence in model products or provide an alternative observation if forecasters are not sure the model is properly representing the atmosphere.

  19. A Review of Distributed Control Techniques for Power Quality Improvement in Micro-grids

    NASA Astrophysics Data System (ADS)

    Zeeshan, Hafiz Muhammad Ali; Nisar, Fatima; Hassan, Ahmad

    2017-05-01

    Micro-grid is typically visualized as a small scale local power supply network dependent on distributed energy resources (DERs) that can operate simultaneously with grid as well as in standalone manner. The distributed generator of a micro-grid system is usually a converter-inverter type topology acting as a non-linear load, and injecting harmonics into the distribution feeder. Hence, the negative effects on power quality by the usage of distributed generation sources and components are clearly witnessed. In this paper, a review of distributed control approaches for power quality improvement is presented which encompasses harmonic compensation, loss mitigation and optimum power sharing in multi-source-load distributed power network. The decentralized subsystems for harmonic compensation and active-reactive power sharing accuracy have been analysed in detail. Results have been validated to be consistent with IEEE standards.

  20. Daymet: Daily Surface Weather Data on a 1-km Grid for North America, Version 2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thornton, Peter E; Thornton, Michele M; Mayer, Benjamin W

    More information: http://daymet.ornl.gov Presenter: Ranjeet Devarakonda Environmental Sciences Division Oak Ridge National Laboratory (ORNL) Daymet: Daily Surface Weather Data and Climatological Summaries provides gridded estimates of daily weather parameters for North America, including daily continuous surfaces of minimum and maximum temperature, precipitation occurrence and amount, humidity, shortwave radiation, snow water equivalent, and day length. The current data product (Version 2) covers the period January 1, 1980 to December 31, 2013 [1]. The prior product (Version 1) only covered from 1980-2008. Data are available on a daily time step at a 1-km x 1-km spatial resolution in Lambert Conformal Conic projectionmore » with a spatial extent that covers the conterminous United States, Mexico, and Southern Canada as meteorological station density allows. Daymet data can be downloaded from 1) the ORNL Distributed Active Archive Center (DAAC) search and order tools (http://daac.ornl.gov/cgi-bin/cart/add2cart.pl?add=1219) or directly from the DAAC FTP site (http://daac.ornl.gov/cgi-bin/dsviewer.pl?ds_id=1219) and 2) the Single Pixel Tool [2] and THREDDS (Thematic Real-time Environmental Data Services) Data Server [3]. The Single Pixel Data Extraction Tool allows users to enter a single geographic point by latitude and longitude in decimal degrees. A routine is executed that translates the (lon, lat) coordinates into projected Daymet (x,y) coordinates. These coordinates are used to access the Daymet database of daily-interpolated surface weather variables. Daily data from the nearest 1 km x 1 km Daymet grid cell are extracted from the database and formatted as a table with one column for each Daymet variable and one row for each day. All daily data for selected years are returned as a single (long) table, formatted for display in the browser window. At the top of this table is a link to the same data in a simple comma-separated text format, suitable for import into a spreadsheet or other data analysis software. The Single Pixel Data Extraction Tool also provides the option to download multiple coordinates programmatically. A multiple extractor script is freely available to download at http://daymet.ornl.gov/files/daymet.zip. The ORNL DAAC s THREDDS data server (TDS) provides customized visualization and access to Daymet time series of North American mosaics. Users can subset and download Daymet data via a variety of community standards, including OPeNDAP, NetCDF Subset service, and Open Geospatial Consortium (OGC) Web Map/Coverage Service. The ORNL DAAC TDS also exposes Daymet metadata through its ncISO service to facilitate harvesting Daymet metadata records into 3rd party catalogs. References: [1] Thornton, P.E., M.M. Thornton, B.W. Mayer, N. Wilhelmi, Y. Wei, R. Devarakonda, and R.B. Cook. 2014. Daymet: Daily Surface Weather Data on a 1-km Grid for North America, Version 2. Data set. Available on-line [http://daac.ornl.gov] from Oak Ridge National Laboratory Distributed Active Archive Center, Oak Ridge, Tennessee, USA. [2] Devarakonda R., et al. 2012. Daymet: Single Pixel Data Extraction Tool. Available on-line [http://daymet.ornl.go/singlepixel.html]. [3] Wei Y., et al. 2014. Daymet: Thematic Real-time Environmental Data Services. Available on-line [http://daymet.ornl.gov/thredds_tiles.html].« less

  1. Real-Time Visualization of Spacecraft Telemetry for the GLAST and LRO Missions

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric T.; Shah, Neerav; Chai, Dean J.

    2010-01-01

    GlastCam and LROCam are closely-related tools developed at NASA Goddard Space Flight Center for real-time visualization of spacecraft telemetry, developed for the Gamma-Ray Large Area Space Telescope (GLAST) and Lunar Reconnaissance Orbiter (LRO) missions, respectively. Derived from a common simulation tool, they use related but different architectures to ingest real-time spacecraft telemetry and ground predicted ephemerides, and to compute and display features of special interest to each mission in its operational environment. We describe the architectures of GlastCam and LROCam, the customizations required to fit into the mission operations environment, and the features that were found to be especially useful in early operations for their respective missions. Both tools have a primary window depicting a three-dimensional Cam view of the spacecraft that may be freely manipulated by the user. The scene is augmented with fields of view, pointing constraints, and other features which enhance situational awareness. Each tool also has another "Map" window showing the spacecraft's groundtrack projected onto a map of the Earth or Moon, along with useful features such as the Sun, eclipse regions, and TDRS satellite locations. Additional windows support specialized checkout tasks. One such window shows the star tracker fields of view, with tracking window locations and the mission star catalog. This view was instrumental for GLAST in quickly resolving a star tracker mounting polarity issue; visualization made the 180-deg mismatch immediately obvious. Full access to GlastCam's source code also made possible a rapid coarse star tracker mounting calibration with some on the fly code adjustments; adding a fine grid to measure alignment offsets, and introducing a calibration quaternion which could be adjusted within GlastCam without perturbing the flight parameters. This calibration, from concept to completion, took less than half an hour. Both GlastCam and LROCam were developed in the C language, with non-proprietary support libraries, for ease of customization and portability. This no-blackboxes aspect enables engineers to adapt quickly to unforeseen circumstances in the intense operations environment. GlastCam and LROCam were installed on multiple workstations in the operations support rooms, allowing independent use by multiple subsystems, systems engineers and managers, with negligible draw on telemetry system resources.

  2. REopt Lite Web Tool Evaluates Photovoltaics and Battery Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Building on the success of the REopt renewable energy integration and optimization platform, NREL has developed a free, publicly available web version of REopt called REopt Lite. REopt Lite evaluates the economics of grid-connected photovoltaics (PV) and battery storage at a site. It allows building owners to identify the system sizes and battery dispatch strategy that minimize their life cycle cost of energy. This web tool also estimates the amount of time a PV and storage system can sustain the site's critical load during a grid outage.

  3. WegenerNet climate station network region Feldbach/Austria: From local measurements to weather and climate data products at 1 km-scale resolution

    NASA Astrophysics Data System (ADS)

    Kabas, T.; Leuprecht, A.; Bichler, C.; Kirchengast, G.

    2010-12-01

    South-eastern Austria is characteristic for experiencing a rich variety of weather and climate patterns. For this reason, the county of Feldbach was selected by the Wegener Center as a focus area for a pioneering observation experiment at very high resolution: The WegenerNet climate station network (in brief WegenerNet) comprises 151 meteorological stations within an area of about 20 km × 15 km (~ 1.4 km × 1.4 km station grid). All stations measure the main parameters temperature, humidity and precipitation with 5 minute sampling. Selected further stations include measurements of wind speed and direction completed by soil parameters as well as air pressure and net radiation. The collected data is integrated in an automatic processing system including data transfer, quality control, product generation, and visualization. Each station is equipped with an internet-attached data logger and the measurements are transferred as binary files via GPRS to the WegenerNet server in 1 hour intervals. The incoming raw data files of measured parameters as well as several operating values of the data logger are stored in a relational database (PostgreSQL). Next, the raw data pass the Quality Control System (QCS) in which the data are checked for its technical and physical plausibility (e.g., sensor specifications, temporal and spatial variability). In consideration of the data quality (quality flag), the Data Product Generator (DPG) results in weather and climate data products on various temporal scales (from 5 min to annual) for single stations and regular grids. Gridded data are derived by vertical scaling and squared inverse distance interpolation (1 km × 1 km and 0.01° × 0.01° grids). Both subsystems (QCS and DPG) are realized by the programming language Python. For application purposes the resulting data products are available via the bi-lingual (dt, en) WegenerNet data portal (www.wegenernet.org). At this time, the main interface is still online in a system in which MapServer is used to import spatial data by its database interface and to generate images of static geographic formats. However, a Java applet is additionally needed to display these images on the users local host. Furthermore, station data are visualized as time series by the scripting language PHP. Since February 2010, the visualization of gridded data products is a first step to a new data portal based on OpenLayers. In this GIS framework, all geographic information (e.g., OpenStreetMap) is displayed with MapServer. Furthermore, the visualization of all meteorological parameters are generated on the fly by a Python CGI script and transparently overlayed on the maps. Hence, station data and gridded data are visualized and further prepared for download in common data formats (csv, NetCDF). In conclusion, measured data and generated data products are provided with a data latency less than 1-2 hours in standard operation (near real time). Following an introduction of the processing system along the lines above, resulting data products are presented online at the WegenerNet data portal.

  4. CANDID: Companion Analysis and Non-Detection in Interferometric Data

    NASA Astrophysics Data System (ADS)

    Gallenne, A.; Mérand, A.; Kervella, P.; Monnier, J. D.; Schaefer, G. H.; Baron, F.; Breitfelder, J.; Le Bouquin, J. B.; Roettenbacher, R. M.; Gieren, W.; Pietrzynski, G.; McAlister, H.; ten Brummelaar, T.; Sturmann, J.; Sturmann, L.; Turner, N.; Ridgway, S.; Kraus, S.

    2015-05-01

    CANDID finds faint companion around star in interferometric data in the OIFITS format. It allows systematically searching for faint companions in OIFITS data, and if not found, estimates the detection limit. The tool is based on model fitting and Chi2 minimization, with a grid for the starting points of the companion position. It ensures all positions are explored by estimating a-posteriori if the grid is dense enough, and provides an estimate of the optimum grid density.

  5. Integrating Clinical Trial Imaging Data Resources Using Service-Oriented Architecture and Grid Computing

    PubMed Central

    Cladé, Thierry; Snyder, Joshua C.

    2010-01-01

    Clinical trials which use imaging typically require data management and workflow integration across several parties. We identify opportunities for all parties involved to realize benefits with a modular interoperability model based on service-oriented architecture and grid computing principles. We discuss middleware products for implementation of this model, and propose caGrid as an ideal candidate due to its healthcare focus; free, open source license; and mature developer tools and support. PMID:20449775

  6. Initial steps towards a production platform for DNA sequence analysis on the grid.

    PubMed

    Luyf, Angela C M; van Schaik, Barbera D C; de Vries, Michel; Baas, Frank; van Kampen, Antoine H C; Olabarriaga, Silvia D

    2010-12-14

    Bioinformatics is confronted with a new data explosion due to the availability of high throughput DNA sequencers. Data storage and analysis becomes a problem on local servers, and therefore it is needed to switch to other IT infrastructures. Grid and workflow technology can help to handle the data more efficiently, as well as facilitate collaborations. However, interfaces to grids are often unfriendly to novice users. In this study we reused a platform that was developed in the VL-e project for the analysis of medical images. Data transfer, workflow execution and job monitoring are operated from one graphical interface. We developed workflows for two sequence alignment tools (BLAST and BLAT) as a proof of concept. The analysis time was significantly reduced. All workflows and executables are available for the members of the Dutch Life Science Grid and the VL-e Medical virtual organizations All components are open source and can be transported to other grid infrastructures. The availability of in-house expertise and tools facilitates the usage of grid resources by new users. Our first results indicate that this is a practical, powerful and scalable solution to address the capacity and collaboration issues raised by the deployment of next generation sequencers. We currently adopt this methodology on a daily basis for DNA sequencing and other applications. More information and source code is available via http://www.bioinformaticslaboratory.nl/

  7. Grids in topographic maps reduce distortions in the recall of learned object locations.

    PubMed

    Edler, Dennis; Bestgen, Anne-Kathrin; Kuchinke, Lars; Dickmann, Frank

    2014-01-01

    To date, it has been shown that cognitive map representations based on cartographic visualisations are systematically distorted. The grid is a traditional element of map graphics that has rarely been considered in research on perception-based spatial distortions. Grids do not only support the map reader in finding coordinates or locations of objects, they also provide a systematic structure for clustering visual map information ("spatial chunks"). The aim of this study was to examine whether different cartographic kinds of grids reduce spatial distortions and improve recall memory for object locations. Recall performance was measured as both the percentage of correctly recalled objects (hit rate) and the mean distance errors of correctly recalled objects (spatial accuracy). Different kinds of grids (continuous lines, dashed lines, crosses) were applied to topographic maps. These maps were also varied in their type of characteristic areas (LANDSCAPE) and different information layer compositions (DENSITY) to examine the effects of map complexity. The study involving 144 participants shows that all experimental cartographic factors (GRID, LANDSCAPE, DENSITY) improve recall performance and spatial accuracy of learned object locations. Overlaying a topographic map with a grid significantly reduces the mean distance errors of correctly recalled map objects. The paper includes a discussion of a square grid's usefulness concerning object location memory, independent of whether the grid is clearly visible (continuous or dashed lines) or only indicated by crosses.

  8. Grid-based platform for training in Earth Observation

    NASA Astrophysics Data System (ADS)

    Petcu, Dana; Zaharie, Daniela; Panica, Silviu; Frincu, Marc; Neagul, Marian; Gorgan, Dorian; Stefanut, Teodor

    2010-05-01

    GiSHEO platform [1] providing on-demand services for training and high education in Earth Observation is developed, in the frame of an ESA funded project through its PECS programme, to respond to the needs of powerful education resources in remote sensing field. It intends to be a Grid-based platform of which potential for experimentation and extensibility are the key benefits compared with a desktop software solution. Near-real time applications requiring simultaneous multiple short-time-response data-intensive tasks, as in the case of a short time training event, are the ones that are proved to be ideal for this platform. The platform is based on Globus Toolkit 4 facilities for security and process management, and on the clusters of four academic institutions involved in the project. The authorization uses a VOMS service. The main public services are the followings: the EO processing services (represented through special WSRF-type services); the workflow service exposing a particular workflow engine; the data indexing and discovery service for accessing the data management mechanisms; the processing services, a collection allowing easy access to the processing platform. The WSRF-type services for basic satellite image processing are reusing free image processing tools, OpenCV and GDAL. New algorithms and workflows were develop to tackle with challenging problems like detecting the underground remains of old fortifications, walls or houses. More details can be found in [2]. Composed services can be specified through workflows and are easy to be deployed. The workflow engine, OSyRIS (Orchestration System using a Rule based Inference Solution), is based on DROOLS, and a new rule-based workflow language, SILK (SImple Language for worKflow), has been built. Workflow creation in SILK can be done with or without a visual designing tools. The basics of SILK are the tasks and relations (rules) between them. It is similar with the SCUFL language, but not relying on XML in order to allow the introduction of more workflow specific issues. Moreover, an event-condition-action (ECA) approach allows a greater flexibility when expressing data and task dependencies, as well as the creation of adaptive workflows which can react to changes in the configuration of the Grid or in the workflow itself. Changes inside the grid are handled by creating specific rules which allow resource selection based on various task scheduling criteria. Modifications of the workflow are usually accomplished either by inserting or retracting at runtime rules belonging to it or by modifying the executor of the task in case a better one is found. The former implies changes in its structure while the latter does not necessarily mean changes of the resource but more precisely changes of the algorithm used for solving the task. More details can be found in [3]. Another important platform component is the data indexing and storage service, GDIS, providing features for data storage, indexing data using a specialized RDBMS, finding data by various conditions, querying external services and keeping track of temporary data generated by other components. The data storage component part of GDIS is responsible for storing the data by using available storage backends such as local disk file systems (ext3), local cluster storage (GFS) or distributed file systems (HDFS). A front-end GridFTP service is capable of interacting with the storage domains on behalf of the clients and in a uniform way and also enforces the security restrictions provided by other specialized services and related with data access. The data indexing is performed by PostGIS. An advanced and flexible interface for searching the project's geographical repository is built around a custom query language (LLQL - Lisp Like Query Language) designed to provide fine grained access to the data in the repository and to query external services (e.g. for exploiting the connection with GENESI-DR catalog). More details can be found in [4]. The Workload Management System (WMS) provides two types of resource managers. The first one will be based on Condor HTC and use Condor as a job manager for task dispatching and working nodes (for development purposes) while the second one will use GT4 GRAM (for production purposes). The WMS main component, the Grid Task Dispatcher (GTD), is responsible for the interaction with other internal services as the composition engine in order to facilitate access to the processing platform. Its main responsibilities are to receive tasks from the workflow engine or directly from user interface, to use a task description language (the ClassAd meta language in case of Condor HTC) for job units, to submit and check the status of jobs inside the workload management system and to retrieve job logs for debugging purposes. More details can be found in [4]. A particular component of the platform is eGLE, the eLearning environment. It provides the functionalities necessary to create the visual appearance of the lessons through the usage of visual containers like tools, patterns and templates. The teacher uses the platform for testing the already created lessons, as well as for developing new lesson resources, such as new images and workflows describing graph-based processing. The students execute the lessons or describe and experiment with new workflows or different data. The eGLE database includes several workflow-based lesson descriptions, teaching materials and lesson resources, selected satellite and spatial data. More details can be found in [5]. A first training event of using the platform was organized in September 2009 during 11th SYNASC symposium (links to the demos, testing interface, and exercises are available on project site [1]). The eGLE component was presented at 4th GPC conference in May 2009. Moreover, the functionality of the platform will be presented as demo in April 2010 at 5th EGEE User forum. References: [1] GiSHEO consortium, Project site, http://gisheo.info.uvt.ro [2] D. Petcu, D. Zaharie, M. Neagul, S. Panica, M. Frincu, D. Gorgan, T. Stefanut, V. Bacu, Remote Sensed Image Processing on Grids for Training in Earth Observation. In Image Processing, V. Kordic (ed.), In-Tech, January 2010. [3] M. Neagul, S. Panica, D. Petcu, D. Zaharie, D. Gorgan, Web and Grid Services for Training in Earth Observation, IDAACS 2009, IEEE Computer Press, 241-246 [4] M. Frincu, S. Panica, M. Neagul, D. Petcu, Gisheo: On Demand Grid Service Based Platform for EO Data Processing. HiperGrid 2009, Politehnica Press, 415-422. [5] D. Gorgan, T. Stefanut, V. Bacu, Grid Based Training Environment for Earth Observation, GPC 2009, LNCS 5529, 98-109

  9. The three-dimensional Multi-Block Advanced Grid Generation System (3DMAGGS)

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.; Weilmuenster, Kenneth J.

    1993-01-01

    As the size and complexity of three dimensional volume grids increases, there is a growing need for fast and efficient 3D volumetric elliptic grid solvers. Present day solvers are limited by computational speed and do not have all the capabilities such as interior volume grid clustering control, viscous grid clustering at the wall of a configuration, truncation error limiters, and convergence optimization residing in one code. A new volume grid generator, 3DMAGGS (Three-Dimensional Multi-Block Advanced Grid Generation System), which is based on the 3DGRAPE code, has evolved to meet these needs. This is a manual for the usage of 3DMAGGS and contains five sections, including the motivations and usage, a GRIDGEN interface, a grid quality analysis tool, a sample case for verifying correct operation of the code, and a comparison to both 3DGRAPE and GRIDGEN3D. Since it was derived from 3DGRAPE, this technical memorandum should be used in conjunction with the 3DGRAPE manual (NASA TM-102224).

  10. Quantifying the uncertainty introduced by discretization and time-averaging in two-fluid model predictions

    DOE PAGES

    Syamlal, Madhava; Celik, Ismail B.; Benyahia, Sofiane

    2017-07-12

    The two-fluid model (TFM) has become a tool for the design and troubleshooting of industrial fluidized bed reactors. To use TFM for scale up with confidence, the uncertainty in its predictions must be quantified. Here, we study two sources of uncertainty: discretization and time-averaging. First, we show that successive grid refinement may not yield grid-independent transient quantities, including cross-section–averaged quantities. Successive grid refinement would yield grid-independent time-averaged quantities on sufficiently fine grids. A Richardson extrapolation can then be used to estimate the discretization error, and the grid convergence index gives an estimate of the uncertainty. Richardson extrapolation may not workmore » for industrial-scale simulations that use coarse grids. We present an alternative method for coarse grids and assess its ability to estimate the discretization error. Second, we assess two methods (autocorrelation and binning) and find that the autocorrelation method is more reliable for estimating the uncertainty introduced by time-averaging TFM data.« less

  11. PLOT3D user's manual

    NASA Technical Reports Server (NTRS)

    Walatka, Pamela P.; Buning, Pieter G.; Pierce, Larry; Elson, Patricia A.

    1990-01-01

    PLOT3D is a computer graphics program designed to visualize the grids and solutions of computational fluid dynamics. Seventy-four functions are available. Versions are available for many systems. PLOT3D can handle multiple grids with a million or more grid points, and can produce varieties of model renderings, such as wireframe or flat shaded. Output from PLOT3D can be used in animation programs. The first part of this manual is a tutorial that takes the reader, keystroke by keystroke, through a PLOT3D session. The second part of the manual contains reference chapters, including the helpfile, data file formats, advice on changing PLOT3D, and sample command files.

  12. Integrated Data Visualization and Virtual Reality Tool

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  13. The Development of a Visual-Perceptual Chemistry Specific (VPCS) Assessment Tool

    ERIC Educational Resources Information Center

    Oliver-Hoyo, Maria; Sloan, Caroline

    2014-01-01

    The development of the Visual-Perceptual Chemistry Specific (VPCS) assessment tool is based on items that align to eight visual-perceptual skills considered as needed by chemistry students. This tool includes a comprehensive range of visual operations and presents items within a chemistry context without requiring content knowledge to solve…

  14. caGrid 1.0 : an enterprise Grid infrastructure for biomedical research.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oster, S.; Langella, S.; Hastings, S.

    To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design: An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG{trademark}) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including (1) discovery, (2) integrated and large-scale data analysis, and (3) coordinated study. Measurements: The caGrid is built as a Grid software infrastructure andmore » leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results: The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: .« less

  15. The OpenEarth Framework (OEF) for the 3D Visualization of Integrated Earth Science Data

    NASA Astrophysics Data System (ADS)

    Nadeau, David; Moreland, John; Baru, Chaitan; Crosby, Chris

    2010-05-01

    Data integration is increasingly important as we strive to combine data from disparate sources and assemble better models of the complex processes operating at the Earth's surface and within its interior. These data are often large, multi-dimensional, and subject to differing conventions for data structures, file formats, coordinate spaces, and units of measure. When visualized, these data require differing, and sometimes conflicting, conventions for visual representations, dimensionality, symbology, and interaction. All of this makes the visualization of integrated Earth science data particularly difficult. The OpenEarth Framework (OEF) is an open-source data integration and visualization suite of applications and libraries being developed by the GEON project at the University of California, San Diego, USA. Funded by the NSF, the project is leveraging virtual globe technology from NASA's WorldWind to create interactive 3D visualization tools that combine and layer data from a wide variety of sources to create a holistic view of features at, above, and beneath the Earth's surface. The OEF architecture is open, cross-platform, modular, and based upon Java. The OEF's modular approach to software architecture yields an array of mix-and-match software components for assembling custom applications. Available modules support file format handling, web service communications, data management, user interaction, and 3D visualization. File parsers handle a variety of formal and de facto standard file formats used in the field. Each one imports data into a general-purpose common data model supporting multidimensional regular and irregular grids, topography, feature geometry, and more. Data within these data models may be manipulated, combined, reprojected, and visualized. The OEF's visualization features support a variety of conventional and new visualization techniques for looking at topography, tomography, point clouds, imagery, maps, and feature geometry. 3D data such as seismic tomography may be sliced by multiple oriented cutting planes and isosurfaced to create 3D skins that trace feature boundaries within the data. Topography may be overlaid with satellite imagery, maps, and data such as gravity and magnetics measurements. Multiple data sets may be visualized simultaneously using overlapping layers within a common 3D coordinate space. Data management within the OEF handles and hides the inevitable quirks of differing file formats, web protocols, storage structures, coordinate spaces, and metadata representations. Heuristics are used to extract necessary metadata used to guide data and visual operations. Derived data representations are computed to better support fluid interaction and visualization while the original data is left unchanged in its original form. Data is cached for better memory and network efficiency, and all visualization makes use of 3D graphics hardware support found on today's computers. The OpenEarth Framework project is currently prototyping the software for use in the visualization, and integration of continental scale geophysical data being produced by EarthScope-related research in the Western US. The OEF is providing researchers with new ways to display and interrogate their data and is anticipated to be a valuable tool for future EarthScope-related research.

  16. Blast2GO goes grid: developing a grid-enabled prototype for functional genomics analysis.

    PubMed

    Aparicio, G; Götz, S; Conesa, A; Segrelles, D; Blanquer, I; García, J M; Hernandez, V; Robles, M; Talon, M

    2006-01-01

    The vast amount in complexity of data generated in Genomic Research implies that new dedicated and powerful computational tools need to be developed to meet their analysis requirements. Blast2GO (B2G) is a bioinformatics tool for Gene Ontology-based DNA or protein sequence annotation and function-based data mining. The application has been developed with the aim of affering an easy-to-use tool for functional genomics research. Typical B2G users are middle size genomics labs carrying out sequencing, ETS and microarray projects, handling datasets up to several thousand sequences. In the current version of B2G. The power and analytical potential of both annotation and function data-mining is somehow restricted to the computational power behind each particular installation. In order to be able to offer the possibility of an enhanced computational capacity within this bioinformatics application, a Grid component is being developed. A prototype has been conceived for the particular problem of speeding up the Blast searches to obtain fast results for large datasets. Many efforts have been done in the literature concerning the speeding up of Blast searches, but few of them deal with the use of large heterogeneous production Grid Infrastructures. These are the infrastructures that could reach the largest number of resources and the best load balancing for data access. The Grid Service under development will analyse requests based on the number of sequences, splitting them accordingly to the available resources. Lower-level computation will be performed through MPIBLAST. The software architecture is based on the WSRF standard.

  17. Geometry Modeling and Grid Generation for Design and Optimization

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1998-01-01

    Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.

  18. On a simulation study for reliable and secured smart grid communications

    NASA Astrophysics Data System (ADS)

    Mallapuram, Sriharsha; Moulema, Paul; Yu, Wei

    2015-05-01

    Demand response is one of key smart grid applications that aims to reduce power generation at peak hours and maintain a balance between supply and demand. With the support of communication networks, energy consumers can become active actors in the energy management process by adjusting or rescheduling their electricity usage during peak hours based on utilities pricing incentives. Nonetheless, the integration of communication networks expose the smart grid to cyber-attacks. In this paper, we developed a smart grid simulation test-bed and designed evaluation scenarios. By leveraging the capabilities of Matlab and ns-3 simulation tools, we conducted a simulation study to evaluate the impact of cyber-attacks on demand response application. Our data shows that cyber-attacks could seriously disrupt smart grid operations, thus confirming the need of secure and resilient communication networks for supporting smart grid operations.

  19. Application of PMU-Based Information in the Indian Power System

    NASA Astrophysics Data System (ADS)

    Agarwal, P. K.; Agarwal, V. K.; Rathour, Harish

    2013-05-01

    SCADA/EMS system has been the most commonly used tool for real-time power system operation and control throughout the world. This system has been found to be very useful in steady-state analysis of the power system. The ever-increasing dependence of human society and every country's economy on electrical energy calls for reliable power delivery. In order to meet these expectations, engineers across the globe have been exploring such new technologies that can improve upon the limitations of SCADA and provide dynamic visibility of the power system. A breakthrough has now been achieved in the form of synchrophasor technology. Synchrophasor measurements using phasor measurement units (PMUs) deployed over a wide area, facilitate dynamic state measurement and visualization of a power system, which are useful in monitoring safety and security of the grid. The Power System Operation Corporation (POSOCO) has taken initiative and implemented a pilot project wherein nine phasor measurement units (PMUs) along with one phasor data concentrator (PDC) were commissioned in the Northern Region (NR) of India. The primary objective of this pilot project was to comprehend the synchrophasor technology and its applications in power system operation. The data received and information derived from the pilot project have been found to be very useful and helped in improving the performance of the grid operation in several ways. The pilot project is operational for the last two years; in the meanwhile, many other initiatives have also been taken in other regions by POSOCO. This article details the utilization of the data collected from the pilot projects and the application of the data in the improvement of Indian power grid.

  20. Two variants of minimum discarded fill ordering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Azevedo, E.F.; Forsyth, P.A.; Tang, Wei-Pai

    1991-01-01

    It is well known that the ordering of the unknowns can have a significant effect on the convergence of Preconditioned Conjugate Gradient (PCG) methods. There has been considerable experimental work on the effects of ordering for regular finite difference problems. In many cases, good results have been obtained with preconditioners based on diagonal, spiral or natural row orderings. However, for finite element problems having unstructured grids or grids generated by a local refinement approach, it is difficult to define many of the orderings for more regular problems. A recently proposed Minimum Discarded Fill (MDF) ordering technique is effective in findingmore » high quality Incomplete LU (ILU) preconditioners, especially for problems arising from unstructured finite element grids. Testing indicates this algorithm can identify a rather complicated physical structure in an anisotropic problem and orders the unknowns in the preferred'' direction. The MDF technique may be viewed as the numerical analogue of the minimum deficiency algorithm in sparse matrix technology. At any stage of the partial elimination, the MDF technique chooses the next pivot node so as to minimize the amount of discarded fill. In this work, two efficient variants of the MDF technique are explored to produce cost-effective high-order ILU preconditioners. The Threshold MDF orderings combine MDF ideas with drop tolerance techniques to identify the sparsity pattern in the ILU preconditioners. These techniques identify an ordering that encourages fast decay of the entries in the ILU factorization. The Minimum Update Matrix (MUM) ordering technique is a simplification of the MDF ordering and is closely related to the minimum degree algorithm. The MUM ordering is especially for large problems arising from Navier-Stokes problems. Some interesting pictures of the orderings are presented using a visualization tool. 22 refs., 4 figs., 7 tabs.« less

  1. Gaming to improve vision: 21st century self-monitoring for patients with age-related macular degeneration.

    PubMed

    Razavi, Hessom; Baglin, Elizabeth; Sharangan, Pyrawy; Caruso, Emily; Tindill, Nicole; Griffin, Susan; Guymer, Robyn

    2017-11-13

    Improved vision self-monitoring tools are required for people at risk of neovascular complications from age related macular degeneration (AMD). to report the self-monitoring habits of participants with intermediate AMD using the Amsler grid chart, and the use of personal electronic devices and gameplay in this over 50 year old cohort. single-centre descriptive study carried out at the Centre for Eye Research (CERA), Melbourne, Australia. 140 participants over 50 years of age, with a diagnosis of intermediate AMD and best-corrected visual acuity (BCVA) of ≥6/12 in each eye. structured questionnaire survey of participants who were enrolled in natural history of AMD studies at CERA. frequency of vision self-monitoring using the Amsler grid chart, and frequency of general use of personal electronic devices and gameplay. Of 140 participants with mean age of 70.5 years, 83.6% used an Amsler grid chart, but only 39.3% used it once per week. Most participants (91.4%) used one or more personal electronic devices. Of these, over half (54.7%) played games on them, among whom 39% played games once a day. Of participants aged 50-69 years, 92% (95%CI 85.1-98.9) were willing to play a game to monitor their vision, compared to 78% (95%CI 69.0-87.0) of those aged 70 years and older (P < 0.05). a large proportion of AMD patients already use personal electronic devices. Gamification techniques are likely to increase compliance with self-monitoring, leading to earlier detection in the next generation of patients with neovascular AMD. © 2017 Royal Australian and New Zealand College of Ophthalmologists.

  2. SciFlo: Semantically-Enabled Grid Workflow for Collaborative Science

    NASA Astrophysics Data System (ADS)

    Yunck, T.; Wilson, B. D.; Raskin, R.; Manipon, G.

    2005-12-01

    SciFlo is a system for Scientific Knowledge Creation on the Grid using a Semantically-Enabled Dataflow Execution Environment. SciFlo leverages Simple Object Access Protocol (SOAP) Web Services and the Grid Computing standards (WS-* standards and the Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable SOAP Services, native executables, local command-line scripts, and python codes into a distributed computing flow (a graph of operators). SciFlo's XML dataflow documents can be a mixture of concrete operators (fully bound operations) and abstract template operators (late binding via semantic lookup). All data objects and operators can be both simply typed (simple and complex types in XML schema) and semantically typed using controlled vocabularies (linked to OWL ontologies such as SWEET). By exploiting ontology-enhanced search and inference, one can discover (and automatically invoke) Web Services and operators that have been semantically labeled as performing the desired transformation, and adapt a particular invocation to the proper interface (number, types, and meaning of inputs and outputs). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. A Visual Programming tool is also being developed, but it is not required. Once an analysis has been specified for a granule or day of data, it can be easily repeated with different control parameters and over months or years of data. SciFlo uses and preserves semantics, and also generates and infers new semantic annotations. Specifically, the SciFlo engine uses semantic metadata to understand (infer) what it is doing and potentially improve the data flow; preserves semantics by saving links to the semantics of (metadata describing) the input datasets, related datasets, and the data transformations (algorithms) used to generate downstream products; generates new metadata by allowing the user to add semantic annotations to the generated data products (or simply accept automatically generated provenance annotations); and infers new semantic metadata by understanding and applying logic to the semantics of the data and the transformations performed. Much ontology development still needs to be done but, nevertheless, SciFlo documents provide a substrate for using and preserving more semantics as ontologies develop. We will give a live demonstration of the growing SciFlo network using an example dataflow in which atmospheric temperature and water vapor profiles from three Earth Observing System (EOS) instruments are retrieved using SOAP (geo-location query & data access) services, co-registered, and visually & statistically compared on demand (see http://sciflo.jpl.nasa.gov for more information).

  3. Techniques and resources for storm-scale numerical weather prediction

    NASA Technical Reports Server (NTRS)

    Droegemeier, Kelvin; Grell, Georg; Doyle, James; Soong, Su-Tzai; Skamarock, William; Bacon, David; Staniforth, Andrew; Crook, Andrew; Wilhelmson, Robert

    1993-01-01

    The topics discussed include the following: multiscale application of the 5th-generation PSU/NCAR mesoscale model, the coupling of nonhydrostatic atmospheric and hydrostatic ocean models for air-sea interaction studies; a numerical simulation of cloud formation over complex topography; adaptive grid simulations of convection; an unstructured grid, nonhydrostatic meso/cloud scale model; efficient mesoscale modeling for multiple scales using variable resolution; initialization of cloud-scale models with Doppler radar data; and making effective use of future computing architectures, networks, and visualization software.

  4. Using Visual Simulation Tools And Learning Outcomes-Based Curriculum To Help Transportation Engineering Students And Practitioners To Better Understand And Design Traffic Signal Control Systems

    DOT National Transportation Integrated Search

    2012-06-01

    The use of visual simulation tools to convey complex concepts has become a useful tool in education as well as in research. : This report describes a project that developed curriculum and visualization tools to train transportation engineering studen...

  5. Communications Effects Server (CES) Model for Systems Engineering Research

    DTIC Science & Technology

    2012-01-31

    Visualization Tool Interface «logical» HLA Tool Interface «logical» DIS Tool Interface «logical» STK Tool Interface «module» Execution Kernels «logical...interoperate with STK when running simulations. GUI Components  Architect – The Architect represents the main network design and visualization ...interest» CES «block» Third Party Visualization Tool «block» Third Party Analysis Tool «block» Third Party Text Editor «block» HLA Tools Analyst User Army

  6. Comprehensive framework for visualizing and analyzing spatio-temporal dynamics of racial diversity in the entire United States

    PubMed Central

    Netzel, Pawel

    2017-01-01

    The United States is increasingly becoming a multi-racial society. To understand multiple consequences of this overall trend to our neighborhoods we need a methodology capable of spatio-temporal analysis of racial diversity at the local level but also across the entire U.S. Furthermore, such methodology should be accessible to stakeholders ranging from analysts to decision makers. In this paper we present a comprehensive framework for visualizing and analyzing diversity data that fulfills such requirements. The first component of our framework is a U.S.-wide, multi-year database of race sub-population grids which is freely available for download. These 30 m resolution grids have being developed using dasymetric modeling and are available for 1990-2000-2010. We summarize numerous advantages of gridded population data over commonly used Census tract-aggregated data. Using these grids frees analysts from constructing their own and allows them to focus on diversity analysis. The second component of our framework is a set of U.S.-wide, multi-year diversity maps at 30 m resolution. A diversity map is our product that classifies the gridded population into 39 communities based on their degrees of diversity, dominant race, and population density. It provides spatial information on diversity in a single, easy-to-understand map that can be utilized by analysts and end users alike. Maps based on subsequent Censuses provide information about spatio-temporal dynamics of diversity. Diversity maps are accessible through the GeoWeb application SocScape (http://sil.uc.edu/webapps/socscape_usa/) for an immediate online exploration. The third component of our framework is a proposal to quantitatively analyze diversity maps using a set of landscape metrics. Because of its form, a grid-based diversity map could be thought of as a diversity “landscape” and analyzed quantitatively using landscape metrics. We give a brief summary of most pertinent metrics and demonstrate how they can be applied to diversity maps. PMID:28358862

  7. Option Grids to facilitate shared decision making for patients with Osteoarthritis of the knee: protocol for a single site, efficacy trial.

    PubMed

    Marrin, Katy; Wood, Fiona; Firth, Jill; Kinsey, Katharine; Edwards, Adrian; Brain, Kate E; Newcombe, Robert G; Nye, Alan; Pickles, Timothy; Hawthorne, Kamila; Elwyn, Glyn

    2014-04-07

    Despite policy interest, an ethical imperative, and evidence of the benefits of patient decision support tools, the adoption of shared decision making (SDM) in day-to-day clinical practice remains slow and is inhibited by barriers that include culture and attitudes; resources and time pressures. Patient decision support tools often require high levels of health and computer literacy. Option Grids are one-page evidence-based summaries of the available condition-specific treatment options, listing patients' frequently asked questions. They are designed to be sufficiently brief and accessible enough to support a better dialogue between patients and clinicians during routine consultations. This paper describes a study to assess whether an Option Grid for osteoarthritis of the knee (OA of the knee) facilitates SDM, and explores the use of Option Grids by patients disadvantaged by language or poor health literacy. This will be a stepped wedge exploratory trial involving 72 patients with OA of the knee referred from primary medical care to a specialist musculoskeletal service in Oldham. Six physiotherapists will sequentially join the trial and consult with six patients using usual care procedures. After a period of brief training in using the Option Grid, the same six physiotherapists will consult with six further patients using an Option Grid in the consultation. The primary outcome will be efficacy of the Option Grid in facilitating SDM as measured by observational scores using the OPTION scale. Comparisons will be made between patients who have received the Option Grid and those who received usual care. A Decision Quality Measure (DQM) will assess quality of decision making. The health literacy of patients will be measured using the REALM-R instrument. Consultations will be observed and audio-recorded. Interviews will be conducted with the physiotherapists, patients and any interpreters present to explore their views of using the Option Grid. Option Grids offer a potential solution to the barriers to implementing traditional decision aids into routine clinical practice. The study will assess whether Option Grids can facilitate SDM in day-to-day clinical practice and explore their use with patients disadvantaged by language or poor health literacy. Current Controlled Trials ISRCTN94871417.

  8. Introducing GV : The Spacecraft Geometry Visualizer

    NASA Astrophysics Data System (ADS)

    Throop, Henry B.; Stern, S. A.; Parker, J. W.; Gladstone, G. R.; Weaver, H. A.

    2009-12-01

    GV (Geometry Visualizer) is a web-based program for planning spacecraft observations. GV is the primary planning tool used by the New Horizons science team to plan the encounter with Pluto. GV creates accurate 3D images and movies showing the position of planets, satellites, and stars as seen from an observer on a spacecraft or other body. NAIF SPICE routines are used throughout for accurate calculations of all geometry. GV includes 3D geometry rendering of all planetary bodies, lon/lat grids, ground tracks, albedo maps, stellar magnitudes, types and positions from HD and Tycho-2 catalogs, and spacecraft FOVs. It generates still images, animations, and geometric data tables. GV is accessed through an easy-to-use and flexible web interface. The web-based interface allows for uniform use from any computer and assures that all users are accessing up-to-date versions of the code and kernel libraries. Compared with existing planning tools, GV is often simpler, faster, lower-cost, and more flexible. GV was developed at SwRI to support the New Horizons mission to Pluto. It has been subsequently expanded to support multiple other missions in flight or under development, including Cassini, Messenger, Rosetta, LRO, and Juno. The system can be used to plan Earth-based observations such as occultations to high precision, and was used by the public to help plan 'Kodak Moment' observations of the Pluto system from New Horizons. Potential users of GV may contact the author for more information. Development of GV has been funded by the New Horizons, Rosetta, and LRO missions.

  9. Assessing species distribution using Google Street View: a pilot study with the Pine Processionary Moth.

    PubMed

    Rousselet, Jérôme; Imbert, Charles-Edouard; Dekri, Anissa; Garcia, Jacques; Goussard, Francis; Vincent, Bruno; Denux, Olivier; Robinet, Christelle; Dorkeld, Franck; Roques, Alain; Rossi, Jean-Pierre

    2013-01-01

    Mapping species spatial distribution using spatial inference and prediction requires a lot of data. Occurrence data are generally not easily available from the literature and are very time-consuming to collect in the field. For that reason, we designed a survey to explore to which extent large-scale databases such as Google maps and Google Street View could be used to derive valid occurrence data. We worked with the Pine Processionary Moth (PPM) Thaumetopoea pityocampa because the larvae of that moth build silk nests that are easily visible. The presence of the species at one location can therefore be inferred from visual records derived from the panoramic views available from Google Street View. We designed a standardized procedure allowing evaluating the presence of the PPM on a sampling grid covering the landscape under study. The outputs were compared to field data. We investigated two landscapes using grids of different extent and mesh size. Data derived from Google Street View were highly similar to field data in the large-scale analysis based on a square grid with a mesh of 16 km (96% of matching records). Using a 2 km mesh size led to a strong divergence between field and Google-derived data (46% of matching records). We conclude that Google database might provide useful occurrence data for mapping the distribution of species which presence can be visually evaluated such as the PPM. However, the accuracy of the output strongly depends on the spatial scales considered and on the sampling grid used. Other factors such as the coverage of Google Street View network with regards to sampling grid size and the spatial distribution of host trees with regards to road network may also be determinant.

  10. Leveraging GeoTIFF Compatibility for Visualizing a New EASE-Grid 2.0 Global Satellite Passive Microwave Climate Record

    NASA Astrophysics Data System (ADS)

    Paget, A. C.; Brodzik, M. J.; Long, D. G.; Hardman, M.

    2016-02-01

    The historical record of satellite-derived passive microwave brightness temperatures comprises data from multiple imaging radiometers (SMMR, SSM/I-SSMIS, AMSR-E), spanning nearly 40 years of Earth observations from 1978 to the present. Passive microwave data are used to monitor time series of many climatological variables, including ocean wind speeds, cloud liquid water and sea ice concentrations and ice velocity. Gridded versions of passive microwave data have been produced using various map projections (polar stereographic, Lambert azimuthal equal-area, cylindrical equal-area, quarter-degree Platte-Carree) and data formats (flat binary, HDF). However, none of the currently available versions can be rendered in the common visualization standard, geoTIFF, without requiring cartographic reprojection. Furthermore, the reprojection details are complicated and often require expert knowledge of obscure software package options. We are producing a consistently calibrated, completely reprocessed data set of this valuable multi-sensor satellite record, using EASE-Grid 2.0, an improved equal-area projection definition that will require no reprojection for translation into geoTIFF. Our approach has been twofold: 1) define the projection ellipsoid to match the reference datum of the satellite data, and 2) include required file-level metadata for standard projection software to correctly render the data in the geoTIFF standard. The Calibrated, Enhanced Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR), leverages image reconstruction techniques to enhance gridded spatial resolution to 3 km and uses newly available intersensor calibrations to improve the quality of derived geophysical products. We expect that our attention to easy geoTIFF compatibility will foster higher-quality analysis with the CETB product by enabling easy and correct intercomparison with other gridded and in situ data.

  11. Assessing Species Distribution Using Google Street View: A Pilot Study with the Pine Processionary Moth

    PubMed Central

    Dekri, Anissa; Garcia, Jacques; Goussard, Francis; Vincent, Bruno; Denux, Olivier; Robinet, Christelle; Dorkeld, Franck; Roques, Alain; Rossi, Jean-Pierre

    2013-01-01

    Mapping species spatial distribution using spatial inference and prediction requires a lot of data. Occurrence data are generally not easily available from the literature and are very time-consuming to collect in the field. For that reason, we designed a survey to explore to which extent large-scale databases such as Google maps and Google street view could be used to derive valid occurrence data. We worked with the Pine Processionary Moth (PPM) Thaumetopoea pityocampa because the larvae of that moth build silk nests that are easily visible. The presence of the species at one location can therefore be inferred from visual records derived from the panoramic views available from Google street view. We designed a standardized procedure allowing evaluating the presence of the PPM on a sampling grid covering the landscape under study. The outputs were compared to field data. We investigated two landscapes using grids of different extent and mesh size. Data derived from Google street view were highly similar to field data in the large-scale analysis based on a square grid with a mesh of 16 km (96% of matching records). Using a 2 km mesh size led to a strong divergence between field and Google-derived data (46% of matching records). We conclude that Google database might provide useful occurrence data for mapping the distribution of species which presence can be visually evaluated such as the PPM. However, the accuracy of the output strongly depends on the spatial scales considered and on the sampling grid used. Other factors such as the coverage of Google street view network with regards to sampling grid size and the spatial distribution of host trees with regards to road network may also be determinant. PMID:24130675

  12. A Multi-layer, Data-driven Advanced Reasoning Tool for Intelligent Data Mining and Analysis for Smart Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ning; Du, Pengwei; Greitzer, Frank L.

    2012-12-31

    This paper presents the multi-layer, data-driven advanced reasoning tool (M-DART), a proof-of-principle decision support tool for improved power system operation. M-DART will cross-correlate and examine different data sources to assess anomalies, infer root causes, and anneal data into actionable information. By performing higher-level reasoning “triage” of diverse data sources, M-DART focuses on early detection of emerging power system events and identifies highest priority actions for the human decision maker. M-DART represents a significant advancement over today’s grid monitoring technologies that apply offline analyses to derive model-based guidelines for online real-time operations and use isolated data processing mechanisms focusing on individualmore » data domains. The development of the M-DART will bridge these gaps by reasoning about results obtained from multiple data sources that are enabled by the smart grid infrastructure. This hybrid approach integrates a knowledge base that is trained offline but tuned online to capture model-based relationships while revealing complex causal relationships among data from different domains.« less

  13. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  14. A 'digital' technique for manual extraction of data from aerial photography

    NASA Technical Reports Server (NTRS)

    Istvan, L. B.; Bondy, M. T.

    1977-01-01

    The interpretation procedure described uses a grid cell approach. In addition, a random point is located in each cell. The procedure required that the cell/point grid be established on a base map, and identical grids be made to precisely match the scale of the photographic frames. The grid is then positioned on the photography by visual alignment to obvious features. Several alignments on one frame are sometimes required to make a precise match of all points to be interpreted. This system inherently corrects for distortions in the photography. Interpretation is then done cell by cell. In order to meet the time constraints, first order interpretation should be maintained. The data is put onto coding forms, along with other appropriate data, if desired. This 'digital' manual interpretation technique has proven to be efficient, and time and cost effective, while meeting strict requirements for data format and accuracy.

  15. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  16. The Multisensory Attentional Consequences of Tool Use: A Functional Magnetic Resonance Imaging Study

    PubMed Central

    Holmes, Nicholas P.; Spence, Charles; Hansen, Peter C.; Mackay, Clare E.; Calvert, Gemma A.

    2008-01-01

    Background Tool use in humans requires that multisensory information is integrated across different locations, from objects seen to be distant from the hand, but felt indirectly at the hand via the tool. We tested the hypothesis that using a simple tool to perceive vibrotactile stimuli results in the enhanced processing of visual stimuli presented at the distal, functional part of the tool. Such a finding would be consistent with a shift of spatial attention to the location where the tool is used. Methodology/Principal Findings We tested this hypothesis by scanning healthy human participants' brains using functional magnetic resonance imaging, while they used a simple tool to discriminate between target vibrations, accompanied by congruent or incongruent visual distractors, on the same or opposite side to the tool. The attentional hypothesis was supported: BOLD response in occipital cortex, particularly in the right hemisphere lingual gyrus, varied significantly as a function of tool position, increasing contralaterally, and decreasing ipsilaterally to the tool. Furthermore, these modulations occurred despite the fact that participants were repeatedly instructed to ignore the visual stimuli, to respond only to the vibrotactile stimuli, and to maintain visual fixation centrally. In addition, the magnitude of multisensory (visual-vibrotactile) interactions in participants' behavioural responses significantly predicted the BOLD response in occipital cortical areas that were also modulated as a function of both visual stimulus position and tool position. Conclusions/Significance These results show that using a simple tool to locate and to perceive vibrotactile stimuli is accompanied by a shift of spatial attention to the location where the functional part of the tool is used, resulting in enhanced processing of visual stimuli at that location, and decreased processing at other locations. This was most clearly observed in the right hemisphere lingual gyrus. Such modulations of visual processing may reflect the functional importance of visuospatial information during human tool use. PMID:18958150

  17. Repertory Grid As a Means to Compare and Contrast Developmental Theorists

    ERIC Educational Resources Information Center

    Mayo, Joseph A.

    2004-01-01

    This article reports on the use of a repertory grid as a tool for studying conceptual systems in line with Kelly's (1955) personal construct theory. Using 7-point construct continua, students rated the positions of major developmental theorists on various bipolar constructs (e.g., nature-nurture, continuity-discontinuity) representing salient…

  18. Renewable Resource Data | Grid Modernization | NREL

    Science.gov Websites

    , and tools related to U.S. biomass, geothermal, solar, and wind energy resources. Measurement and resource data to help energy system designers, building architects and engineers, renewable energy analysts , and others to accelerate the integration of renewable energy technologies on the grid. National Solar

  19. Assessment and Planning Using Portfolio Analysis

    ERIC Educational Resources Information Center

    Roberts, Laura B.

    2010-01-01

    Portfolio analysis is a simple yet powerful management tool. Programs and activities are placed on a grid with mission along one axis and financial return on the other. The four boxes of the grid (low mission, low return; high mission, low return; high return, low mission; high return, high mission) help managers identify which programs might be…

  20. Visual enhancements in pick-and-place tasks: Human operators controlling a simulated cylindrical manipulator

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Tendick, Frank; Stark, Lawrence

    1989-01-01

    A teleoperation simulator was constructed with vector display system, joysticks, and a simulated cylindrical manipulator, in order to quantitatively evaluate various display conditions. The first of two experiments conducted investigated the effects of perspective parameter variations on human operators' pick-and-place performance, using a monoscopic perspective display. The second experiment involved visual enhancements of the monoscopic perspective display, by adding a grid and reference lines, by comparison with visual enhancements of a stereoscopic display; results indicate that stereoscopy generally permits superior pick-and-place performance, but that monoscopy nevertheless allows equivalent performance when defined with appropriate perspective parameter values and adequate visual enhancements.

  1. A tool for optimization of the production and user analysis on the Grid, C. Grigoras for the ALICE Collaboration

    NASA Astrophysics Data System (ADS)

    Grigoras, Costin; Carminati, Federico; Vladimirovna Datskova, Olga; Schreiner, Steffen; Lee, Sehoon; Zhu, Jianlin; Gheata, Mihaela; Gheata, Andrei; Saiz, Pablo; Betev, Latchezar; Furano, Fabrizio; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Bagnasco, Stefano; Peters, Andreas Joachim; Saiz Santos, Maria Dolores

    2011-12-01

    With the LHC and ALICE entering a full operation and production modes, the amount of Simulation and RAW data processing and end user analysis computational tasks are increasing. The efficient management of all these tasks, all of which have large differences in lifecycle, amounts of processed data and methods to analyze the end result, required the development and deployment of new tools in addition to the already existing Grid infrastructure. To facilitate the management of the large scale simulation and raw data reconstruction tasks, ALICE has developed a production framework called a Lightweight Production Manager (LPM). The LPM is automatically submitting jobs to the Grid based on triggers and conditions, for example after a physics run completion. It follows the evolution of the job and publishes the results on the web for worldwide access by the ALICE physicists. This framework is tightly integrated with the ALICE Grid framework AliEn. In addition to the publication of the job status, LPM is also allowing a fully authenticated interface to the AliEn Grid catalogue, to browse and download files, and in the near future will provide simple types of data analysis through ROOT plugins. The framework is also being extended to allow management of end user jobs.

  2. Interactive Graphics Tools for Analysis of MOLA and Other Data

    NASA Technical Reports Server (NTRS)

    Frey, H.; Roark, J.; Sakimoto, S.

    2000-01-01

    We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

  3. Fertilizer Emission Scenario Tool for crop management system scenarios

    EPA Pesticide Factsheets

    The Fertilizer Emission Scenario Tool for CMAQ is a high-end computer interface that simulates daily fertilizer application information for any gridded domain. It integrates the Weather Research and Forecasting model and CMAQ.

  4. Grid Integration Research | Wind | NREL

    Science.gov Websites

    -generated simulation of a wind turbine. Wind Power Plant Modeling and Simulation Engineers at the National computer-aided engineering tool, FAST, as well as their wind power plant simulation tool, Wind-Plant

  5. The Human EST Ontology Explorer: a tissue-oriented visualization system for ontologies distribution in human EST collections.

    PubMed

    Merelli, Ivan; Caprera, Andrea; Stella, Alessandra; Del Corvo, Marcello; Milanesi, Luciano; Lazzari, Barbara

    2009-10-15

    The NCBI dbEST currently contains more than eight million human Expressed Sequenced Tags (ESTs). This wide collection represents an important source of information for gene expression studies, provided it can be inspected according to biologically relevant criteria. EST data can be browsed using different dedicated web resources, which allow to investigate library specific gene expression levels and to make comparisons among libraries, highlighting significant differences in gene expression. Nonetheless, no tool is available to examine distributions of quantitative EST collections in Gene Ontology (GO) categories, nor to retrieve information concerning library-dependent EST involvement in metabolic pathways. In this work we present the Human EST Ontology Explorer (HEOE) http://www.itb.cnr.it/ptp/human_est_explorer, a web facility for comparison of expression levels among libraries from several healthy and diseased tissues. The HEOE provides library-dependent statistics on the distribution of sequences in the GO Direct Acyclic Graph (DAG) that can be browsed at each GO hierarchical level. The tool is based on large-scale BLAST annotation of EST sequences. Due to the huge number of input sequences, this BLAST analysis was performed with the aid of grid computing technology, which is particularly suitable to address data parallel task. Relying on the achieved annotation, library-specific distributions of ESTs in the GO Graph were inferred. A pathway-based search interface was also implemented, for a quick evaluation of the representation of libraries in metabolic pathways. EST processing steps were integrated in a semi-automatic procedure that relies on Perl scripts and stores results in a MySQL database. A PHP-based web interface offers the possibility to simultaneously visualize, retrieve and compare data from the different libraries. Statistically significant differences in GO categories among user selected libraries can also be computed. The HEOE provides an alternative and complementary way to inspect EST expression levels with respect to approaches currently offered by other resources. Furthermore, BLAST computation on the whole human EST dataset was a suitable test of grid scalability in the context of large-scale bioinformatics analysis. The HEOE currently comprises sequence analysis from 70 non-normalized libraries, representing a comprehensive overview on healthy and unhealthy tissues. As the analysis procedure can be easily applied to other libraries, the number of represented tissues is intended to increase.

  6. A Lyapunov Function Based Remedial Action Screening Tool Using Real-Time Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitra, Joydeep; Ben-Idris, Mohammed; Faruque, Omar

    This report summarizes the outcome of a research project that comprised the development of a Lyapunov function based remedial action screening tool using real-time data (L-RAS). The L-RAS is an advanced computational tool that is intended to assist system operators in making real-time redispatch decisions to preserve power grid stability. The tool relies on screening contingencies using a homotopy method based on Lyapunov functions to avoid, to the extent possible, the use of time domain simulations. This enables transient stability evaluation at real-time speed without the use of massively parallel computational resources. The project combined the following components. 1. Developmentmore » of a methodology for contingency screening using a homotopy method based on Lyapunov functions and real-time data. 2. Development of a methodology for recommending remedial actions based on the screening results. 3. Development of a visualization and operator interaction interface. 4. Testing of screening tool, validation of control actions, and demonstration of project outcomes on a representative real system simulated on a Real-Time Digital Simulator (RTDS) cluster. The project was led by Michigan State University (MSU), where the theoretical models including homotopy-based screening, trajectory correction using real-time data, and remedial action were developed and implemented in the form of research-grade software. Los Alamos National Laboratory (LANL) contributed to the development of energy margin sensitivity dynamics, which constituted a part of the remedial action portfolio. Florida State University (FSU) and Southern California Edison (SCE) developed a model of the SCE system that was implemented on FSU's RTDS cluster to simulate real-time data that was streamed over the internet to MSU where the L-RAS tool was executed and remedial actions were communicated back to FSU to execute stabilizing controls on the simulated system. LCG Consulting developed the visualization and operator interaction interface, based on specifications provided by MSU. The project was performed from October 2012 to December 2016, at the end of which the L-RAS tool, as described above, was completed and demonstrated. The project resulted in the following innovations and contributions: (a) the L-RAS software prototype, tested on a simulated system, vetted by utility personnel, and potentially ready for wider testing and commercialization; (b) an RTDS-based test bed that can be used for future research in the field; (c) a suite of breakthrough theoretical contributions to the field of power system stability and control; and (d) a new tool for visualization of power system stability margins. While detailed descriptions of the development and implementation of the various project components have been provided in the quarterly reports, this final report provides an overview of the complete project, and is demonstrated using public domain test systems commonly used in the literature. The SCE system, and demonstrations thereon, are not included in this report due to Critical Energy Infrastructure Information (CEII) restrictions.« less

  7. Linked Environments for Atmospheric Discovery (LEAD): A Cyberinfrastructure for Mesoscale Meteorology Research and Education

    NASA Astrophysics Data System (ADS)

    Droegemeier, K.

    2004-12-01

    A new National Science Foundation Large Information Technology Research (ITR) grant - known as Linked Environments for Atmospheric Discovery (LEAD) - has been funded to facilitate the identification, access, preparation, assimilation, prediction, management, analysis, mining, and visualization of a broad array of meteorological data and model output, independent of format and physical location. A transforming element of LEAD is dynamic workflow orchestration and data management, which will allow use of analysis tools, forecast models, and data repositories as dynamically adaptive, on-demand systems that can a) change configuration rapidly and automatically in response to weather; b) continually be steered by new data; c) respond to decision-driven inputs from users; d) initiate other processes automatically; and e) steer remote observing technologies to optimize data collection for the problem at hand. Having been in operation for slightly more than a year, LEAD has created a technology roadmap and architecture for developing its capabilities and placing them within the academic and research environment. Further, much of the LEAD infrastructure being developed for the WRF model, particularly workflow orchestration, will play a significant role in the nascent WRF Developmental Test Bed Center located at NCAR. This paper updates the status of LEAD (e.g., the topics noted above), its ties with other community activities (e.g., CONDUIT, THREDDS, MADIS, NOMADS), and the manner in which LEAD technologies will be made available for general use. Each component LEAD application is being created as a standards-based Web service that can be run in stand-alone configuration or chained together to build an end-to-end environment for on-demand, real time NWP. We describe in this paper the concepts, implementation plans, and expected impacts of LEAD, the underpinning of which will be a series of interconnected, heterogeneous virtual IT "Grid environments" designed to provide a complete framework for mesoscale meteorology research and education. A set of Integrated Grid and Web Services Testbeds will maintain a rolling archive of several months of recent data, provide tools for operating on them, and serve as an infrastructure (i.e., a mini Grid) for developing distributed Web services capabilities. Education Testbeds will integrate education and outreach throughout the entire LEAD program, and will help shape LEAD research into applications that are congruent with pedagogic requirements, national standards, and evaluation metrics. Ultimately, the LEAD environments will enable researchers, educators, and students to run atmospheric models and other tools in much more realistic, real time settings than is now possible, with emphasis on the use of locally or otherwise uniquely available data.

  8. Image-guided laser projection for port placement in minimally invasive surgery.

    PubMed

    Marmurek, Jonathan; Wedlake, Chris; Pardasani, Utsav; Eagleson, Roy; Peters, Terry

    2006-01-01

    We present an application of an augmented reality laser projection system in which procedure-specific optimal incision sites, computed from pre-operative image acquisition, are superimposed on a patient to guide port placement in minimally invasive surgery. Tests were conducted to evaluate the fidelity of computed and measured port configurations, and to validate the accuracy with which a surgical tool-tip can be placed at an identified virtual target. A high resolution volumetric image of a thorax phantom was acquired using helical computed tomography imaging. Oriented within the thorax, a phantom organ with marked targets was visualized in a virtual environment. A graphical interface enabled marking the locations of target anatomy, and calculation of a grid of potential port locations along the intercostal rib lines. Optimal configurations of port positions and tool orientations were determined by an objective measure reflecting image-based indices of surgical dexterity, hand-eye alignment, and collision detection. Intra-operative registration of the computed virtual model and the phantom anatomy was performed using an optical tracking system. Initial trials demonstrated that computed and projected port placement provided direct access to target anatomy with an accuracy of 2 mm.

  9. Visualization Tools for Teaching Computer Security

    ERIC Educational Resources Information Center

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  10. Visualization and Analytics Tools for Infectious Disease Epidemiology: A Systematic Review

    PubMed Central

    Carroll, Lauren N.; Au, Alan P.; Detwiler, Landon Todd; Fu, Tsung-chieh; Painter, Ian S.; Abernethy, Neil F.

    2014-01-01

    Background A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) Identify public health user needs and preferences for infectious disease information visualization tools; (2) Identify existing infectious disease information visualization tools and characterize their architecture and features; (3) Identify commonalities among approaches applied to different data types; and (4) Describe tool usability evaluation efforts and barriers to the adoption of such tools. Methods We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. Results A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool use. Discussion and Conclusion As the volume and complexity of infectious disease data increases, public health professionals must synthesize highly disparate data to facilitate communication with the public and inform decisions regarding measures to protect the public's health. Our review identified several themes: consideration of users' needs, preferences, and computer literacy; integration of tools into routine workflow; complications associated with understanding and use of visualizations; and the role of user trust and organizational support in the adoption of these tools. Interoperability also emerged as a prominent theme, highlighting challenges associated with the increasingly collaborative and interdisciplinary nature of infectious disease control and prevention. Future work should address methods for representing uncertainty and missing data to avoid misleading users as well as strategies to minimize cognitive overload. PMID:24747356

  11. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    PubMed

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool use. As the volume and complexity of infectious disease data increases, public health professionals must synthesize highly disparate data to facilitate communication with the public and inform decisions regarding measures to protect the public's health. Our review identified several themes: consideration of users' needs, preferences, and computer literacy; integration of tools into routine workflow; complications associated with understanding and use of visualizations; and the role of user trust and organizational support in the adoption of these tools. Interoperability also emerged as a prominent theme, highlighting challenges associated with the increasingly collaborative and interdisciplinary nature of infectious disease control and prevention. Future work should address methods for representing uncertainty and missing data to avoid misleading users as well as strategies to minimize cognitive overload. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Visual illusion of tool use recalibrates tactile perception

    PubMed Central

    Miller, Luke E.; Longo, Matthew R.; Saygin, Ayse P.

    2018-01-01

    Brief use of a tool recalibrates multisensory representations of the user’s body, a phenomenon called tool embodiment. Despite two decades of research, little is known about its boundary conditions. It has been widely argued that embodiment requires active tool use, suggesting a critical role for somatosensory and motor feedback. The present study used a visual illusion to cast doubt on this view. We used a mirror-based setup to induce a visual experience of tool use with an arm that was in fact stationary. Following illusory tool use, tactile perception was recalibrated on this stationary arm, and with equal magnitude as physical use. Recalibration was not found following illusory passive tool holding, and could not be accounted for by sensory conflict or general interhemispheric plasticity. These results suggest visual tool-use signals play a critical role in driving tool embodiment. PMID:28196765

  13. Modernizing Distribution System Restoration to Achieve Grid Resiliency Against Extreme Weather Events: An Integrated Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chen; Wang, Jianhui; Ton, Dan

    Recent severe power outages caused by extreme weather hazards have highlighted the importance and urgency of improving the resilience of the electric power grid. As the distribution grids still remain vulnerable to natural disasters, the power industry has focused on methods of restoring distribution systems after disasters in an effective and quick manner. The current distribution system restoration practice for utilities is mainly based on predetermined priorities and tends to be inefficient and suboptimal, and the lack of situational awareness after the hazard significantly delays the restoration process. As a result, customers may experience an extended blackout, which causes largemore » economic loss. On the other hand, the emerging advanced devices and technologies enabled through grid modernization efforts have the potential to improve the distribution system restoration strategy. However, utilizing these resources to aid the utilities in better distribution system restoration decision-making in response to extreme weather events is a challenging task. Therefore, this paper proposes an integrated solution: a distribution system restoration decision support tool designed by leveraging resources developed for grid modernization. We first review the current distribution restoration practice and discuss why it is inadequate in response to extreme weather events. Then we describe how the grid modernization efforts could benefit distribution system restoration, and we propose an integrated solution in the form of a decision support tool to achieve the goal. The advantages of the solution include improving situational awareness of the system damage status and facilitating survivability for customers. The paper provides a comprehensive review of how the existing methodologies in the literature could be leveraged to achieve the key advantages. The benefits of the developed system restoration decision support tool include the optimal and efficient allocation of repair crews and resources, the expediting of the restoration process, and the reduction of outage durations for customers, in response to severe blackouts due to extreme weather hazards.« less

  14. Relating interesting quantitative time series patterns with text events and text features

    NASA Astrophysics Data System (ADS)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.

  15. SARS Grid--an AG-based disease management and collaborative platform.

    PubMed

    Hung, Shu-Hui; Hung, Tsung-Chieh; Juang, Jer-Nan

    2006-01-01

    This paper describes the development of the NCHC's Severe Acute Respiratory Syndrome (SARS) Grid project-An Access Grid (AG)-based disease management and collaborative platform that allowed for SARS patient's medical data to be dynamically shared and discussed between hospitals and doctors using AG's video teleconferencing (VTC) capabilities. During the height of the SARS epidemic in Asia, SARS Grid and the SARShope website significantly curved the spread of SARS by helping doctors manage the in-hospital and in-home care of quarantined SARS patients through medical data exchange and the monitoring of the patient's symptoms. Now that the SARS epidemic has ended, the primary function of the SARS Grid project is that of a web-based informatics tool to increase pubic awareness of SARS and other epidemic diseases. Additionally, the SARS Grid project can be viewed and further studied as an outstanding model of epidemic disease prevention and/or containment.

  16. Methods for Data-based Delineation of Spatial Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, John E.

    In data analysis, it is often useful to delineate or segregate areas of interest from the general population of data in order to concentrate further analysis efforts on smaller areas. Three methods are presented here for automatically generating polygons around spatial data of interest. Each method addresses a distinct data type. These methods were developed for and implemented in the sample planning tool called Visual Sample Plan (VSP). Method A is used to delineate areas of elevated values in a rectangular grid of data (raster). The data used for this method are spatially related. Although VSP uses data from amore » kriging process for this method, it will work for any type of data that is spatially coherent and appears on a regular grid. Method B is used to surround areas of interest characterized by individual data points that are congregated within a certain distance of each other. Areas where data are “clumped” together spatially will be delineated. Method C is used to recreate the original boundary in a raster of data that separated data values from non-values. This is useful when a rectangular raster of data contains non-values (missing data) that indicate they were outside of some original boundary. If the original boundary is not delivered with the raster, this method will approximate the original boundary.« less

  17. 3D data processing with advanced computer graphics tools

    NASA Astrophysics Data System (ADS)

    Zhang, Song; Ekstrand, Laura; Grieve, Taylor; Eisenmann, David J.; Chumbley, L. Scott

    2012-09-01

    Often, the 3-D raw data coming from an optical profilometer contains spiky noises and irregular grid, which make it difficult to analyze and difficult to store because of the enormously large size. This paper is to address these two issues for an optical profilometer by substantially reducing the spiky noise of the 3-D raw data from an optical profilometer, and by rapidly re-sampling the raw data into regular grids at any pixel size and any orientation with advanced computer graphics tools. Experimental results will be presented to demonstrate the effectiveness of the proposed approach.

  18. Understanding Cognitive and Collaborative Work: Observations in an Electric Transmission Operations Control Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Obradovich, Jodi H.

    2011-09-30

    This paper describes research that is part of an ongoing project to design tools to assist in the integration of renewable energy into the electric grid. These tools will support control room dispatchers in real-time system operations of the electric power transmission system which serves much of the Western United States. Field observations comprise the first phase of this research in which 15 operators have been observed over various shifts and times of day for approximately 90 hours. Findings describing some of the cognitive and environmental challenges of managing the dynamically changing electric grid are presented.

  19. Experimental and analytical study of close-coupled ventral nozzles for ASTOVL aircraft

    NASA Technical Reports Server (NTRS)

    Mcardle, Jack G.; Smith, C. Frederic

    1990-01-01

    Flow in a generic ventral nozzle system was studied experimentally and analytically with a block version of the PARC3D computational fluid dynamics program (a full Navier-Stokes equation solver) in order to evaluate the program's ability to predict system performance and internal flow patterns. For the experimental work a one-third-size model tailpipe with a single large rectangular ventral nozzle mounted normal to the tailpipe axis was tested with unheated air at steady-state pressure ratios up to 4.0. The end of the tailpipe was closed to simulate a blocked exhaust nozzle. Measurements showed about 5 1/2 percent flow-turning loss, reasonable nozzle performance coefficients, and a significant aftward axial component of thrust due to flow turning loss, reasonable nozzle performance coefficients, and a significant aftward axial component of thrust due to flow turning more than 90 deg. Flow behavior into and through the ventral duct is discussed and illustrated with paint streak flow visualization photographs. For the analytical work the same ventral system configuration was modeled with two computational grids to evaluate the effect of grid density. Both grids gave good results. The finer-grid solution produced more detailed flow patterns and predicted performance parameters, such as thrust and discharge coefficient, within 1 percent of the measured values. PARC3D flow visualization images are shown for comparison with the paint streak photographs. Modeling and computational issues encountered in the analytical work are discussed.

  20. Integrating Xgrid into the HENP distributed computing model

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.

    2008-07-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

Top