Sample records for scientific visualization applications

  1. Standardization of Color Palettes for Scientific Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulesza, Joel A.; Spencer, Joshua Bradly; Sood, Avneet

    The purpose of this white paper is to demonstrate the importance of color palette choice in scientific visualizations and to promote an effort to convene an interdisciplinary team of researchers to study and recommend color palettes based on intended application(s) and audience(s).

  2. Integrating visualization and interaction research to improve scientific workflows.

    PubMed

    Keefe, Daniel F

    2010-01-01

    Scientific-visualization research is, nearly by necessity, interdisciplinary. In addition to their collaborators in application domains (for example, cell biology), researchers regularly build on close ties with disciplines related to visualization, such as graphics, human-computer interaction, and cognitive science. One of these ties is the connection between visualization and interaction research. This isn't a new direction for scientific visualization (see the "Early Connections" sidebar). However, momentum recently seems to be increasing toward integrating visualization research (for example, effective visual presentation of data) with interaction research (for example, innovative interactive techniques that facilitate manipulating and exploring data). We see evidence of this trend in several places, including the visualization literature and conferences.

  3. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk; Maynard, Robert

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respectivemore » features into a new visualization toolkit called VTK-m.« less

  4. Application of Andrew's Plots to Visualization of Multidimensional Data

    ERIC Educational Resources Information Center

    Grinshpun, Vadim

    2016-01-01

    Importance: The article raises a point of visual representation of big data, recently considered to be demanded for many scientific and real-life applications, and analyzes particulars for visualization of multi-dimensional data, giving examples of the visual analytics-related problems. Objectives: The purpose of this paper is to study application…

  5. Enhancements to VTK enabling Scientific Visualization in Immersive Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick; Jhaveri, Sankhesh; Chaudhary, Aashish

    Modern scientific, engineering and medical computational sim- ulations, as well as experimental and observational data sens- ing/measuring devices, produce enormous amounts of data. While statistical analysis provides insight into this data, scientific vi- sualization is tactically important for scientific discovery, prod- uct design and data analysis. These benefits are impeded, how- ever, when scientific visualization algorithms are implemented from scratch—a time-consuming and redundant process in im- mersive application development. This process can greatly ben- efit from leveraging the state-of-the-art open-source Visualization Toolkit (VTK) and its community. Over the past two (almost three) decades, integrating VTK with a virtual reality (VR)more » environment has only been attempted to varying degrees of success. In this pa- per, we demonstrate two new approaches to simplify this amalga- mation of an immersive interface with visualization rendering from VTK. In addition, we cover several enhancements to VTK that pro- vide near real-time updates and efficient interaction. Finally, we demonstrate the combination of VTK with both Vrui and OpenVR immersive environments in example applications.« less

  6. Porting the AVS/Express scientific visualization software to Cray XT4.

    PubMed

    Leaver, George W; Turner, Martin J; Perrin, James S; Mummery, Paul M; Withers, Philip J

    2011-08-28

    Remote scientific visualization, where rendering services are provided by larger scale systems than are available on the desktop, is becoming increasingly important as dataset sizes increase beyond the capabilities of desktop workstations. Uptake of such services relies on access to suitable visualization applications and the ability to view the resulting visualization in a convenient form. We consider five rules from the e-Science community to meet these goals with the porting of a commercial visualization package to a large-scale system. The application uses message-passing interface (MPI) to distribute data among data processing and rendering processes. The use of MPI in such an interactive application is not compatible with restrictions imposed by the Cray system being considered. We present details, and performance analysis, of a new MPI proxy method that allows the application to run within the Cray environment yet still support MPI communication required by the application. Example use cases from materials science are considered.

  7. Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.

    PubMed

    Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar

    2012-01-01

    Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.

  8. Integrating advanced visualization technology into the planetary Geoscience workflow

    NASA Astrophysics Data System (ADS)

    Huffman, John; Forsberg, Andrew; Loomis, Andrew; Head, James; Dickson, James; Fassett, Caleb

    2011-09-01

    Recent advances in computer visualization have allowed us to develop new tools for analyzing the data gathered during planetary missions, which is important, since these data sets have grown exponentially in recent years to tens of terabytes in size. As part of the Advanced Visualization in Solar System Exploration and Research (ADVISER) project, we utilize several advanced visualization techniques created specifically with planetary image data in mind. The Geoviewer application allows real-time active stereo display of images, which in aggregate have billions of pixels. The ADVISER desktop application platform allows fast three-dimensional visualization of planetary images overlain on digital terrain models. Both applications include tools for easy data ingest and real-time analysis in a programmatic manner. Incorporation of these tools into our everyday scientific workflow has proved important for scientific analysis, discussion, and publication, and enabled effective and exciting educational activities for students from high school through graduate school.

  9. Visual Data Exploration and Analysis - Report on the Visualization Breakout Session of the SCaLeS Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Frank, Randy; Fulcomer, Sam

    Scientific visualization is the transformation of abstract information into images, and it plays an integral role in the scientific process by facilitating insight into observed or simulated phenomena. Visualization as a discipline spans many research areas from computer science, cognitive psychology and even art. Yet the most successful visualization applications are created when close synergistic interactions with domain scientists are part of the algorithmic design and implementation process, leading to visual representations with clear scientific meaning. Visualization is used to explore, to debug, to gain understanding, and as an analysis tool. Visualization is literally everywhere--images are present in this report,more » on television, on the web, in books and magazines--the common theme is the ability to present information visually that is rapidly assimilated by human observers, and transformed into understanding or insight. As an indispensable part a modern science laboratory, visualization is akin to the biologist's microscope or the electrical engineer's oscilloscope. Whereas the microscope is limited to small specimens or use of optics to focus light, the power of scientific visualization is virtually limitless: visualization provides the means to examine data that can be at galactic or atomic scales, or at any size in between. Unlike the traditional scientific tools for visual inspection, visualization offers the means to ''see the unseeable.'' Trends in demographics or changes in levels of atmospheric CO{sub 2} as a function of greenhouse gas emissions are familiar examples of such unseeable phenomena. Over time, visualization techniques evolve in response to scientific need. Each scientific discipline has its ''own language,'' verbal and visual, used for communication. The visual language for depicting electrical circuits is much different than the visual language for depicting theoretical molecules or trends in the stock market. There is no ''one visualization too'' that can serve as a panacea for all science disciplines. Instead, visualization researchers work hand in hand with domain scientists as part of the scientific research process to define, create, adapt and refine software that ''speaks the visual language'' of each scientific domain.« less

  10. Virtual Environments in Scientific Visualization

    NASA Technical Reports Server (NTRS)

    Bryson, Steve; Lisinski, T. A. (Technical Monitor)

    1994-01-01

    Virtual environment technology is a new way of approaching the interface between computers and humans. Emphasizing display and user control that conforms to the user's natural ways of perceiving and thinking about space, virtual environment technologies enhance the ability to perceive and interact with computer generated graphic information. This enhancement potentially has a major effect on the field of scientific visualization. Current examples of this technology include the Virtual Windtunnel being developed at NASA Ames Research Center. Other major institutions such as the National Center for Supercomputing Applications and SRI International are also exploring this technology. This talk will be describe several implementations of virtual environments for use in scientific visualization. Examples include the visualization of unsteady fluid flows (the virtual windtunnel), the visualization of geodesics in curved spacetime, surface manipulation, and examples developed at various laboratories.

  11. Scientific Visualization Using the Flow Analysis Software Toolkit (FAST)

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Kelaita, Paul G.; Mccabe, R. Kevin; Merritt, Fergus J.; Plessel, Todd C.; Sandstrom, Timothy A.; West, John T.

    1993-01-01

    Over the past few years the Flow Analysis Software Toolkit (FAST) has matured into a useful tool for visualizing and analyzing scientific data on high-performance graphics workstations. Originally designed for visualizing the results of fluid dynamics research, FAST has demonstrated its flexibility by being used in several other areas of scientific research. These research areas include earth and space sciences, acid rain and ozone modelling, and automotive design, just to name a few. This paper describes the current status of FAST, including the basic concepts, architecture, existing functionality and features, and some of the known applications for which FAST is being used. A few of the applications, by both NASA and non-NASA agencies, are outlined in more detail. Described in the Outlines are the goals of each visualization project, the techniques or 'tricks' used lo produce the desired results, and custom modifications to FAST, if any, done to further enhance the analysis. Some of the future directions for FAST are also described.

  12. The 3D widgets for exploratory scientific visualization

    NASA Technical Reports Server (NTRS)

    Herndon, Kenneth P.; Meyer, Tom

    1995-01-01

    Computational fluid dynamics (CFD) techniques are used to simulate flows of fluids like air or water around such objects as airplanes and automobiles. These techniques usually generate very large amounts of numerical data which are difficult to understand without using graphical scientific visualization techniques. There are a number of commercial scientific visualization applications available today which allow scientists to control visualization tools via textual and/or 2D user interfaces. However, these user interfaces are often difficult to use. We believe that 3D direct-manipulation techniques for interactively controlling visualization tools will provide opportunities for powerful and useful interfaces with which scientists can more effectively explore their datasets. A few systems have been developed which use these techniques. In this paper, we will present a variety of 3D interaction techniques for manipulating parameters of visualization tools used to explore CFD datasets, and discuss in detail various techniques for positioning tools in a 3D scene.

  13. Visualization and Enabling Science at PO.DAAC

    NASA Astrophysics Data System (ADS)

    Tauer, E.; To, C.

    2017-12-01

    Facilitating the identification of appropriate data for scientific inquiry is important for efficient progress, but mechanisms for that identification vary, as does the effectiveness of those mechanisms. Appropriately crafted visualizations provide the means to quickly assess science data and scientific features, but providing the right visualization to the right application can present challenges. Even greater is the challenge of generating and/or re-constituting visualizations on the fly, particularly for large datasets. One avenue to mitigate the challenge is to arrive at an optimized intermediate data format that is tuned for rapid processing without sacrificing the provenance trace back to the original source data. This presentation will discuss the results of trading several current approaches towards an intermediate data format, and suggest a list of key attributes that will facilitate rapid visualization, and in the process, facilitate the identification of the right data for a given application.

  14. Accessing and visualizing scientific spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Bergou, Attila; Berriman, G. Bruce; Block, Gary L.; Collier, Jim; Curkendall, David W.; Good, John; Husman, Laura; Jacob, Joseph C.; Laity, Anastasia; hide

    2004-01-01

    This paper discusses work done by JPL's Parallel Applications Technologies Group in helping scientists access and visualize very large data sets through the use of multiple computing resources, such as parallel supercomputers, clusters, and grids.

  15. Visualizing inequality

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2016-07-01

    The study of socioeconomic inequality is of substantial importance, scientific and general alike. The graphic visualization of inequality is commonly conveyed by Lorenz curves. While Lorenz curves are a highly effective statistical tool for quantifying the distribution of wealth in human societies, they are less effective a tool for the visual depiction of socioeconomic inequality. This paper introduces an alternative to Lorenz curves-the hill curves. On the one hand, the hill curves are a potent scientific tool: they provide detailed scans of the rich-poor gaps in human societies under consideration, and are capable of accommodating infinitely many degrees of freedom. On the other hand, the hill curves are a powerful infographic tool: they visualize inequality in a most vivid and tangible way, with no quantitative skills that are required in order to grasp the visualization. The application of hill curves extends far beyond socioeconomic inequality. Indeed, the hill curves are highly effective 'hyperspectral' measures of statistical variability that are applicable in the context of size distributions at large. This paper establishes the notion of hill curves, analyzes them, and describes their application in the context of general size distributions.

  16. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebel, Oliver

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less

  17. Parallelization and Visual Analysis of Multidimensional Fields: Application to Ozone Production, Destruction, and Transport in Three Dimensions

    NASA Technical Reports Server (NTRS)

    Schwan, Karsten

    1997-01-01

    This final report has four sections. We first describe the actual scientific results attained by our research team, followed by a description of the high performance computing research enhancing those results and prompted by the scientific tasks being undertaken. Next, we describe our research in data and program visualization motivated by the scientific research and also enabling it. Last, we comment on the indirect effects this research effort has had on our work, in terms of follow up or additional funding, student training, etc.

  18. High performance geospatial and climate data visualization using GeoJS

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.

  19. Architectural Visualization of C/C++ Source Code for Program Comprehension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panas, T; Epperly, T W; Quinlan, D

    2006-09-01

    Structural and behavioral visualization of large-scale legacy systems to aid program comprehension is still a major challenge. The challenge is even greater when applications are implemented in flexible and expressive languages such as C and C++. In this paper, we consider visualization of static and dynamic aspects of large-scale scientific C/C++ applications. For our investigation, we reuse and integrate specialized analysis and visualization tools. Furthermore, we present a novel layout algorithm that permits a compressive architectural view of a large-scale software system. Our layout is unique in that it allows traditional program visualizations, i.e., graph structures, to be seen inmore » relation to the application's file structure.« less

  20. Final Report. Institute for Ultralscale Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois

    The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less

  1. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  2. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  3. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  4. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  5. Effectiveness of the use of question-driven levels of inquiry based instruction (QD-LOIBI) assisted visual multimedia supported teaching material on enhancing scientific explanation ability senior high school students

    NASA Astrophysics Data System (ADS)

    Suhandi, A.; Muslim; Samsudin, A.; Hermita, N.; Supriyatman

    2018-05-01

    In this study, the effectiveness of the use of Question-Driven Levels of Inquiry Based Instruction (QD-LOIBI) assisted visual multimedia supported teaching materials on enhancing senior high school students scientific explanation ability has been studied. QD-LOIBI was designed by following five-levels of inquiry proposed by Wenning. Visual multimedia used in teaching materials included image (photo), virtual simulation and video phenomena. QD-LOIBI assisted teaching materials supported by visual multimedia were tried out on senior high school students at one high school in one district in West Java. A quasi-experiment method with design one experiment group (n = 31) and one control group (n = 32) were used. Experimental group were given QD-LOIBI assisted teaching material supported by visual multimedia, whereas the control group were given QD-LOIBI assisted teaching materials not supported visual multimedia. Data on the ability of scientific explanation in both groups were collected by scientific explanation ability test in essay form concerning kinetic gas theory concept. The results showed that the number of students in the experimental class that has increased the category and quality of scientific explanation is greater than in the control class. These results indicate that the use of multimedia supported instructional materials developed for implementation of QD-LOIBI can improve students’ ability to provide explanations supported by scientific evidence gained from practicum activities and applicable concepts, laws, principles or theories.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, W.

    Building something which could be called {open_quotes}virtual reality{close_quotes} (VR) is something of a challenge, particularly when nobody really seems to agree on a definition of VR. The author wanted to combine scientific visualization with VR, resulting in an environment useful for assisting scientific research. He demonstrates the combination of VR and scientific visualization in a prototype application. The VR application constructed consists of a dataflow based system for performing scientific visualization (AVS), extensions to the system to support VR input devices and a numerical simulation ported into the dataflow environment. The VR system includes two inexpensive, off-the-shelf VR devices andmore » some custom code. A working system was assembled with about two man-months of effort. The system allows the user to specify parameters for a chemical flooding simulation as well as some viewing parameters using VR input devices, as well as view the output using VR output devices. In chemical flooding, there is a subsurface region that contains chemicals which are to be removed. Secondary oil recovery and environmental remediation are typical applications of chemical flooding. The process assumes one or more injection wells, and one or more production wells. Chemicals or water are pumped into the ground, mobilizing and displacing hydrocarbons or contaminants. The placement of the production and injection wells, and other parameters of the wells, are the most important variables in the simulation.« less

  7. Scalable data management, analysis and visualization (SDAV) Institute. Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk

    The purpose of the SDAV institute is to provide tools and expertise in scientific data management, analysis, and visualization to DOE’s application scientists. Our goal is to actively work with application teams to assist them in achieving breakthrough science, and to provide technical solutions in the data management, analysis, and visualization regimes that are broadly used by the computational science community. Over the last 5 years members of our institute worked directly with application scientists and DOE leadership-class facilities to assist them by applying the best tools and technologies at our disposal. We also enhanced our tools based on inputmore » from scientists on their needs. Many of the applications we have been working with are based on connections with scientists established in previous years. However, we contacted additional scientists though our outreach activities, as well as engaging application teams running on leading DOE computing systems. Our approach is to employ an evolutionary development and deployment process: first considering the application of existing tools, followed by the customization necessary for each particular application, and then the deployment in real frameworks and infrastructures. The institute is organized into three areas, each with area leaders, who keep track of progress, engagement of application scientists, and results. The areas are: (1) Data Management, (2) Data Analysis, and (3) Visualization. Kitware has been involved in the Visualization area. This report covers Kitware’s contributions over the last 5 years (February 2012 – February 2017). For details on the work performed by the SDAV institute as a whole, please see the SDAV final report.« less

  8. 77 FR 12240 - Application(s) for Duty-Free Entry of Scientific Instruments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-29

    ... experiments will consist of direct visual observations of fluorescently tagged DNA and DNA-bound protein... including animal tissues, bacteria, insects and parasites, involving the examination of their morphological...

  9. Three-dimensional user interfaces for scientific visualization

    NASA Technical Reports Server (NTRS)

    VanDam, Andries (Principal Investigator)

    1996-01-01

    The focus of this grant was to experiment with novel user interfaces for scientific visualization applications using both desktop and virtual reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past three years, and subsumes all prior reports.

  10. Scientific Computing Paradigm

    NASA Technical Reports Server (NTRS)

    VanZandt, John

    1994-01-01

    The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.

  11. The social computing room: a multi-purpose collaborative visualization environment

    NASA Astrophysics Data System (ADS)

    Borland, David; Conway, Michael; Coposky, Jason; Ginn, Warren; Idaszak, Ray

    2010-01-01

    The Social Computing Room (SCR) is a novel collaborative visualization environment for viewing and interacting with large amounts of visual data. The SCR consists of a square room with 12 projectors (3 per wall) used to display a single 360-degree desktop environment that provides a large physical real estate for arranging visual information. The SCR was designed to be cost-effective, collaborative, configurable, widely applicable, and approachable for naive users. Because the SCR displays a single desktop, a wide range of applications is easily supported, making it possible for a variety of disciplines to take advantage of the room. We provide a technical overview of the room and highlight its application to scientific visualization, arts and humanities projects, research group meetings, and virtual worlds, among other uses.

  12. Visualizing planetary data by using 3D engines

    NASA Astrophysics Data System (ADS)

    Elgner, S.; Adeli, S.; Gwinner, K.; Preusker, F.; Kersten, E.; Matz, K.-D.; Roatsch, T.; Jaumann, R.; Oberst, J.

    2017-09-01

    We examined 3D gaming engines for their usefulness in visualizing large planetary image data sets. These tools allow us to include recent developments in the field of computer graphics in our scientific visualization systems and present data products interactively and in higher quality than before. We started to set up the first applications which will take use of virtual reality (VR) equipment.

  13. Mapping scientific frontiers : the quest for knowledge visualization.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyack, Kevin W.

    Visualization of scientific frontiers is a relatively new field, yet it has a long history and many predecessors. The application of science to science itself has been undertaken for decades with notable early contributions by Derek Price, Thomas Kuhn, Diana Crane, Eugene Garfield, and many others. What is new is the field of information visualization and application of its techniques to help us understand the process of science in the making. In his new book, Chaomei Chen takes us on a journey through this history, touching on predecessors, and then leading us firmly into the new world of Mapping Scientificmore » Frontiers. Building on the foundation of his earlier book, Information Visualization and Virtual Environments, Chen's new offering is much less a tutorial in how to do information visualization, and much more a conceptual exploration of why and how the visualization of science can change the way we do science, amplified by real examples. Chen's stated intents for the book are: (1) to focus on principles of visual thinking that enable the identification of scientific frontiers; (2) to introduce a way to systematize the identification of scientific frontiers (or paradigms) through visualization techniques; and (3) to stimulate interdisciplinary research between information visualization and information science researchers. On all these counts, he succeeds. Chen's book can be broken into two parts which focus on the first two purposes stated above. The first, consisting of the initial four chapters, covers history and predecessors. Kuhn's theory of normal science punctuated by periods of revolution, now commonly known as paradigm shifts, motivates the work. Relevant predecessors outside the traditional field of information science such as cartography (both terrestrial and celestial), mapping the mind, and principles of visual association and communication, are given ample coverage. Chen also describes enabling techniques known to information scientists, such as multi-dimensional scaling, advanced dimensional reduction, social network analysis, Pathfinder network scaling, and landscape visualizations. No algorithms are given here; rather, these techniques are described from the point of view of enabling 'visual thinking'. The Generalized Similarity Analysis (GSA) technique used by Chen in his recent published papers is also introduced here. Information and computer science professionals would be wise not to skip through these early chapters. Although principles of gestalt psychology, cartography, thematic maps, and association techniques may be outside their technology comfort zone, or interest, these predecessors lay a groundwork for the 'visual thinking' that is required to create effective visualizations. Indeed, the great challenge in information visualization is to transform the abstract and intangible into something visible, concrete, and meaningful to the user. The second part of the book, covering the final three chapters, extends the mapping metaphor into the realm of scientific discovery through the structuring of literatures in a way that enables us to see scientific frontiers or paradigms. Case studies are used extensively to show the logical progression that has been made in recent years to get us to this point. Homage is paid to giants of the last 20 years including Michel Callon for co-word mapping, Henry Small for document co-citation analysis and specialty narratives (charting a path linking the different sciences), and Kate McCain for author co-citation analysis, whose work has led to the current state-of-the-art. The last two chapters finally answer the question - 'What does a scientific paradigm look like?' The visual answer given is specific to the GSA technique used by Chen, but does satisfy the intent of the book - to introduce a way to visually identify scientific frontiers. A variety of case studies, mostly from Chen's previously published work - supermassive black holes, cross-domain applications of Pathfinder networks, mass extinction debates, impact of Don Swanson's work, and mad cow disease and vCJD in humans - succeed in explaining how visualization can be used to show the development of, competition between, and eventual acceptance (or replacement) of scientific paradigms. Although not addressed specifically, Chen's work nonetheless makes the persuasive argument that visual maps alone are not sufficient to explain 'the making of science' to a non-expert in a particular field. Rather, expert knowledge is still required to interpret these maps and to explain the paradigms. This combination of visual maps and expert knowledge, used jointly to good effect in the book, becomes a potent means for explaining progress in science to the expert and non-expert alike. Work to extend the GSA technique to explore latent domain knowledge (important work that falls below the citation thresholds typically used in GSA) is also explored here.« less

  14. Geoscience Through the Lens of Art: a collaborative course of science and art for undergraduates of various disciplines

    NASA Astrophysics Data System (ADS)

    Ellins, K. K.; Eriksson, S. C.; Samsel, F.; Lavier, L.

    2017-12-01

    A new undergraduate, upper level geoscience course was developed and taught by faculty and staff of the UT Austin Jackson School of Geosciences, the Center for Agile Technology, and the Texas Advanced Computational Center. The course examined the role of the visual arts in placing the scientific process and knowledge in a broader context and introduced students to innovations in the visual arts that promote scientific investigation through collaboration between geoscientists and artists. The course addressed (1) the role of the visual arts in teaching geoscience concepts and promoting geoscience learning; (2) the application of innovative visualization and artistic techniques to large volumes of geoscience data to enhance scientific understanding and to move scientific investigation forward; and (3) the illustrative power of art to communicate geoscience to the public. In-class activities and discussions, computer lab instruction on the application of Paraview software, reading assignments, lectures, and group projects with presentations comprised the two-credit, semester-long "special topics" course, which was taken by geoscience, computer science, and engineering students. Assessment of student learning was carried out by the instructors and course evaluation was done by an external evaluator using rubrics, likert-scale surveys and focus goups. The course achieved its goals of students' learning the concepts and techniques of the visual arts. The final projects demonstrated this, along with the communication of geologic concepts using what they had learned in the course. The basic skill of sketching for learning and using best practices in visual communication were used extensively and, in most cases, very effectively. The use of an advanced visualization tool, Paraview, was received with mixed reviews because of the lack of time to really learn the tool and the fact that it is not a tool used routinely in geoscience. Those senior students with advanced computer skills saw the importance of this tool. Students worked in teams, more or less effectively, and made suggestions for improving future offerings of the course.

  15. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    NASA Astrophysics Data System (ADS)

    Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.

    2017-10-01

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks. Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.

  16. AI applications to conceptual aircraft design

    NASA Technical Reports Server (NTRS)

    Chalfan, Kathryn M.

    1990-01-01

    This paper presents in viewgraph form several applications of artificial intelligence (AI) to the conceptual design of aircraft, including: an access manager for automated data management, AI techniques applied to optimization, and virtual reality for scientific visualization of the design prototype.

  17. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.; Silva, Claudio

    2013-09-30

    For the past three years, a large analysis and visualization effort—funded by the Department of Energy’s Office of Biological and Environmental Research (BER), the National Aeronautics and Space Administration (NASA), and the National Oceanic and Atmospheric Administration (NOAA)—has brought together a wide variety of industry-standard scientific computing libraries and applications to create Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) to serve the global climate simulation and observational research communities. To support interactive analysis and visualization, all components connect through a provenance application–programming interface to capture meaningful history and workflow. Components can be loosely coupled into the framework for fast integrationmore » or tightly coupled for greater system functionality and communication with other components. The overarching goal of UV-CDAT is to provide a new paradigm for access to and analysis of massive, distributed scientific data collections by leveraging distributed data architectures located throughout the world. The UV-CDAT framework addresses challenges in analysis and visualization and incorporates new opportunities, including parallelism for better efficiency, higher speed, and more accurate scientific inferences. Today, it provides more than 600 users access to more analysis and visualization products than any other single source.« less

  18. A knowledge based system for scientific data visualization

    NASA Technical Reports Server (NTRS)

    Senay, Hikmet; Ignatius, Eve

    1992-01-01

    A knowledge-based system, called visualization tool assistant (VISTA), which was developed to assist scientists in the design of scientific data visualization techniques, is described. The system derives its knowledge from several sources which provide information about data characteristics, visualization primitives, and effective visual perception. The design methodology employed by the system is based on a sequence of transformations which decomposes a data set into a set of data partitions, maps this set of partitions to visualization primitives, and combines these primitives into a composite visualization technique design. Although the primary function of the system is to generate an effective visualization technique design for a given data set by using principles of visual perception the system also allows users to interactively modify the design, and renders the resulting image using a variety of rendering algorithms. The current version of the system primarily supports visualization techniques having applicability in earth and space sciences, although it may easily be extended to include other techniques useful in other disciplines such as computational fluid dynamics, finite-element analysis and medical imaging.

  19. Virtual reality at work

    NASA Technical Reports Server (NTRS)

    Brooks, Frederick P., Jr.

    1991-01-01

    The utility of virtual reality computer graphics in telepresence applications is not hard to grasp and promises to be great. When the virtual world is entirely synthetic, as opposed to real but remote, the utility is harder to establish. Vehicle simulators for aircraft, vessels, and motor vehicles are proving their worth every day. Entertainment applications such as Disney World's StarTours are technologically elegant, good fun, and economically viable. Nevertheless, some of us have no real desire to spend our lifework serving the entertainment craze of our sick culture; we want to see this exciting technology put to work in medicine and science. The topics covered include the following: testing a force display for scientific visualization -- molecular docking; and testing a head-mounted display for scientific and medical visualization.

  20. Visiting Scholars Program Application | FNLCR Staging

    Cancer.gov

    Below are scientific areas and programs that the Frederick National Labisactively seeking scholars to participate: Data Science and Information Technology (including Bioinformatics, Visualization, etc) Advanced Preclinical Researc

  1. Web-based visualization of very large scientific astronomy imagery

    NASA Astrophysics Data System (ADS)

    Bertin, E.; Pillay, R.; Marmo, C.

    2015-04-01

    Visualizing and navigating through large astronomy images from a remote location with current astronomy display tools can be a frustrating experience in terms of speed and ergonomics, especially on mobile devices. In this paper, we present a high performance, versatile and robust client-server system for remote visualization and analysis of extremely large scientific images. Applications of this work include survey image quality control, interactive data query and exploration, citizen science, as well as public outreach. The proposed software is entirely open source and is designed to be generic and applicable to a variety of datasets. It provides access to floating point data at terabyte scales, with the ability to precisely adjust image settings in real-time. The proposed clients are light-weight, platform-independent web applications built on standard HTML5 web technologies and compatible with both touch and mouse-based devices. We put the system to the test and assess the performance of the system and show that a single server can comfortably handle more than a hundred simultaneous users accessing full precision 32 bit astronomy data.

  2. Data-proximate Visualization via Unidata Cloud Technologies

    NASA Astrophysics Data System (ADS)

    Fisher, W. I.; Oxelson Ganter, J.; Weber, J.

    2016-12-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service.The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. The challenge now becomes creating tools which are cloud-ready.The solution to this challenge is provided by Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be.Unidata has harnessed Application Streaming to provide a cloud-capable version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  3. Cloud-based data-proximate visualization and analysis

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2017-04-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. The challenge now becomes creating tools which are cloud-ready. The solution to this challenge is provided by Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has harnessed Application Streaming to provide a cloud-capable version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  4. VESL: The Virtual Earth Sheet Laboratory for Ice Sheet Modeling and Visualization

    NASA Astrophysics Data System (ADS)

    Cheng, D. L. C.; Larour, E. Y.; Quinn, J. D.; Halkides, D. J.

    2017-12-01

    We present the Virtual Earth System Laboratory (VESL), a scientific modeling and visualization tool delivered through an integrated web portal. This allows for the dissemination of data, simulation of physical processes, and promotion of climate literacy. The current iteration leverages NASA's Ice Sheet System Model (ISSM), a state-of-the-art polar ice sheet dynamics model developed at the Jet Propulsion Lab and UC Irvine. We utilize the Emscripten source-to-source compiler to convert the C/C++ ISSM engine core to JavaScript, and bundled pre/post-processing JS scripts to be compatible with the existing ISSM Python/Matlab API. Researchers using VESL will be able to effectively present their work for public dissemination with little-to-no additional post-processing. Moreover, the portal allows for real time visualization and editing of models, cloud based computational simulation, and downloads of relevant data. This allows for faster publication in peer-reviewed journals and adaption of results for educational applications. Through application of this concept to multiple aspects of the Earth System, VESL is able to broaden data applications in the geosciences and beyond. At this stage, we still seek feedback from the greater scientific and public outreach communities regarding the ease of use and feature set of VESL. As we plan its expansion, we aim to achieve more rapid communication and presentation of scientific results.

  5. New Perspective on Visual Communication Design Education: An Empirical Study of Applying Narrative Theory to Graphic Design Courses

    ERIC Educational Resources Information Center

    Yang, Chao-Ming; Hsu, Tzu-Fan

    2017-01-01

    Visual communication design (VCD) is a form of nonverbal communication. The application of relevant linguistic or semiotic theories to VCD education renders graphic design an innovative and scientific discipline. In this study, actual teaching activities were examined to verify the feasibility of applying narrative theory to graphic design…

  6. Visiting Scholars Program Application | Frederick National Laboratory for Cancer Research

    Cancer.gov

    Below are scientific areas and programs that the Frederick National Labisactively seeking scholars to participate: Data Science and Information Technology (including Bioinformatics, Visualization, etc) Advanced Preclinical Researc

  7. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks.more » Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.« less

  8. GROTTO visualization for decision support

    NASA Astrophysics Data System (ADS)

    Lanzagorta, Marco O.; Kuo, Eddy; Uhlmann, Jeffrey K.

    1998-08-01

    In this paper we describe the GROTTO visualization projects being carried out at the Naval Research Laboratory. GROTTO is a CAVE-like system, that is, a surround-screen, surround- sound, immersive virtual reality device. We have explored the GROTTO visualization in a variety of scientific areas including oceanography, meteorology, chemistry, biochemistry, computational fluid dynamics and space sciences. Research has emphasized the applications of GROTTO visualization for military, land and sea-based command and control. Examples include the visualization of ocean current models for the simulation and stud of mine drifting and, inside our computational steering project, the effects of electro-magnetic radiation on missile defense satellites. We discuss plans to apply this technology to decision support applications involving the deployment of autonomous vehicles into contaminated battlefield environments, fire fighter control and hostage rescue operations.

  9. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming frameworks and how a developer might prepare their software for application streaming. We will also examine the secondary benefits realized by moving legacy software to the cloud. Finally, we will examine the process by which a legacy Java application, the Integrated Data Viewer (IDV), is to be adapted for tablet computing via Application Streaming.

  10. Beyond the Renderer: Software Architecture for Parallel Graphics and Visualization

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1996-01-01

    As numerous implementations have demonstrated, software-based parallel rendering is an effective way to obtain the needed computational power for a variety of challenging applications in computer graphics and scientific visualization. To fully realize their potential, however, parallel renderers need to be integrated into a complete environment for generating, manipulating, and delivering visual data. We examine the structure and components of such an environment, including the programming and user interfaces, rendering engines, and image delivery systems. We consider some of the constraints imposed by real-world applications and discuss the problems and issues involved in bringing parallel rendering out of the lab and into production.

  11. Engineering and Scientific Applications: Using MatLab(Registered Trademark) for Data Processing and Visualization

    NASA Technical Reports Server (NTRS)

    Sen, Syamal K.; Shaykhian, Gholam Ali

    2011-01-01

    MatLab(TradeMark)(MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many countries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its real strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbox. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using symbolic operations. MatLab in its interpreter programming language form (command interface) is similar with well known programming languages such as C/C++, support data structures and cell arrays to define classes in object oriented programming. As such, MatLab is equipped with most of the essential constructs of a higher programming language. MatLab is packaged with an editor and debugging functionality useful to perform analysis of large MatLab programs and find errors. We believe there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and analysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applications. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientific problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabular format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed.

  12. VESL: The Virtual Earth Sheet Laboratory for Ice Sheet Modeling and Visualization

    NASA Astrophysics Data System (ADS)

    Cheng, D. L. C.; Larour, E. Y.; Quinn, J. D.; Halkides, D. J.

    2016-12-01

    We introduce the Virtual Earth System Laboratory (VESL), a scientific modeling and visualization tool delivered through an integrated web portal for dissemination of data, simulation of physical processes, and promotion of climate literacy. The current prototype leverages NASA's Ice Sheet System Model (ISSM), a state-of-the-art polar ice sheet dynamics model developed at the Jet Propulsion Lab and UC Irvine. We utilize the Emscripten source-to-source compiler to convert the C/C++ ISSM engine core to JavaScript, and bundled pre/post-processing JS scripts to be compatible with the existing ISSM Python/Matlab API. Researchers using VESL will be able to effectively present their work for public dissemination with little-to-no additional post-processing. This will allow for faster publication in peer-reviewed journals and adaption of results for educational applications. Through future application of this concept to multiple aspects of the Earth System, VESL has the potential to broaden data applications in the geosciences and beyond. At this stage, we seek feedback from the greater scientific and public outreach communities regarding the ease of use and feature set of VESL, as we plan its expansion, and aim to achieve more rapid communication and presentation of scientific results.

  13. 78 FR 27186 - Application(s) for Duty-Free Entry of Scientific Instruments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-09

    ... cell regeneration in damaged tissue, and examine the regulatory mechanisms for metabolic activity in... populations of cells develop into a coherent circuit that capably detects directional movement in a visual... a single neuron or large numbers of cells in a neuronal population. The instrument's capabilities...

  14. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  15. Visualization of Multi-mission Astronomical Data with ESASky

    NASA Astrophysics Data System (ADS)

    Baines, Deborah; Giordano, Fabrizio; Racero, Elena; Salgado, Jesús; López Martí, Belén; Merín, Bruno; Sarmiento, María-Henar; Gutiérrez, Raúl; Ortiz de Landaluce, Iñaki; León, Ignacio; de Teodoro, Pilar; González, Juan; Nieto, Sara; Segovia, Juan Carlos; Pollock, Andy; Rosa, Michael; Arviset, Christophe; Lennon, Daniel; O'Mullane, William; de Marchi, Guido

    2017-02-01

    ESASky is a science-driven discovery portal to explore the multi-wavelength sky and visualize and access multiple astronomical archive holdings. The tool is a web application that requires no prior knowledge of any of the missions involved and gives users world-wide simplified access to the highest-level science data products from multiple astronomical space-based astronomy missions plus a number of ESA source catalogs. The first public release of ESASky features interfaces for the visualization of the sky in multiple wavelengths, the visualization of query results summaries, and the visualization of observations and catalog sources for single and multiple targets. This paper describes these features within ESASky, developed to address use cases from the scientific community. The decisions regarding the visualization of large amounts of data and the technologies used were made to maximize the responsiveness of the application and to keep the tool as useful and intuitive as possible.

  16. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618

  17. Data-Proximate Analysis and Visualization in the Cloud using Cloudstream, an Open-Source Application Streaming Technology Stack

    NASA Astrophysics Data System (ADS)

    Fisher, W. I.

    2017-12-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.

  18. Manifold compositions, music visualization, and scientific sonification in an immersive virtual-reality environment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaper, H. G.

    1998-01-05

    An interdisciplinary project encompassing sound synthesis, music composition, sonification, and visualization of music is facilitated by the high-performance computing capabilities and the virtual-reality environments available at Argonne National Laboratory. The paper describes the main features of the project's centerpiece, DIASS (Digital Instrument for Additive Sound Synthesis); ''A.N.L.-folds'', an equivalence class of compositions produced with DIASS; and application of DIASS in two experiments in the sonification of complex scientific data. Some of the larger issues connected with this project, such as the changing ways in which both scientists and composers perform their tasks, are briefly discussed.

  19. [Recent developments on the scientific research in optometry and visual science in China].

    PubMed

    Qu, Jia

    2010-10-01

    The current text reviewed the situation of the scientific research in the field of Optometry and visual sciences in the recent 5 to 6 years in our country. It showed the advancement and achievement in the myopic fundamental research and the application research of visual science. In addition, it also analyzed the guidance of research in solving the clinical visual issues and the significance of community service of research in eye care in public. This text indicated by the concrete current situation and the result data of research that the biology and optics, the double property of the eye endowed the distinguished feature to the research in Ophthalmology and Optometry, and that the cross cooperation of multidisciplinary promoted the innovation in the fields of Optometry and visual research. In future, the fields of Optometry and visual science in China will face up to more and more anticipations of the original and systematic research. The prophylaxis and treatment of myopia would be still a long-term and rough exploration theme in these fields.

  20. Software for visualization, analysis, and manipulation of laser scan images

    NASA Astrophysics Data System (ADS)

    Burnsides, Dennis B.

    1997-03-01

    The recent introduction of laser surface scanning to scientific applications presents a challenge to computer scientists and engineers. Full utilization of this two- dimensional (2-D) and three-dimensional (3-D) data requires advances in techniques and methods for data processing and visualization. This paper explores the development of software to support the visualization, analysis and manipulation of laser scan images. Specific examples presented are from on-going efforts at the Air Force Computerized Anthropometric Research and Design (CARD) Laboratory.

  1. Material Interface Reconstruction in VisIt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meredith, J S

    In this paper, we first survey a variety of approaches to material interface reconstruction and their applicability to visualization, and we investigate the details of the current reconstruction algorithm in the VisIt scientific analysis and visualization tool. We then provide a novel implementation of the original VisIt algorithm that makes use of a wide range of the finite element zoo during reconstruction. This approach results in dramatic improvements in quality and performance without sacrificing the strengths of the VisIt algorithm as it relates to visualization.

  2. Overview of machine vision methods in x-ray imaging and microtomography

    NASA Astrophysics Data System (ADS)

    Buzmakov, Alexey; Zolotov, Denis; Chukalina, Marina; Nikolaev, Dmitry; Gladkov, Andrey; Ingacheva, Anastasia; Yakimchuk, Ivan; Asadchikov, Victor

    2018-04-01

    Digital X-ray imaging became widely used in science, medicine, non-destructive testing. This allows using modern digital images analysis for automatic information extraction and interpretation. We give short review of scientific applications of machine vision in scientific X-ray imaging and microtomography, including image processing, feature detection and extraction, images compression to increase camera throughput, microtomography reconstruction, visualization and setup adjustment.

  3. A main path domain map as digital library interface

    NASA Astrophysics Data System (ADS)

    Demaine, Jeffrey

    2009-01-01

    The shift to electronic publishing of scientific journals is an opportunity for the digital library to provide non-traditional ways of accessing the literature. One method is to use citation metadata drawn from a collection of electronic journals to generate maps of science. These maps visualize the communication patterns in the collection, giving the user an easy-tograsp view of the semantic structure underlying the scientific literature. For this visualization to be understandable the complexity of the citation network must be reduced through an algorithm. This paper describes the Citation Pathfinder application and its integration into a prototype digital library. This application generates small-scale citation networks that expand upon the search results of the digital library. These domain maps are linked to the collection, creating an interface that is based on the communication patterns in science. The Main Path Analysis technique is employed to simplify these networks into linear, sequential structures. By identifying patterns that characterize the evolution of the research field, Citation Pathfinder uses citations to give users a deeper understanding of the scientific literature.

  4. The steady-state visual evoked potential in vision research: A review

    PubMed Central

    Norcia, Anthony M.; Appelbaum, L. Gregory; Ales, Justin M.; Cottereau, Benoit R.; Rossion, Bruno

    2015-01-01

    Periodic visual stimulation and analysis of the resulting steady-state visual evoked potentials were first introduced over 80 years ago as a means to study visual sensation and perception. From the first single-channel recording of responses to modulated light to the present use of sophisticated digital displays composed of complex visual stimuli and high-density recording arrays, steady-state methods have been applied in a broad range of scientific and applied settings.The purpose of this article is to describe the fundamental stimulation paradigms for steady-state visual evoked potentials and to illustrate these principles through research findings across a range of applications in vision science. PMID:26024451

  5. A Responsive Client for Distributed Visualization

    NASA Astrophysics Data System (ADS)

    Bollig, E. F.; Jensen, P. A.; Erlebacher, G.; Yuen, D. A.; Momsen, A. R.

    2006-12-01

    As grids, web services and distributed computing continue to gain popularity in the scientific community, demand for virtual laboratories likewise increases. Today organizations such as the Virtual Laboratory for Earth and Planetary Sciences (VLab) are dedicated to developing web-based portals to perform various simulations remotely while abstracting away details of the underlying computation. Two of the biggest challenges in portal- based computing are fast visualization and smooth interrogation without over taxing clients resources. In response to this challenge, we have expanded on our previous data storage strategy and thick client visualization scheme [1] to develop a client-centric distributed application that utilizes remote visualization of large datasets and makes use of the local graphics processor for improved interactivity. Rather than waste precious client resources for visualization, a combination of 3D graphics and 2D server bitmaps are used to simulate the look and feel of local rendering. Java Web Start and Java Bindings for OpenGL enable install-on- demand functionality as well as low level access to client graphics for all platforms. Powerful visualization services based on VTK and auto-generated by the WATT compiler [2] are accessible through a standard web API. Data is permanently stored on compute nodes while separate visualization nodes fetch data requested by clients, caching it locally to prevent unnecessary transfers. We will demonstrate application capabilities in the context of simulated charge density visualization within the VLab portal. In addition, we will address generalizations of our application to interact with a wider number of WATT services and performance bottlenecks. [1] Ananthuni, R., Karki, B.B., Bollig, E.F., da Silva, C.R.S., Erlebacher, G., "A Web-Based Visualization and Reposition Scheme for Scientific Data," In Press, Proceedings of the 2006 International Conference on Modeling Simulation and Visualization Methods (MSV'06) (2006). [2] Jensen, P.A., Yuen, D.A., Erlebacher, G., Bollig, E.F., Kigelman, D.G., Shukh, E.A., Automated Generation of Web Services for Visualization Toolkits, Eos Trans. AGU, 86(52), Fall Meet. Suppl., Abstract IN42A-06, 2005.

  6. Ambiguous science and the visual representation of the real

    NASA Astrophysics Data System (ADS)

    Newbold, Curtis Robert

    The emergence of visual media as prominent and even expected forms of communication in nearly all disciplines, including those scientific, has raised new questions about how the art and science of communication epistemologically affect the interpretation of scientific phenomena. In this dissertation I explore how the influence of aesthetics in visual representations of science inevitably creates ambiguous meanings. As a means to improve visual literacy in the sciences, I call awareness to the ubiquity of visual ambiguity and its importance and relevance in scientific discourse. To do this, I conduct a literature review that spans interdisciplinary research in communication, science, art, and rhetoric. Furthermore, I create a paradoxically ambiguous taxonomy, which functions to exploit the nuances of visual ambiguities and their role in scientific communication. I then extrapolate the taxonomy of visual ambiguity and from it develop an ambiguous, rhetorical heuristic, the Tetradic Model of Visual Ambiguity. The Tetradic Model is applied to a case example of a scientific image as a demonstration of how scientific communicators may increase their awareness of the epistemological effects of ambiguity in the visual representations of science. I conclude by demonstrating how scientific communicators may make productive use of visual ambiguity, even in communications of objective science, and I argue how doing so strengthens scientific communicators' visual literacy skills and their ability to communicate more ethically and effectively.

  7. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads.

    PubMed

    Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-05-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.

  8. Scientific Process Flowchart Assessment (SPFA): A Method for Evaluating Changes in Understanding and Visualization of the Scientific Process in a Multidisciplinary Student Population

    PubMed Central

    Wilson, Kristy J.; Rigakos, Bessie

    2016-01-01

    The scientific process is nonlinear, unpredictable, and ongoing. Assessing the nature of science is difficult with methods that rely on Likert-scale or multiple-choice questions. This study evaluated conceptions about the scientific process using student-created visual representations that we term “flowcharts.” The methodology, Scientific Process Flowchart Assessment (SPFA), consisted of a prompt and rubric that was designed to assess students’ understanding of the scientific process. Forty flowcharts representing a multidisciplinary group without intervention and 26 flowcharts representing pre- and postinstruction were evaluated over five dimensions: connections, experimental design, reasons for doing science, nature of science, and interconnectivity. Pre to post flowcharts showed a statistically significant improvement in the number of items and ratings for the dimensions. Comparison of the terms used and connections between terms on student flowcharts revealed an enhanced and more nuanced understanding of the scientific process, especially in the areas of application to society and communication within the scientific community. We propose that SPFA can be used in a variety of circumstances, including in the determination of what curricula or interventions would be useful in a course or program, in the assessment of curriculum, or in the evaluation of students performing research projects. PMID:27856551

  9. A Lightweight Remote Parallel Visualization Platform for Interactive Massive Time-varying Climate Data Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, T.; Huang, Q.; Liu, Q.

    2014-12-01

    Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.

  10. Accessing and Visualizing scientific spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Bergou, Attila; Berriman, Bruce G.; Block, Gary L.; Collier, Jim; Curkendall, David W.; Good, John; Husman, Laura; Jacob, Joseph C.; Laity, Anastasia; hide

    2004-01-01

    This paper discusses work done by JPL 's Parallel Applications Technologies Group in helping scientists access and visualize very large data sets through the use of multiple computing resources, such as parallel supercomputers, clusters, and grids These tools do one or more of the following tasks visualize local data sets for local users, visualize local data sets for remote users, and access and visualize remote data sets The tools are used for various types of data, including remotely sensed image data, digital elevation models, astronomical surveys, etc The paper attempts to pull some common elements out of these tools that may be useful for others who have to work with similarly large data sets.

  11. Scientific visualization of volumetric radar cross section data

    NASA Astrophysics Data System (ADS)

    Wojszynski, Thomas G.

    1992-12-01

    For aircraft design and mission planning, designers, threat analysts, mission planners, and pilots require a Radar Cross Section (RCS) central tendency with its associated distribution about a specified aspect and its relation to a known threat, Historically, RCS data sets have been statically analyzed to evaluate a d profile. However, Scientific Visualization, the application of computer graphics techniques to produce pictures of complex physical phenomena appears to be a more promising tool to interpret this data. This work describes data reduction techniques and a surface rendering algorithm to construct and display a complex polyhedron from adjacent contours of RCS data. Data reduction is accomplished by sectorizing the data and characterizing the statistical properties of the data. Color, lighting, and orientation cues are added to complete the visualization system. The tool may be useful for synthesis, design, and analysis of complex, low observable air vehicles.

  12. Scientific work environments in the next decade

    NASA Technical Reports Server (NTRS)

    Gomez, Julian E.

    1989-01-01

    The applications of contemporary computer graphics to scientific visualization is described, with emphasis on the nonintuitive problems. A radically different approach is proposed which centers on the idea of the scientist being in the simulation display space rather than observing it on a screen. Interaction is performed with nonstandard input devices to preserve the feeling of being immersed in the three-dimensional display space. Construction of such a system could begin now with currently available technology.

  13. Dam Removal Information Portal (DRIP)—A map-based resource linking scientific studies and associated geospatial information about dam removals

    USGS Publications Warehouse

    Duda, Jeffrey J.; Wieferich, Daniel J.; Bristol, R. Sky; Bellmore, J. Ryan; Hutchison, Vivian B.; Vittum, Katherine M.; Craig, Laura; Warrick, Jonathan A.

    2016-08-18

    The removal of dams has recently increased over historical levels due to aging infrastructure, changing societal needs, and modern safety standards rendering some dams obsolete. Where possibilities for river restoration, or improved safety, exceed the benefits of retaining a dam, removal is more often being considered as a viable option. Yet, as this is a relatively new development in the history of river management, science is just beginning to guide our understanding of the physical and ecological implications of dam removal. Ultimately, the “lessons learned” from previous scientific studies on the outcomes dam removal could inform future scientific understanding of ecosystem outcomes, as well as aid in decision-making by stakeholders. We created a database visualization tool, the Dam Removal Information Portal (DRIP), to display map-based, interactive information about the scientific studies associated with dam removals. Serving both as a bibliographic source as well as a link to other existing databases like the National Hydrography Dataset, the derived National Dam Removal Science Database serves as the foundation for a Web-based application that synthesizes the existing scientific studies associated with dam removals. Thus, using the DRIP application, users can explore information about completed dam removal projects (for example, their location, height, and date removed), as well as discover sources and details of associated of scientific studies. As such, DRIP is intended to be a dynamic collection of scientific information related to dams that have been removed in the United States and elsewhere. This report describes the architecture and concepts of this “metaknowledge” database and the DRIP visualization tool.

  14. Scientific Visualization, Seeing the Unseeable

    ScienceCinema

    LBNL

    2017-12-09

    June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  15. Construction of Blaze at the University of Illinois at Chicago: A Shared, High-Performance, Visual Computer for Next-Generation Cyberinfrastructure-Accelerated Scientific, Engineering, Medical and Public Policy Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Maxine D.; Leigh, Jason

    2014-02-17

    The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascalemore » computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.« less

  16. Fast I/O for Massively Parallel Applications

    NASA Technical Reports Server (NTRS)

    OKeefe, Matthew T.

    1996-01-01

    The two primary goals for this report were the design, contruction and modeling of parallel disk arrays for scientific visualization and animation, and a study of the IO requirements of highly parallel applications. In addition, further work in parallel display systems required to project and animate the very high-resolution frames resulting from our supercomputing simulations in ocean circulation and compressible gas dynamics.

  17. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads

    PubMed Central

    Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-01-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922

  18. Hybrid 2-D and 3-D Immersive and Interactive User Interface for Scientific Data Visualization

    DTIC Science & Technology

    2017-08-01

    visualization, 3-D interactive visualization, scientific visualization, virtual reality, real -time ray tracing 16. SECURITY CLASSIFICATION OF: 17...scientists to employ in the real world. Other than user-friendly software and hardware setup, scientists also need to be able to perform their usual...and scientific visualization communities mostly have different research priorities. For the VR community, the ability to support real -time user

  19. The climate visualizer: Sense-making through scientific visualization

    NASA Astrophysics Data System (ADS)

    Gordin, Douglas N.; Polman, Joseph L.; Pea, Roy D.

    1994-12-01

    This paper describes the design of a learning environment, called the Climate Visualizer, intended to facilitate scientific sense-making in high school classrooms by providing students the ability to craft, inspect, and annotate scientific visualizations. The theoretical back-ground for our design presents a view of learning as acquiring and critiquing cultural practices and stresses the need for students to appropriate the social and material aspects of practice when learning an area. This is followed by a description of the design of the Climate Visualizer, including detailed accounts of its provision of spatial and temporal context and the quantitative and visual representations it employs. A broader context is then explored by describing its integration into the high school science classroom. This discussion explores how visualizations can promote the creation of scientific theories, especially in conjunction with the Collaboratory Notebook, an embedded environment for creating and critiquing scientific theories and visualizations. Finally, we discuss the design trade-offs we have made in light of our theoretical orientation, and our hopes for further progress.

  20. Three-dimensional user interfaces for scientific visualization

    NASA Technical Reports Server (NTRS)

    Vandam, Andries

    1995-01-01

    The main goal of this project is to develop novel and productive user interface techniques for creating and managing visualizations of computational fluid dynamics (CFD) datasets. We have implemented an application framework in which we can visualize computational fluid dynamics user interfaces. This UI technology allows users to interactively place visualization probes in a dataset and modify some of their parameters. We have also implemented a time-critical scheduling system which strives to maintain a constant frame-rate regardless of the number of visualization techniques. In the past year, we have published parts of this research at two conferences, the research annotation system at Visualization 1994, and the 3D user interface at UIST 1994. The real-time scheduling system has been submitted to SIGGRAPH 1995 conference. Copies of these documents are included with this report.

  1. 6D Visualization of Multidimensional Data by Means of Cognitive Technology

    NASA Astrophysics Data System (ADS)

    Vitkovskiy, V.; Gorohov, V.; Komarinskiy, S.

    2010-12-01

    On the basis of the cognitive graphics concept, we worked out the SW-system for visualization and analysis. It allows to train and to aggravate intuition of researcher, to raise his interest and motivation to the creative, scientific cognition, to realize process of dialogue with the very problems simultaneously. The Space Hedgehog system is the next step in the cognitive means of the multidimensional data analyze. The technique and technology cognitive 6D visualization of the multidimensional data is developed on the basis of the cognitive visualization research and technology development. The Space Hedgehog system allows direct dynamic visualization of 6D objects. It is developed with use of experience of the program Space Walker creation and its applications.

  2. 3D Scientific Visualization with Blender

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.

    2015-03-01

    This is the first book written on using Blender (an open source visualization suite widely used in the entertainment and gaming industries) for scientific visualization. It is a practical and interesting introduction to Blender for understanding key parts of 3D rendering and animation that pertain to the sciences via step-by-step guided tutorials. 3D Scientific Visualization with Blender takes you through an understanding of 3D graphics and modelling for different visualization scenarios in the physical sciences.

  3. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    NASA Astrophysics Data System (ADS)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources at one place. The study indicates that internet GIS, developed using advanced technologies, provides valuable education potential to users in hydrology and irrigation engineering and suggests that such a system can support advanced hydrological data access and analysis tools to improve utility of data in operations. Keywords: Hydrological Information System, NebHydro, Water Management, data sharing, data visualization, ArcGIS server.

  4. 3D Scientific Visualization with Blender

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.

    2015-03-01

    This is the first book written on using Blender for scientific visualization. It is a practical and interesting introduction to Blender for understanding key parts of 3D rendering and animation that pertain to the sciences via step-by-step guided tutorials. 3D Scientific Visualization with Blender takes you through an understanding of 3D graphics and modelling for different visualization scenarios in the physical sciences.

  5. Scientific Visualization: The Modern Oscilloscope for "Seeing the Unseeable" (LBNL Summer Lecture Series)

    ScienceCinema

    Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division and Scientific Visualization Group

    2018-05-07

    Summer Lecture Series 2008: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  6. IDP camp evolvement analysis in Darfur using VHSR optical satellite image time series and scientific visualization on virtual globes

    NASA Astrophysics Data System (ADS)

    Tiede, Dirk; Lang, Stefan

    2010-11-01

    In this paper we focus on the application of transferable, object-based image analysis algorithms for dwelling extraction in a camp for internally displaced people (IDP) in Darfur, Sudan along with innovative means for scientific visualisation of the results. Three very high spatial resolution satellite images (QuickBird: 2002, 2004, 2008) were used for: (1) extracting different types of dwellings and (2) calculating and visualizing added-value products such as dwelling density and camp structure. The results were visualized on virtual globes (Google Earth and ArcGIS Explorer) revealing the analysis results (analytical 3D views,) transformed into the third dimension (z-value). Data formats depend on virtual globe software including KML/KMZ (keyhole mark-up language) and ESRI 3D shapefiles streamed as ArcGIS Server-based globe service. In addition, means for improving overall performance of automated dwelling structures using grid computing techniques are discussed using examples from a similar study.

  7. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  8. Imaging System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The 1100C Virtual Window is based on technology developed under NASA Small Business Innovation (SBIR) contracts to Ames Research Center. For example, under one contract Dimension Technologies, Inc. developed a large autostereoscopic display for scientific visualization applications. The Virtual Window employs an innovative illumination system to deliver the depth and color of true 3D imaging. Its applications include surgery and Magnetic Resonance Imaging scans, viewing for teleoperated robots, training, and in aviation cockpit displays.

  9. Corridor One:An Integrated Distance Visualization Enuronments for SSI+ASCI Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher R. Johnson, Charles D. Hansen

    2001-10-29

    The goal of Corridor One: An Integrated Distance Visualization Environment for ASCI and SSI Application was to combine the forces of six leading edge laboratories working in the areas of visualization and distributed computing and high performance networking (Argonne National Laboratory, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, University of Illinois, University of Utah and Princeton University) to develop and deploy the most advanced integrated distance visualization environment for large-scale scientific visualization and demonstrate it on applications relevant to the DOE SSI and ASCI programs. The Corridor One team brought world class expertise in parallel rendering, deep image basedmore » rendering, immersive environment technology, large-format multi-projector wall based displays, volume and surface visualization algorithms, collaboration tools and streaming media technology, network protocols for image transmission, high-performance networking, quality of service technology and distributed computing middleware. Our strategy was to build on the very successful teams that produced the I-WAY, ''Computational Grids'' and CAVE technology and to add these to the teams that have developed the fastest parallel visualizations systems and the most widely used networking infrastructure for multicast and distributed media. Unfortunately, just as we were getting going on the Corridor One project, DOE cut the program after the first year. As such, our final report consists of our progress during year one of the grant.« less

  10. National Laboratory for Advanced Scientific Visualization at UNAM - Mexico

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo

    2016-04-01

    In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires large quantity of memory as well as large and fast parallel storage systems. The entire system temperature is controlled by an energy and space efficient cooling solution, based on large rear door liquid cooled heat exchangers. This state-of-the-art infrastructure will boost research activities in the region, offer a powerful scientific tool for teaching at undergraduate and graduate levels, and enhance association and cooperation with business-oriented organizations.

  11. High Performance Real-Time Visualization of Voluminous Scientific Data Through the NOAA Earth Information System (NEIS).

    NASA Astrophysics Data System (ADS)

    Stewart, J.; Hackathorn, E. J.; Joyce, J.; Smith, J. S.

    2014-12-01

    Within our community data volume is rapidly expanding. These data have limited value if one cannot interact or visualize the data in a timely manner. The scientific community needs the ability to dynamically visualize, analyze, and interact with these data along with other environmental data in real-time regardless of the physical location or data format. Within the National Oceanic Atmospheric Administration's (NOAA's), the Earth System Research Laboratory (ESRL) is actively developing the NOAA Earth Information System (NEIS). Previously, the NEIS team investigated methods of data discovery and interoperability. The recent focus shifted to high performance real-time visualization allowing NEIS to bring massive amounts of 4-D data, including output from weather forecast models as well as data from different observations (surface obs, upper air, etc...) in one place. Our server side architecture provides a real-time stream processing system which utilizes server based NVIDIA Graphical Processing Units (GPU's) for data processing, wavelet based compression, and other preparation techniques for visualization, allows NEIS to minimize the bandwidth and latency for data delivery to end-users. Client side, users interact with NEIS services through the visualization application developed at ESRL called TerraViz. Terraviz is developed using the Unity game engine and takes advantage of the GPU's allowing a user to interact with large data sets in real time that might not have been possible before. Through these technologies, the NEIS team has improved accessibility to 'Big Data' along with providing tools allowing novel visualization and seamless integration of data across time and space regardless of data size, physical location, or data format. These capabilities provide the ability to see the global interactions and their importance for weather prediction. Additionally, they allow greater access than currently exists helping to foster scientific collaboration and new ideas. This presentation will provide an update of the recent enhancements of the NEIS architecture and visualization capabilities, challenges faced, as well as ongoing research activities related to this project.

  12. Introduction to the LaRC central scientific computing complex

    NASA Technical Reports Server (NTRS)

    Shoosmith, John N.

    1993-01-01

    The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.

  13. Stackable middleware services for advanced multimedia applications. Final report for period July 14, 1999 - July 14, 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Wu-chi; Crawfis, Roger, Weide, Bruce

    2002-02-01

    In this project, the authors propose the research, development, and distribution of a stackable component-based multimedia streaming protocol middleware service. The goals of this stackable middleware interface include: (1) The middleware service will provide application writers and scientists easy to use interfaces that support their visualization needs. (2) The middleware service will support a variety of image compression modes. Currently, many of the network adaptation protocols for video have been developed with DCT-based compression algorithms like H.261, MPEG-1, or MPEG-2 in mind. It is expected that with advanced scientific computing applications that the lossy compression of the image data willmore » be unacceptable in certain instances. The middleware service will support several in-line lossless compression modes for error-sensitive scientific visualization data. (3) The middleware service will support two different types of streaming video modes: one for interactive collaboration of scientists and a stored video streaming mode for viewing prerecorded animations. The use of two different streaming types will allow the quality of the video delivered to the user to be maximized. Most importantly, this service will happen transparently to the user (with some basic controls exported to the user for domain specific tweaking). In the spirit of layered network protocols (like ISO and TCP/IP), application writers should not have to know a large amount about lower level network details. Currently, many example video streaming players have their congestion management techniques tightly integrated into the video player itself and are, for the most part, ''one-off'' applications. As more networked multimedia and video applications are written in the future, a larger percentage of these programmers and scientist will most likely know little about the underlying networking layer. By providing a simple, powerful, and semi-transparent middleware layer, the successful completion of this project will help serve as a catalyst to support future video-based applications, particularly those of advanced scientific computing applications.« less

  14. Invisible Light: a global infotainment community based on augmented reality technologies

    NASA Astrophysics Data System (ADS)

    Israel, Kai; Wozniak, Peter; Vauderwange, Oliver; Curticapean, Dan

    2015-10-01

    Theoretical details about optics and photonics are not common knowledge nowadays. Physicists are keen to scientifically explain `light,' which has a huge impact on our lives. It is necessary to examine it from multiple perspectives and to make the knowledge accessible to the public in an interdisciplinary, scientifically well-grounded and appealing medial way. To allow an information exchange on a global scale, our project "Invisible Light" establishes a worldwide accessible platform. Its contents will not be created by a single instance, but user-generated, with the help of the global community. The article describes the infotainment portal "Invisible Light," which stores scientific articles about light and photonics and makes them accessible worldwide. All articles are tagged with geo-coordinates, so they can be clearly identified and localized. A smartphone application is used for visualization, transmitting the information to users in real time by means of an augmented reality application. Scientific information is made accessible for a broad audience and in an attractive manner.

  15. Query-Driven Visualization and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebel, Oliver; Bethel, E. Wes; Prabhat, Mr.

    2012-11-01

    This report focuses on an approach to high performance visualization and analysis, termed query-driven visualization and analysis (QDV). QDV aims to reduce the amount of data that needs to be processed by the visualization, analysis, and rendering pipelines. The goal of the data reduction process is to separate out data that is "scientifically interesting'' and to focus visualization, analysis, and rendering on that interesting subset. The premise is that for any given visualization or analysis task, the data subset of interest is much smaller than the larger, complete data set. This strategy---extracting smaller data subsets of interest and focusing ofmore » the visualization processing on these subsets---is complementary to the approach of increasing the capacity of the visualization, analysis, and rendering pipelines through parallelism. This report discusses the fundamental concepts in QDV, their relationship to different stages in the visualization and analysis pipelines, and presents QDV's application to problems in diverse areas, ranging from forensic cybersecurity to high energy physics.« less

  16. Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.

  17. Metadata Mapper: a web service for mapping data between independent visual analysis components, guided by perceptual rules

    NASA Astrophysics Data System (ADS)

    Rogowitz, Bernice E.; Matasci, Naim

    2011-03-01

    The explosion of online scientific data from experiments, simulations, and observations has given rise to an avalanche of algorithmic, visualization and imaging methods. There has also been enormous growth in the introduction of tools that provide interactive interfaces for exploring these data dynamically. Most systems, however, do not support the realtime exploration of patterns and relationships across tools and do not provide guidance on which colors, colormaps or visual metaphors will be most effective. In this paper, we introduce a general architecture for sharing metadata between applications and a "Metadata Mapper" component that allows the analyst to decide how metadata from one component should be represented in another, guided by perceptual rules. This system is designed to support "brushing [1]," in which highlighting a region of interest in one application automatically highlights corresponding values in another, allowing the scientist to develop insights from multiple sources. Our work builds on the component-based iPlant Cyberinfrastructure [2] and provides a general approach to supporting interactive, exploration across independent visualization and visual analysis components.

  18. Scientific Visualization Made Easy for the Scientist

    NASA Astrophysics Data System (ADS)

    Westerhoff, M.; Henderson, B.

    2002-12-01

    amirar is an application program used in creating 3D visualizations and geometric models of 3D image data sets from various application areas, e.g. medicine, biology, biochemistry, chemistry, physics, and engineering. It has demonstrated significant adoption in the market place since becoming commercially available in 2000. The rapid adoption has expanded the features being requested by the user base and broadened the scope of the amira product offering. The amira product offering includes amira Standard, amiraDevT, used to extend the product capabilities by users, amiraMolT, used for molecular visualization, amiraDeconvT, used to improve quality of image data, and amiraVRT, used in immersive VR environments. amira allows the user to construct a visualization tailored to his or her needs without requiring any programming knowledge. It also allows 3D objects to be represented as grids suitable for numerical simulations, notably as triangular surfaces and volumetric tetrahedral grids. The amira application also provides methods to generate such grids from voxel data representing an image volume, and it includes a general-purpose interactive 3D viewer. amiraDev provides an application-programming interface (API) that allows the user to add new components by C++ programming. amira supports many import formats including a 'raw' format allowing immediate access to your native uniform data sets. amira uses the power and speed of the OpenGLr and Open InventorT graphics libraries and 3D graphics accelerators to allow you to access over 145 modules, enabling you to process, probe, analyze and visualize your data. The amiraMolT extension adds powerful tools for molecular visualization to the existing amira platform. amiraMolT contains support for standard molecular file formats, tools for visualization and analysis of static molecules as well as molecular trajectories (time series). amiraDeconv adds tools for the deconvolution of 3D microscopic images. Deconvolution is the process of increasing image quality and resolution by computationally compensating artifacts of the recording process. amiraDeconv supports 3D wide field microscopy as well as 3D confocal microscopy. It offers both non-blind and blind image deconvolution algorithms. Non-blind deconvolution uses an individual measured point spread function, while non-blind algorithms work on the basis of only a few recording parameters (like numerical aperture or zoom factor). amiraVR is a specialized and extended version of the amira visualization system which is dedicated for use in immersive installations, such as large-screen stereoscopic projections, CAVEr or Holobenchr systems. Among others, it supports multi-threaded multi-pipe rendering, head-tracking, advanced 3D interaction concepts, and 3D menus allowing interaction with any amira object in the same way as on the desktop. With its unique set of features, amiraVR represents both a VR (Virtual Reality) ready application for scientific and medical visualization in immersive environments, and a development platform that allows building VR applications.

  19. An Event Driven State Based Interface for Synthetic Environments

    DTIC Science & Technology

    1991-12-01

    GROPE -- laptic Displays for Scientific Visualization." In Proceedings of the A.4CM SIGGRA PH, pages 177-185, 6-10 Aug 1990. S. Brunderinan. (’apt...M1athemnatical Metthods for .4 rtiflcial lnt Iligence and .4 utonomous Systemts. Prentence Hall, 1988. 13. Duckett, (’apt Donnald T. The Application of

  20. NED-2: A decision support system for integrated forest ecosystem management

    Treesearch

    Mark J. Twery; Peter D. Knopp; Scott A. Thomasma; H. Michael Rauscher; Donald E. Nute; Walter D. Potter; Frederick Maier; Jin Wang; Mayukh Dass; Hajime Uchiyama; Astrid Glende; Robin E. Hoffman

    2005-01-01

    NED-2 is a Windows-based system designed to improve project-level planning and decision making by providing useful and scientifically sound information to natural resource managers. Resources currently addressed include visual quality, ecology, forest health, timber, water, and wildlife. NED-2 expands on previous versions of NED applications by integrating treatment...

  1. NED-2: a decision support system for integrated forest ecosystem management

    Treesearch

    Mark J. Twery; Peter D. Knopp; Scott A. Thomasma; H. Michael Rauscher; Donald E. Nute; Walter D. Potter; Frederick Maier; Jin Wang; Mayukh Dass; Hajime Uchiyama; Astrid Glende; Robin E. Hoffman

    2005-01-01

    NED-2 is a Windows-based system designed to improve project-level planning and decision making by providing useful and scientifically sound information to natural resource managers. Resources currently addressed include visual quality, ecology, forest health, timber, water, and wildlife. NED-2 expands on previous versions of NED applications by integrating treatment...

  2. Visualizing Forensic Publication Impacts and Collaborations: Presenting at a Scientific Venue Leads to Increased Collaborations between Researchers and Information Professionals

    PubMed Central

    Makar, Susan; Malanowski, Amanda; Rapp, Katie

    2016-01-01

    The Information Services Office (ISO) of the National Institute of Standards and Technology (NIST) proactively sought out an opportunity to present the findings of a study that showed the impact of NIST’s forensic research output to its internal customers and outside researchers. ISO analyzed the impact of NIST’s contributions to the peer-reviewed forensic journal literature through citation analysis and network visualizations. The findings of this study were compiled into a poster that was presented during the Forensics@NIST Symposium in December 2014. ISO’s study informed the forensic research community where NIST has had some of the greatest scholarly impact. This paper describes the methodology used to assess the impact of NIST’s forensic publications and shares the results, outcomes, and impacts of ISO’s study and poster presentation. This methodology is adaptable and applicable to other research fields and to other libraries. It has improved the recognition of ISO’s capabilities within NIST and resulted in application of the methodology to additional scientific disciplines. PMID:27956754

  3. Interactive Learning System "VisMis" for Scientific Visualization Course

    ERIC Educational Resources Information Center

    Zhu, Xiaoming; Sun, Bo; Luo, Yanlin

    2018-01-01

    Now visualization courses have been taught at universities around the world. Keeping students motivated and actively engaged in this course can be a challenging task. In this paper we introduce our developed interactive learning system called VisMis (Visualization and Multi-modal Interaction System) for postgraduate scientific visualization course…

  4. Virtual Reality: Visualization in Three Dimensions.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    Virtual reality is a newly emerging tool for scientific visualization that makes possible multisensory, three-dimensional modeling of scientific data. While the emphasis is on visualization, the other senses are added to enhance what the scientist can visualize. Researchers are working to extend the sensory range of what can be perceived in…

  5. Visual Discourse in Scientific Conference Papers: A Genre-based Study.

    ERIC Educational Resources Information Center

    Rowley-Jolivet, Elizabeth

    2002-01-01

    Investigates the role of visual communication in a spoken research genre: the scientific research paper. Analyzes 2,048 visuals projected during 90 papers given at five international conferences in three fields (Geology, medicine, physics), in order to bring out the recurrent features of the visual dimension. (Author/VWL)

  6. VizioMetrics: Mining the Scientific Visual Literature

    ERIC Educational Resources Information Center

    Lee, Po-Shen

    2017-01-01

    Scientific results are communicated visually in the literature through diagrams, visualizations, and photographs. In this thesis, we developed a figure processing pipeline to classify more than 8 million figures from PubMed Central into different figure types and study the resulting patterns of visual information as they relate to scholarly…

  7. Living Liquid: Design and Evaluation of an Exploratory Visualization Tool for Museum Visitors.

    PubMed

    Ma, J; Liao, I; Ma, Kwan-Liu; Frazier, J

    2012-12-01

    Interactive visualizations can allow science museum visitors to explore new worlds by seeing and interacting with scientific data. However, designing interactive visualizations for informal learning environments, such as museums, presents several challenges. First, visualizations must engage visitors on a personal level. Second, visitors often lack the background to interpret visualizations of scientific data. Third, visitors have very limited time at individual exhibits in museums. This paper examines these design considerations through the iterative development and evaluation of an interactive exhibit as a visualization tool that gives museumgoers access to scientific data generated and used by researchers. The exhibit prototype, Living Liquid, encourages visitors to ask and answer their own questions while exploring the time-varying global distribution of simulated marine microbes using a touchscreen interface. Iterative development proceeded through three rounds of formative evaluations using think-aloud protocols and interviews, each round informing a key visualization design decision: (1) what to visualize to initiate inquiry, (2) how to link data at the microscopic scale to global patterns, and (3) how to include additional data that allows visitors to pursue their own questions. Data from visitor evaluations suggests that, when designing visualizations for public audiences, one should (1) avoid distracting visitors from data that they should explore, (2) incorporate background information into the visualization, (3) favor understandability over scientific accuracy, and (4) layer data accessibility to structure inquiry. Lessons learned from this case study add to our growing understanding of how to use visualizations to actively engage learners with scientific data.

  8. Discovering Communicable Scientific Knowledge from Spatio-Temporal Data

    NASA Technical Reports Server (NTRS)

    Schwabacher, Mark; Langley, Pat; Norvig, Peter (Technical Monitor)

    2001-01-01

    This paper describes how we used regression rules to improve upon a result previously published in the Earth science literature. In such a scientific application of machine learning, it is crucially important for the learned models to be understandable and communicable. We recount how we selected a learning algorithm to maximize communicability, and then describe two visualization techniques that we developed to aid in understanding the model by exploiting the spatial nature of the data. We also report how evaluating the learned models across time let us discover an error in the data.

  9. Visualization techniques to aid in the analysis of multispectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.

  10. 19 CFR 10.121 - Visual or auditory materials of an educational, scientific, or cultural character.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... educational, scientific, or cultural character. (a) Where photographic film and other articles described in... the articles are visual or auditory materials of an educational, scientific, or cultural character..., scientific, or cultural character. 10.121 Section 10.121 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION...

  11. 19 CFR 10.121 - Visual or auditory materials of an educational, scientific, or cultural character.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... educational, scientific, or cultural character. (a) Where photographic film and other articles described in... the articles are visual or auditory materials of an educational, scientific, or cultural character..., scientific, or cultural character. 10.121 Section 10.121 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION...

  12. Scalable Adaptive Graphics Environment (SAGE) Software for the Visualization of Large Data Sets on a Video Wall

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary; Srikishen, Jayanthi; Edwards, Rita; Cross, David; Welch, Jon; Smith, Matt

    2013-01-01

    The use of collaborative scientific visualization systems for the analysis, visualization, and sharing of "big data" available from new high resolution remote sensing satellite sensors or four-dimensional numerical model simulations is propelling the wider adoption of ultra-resolution tiled display walls interconnected by high speed networks. These systems require a globally connected and well-integrated operating environment that provides persistent visualization and collaboration services. This abstract and subsequent presentation describes a new collaborative visualization system installed for NASA's Shortterm Prediction Research and Transition (SPoRT) program at Marshall Space Flight Center and its use for Earth science applications. The system consists of a 3 x 4 array of 1920 x 1080 pixel thin bezel video monitors mounted on a wall in a scientific collaboration lab. The monitors are physically and virtually integrated into a 14' x 7' for video display. The display of scientific data on the video wall is controlled by a single Alienware Aurora PC with a 2nd Generation Intel Core 4.1 GHz processor, 32 GB memory, and an AMD Fire Pro W600 video card with 6 mini display port connections. Six mini display-to-dual DVI cables are used to connect the 12 individual video monitors. The open source Scalable Adaptive Graphics Environment (SAGE) windowing and media control framework, running on top of the Ubuntu 12 Linux operating system, allows several users to simultaneously control the display and storage of high resolution still and moving graphics in a variety of formats, on tiled display walls of any size. The Ubuntu operating system supports the open source Scalable Adaptive Graphics Environment (SAGE) software which provides a common environment, or framework, enabling its users to access, display and share a variety of data-intensive information. This information can be digital-cinema animations, high-resolution images, high-definition video-teleconferences, presentation slides, documents, spreadsheets or laptop screens. SAGE is cross-platform, community-driven, open-source visualization and collaboration middleware that utilizes shared national and international cyberinfrastructure for the advancement of scientific research and education.

  13. Scalable Adaptive Graphics Environment (SAGE) Software for the Visualization of Large Data Sets on a Video Wall

    NASA Astrophysics Data System (ADS)

    Jedlovec, G.; Srikishen, J.; Edwards, R.; Cross, D.; Welch, J. D.; Smith, M. R.

    2013-12-01

    The use of collaborative scientific visualization systems for the analysis, visualization, and sharing of 'big data' available from new high resolution remote sensing satellite sensors or four-dimensional numerical model simulations is propelling the wider adoption of ultra-resolution tiled display walls interconnected by high speed networks. These systems require a globally connected and well-integrated operating environment that provides persistent visualization and collaboration services. This abstract and subsequent presentation describes a new collaborative visualization system installed for NASA's Short-term Prediction Research and Transition (SPoRT) program at Marshall Space Flight Center and its use for Earth science applications. The system consists of a 3 x 4 array of 1920 x 1080 pixel thin bezel video monitors mounted on a wall in a scientific collaboration lab. The monitors are physically and virtually integrated into a 14' x 7' for video display. The display of scientific data on the video wall is controlled by a single Alienware Aurora PC with a 2nd Generation Intel Core 4.1 GHz processor, 32 GB memory, and an AMD Fire Pro W600 video card with 6 mini display port connections. Six mini display-to-dual DVI cables are used to connect the 12 individual video monitors. The open source Scalable Adaptive Graphics Environment (SAGE) windowing and media control framework, running on top of the Ubuntu 12 Linux operating system, allows several users to simultaneously control the display and storage of high resolution still and moving graphics in a variety of formats, on tiled display walls of any size. The Ubuntu operating system supports the open source Scalable Adaptive Graphics Environment (SAGE) software which provides a common environment, or framework, enabling its users to access, display and share a variety of data-intensive information. This information can be digital-cinema animations, high-resolution images, high-definition video-teleconferences, presentation slides, documents, spreadsheets or laptop screens. SAGE is cross-platform, community-driven, open-source visualization and collaboration middleware that utilizes shared national and international cyberinfrastructure for the advancement of scientific research and education.

  14. Creativity, visualization abilities, and visual cognitive style.

    PubMed

    Kozhevnikov, Maria; Kozhevnikov, Michael; Yu, Chen Jiao; Blazhenkova, Olesya

    2013-06-01

    Despite the recent evidence for a multi-component nature of both visual imagery and creativity, there have been no systematic studies on how the different dimensions of creativity and imagery might interrelate. The main goal of this study was to investigate the relationship between different dimensions of creativity (artistic and scientific) and dimensions of visualization abilities and styles (object and spatial). In addition, we compared the contributions of object and spatial visualization abilities versus corresponding styles to scientific and artistic dimensions of creativity. Twenty-four undergraduate students (12 females) were recruited for the first study, and 75 additional participants (36 females) were recruited for an additional experiment. Participants were administered a number of object and spatial visualization abilities and style assessments as well as a number of artistic and scientific creativity tests. The results show that object visualization relates to artistic creativity and spatial visualization relates to scientific creativity, while both are distinct from verbal creativity. Furthermore, our findings demonstrate that style predicts corresponding dimension of creativity even after removing shared variance between style and visualization ability. The results suggest that styles might be a more ecologically valid construct in predicting real-life creative behaviour, such as performance in different professional domains. © 2013 The British Psychological Society.

  15. Provenance Storage, Querying, and Visualization in PBase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kianmajd, Parisa; Ludascher, Bertram; Missier, Paolo

    2015-01-01

    We present PBase, a repository for scientific workflows and their corresponding provenance information that facilitates the sharing of experiments among the scientific community. PBase is interoperable since it uses ProvONE, a standard provenance model for scientific workflows. Workflows and traces are stored in RDF, and with the support of SPARQL and the tree cover encoding, the repository provides a scalable infrastructure for querying the provenance data. Furthermore, through its user interface, it is possible to: visualize workflows and execution traces; visualize reachability relations within these traces; issue SPARQL queries; and visualize query results.

  16. Earthscape, a Multi-Purpose Interactive 3d Globe Viewer for Hybrid Data Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Sarthou, A.; Mas, S.; Jacquin, M.; Moreno, N.; Salamon, A.

    2015-08-01

    The hybrid visualization and interaction tool EarthScape is presented here. The software is able to display simultaneously LiDAR point clouds, draped videos with moving footprint, volume scientific data (using volume rendering, isosurface and slice plane), raster data such as still satellite images, vector data and 3D models such as buildings or vehicles. The application runs on touch screen devices such as tablets. The software is based on open source libraries, such as OpenSceneGraph, osgEarth and OpenCV, and shader programming is used to implement volume rendering of scientific data. The next goal of EarthScape is to perform data analysis using ENVI Services Engine, a cloud data analysis solution. EarthScape is also designed to be a client of Jagwire which provides multisource geo-referenced video fluxes. When all these components will be included, EarthScape will be a multi-purpose platform that will provide at the same time data analysis, hybrid visualization and complex interactions. The software is available on demand for free at france@exelisvis.com.

  17. Characterization of the peer review network at the Center for Scientific Review, National Institutes of Health.

    PubMed

    Boyack, Kevin W; Chen, Mei-Ching; Chacko, George

    2014-01-01

    The National Institutes of Health (NIH) is the largest source of funding for biomedical research in the world. This funding is largely effected through a competitive grants process. Each year the Center for Scientific Review (CSR) at NIH manages the evaluation, by peer review, of more than 55,000 grant applications. A relevant management question is how this scientific evaluation system, supported by finite resources, could be continuously evaluated and improved for maximal benefit to the scientific community and the taxpaying public. Towards this purpose, we have created the first system-level description of peer review at CSR by applying text analysis, bibliometric, and graph visualization techniques to administrative records. We identify otherwise latent relationships across scientific clusters, which in turn suggest opportunities for structural reorganization of the system based on expert evaluation. Such studies support the creation of monitoring tools and provide transparency and knowledge to stakeholders.

  18. DIY visualizations: opportunities for story-telling with esri tools

    Treesearch

    Charles H. Perry; Barry T. Wilson

    2015-01-01

    The Forest Service and Esri recently entered into a partnership: (1) to distribute FIA and other Forest Service data with the public and stakeholders through ArcGIS Online, and (2) to facilitate the application of the ArcGIS platform within the Forest Service to develop forest management and landscape management plans, and support their scientific research activities....

  19. The Relevance of Feenberg's Critical Theory of Technology to Critical Visual Literacy: The Case of Scientific and Technical Illustrations

    ERIC Educational Resources Information Center

    Northcut, Kathryn M.

    2007-01-01

    Andrew Feenberg's critical theory of technology is an underutilized, relatively unknown resource in technical communication which could be exploited not only for its potential clarification of large social issues that involve our discipline, but also specifically toward the development of a critical theory of illustrations. Applications of…

  20. High Performance Visualization using Query-Driven Visualizationand Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Campbell, Scott; Dart, Eli

    2006-06-15

    Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.

  1. NASA's Global Imagery Browse Services - Technologies for Visualizing Earth Science Data

    NASA Astrophysics Data System (ADS)

    Cechini, M. F.; Boller, R. A.; Baynes, K.; Schmaltz, J. E.; Thompson, C. K.; Roberts, J. T.; Rodriguez, J.; Wong, M. M.; King, B. A.; King, J.; De Luca, A. P.; Pressley, N. N.

    2017-12-01

    For more than 20 years, the NASA Earth Observing System (EOS) has collected earth science data for thousands of scientific parameters now totaling nearly 15 Petabytes of data. In 2013, NASA's Global Imagery Browse Services (GIBS) formed its vision to "transform how end users interact and discover [EOS] data through visualizations." This vision included leveraging scientific and community best practices and standards to provide a scalable, compliant, and authoritative source for EOS earth science data visualizations. Since that time, GIBS has grown quickly and now services millions of daily requests for over 500 imagery layers representing hundreds of earth science parameters to a broad community of users. For many of these parameters, visualizations are available within hours of acquisition from the satellite. For others, visualizations are available for the entire mission of the satellite. The GIBS system is built upon the OnEarth and MRF open source software projects, which are provided by the GIBS team. This software facilitates standards-based access for compliance with existing GIS tools. The GIBS imagery layers are predominantly rasterized images represented in two-dimensional coordinate systems, though multiple projections are supported. The OnEarth software also supports the GIBS ingest pipeline to facilitate low latency updates to new or updated visualizations. This presentation will focus on the following topics: Overview of GIBS visualizations and user community Current benefits and limitations of the OnEarth and MRF software projects and related standards GIBS access methods and their in/compatibilities with existing GIS libraries and applications Considerations for visualization accuracy and understandability Future plans for more advanced visualization concepts including Vertical Profiles and Vector-Based Representations Future plans for Amazon Web Service support and deployments

  2. Novel Scientific Visualization Interfaces for Interactive Information Visualization and Sharing

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2012-12-01

    As geoscientists are confronted with increasingly massive datasets from environmental observations to simulations, one of the biggest challenges is having the right tools to gain scientific insight from the data and communicate the understanding to stakeholders. Recent developments in web technologies make it easy to manage, visualize and share large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to interact with data, and modify the parameters to create custom views of the data to gain insight from simulations and environmental observations. This requires developing new data models and intelligent knowledge discovery techniques to explore and extract information from complex computational simulations or large data repositories. Scientific visualization will be an increasingly important component to build comprehensive environmental information platforms. This presentation provides an overview of the trends and challenges in the field of scientific visualization, and demonstrates information visualization and communication tools in the Iowa Flood Information System (IFIS), developed within the light of these challenges. The IFIS is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS. 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities in advance to help minimize damage of floods.

  3. Biomedical ontologies: toward scientific debate.

    PubMed

    Maojo, V; Crespo, J; García-Remesal, M; de la Iglesia, D; Perez-Rey, D; Kulikowski, C

    2011-01-01

    Biomedical ontologies have been very successful in structuring knowledge for many different applications, receiving widespread praise for their utility and potential. Yet, the role of computational ontologies in scientific research, as opposed to knowledge management applications, has not been extensively discussed. We aim to stimulate further discussion on the advantages and challenges presented by biomedical ontologies from a scientific perspective. We review various aspects of biomedical ontologies going beyond their practical successes, and focus on some key scientific questions in two ways. First, we analyze and discuss current approaches to improve biomedical ontologies that are based largely on classical, Aristotelian ontological models of reality. Second, we raise various open questions about biomedical ontologies that require further research, analyzing in more detail those related to visual reasoning and spatial ontologies. We outline significant scientific issues that biomedical ontologies should consider, beyond current efforts of building practical consensus between them. For spatial ontologies, we suggest an approach for building "morphospatial" taxonomies, as an example that could stimulate research on fundamental open issues for biomedical ontologies. Analysis of a large number of problems with biomedical ontologies suggests that the field is very much open to alternative interpretations of current work, and in need of scientific debate and discussion that can lead to new ideas and research directions.

  4. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1995-01-01

    The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.

  5. Automating Geospatial Visualizations with Smart Default Renderers for Data Exploration Web Applications

    NASA Astrophysics Data System (ADS)

    Ekenes, K.

    2017-12-01

    This presentation will outline the process of creating a web application for exploring large amounts of scientific geospatial data using modern automated cartographic techniques. Traditional cartographic methods, including data classification, may inadvertently hide geospatial and statistical patterns in the underlying data. This presentation demonstrates how to use smart web APIs that quickly analyze the data when it loads, and provides suggestions for the most appropriate visualizations based on the statistics of the data. Since there are just a few ways to visualize any given dataset well, it is imperative to provide smart default color schemes tailored to the dataset as opposed to static defaults. Since many users don't go beyond default values, it is imperative that they are provided with smart default visualizations. Multiple functions for automating visualizations are available in the Smart APIs, along with UI elements allowing users to create more than one visualization for a dataset since there isn't a single best way to visualize a given dataset. Since bivariate and multivariate visualizations are particularly difficult to create effectively, this automated approach removes the guesswork out of the process and provides a number of ways to generate multivariate visualizations for the same variables. This allows the user to choose which visualization is most appropriate for their presentation. The methods used in these APIs and the renderers generated by them are not available elsewhere. The presentation will show how statistics can be used as the basis for automating default visualizations of data along continuous ramps, creating more refined visualizations while revealing the spread and outliers of the data. Adding interactive components to instantaneously alter visualizations allows users to unearth spatial patterns previously unknown among one or more variables. These applications may focus on a single dataset that is frequently updated, or configurable for a variety of datasets from multiple sources.

  6. Using Scientific Visualizations to Enhance Scientific Thinking In K-12 Geoscience Education

    NASA Astrophysics Data System (ADS)

    Robeck, E.

    2016-12-01

    The same scientific visualizations, animations, and images that are powerful tools for geoscientists can serve an important role in K-12 geoscience education by encouraging students to communicate in ways that help them develop habits of thought that are similar to those used by scientists. Resources such as those created by NASA's Scientific Visualization Studio (SVS), which are intended to inform researchers and the public about NASA missions, can be used in classrooms to promote thoughtful, engaged learning. Instructional materials that make use of those visualizations have been developed and are being used in K-12 classrooms in ways that demonstrate the vitality of the geosciences. For example, the Center for Geoscience and Society at the American Geosciences Institute (AGI) helped to develop a publication that outlines an inquiry-based approach to introducing students to the interpretation of scientific visualizations, even when they have had little to no prior experience with such media. To facilitate these uses, the SVS team worked with Center staff and others to adapt the visualizations, primarily by removing most of the labels and annotations. Engaging with these visually compelling resources serves as an invitation for students to ask questions, interpret data, draw conclusions, and make use of other processes that are key components of scientific thought. This presentation will share specific resources for K-12 teaching (all of which are available online, from NASA, and/or from AGI), as well as the instructional principles that they incorporate.

  7. Scientific Process Flowchart Assessment (SPFA): A Method for Evaluating Changes in Understanding and Visualization of the Scientific Process in a Multidisciplinary Student Population

    ERIC Educational Resources Information Center

    Wilson, Kristy J.; Rigakos, Bessie

    2016-01-01

    The scientific process is nonlinear, unpredictable, and ongoing. Assessing the nature of science is difficult with methods that rely on Likert-scale or multiple-choice questions. This study evaluated conceptions about the scientific process using student-created visual representations that we term "flowcharts." The methodology,…

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Song

    CFD (Computational Fluid Dynamics) is a widely used technique in engineering design field. It uses mathematical methods to simulate and predict flow characteristics in a certain physical space. Since the numerical result of CFD computation is very hard to understand, VR (virtual reality) and data visualization techniques are introduced into CFD post-processing to improve the understandability and functionality of CFD computation. In many cases CFD datasets are very large (multi-gigabytes), and more and more interactions between user and the datasets are required. For the traditional VR application, the limitation of computing power is a major factor to prevent visualizing largemore » dataset effectively. This thesis presents a new system designing to speed up the traditional VR application by using parallel computing and distributed computing, and the idea of using hand held device to enhance the interaction between a user and VR CFD application as well. Techniques in different research areas including scientific visualization, parallel computing, distributed computing and graphical user interface designing are used in the development of the final system. As the result, the new system can flexibly be built on heterogeneous computing environment, dramatically shorten the computation time.« less

  9. NOAA's Science On a Sphere Education Program: Application of a Scientific Visualization System to Teach Earth System Science and Improve our Understanding About Creating Effective Visualizations

    NASA Astrophysics Data System (ADS)

    McDougall, C.; McLaughlin, J.

    2008-12-01

    NOAA has developed several programs aimed at facilitating the use of earth system science data and data visualizations by formal and informal educators. One of them, Science On a Sphere, a visualization display tool and system that uses networked LCD projectors to display animated global datasets onto the outside of a suspended, 1.7-meter diameter opaque sphere, enables science centers, museums, and universities to display real-time and current earth system science data. NOAA's Office of Education has provided grants to such education institutions to develop exhibits featuring Science On a Sphere (SOS) and create content for and evaluate audience impact. Currently, 20 public education institutions have permanent Science On a Sphere exhibits and 6 more will be installed soon. These institutions and others that are working to create and evaluate content for this system work collaboratively as a network to improve our collective knowledge about how to create educationally effective visualizations. Network members include other federal agencies, such as, NASA and the Dept. of Energy, and major museums such as Smithsonian and American Museum of Natural History, as well as a variety of mid-sized and small museums and universities. Although the audiences in these institutions vary widely in their scientific awareness and understanding, we find there are misconceptions and lack of familiarity with viewing visualizations that are common among the audiences. Through evaluations performed in these institutions we continue to evolve our understanding of how to create content that is understandable by those with minimal scientific literacy. The findings from our network will be presented including the importance of providing context, real-world connections and imagery to accompany the visualizations and the need for audience orientation before the visualizations are viewed. Additionally, we will review the publicly accessible virtual library housing over 200 datasets for SOS and any other real or virtual globe. These datasets represent contributions from NOAA, NASA, Dept. of Energy, and the public institutions that are displaying the spheres.

  10. Modern Scientific Visualization is more than Just Pretty Pictures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E Wes; Rubel, Oliver; Wu, Kesheng

    2008-12-05

    While the primary product of scientific visualization is images and movies, its primary objective is really scientific insight. Too often, the focus of visualization research is on the product, not the mission. This paper presents two case studies, both that appear in previous publications, that focus on using visualization technology to produce insight. The first applies"Query-Driven Visualization" concepts to laser wakefield simulation data to help identify and analyze the process of beam formation. The second uses topological analysis to provide a quantitative basis for (i) understanding the mixing process in hydrodynamic simulations, and (ii) performing comparative analysis of data frommore » two different types of simulations that model hydrodynamic instability.« less

  11. Making data matter: Voxel printing for the digital fabrication of data across scales and domains.

    PubMed

    Bader, Christoph; Kolb, Dominik; Weaver, James C; Sharma, Sunanda; Hosny, Ahmed; Costa, João; Oxman, Neri

    2018-05-01

    We present a multimaterial voxel-printing method that enables the physical visualization of data sets commonly associated with scientific imaging. Leveraging voxel-based control of multimaterial three-dimensional (3D) printing, our method enables additive manufacturing of discontinuous data types such as point cloud data, curve and graph data, image-based data, and volumetric data. By converting data sets into dithered material deposition descriptions, through modifications to rasterization processes, we demonstrate that data sets frequently visualized on screen can be converted into physical, materially heterogeneous objects. Our approach alleviates the need to postprocess data sets to boundary representations, preventing alteration of data and loss of information in the produced physicalizations. Therefore, it bridges the gap between digital information representation and physical material composition. We evaluate the visual characteristics and features of our method, assess its relevance and applicability in the production of physical visualizations, and detail the conversion of data sets for multimaterial 3D printing. We conclude with exemplary 3D-printed data sets produced by our method pointing toward potential applications across scales, disciplines, and problem domains.

  12. EDITORIAL: Focus on Visualization in Physics FOCUS ON VISUALIZATION IN PHYSICS

    NASA Astrophysics Data System (ADS)

    Sanders, Barry C.; Senden, Tim; Springel, Volker

    2008-12-01

    Advances in physics are intimately connected with developments in a new technology, the telescope, precision clocks, even the computer all have heralded a shift in thinking. These landmark developments open new opportunities accelerating research and in turn new scientific directions. These technological drivers often correspond to new instruments, but equally might just as well flag a new mathematical tool, an algorithm or even means to visualize physics in a new way. Early on in this twenty-first century, scientific communities are just starting to explore the potential of digital visualization. Whether visualization is used to represent and communicate complex concepts, or to understand and interpret experimental data, or to visualize solutions to complex dynamical equations, the basic tools of visualization are shared in each of these applications and implementations. High-performance computing exemplifies the integration of visualization with leading research. Visualization is an indispensable tool for analyzing and interpreting complex three-dimensional dynamics as well as to diagnose numerical problems in intricate parallel calculation algorithms. The effectiveness of visualization arises by exploiting the unmatched capability of the human eye and visual cortex to process the large information content of images. In a brief glance, we recognize patterns or identify subtle features even in noisy data, something that is difficult or impossible to achieve with more traditional forms of data analysis. Importantly, visualizations guide the intuition of researchers and help to comprehend physical phenomena that lie far outside of direct experience. In fact, visualizations literally allow us to see what would otherwise remain completely invisible. For example, artificial imagery created to visualize the distribution of dark matter in the Universe has been instrumental to develop the notion of a cosmic web, and for helping to establish the current standard model of cosmology wherein this (in principle invisible) dark matter dominates the cosmic matter content. The advantages of visualization found for simulated data also hold for real world data as well. With the application of computerized acquisition many scientific disciplines are witnessing exponential growth rates of the volume of accumulated raw data, which often makes it daunting to condense the information into a manageable form, a challenge that can be addressed by modern visualization techniques. Such visualizations are also often an enticing way to communicate scientific results to the general public. This need for visualization is especially true in basic science, with its reliance on a benevolent and interested general public that drives the need for high-quality visualizations. Despite the widespread use of visualization, this technology has suffered from a lack of the unifying influence of shared common experiences. As with any emerging technology practitioners have often independently found solutions to similar problems. It is the aim of this focus issue to celebrate the importance of visualization, report on its growing use by the broad community of physicists, including biophysics, chemical physics, geophysics, astrophysics, and medical physics, and provide an opportunity for the diverse community of scientists using visualization to share work in one issue of a journal that itself is in the vanguard of supporting visualization and multimedia. A remarkable breadth and diversity of visualization in physics is to be found in this issue spanning fundamental aspects of relativity theory to computational fluid dynamics. The topics span length scales that are as small as quantum phenomena to the entire observable Universe. We have been impressed by the quality of the submissions and hope that this snap-shot will introduce, inform, motivate and maybe even help to unify visualization in physics. Readers are also directed to the December issue of Physics World which includes the following features highlighting work in this collection and other novel uses of visualization techniques: 'A feast of visualization' Physics World December 2008 pp 20 23 'Seeing the quantum world' by Barry Sanders Physics World December 2008 pp 24 27 'A picture of the cosmos' by Mark SubbaRao and Miguel Aragon-Calvo Physics World December 2008 pp 29 32 'Thinking outside the cube' by César A Hidalgo Physics World December 2008 pp 34 37 Focus on Visualization in Physics Contents Visualization of spiral and scroll waves in simulated and experimental cardiac tissue E M Cherry and F H Fenton Visualization of large scale structure from the Sloan Digital Sky Survey M U SubbaRao, M A Aragón-Calvo, H W Chen, J M Quashnock, A S Szalay and D G York How computers can help us in creating an intuitive access to relativity Hanns Ruder, Daniel Weiskopf, Hans-Peter Nollert and Thomas Müller Lagrangian particle tracking in three dimensions via single-camera in-line digital holography Jiang Lu, Jacob P Fugal, Hansen Nordsiek, Ewe Wei Saw, Raymond A Shaw and Weidong Yang Quantifying spatial heterogeneity from images Andrew E Pomerantz and Yi-Qiao Song Disaggregation and scientific visualization of earthscapes considering trends and spatial dependence structures S Grunwald Strength through structure: visualization and local assessment of the trabecular bone structure C Räth, R Monetti, J Bauer, I Sidorenko, D Müller, M Matsuura, E-M Lochmüller, P Zysset and F Eckstein Thermonuclear supernovae: a multi-scale astrophysical problem challenging numerical simulations and visualization F K Röpke and R Bruckschen Visualization needs and techniques for astrophysical simulations W Kapferer and T Riser Flow visualization and field line advection in computational fluid dynamics: application to magnetic fields and turbulent flows Pablo Mininni, Ed Lee, Alan Norton and John Clyne Splotch: visualizing cosmological simulations K Dolag, M Reinecke, C Gheller and S Imboden Visualizing a silicon quantum computer Barry C Sanders, Lloyd C L Hollenberg, Darran Edmundson and Andrew Edmundson Colliding galaxies, rotating neutron stars and merging black holes—visualizing high dimensional datasets on arbitrary meshes Werner Benger A low complexity visualization tool that helps to perform complex systems analysis M G Beiró, J I Alvarez-Hamelin and J R Busch Visualizing astrophysical N-body systems John Dubinski

  13. The Rising Landscape: A Visual Exploration of Superstring Revolutions in Physics.

    ERIC Educational Resources Information Center

    Chen, Chaomei; Kuljis, Jasna

    2003-01-01

    Discussion of knowledge domain visualization focuses on practical issues concerning modeling and visualizing scientific revolutions. Studies growth patterns of specialties derived from citation and cocitation data on string theory in physics, using the general framework of Thomas Kuhn's structure of scientific revolutions. (Author/LRW)

  14. Application-Controlled Demand Paging for Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Cox, Michael; Ellsworth, David; Kutler, Paul (Technical Monitor)

    1997-01-01

    In the area of scientific visualization, input data sets are often very large. In visualization of Computational Fluid Dynamics (CFD) in particular, input data sets today can surpass 100 Gbytes, and are expected to scale with the ability of supercomputers to generate them. Some visualization tools already partition large data sets into segments, and load appropriate segments as they are needed. However, this does not remove the problem for two reasons: 1) there are data sets for which even the individual segments are too large for the largest graphics workstations, 2) many practitioners do not have access to workstations with the memory capacity required to load even a segment, especially since the state-of-the-art visualization tools tend to be developed by researchers with much more powerful machines. When the size of the data that must be accessed is larger than the size of memory, some form of virtual memory is simply required. This may be by segmentation, paging, or by paged segments. In this paper we demonstrate that complete reliance on operating system virtual memory for out-of-core visualization leads to poor performance. We then describe a paged segment system that we have implemented, and explore the principles of memory management that can be employed by the application for out-of-core visualization. We show that application control over some of these can significantly improve performance. We show that sparse traversal can be exploited by loading only those data actually required. We show also that application control over data loading can be exploited by 1) loading data from alternative storage format (in particular 3-dimensional data stored in sub-cubes), 2) controlling the page size. Both of these techniques effectively reduce the total memory required by visualization at run-time. We also describe experiments we have done on remote out-of-core visualization (when pages are read by demand from remote disk) whose results are promising.

  15. Scientific Process Flowchart Assessment (SPFA): A Method for Evaluating Changes in Understanding and Visualization of the Scientific Process in a Multidisciplinary Student Population.

    PubMed

    Wilson, Kristy J; Rigakos, Bessie

    The scientific process is nonlinear, unpredictable, and ongoing. Assessing the nature of science is difficult with methods that rely on Likert-scale or multiple-choice questions. This study evaluated conceptions about the scientific process using student-created visual representations that we term "flowcharts." The methodology, Scientific Process Flowchart Assessment (SPFA), consisted of a prompt and rubric that was designed to assess students' understanding of the scientific process. Forty flowcharts representing a multidisciplinary group without intervention and 26 flowcharts representing pre- and postinstruction were evaluated over five dimensions: connections, experimental design, reasons for doing science, nature of science, and interconnectivity. Pre to post flowcharts showed a statistically significant improvement in the number of items and ratings for the dimensions. Comparison of the terms used and connections between terms on student flowcharts revealed an enhanced and more nuanced understanding of the scientific process, especially in the areas of application to society and communication within the scientific community. We propose that SPFA can be used in a variety of circumstances, including in the determination of what curricula or interventions would be useful in a course or program, in the assessment of curriculum, or in the evaluation of students performing research projects. © 2016 K. J. Wilson and B. Rigakos. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  16. ScyFlow: An Environment for the Visual Specification and Execution of Scientific Workflows

    NASA Technical Reports Server (NTRS)

    McCann, Karen M.; Yarrow, Maurice; DeVivo, Adrian; Mehrotra, Piyush

    2004-01-01

    With the advent of grid technologies, scientists and engineers are building more and more complex applications to utilize distributed grid resources. The core grid services provide a path for accessing and utilizing these resources in a secure and seamless fashion. However what the scientists need is an environment that will allow them to specify their application runs at a high organizational level, and then support efficient execution across any given set or sets of resources. We have been designing and implementing ScyFlow, a dual-interface architecture (both GUT and APT) that addresses this problem. The scientist/user specifies the application tasks along with the necessary control and data flow, and monitors and manages the execution of the resulting workflow across the distributed resources. In this paper, we utilize two scenarios to provide the details of the two modules of the project, the visual editor and the runtime workflow engine.

  17. Narratives in Mind and Media: A Cognitive Semiotic Account of Novices Interpreting Visual Science Media

    NASA Astrophysics Data System (ADS)

    Matuk, Camillia Faye

    Visual representations are central to expert scientific thinking. Meanwhile, novices tend toward narrative conceptions of scientific phenomena. Until recently, however, relationships between visual design, narrative thinking, and their impacts on learning science have only been theoretically pursued. This dissertation first synthesizes different disciplinary perspectives, then offers a mixed-methods investigation into interpretations of scientific representations. Finally, it considers design issues associated with narrative and visual imagery, and explores the possibilities of a pedagogical notation to scaffold the understanding of a standard scientific notation. Throughout, I distinguish two categories of visual media by their relation to narrative: Narrative visual media, which convey content via narrative structure, and Conceptual visual media, which convey states of relationships among objects. Given the role of narrative in framing conceptions of scientific phenomena and perceptions of its representations, I suggest that novices are especially prone to construe both kinds of media in narrative terms. To illustrate, I first describe how novices make meaning of the science conveyed in narrative visual media. Vignettes of an undergraduate student's interpretation of a cartoon about natural selection; and of four 13-year olds' readings of a comic book about human papillomavirus infection, together demonstrate conditions under which designed visual narrative elements facilitate or hinder understanding. I next consider the interpretation of conceptual visual media with an example of an expert notation from evolutionary biology, the cladogram. By combining clinical interview methods with experimental design, I show how undergraduate students' narrative theories of evolution frame perceptions of the diagram (Study 1); I demonstrate the flexibility of symbolic meaning, both with the content assumed (Study 2A), and with alternate manners of presenting the diagram (Study 2B); finally, I show the effects of content assumptions on the diagrams students invent of phylogenetic data (Study 3A), and how first inventing a diagram influences later interpretations of the standard notation (Study 3B). Lastly, I describe the prototype design and pilot test of an interactive diagram to scaffold biology students' understanding of this expert scientific notation. Insights from this dissertation inform the design of more pedagogically useful representations that might support students' developing fluency with expert scientific representations.

  18. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    NASA Astrophysics Data System (ADS)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface of the web application. These features are all in addition to a full range of essential visualization functions including 3-D camera and object orientation, position manipulation, time-stepping control, and custom color/alpha mapping.

  19. Scientific Notation Watercolor

    ERIC Educational Resources Information Center

    Linford, Kyle; Oltman, Kathleen; Daisey, Peggy

    2016-01-01

    (Purpose) The purpose of this paper is to describe visual literacy, an adapted version of Visual Thinking Strategy (VTS), and an art-integrated middle school mathematics lesson about scientific notation. The intent of this lesson was to provide students with a real life use of scientific notation and exponents, and to motivate them to apply their…

  20. Program Supports Scientific Visualization

    NASA Technical Reports Server (NTRS)

    Keith, Stephan

    1994-01-01

    Primary purpose of General Visualization System (GVS) computer program is to support scientific visualization of data generated by panel-method computer program PMARC_12 (inventory number ARC-13362) on Silicon Graphics Iris workstation. Enables user to view PMARC geometries and wakes as wire frames or as light shaded objects. GVS is written in C language.

  1. Using Scientific Visualization to Represent Soil Hydrology Dynamics

    ERIC Educational Resources Information Center

    Dolliver, H. A. S.; Bell, J. C.

    2006-01-01

    Understanding the relationships between soil, landscape, and hydrology is important for making sustainable land management decisions. In this study, scientific visualization was explored as a means to visually represent the complex spatial and temporal variations in the hydrologic status of soils. Soil hydrology data was collected at seven…

  2. Visualizing the Heliosphere

    NASA Technical Reports Server (NTRS)

    Bridgman, William T.; Shirah, Greg W.; Mitchell, Horace G.

    2008-01-01

    Today, scientific data and models can combine with modern animation tools to produce compelling visualizations to inform and educate. The Scientific Visualization Studio at Goddard Space Flight Center merges these techniques from the very different worlds of entertainment and science to enable scientists and the general public to 'see the unseeable' in new ways.

  3. Scientific Visualization: A Synthesis of Historical Data.

    ERIC Educational Resources Information Center

    Polland, Mark

    Visualization is the process by which one is able to create and sustain mental images for observation, analysis, and experimentation. This study consists of a compilation of evidence from historical examples that were collected in order to document the importance and the uses of visualization within the realm of scientific investigation.…

  4. Laboratory x-ray micro-computed tomography: a user guideline for biological samples

    PubMed Central

    2017-01-01

    Abstract Laboratory x-ray micro–computed tomography (micro-CT) is a fast-growing method in scientific research applications that allows for non-destructive imaging of morphological structures. This paper provides an easily operated “how to” guide for new potential users and describes the various steps required for successful planning of research projects that involve micro-CT. Background information on micro-CT is provided, followed by relevant setup, scanning, reconstructing, and visualization methods and considerations. Throughout the guide, a Jackson's chameleon specimen, which was scanned at different settings, is used as an interactive example. The ultimate aim of this paper is make new users familiar with the concepts and applications of micro-CT in an attempt to promote its use in future scientific studies. PMID:28419369

  5. The Cloud-Based Integrated Data Viewer (IDV)

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2015-04-01

    Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While there are a suite of tools and methodologies used in traditional software engineering environments to mitigate this issue, they are typically ignored by developers lacking a background in software engineering. The result is a large body of software which is simultaneously critical and difficult to maintain. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. The advent of cloud computing has provided a solution to this problem, which was not previously practical on a large scale; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. Through application streaming we are able to bring the same visualization to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be. Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved. We will also discuss the differences between local software and software-as-a-service.

  6. Anthropological film: a scientific and humanistic resource.

    PubMed

    Soren, E R

    1974-12-20

    More than a scientific endeavor but not strictly one of the humanities either, anthropology stands between these basic kinds of intellectual pursuit, bridging and contributing to both. Not limited to natural history, anthropology touches art, historical process, and human values, drawing from the materials and approaches of both science and humanities. This professional interest in a broad understanding of the human condition has led anthropologists to adapt and use modern cameras and films to inquire further into the variety of ways of life of mankind and to develop method and theory to prepare anthropological film as a permanent scientific and humanistic resource. Until quite recently the evolution of human culture and organization has diverged in the hitherto isolated regions of the world. Now this divergence has virtually ceased; we are witnessing an unprecedented period in human history-one where cultural divergence has turned to cultural convergence and where the varieties of independently evolved expressions of basic human potential are giving way to a single system of modern communications, transport, commerce, and manufacturing technology. Before the varieties of ways of life of the world disappear, they can be preserved in facsimile in anthropological films. As primary, undifferentiated visual information, these films facilitate that early step in the creation of new knowledge which is sometimes called humanistic and without which scientific application lies dormant, lacking an idea to test. In keeping with the two scholarly faces of anthropology, humanistic and scientific, anthropological films may provide material permitting both humanistic insight and the more controlled formulations of science. The lightweight filming equipment recently developed has been adapted by anthropologists as a tool of scholarly visual inquiry; methods of retrieving visual data from changing and vanishing ways of life have been developed; and new ways to reveal human beings to one another by using such visual resources have been explored. As a result, not only can anthropological film records permit continued reexamination of the past human conditions from which the present was shaped, but they also facilitate an ongoing public and scientific review of the dynamics of the human behavioral and social repertoire in relation to the contemporary conditions which pattern human responses and adaptation. How man fits into and copes with the changing world is of vital interest and concern. Visual data provide otherwise unobtainable information on human potential, behavior, and social organization. Such information, fed into the public media, facilitates informed consideration of alternative possibilities. By contributing to a better informed society, such films will help make our future more human and more humane.

  7. The nature of the (visualization) game: Challenges and opportunities from computational geophysics

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them to better understand the nature of complex, multiscale geoscience data.

  8. Garbage Patch Visualization Experiment

    NASA Image and Video Library

    2015-08-20

    Goddard visualizers show us how five garbage patches formed in the world's oceans using 35 years of data. Read more: 1.usa.gov/1Lnj7xV Credit: NASA's Scientific Visualization Studio NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  9. USGS Scientific Visualization Laboratory

    USGS Publications Warehouse

    ,

    1995-01-01

    The U.S. Geological Survey's (USGS) Scientific Visualization Laboratory at the National Center in Reston, Va., provides a central facility where USGS employees can use state-of-the-art equipment for projects ranging from presentation graphics preparation to complex visual representations of scientific data. Equipment including color printers, black-and-white and color scanners, film recorders, video equipment, and DOS, Apple Macintosh, and UNIX platforms with software are available for both technical and nontechnical users. The laboratory staff provides assistance and demonstrations in the use of the hardware and software products.

  10. An e-learning application on electrochemotherapy

    PubMed Central

    Corovic, Selma; Bester, Janez; Miklavcic, Damijan

    2009-01-01

    Background Electrochemotherapy is an effective approach in local tumour treatment employing locally applied high-voltage electric pulses in combination with chemotherapeutic drugs. In planning and performing electrochemotherapy a multidisciplinary expertise is required and collaboration, knowledge and experience exchange among the experts from different scientific fields such as medicine, biology and biomedical engineering is needed. The objective of this study was to develop an e-learning application in order to provide the educational content on electrochemotherapy and its underlying principles and to support collaboration, knowledge and experience exchange among the experts involved in the research and clinics. Methods The educational content on electrochemotherapy and cell and tissue electroporation was based on previously published studies from molecular dynamics, lipid bilayers, single cell level and simplified tissue models to complex biological tissues and research and clinical results of electrochemotherapy treatment. We used computer graphics such as model-based visualization (i.e. 3D numerical modelling using finite element method) and 3D computer animations and graphical illustrations to facilitate the representation of complex biological and physical aspects in electrochemotherapy. The e-learning application is integrated into an interactive e-learning environment developed at our institution, enabling collaboration and knowledge exchange among the users. We evaluated the designed e-learning application at the International Scientific workshop and postgraduate course (Electroporation Based Technologies and Treatments). The evaluation was carried out by testing the pedagogical efficiency of the presented educational content and by performing the usability study of the application. Results The e-learning content presents three different levels of knowledge on cell and tissue electroporation. In the first part of the e-learning application we explain basic principles of electroporation process. The second part provides educational content about importance of modelling and visualization of local electric field in electroporation-based treatments. In the third part we developed an interactive module for visualization of local electric field distribution in 3D tissue models of cutaneous tumors for different parameters such as voltage applied, distance between electrodes, electrode dimension and shape, tissue geometry and electric conductivity. The pedagogical efficiency assessment showed that the participants improved their level of knowledge. The results of usability evaluation revealed that participants found the application simple to learn, use and navigate. The participants also found the information provided by the application easy to understand. Conclusion The e-learning application we present in this article provides educational material on electrochemotherapy and its underlying principles such as cell and tissue electroporation. The e-learning application is developed to provide an interactive educational content in order to simulate the "hands-on" learning approach about the parameters being important for successful therapy. The e-learning application together with the interactive e-learning environment is available to the users to provide collaborative and flexible learning in order to facilitate knowledge exchange among the experts from different scientific fields that are involved in electrochemotherapy. The modular structure of the application allows for upgrade with new educational content collected from the clinics and research, and can be easily adapted to serve as a collaborative e-learning tool also in other electroporation-based treatments such as gene electrotransfer, gene vaccination, irreversible tissue ablation and transdermal gene and drug delivery. The presented e-learning application provides an easy and rapid approach for information, knowledge and experience exchange among the experts from different scientific fields, which can facilitate development and optimisation of electroporation-based treatments. PMID:19843322

  11. An e-learning application on electrochemotherapy.

    PubMed

    Corovic, Selma; Bester, Janez; Miklavcic, Damijan

    2009-10-20

    Electrochemotherapy is an effective approach in local tumour treatment employing locally applied high-voltage electric pulses in combination with chemotherapeutic drugs. In planning and performing electrochemotherapy a multidisciplinary expertise is required and collaboration, knowledge and experience exchange among the experts from different scientific fields such as medicine, biology and biomedical engineering is needed. The objective of this study was to develop an e-learning application in order to provide the educational content on electrochemotherapy and its underlying principles and to support collaboration, knowledge and experience exchange among the experts involved in the research and clinics. The educational content on electrochemotherapy and cell and tissue electroporation was based on previously published studies from molecular dynamics, lipid bilayers, single cell level and simplified tissue models to complex biological tissues and research and clinical results of electrochemotherapy treatment. We used computer graphics such as model-based visualization (i.e. 3D numerical modelling using finite element method) and 3D computer animations and graphical illustrations to facilitate the representation of complex biological and physical aspects in electrochemotherapy. The e-learning application is integrated into an interactive e-learning environment developed at our institution, enabling collaboration and knowledge exchange among the users. We evaluated the designed e-learning application at the International Scientific workshop and postgraduate course (Electroporation Based Technologies and Treatments). The evaluation was carried out by testing the pedagogical efficiency of the presented educational content and by performing the usability study of the application. The e-learning content presents three different levels of knowledge on cell and tissue electroporation. In the first part of the e-learning application we explain basic principles of electroporation process. The second part provides educational content about importance of modelling and visualization of local electric field in electroporation-based treatments. In the third part we developed an interactive module for visualization of local electric field distribution in 3D tissue models of cutaneous tumors for different parameters such as voltage applied, distance between electrodes, electrode dimension and shape, tissue geometry and electric conductivity. The pedagogical efficiency assessment showed that the participants improved their level of knowledge. The results of usability evaluation revealed that participants found the application simple to learn, use and navigate. The participants also found the information provided by the application easy to understand. The e-learning application we present in this article provides educational material on electrochemotherapy and its underlying principles such as cell and tissue electroporation. The e-learning application is developed to provide an interactive educational content in order to simulate the "hands-on" learning approach about the parameters being important for successful therapy. The e-learning application together with the interactive e-learning environment is available to the users to provide collaborative and flexible learning in order to facilitate knowledge exchange among the experts from different scientific fields that are involved in electrochemotherapy. The modular structure of the application allows for upgrade with new educational content collected from the clinics and research, and can be easily adapted to serve as a collaborative e-learning tool also in other electroporation-based treatments such as gene electrotransfer, gene vaccination, irreversible tissue ablation and transdermal gene and drug delivery. The presented e-learning application provides an easy and rapid approach for information, knowledge and experience exchange among the experts from different scientific fields, which can facilitate development and optimisation of electroporation-based treatments.

  12. Scientific Computing for Chemists: An Undergraduate Course in Simulations, Data Processing, and Visualization

    ERIC Educational Resources Information Center

    Weiss, Charles J.

    2017-01-01

    The Scientific Computing for Chemists course taught at Wabash College teaches chemistry students to use the Python programming language, Jupyter notebooks, and a number of common Python scientific libraries to process, analyze, and visualize data. Assuming no prior programming experience, the course introduces students to basic programming and…

  13. Visual representation of scientific information.

    PubMed

    Wong, Bang

    2011-02-15

    Great technological advances have enabled researchers to generate an enormous amount of data. Data analysis is replacing data generation as the rate-limiting step in scientific research. With this wealth of information, we have an opportunity to understand the molecular causes of human diseases. However, the unprecedented scale, resolution, and variety of data pose new analytical challenges. Visual representation of data offers insights that can lead to new understanding, whether the purpose is analysis or communication. This presentation shows how art, design, and traditional illustration can enable scientific discovery. Examples will be drawn from the Broad Institute's Data Visualization Initiative, aimed at establishing processes for creating informative visualization models.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springmeyer, R R; Brugger, E; Cook, R

    The Data group provides data analysis and visualization support to its customers. This consists primarily of the development and support of VisIt, a data analysis and visualization tool. Support ranges from answering questions about the tool, providing classes on how to use the tool, and performing data analysis and visualization for customers. The Information Management and Graphics Group supports and develops tools that enhance our ability to access, display, and understand large, complex data sets. Activities include applying visualization software for large scale data exploration; running video production labs on two networks; supporting graphics libraries and tools for end users;more » maintaining PowerWalls and assorted other displays; and developing software for searching and managing scientific data. Researchers in the Center for Applied Scientific Computing (CASC) work on various projects including the development of visualization techniques for large scale data exploration that are funded by the ASC program, among others. The researchers also have LDRD projects and collaborations with other lab researchers, academia, and industry. The IMG group is located in the Terascale Simulation Facility, home to Dawn, Atlas, BGL, and others, which includes both classified and unclassified visualization theaters, a visualization computer floor and deployment workshop, and video production labs. We continued to provide the traditional graphics group consulting and video production support. We maintained five PowerWalls and many other displays. We deployed a 576-node Opteron/IB cluster with 72 TB of memory providing a visualization production server on our classified network. We continue to support a 128-node Opteron/IB cluster providing a visualization production server for our unclassified systems and an older 256-node Opteron/IB cluster for the classified systems, as well as several smaller clusters to drive the PowerWalls. The visualization production systems includes NFS servers to provide dedicated storage for data analysis and visualization. The ASC projects have delivered new versions of visualization and scientific data management tools to end users and continue to refine them. VisIt had 4 releases during the past year, ending with VisIt 2.0. We released version 2.4 of Hopper, a Java application for managing and transferring files. This release included a graphical disk usage view which works on all types of connections and an aggregated copy feature for quickly transferring massive datasets quickly and efficiently to HPSS. We continue to use and develop Blockbuster and Telepath. Both the VisIt and IMG teams were engaged in a variety of movie production efforts during the past year in addition to the development tasks.« less

  15. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  16. Interoperable web applications for sharing data and products of the International DORIS Service

    NASA Astrophysics Data System (ADS)

    Soudarin, L.; Ferrage, P.

    2017-12-01

    The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.

  17. Applied and implied semantics in crystallographic publishing

    PubMed Central

    2012-01-01

    Background Crystallography is a data-rich, software-intensive scientific discipline with a community that has undertaken direct responsibility for publishing its own scientific journals. That community has worked actively to develop information exchange standards allowing readers of structure reports to access directly, and interact with, the scientific content of the articles. Results Structure reports submitted to some journals of the International Union of Crystallography (IUCr) can be automatically validated and published through an efficient and cost-effective workflow. Readers can view and interact with the structures in three-dimensional visualization applications, and can access the experimental data should they wish to perform their own independent structure solution and refinement. The journals also layer on top of this facility a number of automated annotations and interpretations to add further scientific value. Conclusions The benefits of semantically rich information exchange standards have revolutionised the scholarly publishing process for crystallography, and establish a model relevant to many other physical science disciplines. PMID:22932420

  18. Enhancing insight in scientific problem solving by highlighting the functional features of prototypes: an fMRI study.

    PubMed

    Hao, Xin; Cui, Shuai; Li, Wenfu; Yang, Wenjing; Qiu, Jiang; Zhang, Qinglin

    2013-10-09

    Insight can be the first step toward creating a groundbreaking product. As evident in anecdotes and major inventions in history, heuristic events (heuristic prototypes) prompted inventors to acquire insight when solving problems. Bionic imitation in scientific innovation is an example of this kind of problem solving. In particular, heuristic prototypes (e.g., the lotus effect; the very high water repellence exhibited by lotus leaves) help solve insight problems (e.g., non-stick surfaces). We speculated that the biological functional feature of prototypes is a critical factor in inducing insightful scientific problem solving. In this functional magnetic resonance imaging (fMRI) study, we selected scientific innovation problems and utilized "learning prototypes-solving problems" two-phase paradigm to test the supposition. We also explored its neural mechanisms. Functional MRI data showed that the activation of the middle temporal gyrus (MTG, BA 37) and the middle occipital gyrus (MOG, BA 19) were associated with the highlighted functional feature condition. fMRI data also indicated that the MTG (BA 37) could be responsible for the semantic processing of functional features and for the formation of novel associations based on related functions. In addition, the MOG (BA 19) could be involved in the visual imagery of formation and application of function association between the heuristic prototype and problem. Our findings suggest that both semantic processing and visual imagery could be crucial components underlying scientific problem solving. © 2013 Elsevier B.V. All rights reserved.

  19. Engineering and Scientific Applications: Using MatLab(Registered Trademark) for Data Processing and Visualization

    NASA Technical Reports Server (NTRS)

    Sen, Syamal K.; Shaykhian, Gholam Ali

    2011-01-01

    MatLab(R) (MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many cou ntries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its re al strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbo x. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using s ymbolic operations. MatLab in its interpreter programming language fo rm (command interface) is similar with well known programming languag es such as C/C++, support data structures and cell arrays to define c lasses in object oriented programming. As such, MatLab is equipped with most ofthe essential constructs of a higher programming language. M atLab is packaged with an editor and debugging functionality useful t o perform analysis of large MatLab programs and find errors. We belie ve there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and ana lysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applicati ons. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientifi c problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabu lar format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed. The presentation will emphasize creating practIcal scripts (pro grams) that extend the basic features of MatLab TOPICS mclude (1) Ma trix and vector analysis and manipulations (2) Mathematical functions (3) Symbolic calculations & functions (4) Import/export data files (5) Program lOgic and flow control (6) Writing function and passing parameters (7) Test application programs

  20. Data Visualization Using Immersive Virtual Reality Tools

    NASA Astrophysics Data System (ADS)

    Cioc, Alexandru; Djorgovski, S. G.; Donalek, C.; Lawler, E.; Sauer, F.; Longo, G.

    2013-01-01

    The growing complexity of scientific data poses serious challenges for an effective visualization. Data sets, e.g., catalogs of objects detected in sky surveys, can have a very high dimensionality, ~ 100 - 1000. Visualizing such hyper-dimensional data parameter spaces is essentially impossible, but there are ways of visualizing up to ~ 10 dimensions in a pseudo-3D display. We have been experimenting with the emerging technologies of immersive virtual reality (VR) as a platform for a scientific, interactive, collaborative data visualization. Our initial experiments used the virtual world of Second Life, and more recently VR worlds based on its open source code, OpenSimulator. There we can visualize up to ~ 100,000 data points in ~ 7 - 8 dimensions (3 spatial and others encoded as shapes, colors, sizes, etc.), in an immersive virtual space where scientists can interact with their data and with each other. We are now developing a more scalable visualization environment using the popular (practically an emerging standard) Unity 3D Game Engine, coded using C#, JavaScript, and the Unity Scripting Language. This visualization tool can be used through a standard web browser, or a standalone browser of its own. Rather than merely plotting data points, the application creates interactive three-dimensional objects of various shapes, colors, and sizes, and of course the XYZ positions, encoding various dimensions of the parameter space, that can be associated interactively. Multiple users can navigate through this data space simultaneously, either with their own, independent vantage points, or with a shared view. At this stage ~ 100,000 data points can be easily visualized within seconds on a simple laptop. The displayed data points can contain linked information; e.g., upon a clicking on a data point, a webpage with additional information can be rendered within the 3D world. A range of functionalities has been already deployed, and more are being added. We expect to make this visualization tool freely available to the academic community within a few months, on an experimental (beta testing) basis.

  1. Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Viswambharan, V.; Doshi, A.

    2017-12-01

    Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics capabilities that exploit distributed computing in an enterprise environment.

  2. Scientific Visualization in High Speed Network Environments

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi; Kutler, Paul (Technical Monitor)

    1997-01-01

    In several cases, new visualization techniques have vastly increased the researcher's ability to analyze and comprehend data. Similarly, the role of networks in providing an efficient supercomputing environment have become more critical and continue to grow at a faster rate than the increase in the processing capabilities of supercomputers. A close relationship between scientific visualization and high-speed networks in providing an important link to support efficient supercomputing is identified. The two technologies are driven by the increasing complexities and volume of supercomputer data. The interaction of scientific visualization and high-speed networks in a Computational Fluid Dynamics simulation/visualization environment are given. Current capabilities supported by high speed networks, supercomputers, and high-performance graphics workstations at the Numerical Aerodynamic Simulation Facility (NAS) at NASA Ames Research Center are described. Applied research in providing a supercomputer visualization environment to support future computational requirements are summarized.

  3. Animated computer graphics models of space and earth sciences data generated via the massively parallel processor

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David

    1987-01-01

    The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.

  4. Havery Mudd 2014-2015 Computer Science Conduit Clinic Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aspesi, G; Bai, J; Deese, R

    2015-05-12

    Conduit, a new open-source library developed at Lawrence Livermore National Laboratories, provides a C++ application programming interface (API) to describe and access scientific data. Conduit’s primary use is for inmemory data exchange in high performance computing (HPC) applications. Our team tested and improved Conduit to make it more appealing to potential adopters in the HPC community. We extended Conduit’s capabilities by prototyping four libraries: one for parallel communication using MPI, one for I/O functionality, one for aggregating performance data, and one for data visualization.

  5. JoVE: the Journal of Visualized Experiments.

    PubMed

    Vardell, Emily

    2015-01-01

    The Journal of Visualized Experiments (JoVE) is the world's first scientific video journal and is designed to communicate research and scientific methods in an innovative, intuitive way. JoVE includes a wide range of biomedical videos, from biology to immunology and bioengineering to clinical and translation medicine. This column describes the browsing and searching capabilities of JoVE, as well as its additional features (including the JoVE Scientific Education Database designed for students in scientific fields).

  6. Korean consumers' perceptions of health/functional food claims according to the strength of scientific evidence

    PubMed Central

    Kim, Ji Yeon; Kang, Eun Jin; Kwon, Oran

    2010-01-01

    In this study, we investigated that consumers could differentiate between levels of claims and clarify how a visual aid influences consumer understanding of the different claim levels. We interviewed 2,000 consumers in 13 shopping malls on their perception of and confidence in different levels of health claims using seven point scales. The average confidence scores given by participants were 4.17 for the probable level and 4.07 for the possible level; the score for the probable level was significantly higher than that for the possible level (P < 0.05). Scores for confidence in claims after reading labels with and without a visual aid were 5.27 and 4.43, respectively; the score for labeling with a visual aid was significantly higher than for labeling without a visual aid (P < 0.01). Our results provide compelling evidence that providing health claims with qualifying language differentiating levels of scientific evidence can help consumers understand the strength of scientific evidence behind those claims. Moreover, when a visual aid was included, consumers perceived the scientific levels more clearly and had greater confidence in their meanings than when a visual aid was not included. Although this result suggests that consumers react differently to different claim levels, it is not yet clear whether consumers understand the variations in the degree of scientific support. PMID:21103090

  7. Advancements to Visualization Control System (VCS, part of UV-CDAT), a Visualization Package Designed for Climate Scientists

    NASA Astrophysics Data System (ADS)

    Lipsa, D.; Chaudhary, A.; Williams, D. N.; Doutriaux, C.; Jhaveri, S.

    2017-12-01

    Climate Data Analysis Tools (UV-CDAT, https://uvcdat.llnl.gov) is a data analysis and visualization software package developed at Lawrence Livermore National Laboratory and designed for climate scientists. Core components of UV-CDAT include: 1) Community Data Management System (CDMS) which provides I/O support and a data model for climate data;2) CDAT Utilities (GenUtil) that processes data using spatial and temporal averaging and statistic functions; and 3) Visualization Control System (VCS) for interactive visualization of the data. VCS is a Python visualization package primarily built for climate scientists, however, because of its generality and breadth of functionality, it can be a useful tool to other scientific applications. VCS provides 1D, 2D and 3D visualization functions such as scatter plot and line graphs for 1d data, boxfill, meshfill, isofill, isoline for 2d scalar data, vector glyphs and streamlines for 2d vector data and 3d_scalar and 3d_vector for 3d data. Specifically for climate data our plotting routines include projections, Skew-T plots and Taylor diagrams. While VCS provided a user-friendly API, the previous implementation of VCS relied on slow performing vector graphics (Cairo) backend which is suitable for smaller dataset and non-interactive graphics. LLNL and Kitware team has added a new backend to VCS that uses the Visualization Toolkit (VTK) as its visualization backend. VTK is one of the most popular open source, multi-platform scientific visualization library written in C++. Its use of OpenGL and pipeline processing architecture results in a high performant VCS library. Its multitude of available data formats and visualization algorithms results in easy adoption of new visualization methods and new data formats in VCS. In this presentation, we describe recent contributions to VCS that includes new visualization plots, continuous integration testing using Conda and CircleCI, tutorials and examples using Jupyter notebooks as well as upgrades that we are planning in the near future which will improve its ease of use and reliability and extend its capabilities.

  8. The emergence of the silent witness: the legal and medical reception of X-rays in the USA.

    PubMed

    Golan, Tal

    2004-08-01

    The late 19th-century discovery of X-rays befuddled not only the scientific world but also the medical and legal worlds. The possibility of looking into the human body as if through an open window challenged the time-honored medical monopoly over the inner cavities of the human body. Likewise, the possibility of visualizing objects unavailable to the naked eye challenged the established legal theories and practices of illustration and proof. This paper describes the reactions to those challenges by the medical and the legal professions in the USA. The two professions are treated as connected social institutions, producing ongoing negotiations through which legal doctrines affect medicine no less than scientific discoveries and medical applications affect the law. This joint analysis rewards us with a rich story about an early and overlooked chapter in X-ray history on the professionalization of radiology, the origins of defensive medicine, and the evolution of the legal theory and practice of visual evidence.

  9. Exploring Verbal, Visual and Schematic Learners' Static and Dynamic Mental Images of Scientific Species and Processes in Relation to Their Spatial Ability

    ERIC Educational Resources Information Center

    Al-Balushi, Sulaiman M.; Coll, Richard Kevin

    2013-01-01

    The current study compared different learners' static and dynamic mental images of unseen scientific species and processes in relation to their spatial ability. Learners were classified into verbal, visual and schematic. Dynamic images were classified into: appearing/disappearing, linear-movement, and rotation. Two types of scientific entities and…

  10. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  11. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  12. Thirteen ways to say nothing with scientific visualization

    NASA Technical Reports Server (NTRS)

    Globus, AL; Raible, E.

    1992-01-01

    Scientific visualization can be used to produce very beautiful images. Frequently, users and others not properly initiated into mysteries of visualization research fail to appreciate the artistic qualities of these images. Scientists will frequently use our work to needlessly understand the data from which it is derived. This paper describes a number of effective techniques to confound such pernicious activity.

  13. SCSODC: Integrating Ocean Data for Visualization Sharing and Application

    NASA Astrophysics Data System (ADS)

    Xu, C.; Li, S.; Wang, D.; Xie, Q.

    2014-02-01

    The South China Sea Ocean Data Center (SCSODC) was founded in 2010 in order to improve collecting and managing of ocean data of the South China Sea Institute of Oceanology (SCSIO). The mission of SCSODC is to ensure the long term scientific stewardship of ocean data, information and products - collected through research groups, monitoring stations and observation cruises - and to facilitate the efficient use and distribution to possible users. However, data sharing and applications were limited due to the characteristics of distribution and heterogeneity that made it difficult to integrate the data. To surmount those difficulties, the Data Sharing System has been developed by the SCSODC using the most appropriate information management and information technology. The Data Sharing System uses open standards and tools to promote the capability to integrate ocean data and to interact with other data portals or users and includes a full range of processes such as data discovery, evaluation and access combining C/S and B/S mode. It provides a visualized management interface for the data managers and a transparent and seamless data access and application environment for users. Users are allowed to access data using the client software and to access interactive visualization application interface via a web browser. The architecture, key technologies and functionality of the system are discussed briefly in this paper. It is shown that the system of SCSODC is able to implement web visualization sharing and seamless access to ocean data in a distributed and heterogeneous environment.

  14. Real time visualization of dynamic magnetic fields with a nanomagnetic ferrolens

    NASA Astrophysics Data System (ADS)

    Markoulakis, Emmanouil; Rigakis, Iraklis; Chatzakis, John; Konstantaras, Antonios; Antonidakis, Emmanuel

    2018-04-01

    Due to advancements in nanomagnetism and latest nanomagnetic materials and devices, a new potential field has been opened up for research and applications which was not possible before. We herein propose a new research field and application for nanomagnetism for the visualization of dynamic magnetic fields in real-time. In short, Nano Magnetic Vision. A new methodology, technique and apparatus were invented and prototyped in order to demonstrate and test this new application. As an application example the visualization of the dynamic magnetic field on a transmitting antenna was chosen. Never seen before high-resolution, photos and real-time color video revealing the actual dynamic magnetic field inside a transmitting radio antenna rod has been captured for the first time. The antenna rod is fed with six hundred volts, orthogonal pulses. This unipolar signal is in the very low frequency (i.e. VLF) range. The signal combined with an extremely short electrical length of the rod, ensures the generation of a relatively strong fluctuating magnetic field, analogue to the signal transmitted, along and inside the antenna. This field is induced into a ferrolens and becomes visible in real-time within the normal human eyes frequency spectrum. The name we have given to the new observation apparatus is, SPIONs Superparamagnetic Ferrolens Microscope (SSFM), a powerful passive scientific observation tool with many other potential applications in the near future.

  15. Visualizing Geographic Data in Google Earth for Education and Outreach

    NASA Astrophysics Data System (ADS)

    Martin, D. J.; Treves, R.

    2008-12-01

    Google Earth is an excellent tool to help students and the public visualize scientific data as with low technical skill scientific content can be shown in three dimensions against a background of remotely sensed imagery. It therefore has a variety of uses in university education and as a tool for public outreach. However, in both situations it is of limited value if it is only used to attract attention with flashy three dimensional animations. In this poster we shall illustrate several applications that represent what we believe is good educational practice. The first example shows how the combination of a floor map and a projection of Google Earth on a screen can be used to produce active learning. Students are asked to imagine where they would build a house on Big Island Hawaii in order to avoid volcanic hazards. In the second example Google Earth is used to illustrate evidence over a range of scales in a description of Lake Agassiz flood events which would be more difficult to comprehend in a traditional paper based format. In the final example a simple text manipulation application "TMapper" is used to change the color palette of a thematic map generated by the students in Google Earth to teach them about the use of color in map design.

  16. Rapid development of medical imaging tools with open-source libraries.

    PubMed

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  17. Interactive Visualization of Large-Scale Hydrological Data using Emerging Technologies in Web Systems and Parallel Programming

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2013-12-01

    As geoscientists are confronted with increasingly massive datasets from environmental observations to simulations, one of the biggest challenges is having the right tools to gain scientific insight from the data and communicate the understanding to stakeholders. Recent developments in web technologies make it easy to manage, visualize and share large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to interact with data, and modify the parameters to create custom views of the data to gain insight from simulations and environmental observations. This requires developing new data models and intelligent knowledge discovery techniques to explore and extract information from complex computational simulations or large data repositories. Scientific visualization will be an increasingly important component to build comprehensive environmental information platforms. This presentation provides an overview of the trends and challenges in the field of scientific visualization, and demonstrates information visualization and communication tools developed within the light of these challenges.

  18. Explore the virtual side of earth science

    USGS Publications Warehouse

    ,

    1998-01-01

    Scientists have always struggled to find an appropriate technology that could represent three-dimensional (3-D) data, facilitate dynamic analysis, and encourage on-the-fly interactivity. In the recent past, scientific visualization has increased the scientist's ability to visualize information, but it has not provided the interactive environment necessary for rapidly changing the model or for viewing the model in ways not predetermined by the visualization specialist. Virtual Reality Modeling Language (VRML 2.0) is a new environment for visualizing 3-D information spaces and is accessible through the Internet with current browser technologies. Researchers from the U.S. Geological Survey (USGS) are using VRML as a scientific visualization tool to help convey complex scientific concepts to various audiences. Kevin W. Laurent, computer scientist, and Maura J. Hogan, technical information specialist, have created a collection of VRML models available through the Internet at Virtual Earth Science (virtual.er.usgs.gov).

  19. Interactive 3D visualization for theoretical virtual observatories

    NASA Astrophysics Data System (ADS)

    Dykes, T.; Hassan, A.; Gheller, C.; Croton, D.; Krokos, M.

    2018-06-01

    Virtual observatories (VOs) are online hubs of scientific knowledge. They encompass a collection of platforms dedicated to the storage and dissemination of astronomical data, from simple data archives to e-research platforms offering advanced tools for data exploration and analysis. Whilst the more mature platforms within VOs primarily serve the observational community, there are also services fulfilling a similar role for theoretical data. Scientific visualization can be an effective tool for analysis and exploration of data sets made accessible through web platforms for theoretical data, which often contain spatial dimensions and properties inherently suitable for visualization via e.g. mock imaging in 2D or volume rendering in 3D. We analyse the current state of 3D visualization for big theoretical astronomical data sets through scientific web portals and virtual observatory services. We discuss some of the challenges for interactive 3D visualization and how it can augment the workflow of users in a virtual observatory context. Finally we showcase a lightweight client-server visualization tool for particle-based data sets, allowing quantitative visualization via data filtering, highlighting two example use cases within the Theoretical Astrophysical Observatory.

  20. Using Visualization Science to Evaluate Effective Communication of Climate Indicators

    NASA Astrophysics Data System (ADS)

    Gerst, M.; Kenney, M. A.; Wolfinger, F.; Lloyd, A.

    2015-12-01

    Indicators are observations or calculations that are used to track social and environmental conditions over time. For a large coupled system such as the economy and environment, the choice of indicators requires a structured process that involves co-production among facilitators, subject-matter experts, decision-makers, and the general public. This co-production is needed in part because such indicators serve a duel role of scientifically tracking change and of communicating to non-scientists important changes and information that may be useful in decision contexts. Because the goal is to communicate and inform decisions it is critical that indicators be understood by non-scientific audiences, which may require different visualization techniques than for scientific audiences. Here we describe a process of rigorously evaluating visual communication efficacy by using a simplified taxonomy of visualization design problems and trade-offs to assess existing and redesigned indicator images. The experimental design is three-part. It involves testing non-scientific audiences' understandability of scientific images found in the literature along with similar information shaped by a partial co-production process that informed the U.S. Global Change Research Program prototype indicators system, released in Spring 2015. These recommendations for physical, natural, and societal indicators of changes and impacts involved input from over 200 subject-matter experts, organized into 13 technical teams. Using results from the first two parts, we then explore visualization design improvements that may increase understandability to non-scientific audiences. We anticipate that this work will highlight important trade-offs in visualization design when moving between audiences that will be of great use to scientists who wish to communicate their results broader audiences.

  1. A standardized set of 3-D objects for virtual reality research and applications.

    PubMed

    Peeters, David

    2018-06-01

    The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.

  2. Y0: An innovative tool for spatial data analysis

    NASA Astrophysics Data System (ADS)

    Wilson, Jeremy C.

    1993-08-01

    This paper describes an advanced analysis and visualization tool, called Y0 (pronounced ``Why not?!''), that has been developed to directly support the scientific process for earth and space science research. Y0 aids the scientific research process by enabling the user to formulate algorithms and models within an integrated environment, and then interactively explore the solution space with the aid of appropriate visualizations. Y0 has been designed to provide strong support for both quantitative analysis and rich visualization. The user's algorithm or model is defined in terms of algebraic formulas in cells on worksheets, in a similar fashion to spreadsheet programs. Y0 is specifically designed to provide the data types and rich function set necessary for effective analysis and manipulation of remote sensing data. This includes various types of arrays, geometric objects, and objects for representing geographic coordinate system mappings. Visualization of results is tailored to the needs of remote sensing, with straightforward methods of composing, comparing, and animating imagery and graphical information, with reference to geographical coordinate systems. Y0 is based on advanced object-oriented technology. It is implemented in C++ for use in Unix environments, with a user interface based on the X window system. Y0 has been delivered under contract to Unidata, a group which provides data and software support to atmospheric researches in universities affiliated with UCAR. This paper will explore the key concepts in Y0, describe its utility for remote sensing analysis and visualization, and will give a specific example of its application to the problem of measuring glacier flow rates from Landsat imagery.

  3. MCSDSS: A Multi-Criteria Decision Support System for Merging Geoscience Information with Natural User Interfaces, Preference Ranking, and Interactive Data Utilities

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Gentle, J.

    2015-12-01

    The multi-criteria decision support system (MCSDSS) is a newly completed application for touch-enabled group decision support that uses D3 data visualization tools, a geojson conversion utility that we developed, and Paralelex to create an interactive tool. The MCSDSS is a prototype system intended to demonstrate the potential capabilities of a single page application (SPA) running atop a web and cloud based architecture utilizing open source technologies. The application is implemented on current web standards while supporting human interface design that targets both traditional mouse/keyboard interactions and modern touch/gesture enabled interactions. The technology stack for MCSDSS was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The application integrates current frameworks for highly performant agile development with unit testing, statistical analysis, data visualization, mapping technologies, geographic data manipulation, and cloud infrastructure while retaining support for traditional HTML5/CSS3 web standards. The software lifecylcle for MCSDSS has following best practices to develop, share, and document the codebase and application. Code is documented and shared via an online repository with the option for programmers to see, contribute, or fork the codebase. Example data files and tutorial documentation have been shared with clear descriptions and data object identifiers. And the metadata about the application has been incorporated into an OntoSoft entry to ensure that MCSDSS is searchable and clearly described. MCSDSS is a flexible platform that allows for data fusion and inclusion of large datasets in an interactive front-end application capable of connecting with other science-based applications and advanced computing resources. In addition, MCSDSS offers functionality that enables communication with non-technical users for policy, education, or engagement with groups around scientific topics with societal relevance.

  4. Capturing change: the duality of time-lapse imagery to acquire data and depict ecological dynamics

    USGS Publications Warehouse

    Brinley Buckley, Emma M.; Allen, Craig R.; Forsberg, Michael; Farrell, Michael; Caven, Andrew J.

    2017-01-01

    We investigate the scientific and communicative value of time-lapse imagery by exploring applications for data collection and visualization. Time-lapse imagery has a myriad of possible applications to study and depict ecosystems and can operate at unique temporal and spatial scales to bridge the gap between large-scale satellite imagery projects and observational field research. Time-lapse data sequences, linking time-lapse imagery with data visualization, have the ability to make data come alive for a wider audience by connecting abstract numbers to images that root data in time and place. Utilizing imagery from the Platte Basin Timelapse Project, water inundation and vegetation phenology metrics are quantified via image analysis and then paired with passive monitoring data, including streamflow and water chemistry. Dynamic and interactive time-lapse data sequences elucidate the visible and invisible ecological dynamics of a significantly altered yet internationally important river system in central Nebraska.

  5. The FDA's role in medical device clinical studies of human subjects

    NASA Astrophysics Data System (ADS)

    Saviola, James

    2005-03-01

    This paper provides an overview of the United States Food and Drug Administration's (FDA) role as a regulatory agency in medical device clinical studies involving human subjects. The FDA's regulations and responsibilities are explained and the device application process discussed. The specific medical device regulatory authorities are described as they apply to the development and clinical study of retinal visual prosthetic devices. The FDA medical device regulations regarding clinical studies of human subjects are intended to safeguard the rights and safety of subjects. The data gathered in pre-approval clinical studies provide a basis of valid scientific evidence in order to demonstrate the safety and effectiveness of a medical device. The importance of a working understanding of applicable medical device regulations from the beginning of the device development project is emphasized particularly for novel, complex products such as implantable visual prosthetic devices.

  6. The Visual Geophysical Exploration Environment: A Multi-dimensional Scientific Visualization

    NASA Astrophysics Data System (ADS)

    Pandya, R. E.; Domenico, B.; Murray, D.; Marlino, M. R.

    2003-12-01

    The Visual Geophysical Exploration Environment (VGEE) is an online learning environment designed to help undergraduate students understand fundamental Earth system science concepts. The guiding principle of the VGEE is the importance of hands-on interaction with scientific visualization and data. The VGEE consists of four elements: 1) an online, inquiry-based curriculum for guiding student exploration; 2) a suite of El Nino-related data sets adapted for student use; 3) a learner-centered interface to a scientific visualization tool; and 4) a set of concept models (interactive tools that help students understand fundamental scientific concepts). There are two key innovations featured in this interactive poster session. One is the integration of concept models and the visualization tool. Concept models are simple, interactive, Java-based illustrations of fundamental physical principles. We developed eight concept models and integrated them into the visualization tool to enable students to probe data. The ability to probe data using a concept model addresses the common problem of transfer: the difficulty students have in applying theoretical knowledge to everyday phenomenon. The other innovation is a visualization environment and data that are discoverable in digital libraries, and installed, configured, and used for investigations over the web. By collaborating with the Integrated Data Viewer developers, we were able to embed a web-launchable visualization tool and access to distributed data sets into the online curricula. The Thematic Real-time Environmental Data Distributed Services (THREDDS) project is working to provide catalogs of datasets that can be used in new VGEE curricula under development. By cataloging this curricula in the Digital Library for Earth System Education (DLESE), learners and educators can discover the data and visualization tool within a framework that guides their use.

  7. Web-Accessible Scientific Workflow System for Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roelof Versteeg; Roelof Versteeg; Trevor Rowe

    2006-03-01

    We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less

  8. Customizable scientific web-portal for DIII-D nuclear fusion experiment

    NASA Astrophysics Data System (ADS)

    Abla, G.; Kim, E. N.; Schissel, D. P.

    2010-04-01

    Increasing utilization of the Internet and convenient web technologies has made the web-portal a major application interface for remote participation and control of scientific instruments. While web-portals have provided a centralized gateway for multiple computational services, the amount of visual output often is overwhelming due to the high volume of data generated by complex scientific instruments and experiments. Since each scientist may have different priorities and areas of interest in the experiment, filtering and organizing information based on the individual user's need can increase the usability and efficiency of a web-portal. DIII-D is the largest magnetic nuclear fusion device in the US. A web-portal has been designed to support the experimental activities of DIII-D researchers worldwide. It offers a customizable interface with personalized page layouts and list of services for users to select. Each individual user can create a unique working environment to fit his own needs and interests. Customizable services are: real-time experiment status monitoring, diagnostic data access, interactive data analysis and visualization. The web-portal also supports interactive collaborations by providing collaborative logbook, and online instant announcement services. The DIII-D web-portal development utilizes multi-tier software architecture, and Web 2.0 technologies and tools, such as AJAX and Django, to develop a highly-interactive and customizable user interface.

  9. Perceptual issues in scientific visualization

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Proffitt, Dennis R.

    1989-01-01

    In order to develop effective tools for scientific visulaization, consideration must be given to the perceptual competencies, limitations, and biases of the human operator. Perceptual psychology has amassed a rich body of research on these issues and can lend insight to the development of visualization tehcniques. Within a perceptual psychological framework, the computer display screen can best be thought of as a special kind of impoverished visual environemnt. Guidelines can be gleaned from the psychological literature to help visualization tool designers avoid ambiguities and/or illusions in the resulting data displays.

  10. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    NASA Astrophysics Data System (ADS)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.

  11. Interactive Design and Visualization of Branched Covering Spaces.

    PubMed

    Roy, Lawrence; Kumar, Prashant; Golbabaei, Sanaz; Zhang, Yue; Zhang, Eugene

    2018-01-01

    Branched covering spaces are a mathematical concept which originates from complex analysis and topology and has applications in tensor field topology and geometry remeshing. Given a manifold surface and an -way rotational symmetry field, a branched covering space is a manifold surface that has an -to-1 map to the original surface except at the ramification points, which correspond to the singularities in the rotational symmetry field. Understanding the notion and mathematical properties of branched covering spaces is important to researchers in tensor field visualization and geometry processing, and their application areas. In this paper, we provide a framework to interactively design and visualize the branched covering space (BCS) of an input mesh surface and a rotational symmetry field defined on it. In our framework, the user can visualize not only the BCSs but also their construction process. In addition, our system allows the user to design the geometric realization of the BCS using mesh deformation techniques as well as connecting tubes. This enables the user to verify important facts about BCSs such as that they are manifold surfaces around singularities, as well as the Riemann-Hurwitz formula which relates the Euler characteristic of the BCS to that of the original mesh. Our system is evaluated by student researchers in scientific visualization and geometry processing as well as faculty members in mathematics at our university who teach topology. We include their evaluations and feedback in the paper.

  12. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert B.

    1994-01-01

    Software was developed with NASA support to aid in the analysis and display of the massive amounts of data generated from satellites, observational field programs, and from model simulations. This software was developed in the context of the PATHFINDER (Probing ATmospHeric Flows in an Interactive and Distributed EnviRonment) Project. The overall aim of this project is to create a flexible, modular, and distributed environment for data handling, modeling simulations, data analysis, and visualization of atmospheric and fluid flows. Software completed with NASA support includes GEMPAK analysis, data handling, and display modules for which collaborators at NASA had primary responsibility, and prototype software modules for three-dimensional interactive and distributed control and display as well as data handling, for which NSCA was responsible. Overall process control was handled through a scientific and visualization application builder from Silicon Graphics known as the Iris Explorer. In addition, the GEMPAK related work (GEMVIS) was also ported to the Advanced Visualization System (AVS) application builder. Many modules were developed to enhance those already available in Iris Explorer including HDF file support, improved visualization and display, simple lattice math, and the handling of metadata through development of a new grid datatype. Complete source and runtime binaries along with on-line documentation is available via the World Wide Web at: http://redrock.ncsa.uiuc.edu/ PATHFINDER/pathre12/top/top.html.

  13. ONR Far East Scientific Information Bulletin. Volume 15, Number 3, July- September 1990

    DTIC Science & Technology

    1990-09-01

    video IMSL, an extremely well known com- Mimura’s Laboratory at tapes. Most of the programming is done mercial software library probably avail- Hiroshima...are visualized via video tape. The experiment associated with this. In a ifthiswill have the horsepower tocrank finished product is professional...individuals associated for tional composite applications include tor- the various Japanese databases are: pedo tubes by St. Tropez and aerospace radomes by

  14. Three-Dimensional User Interfaces for Immersive Virtual Reality

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1997-01-01

    The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.

  15. Visualization: A pathway to enhanced scientific productivity in the expanding missions of Space and Earth Sciences

    NASA Technical Reports Server (NTRS)

    Szuszczewicz, E. P.

    1995-01-01

    The movement toward the solution of problems involving large-scale system science, the ever-increasing capabilities of three-dimensional, time-dependent numerical models, and the enhanced capabilities of 'in situ' and remote sensing instruments bring a new era of scientific endeavor that requires an important change in our approach to mission planning and the task of data reduction and analysis. Visualization is at the heart of the requirements for a much-needed enhancement in scientific productivity as we face these new challenges. This article draws a perspective on the problem as it crosses discipline boundaries from solar physics to atmospheric and ocean sciences. It also attempts to introduce visualization as a new approach to scientific discovery and a tool which expedites and improves our insight into physically complex problems. A set of simple illustrations demonstrates a number of visualization techniques and the discussion emphasizes the trial-and-error and search-and-discover modes that are necessary for the techniques to reach their full potential. Further discussions also point to the importance of integrating data access, management, mathematical operations, and visualization into a single system. Some of the more recent developments in this area are reviewed.

  16. Web-Based Geographic Information System Tool for Accessing Hanford Site Environmental Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Triplett, Mark B.; Seiple, Timothy E.; Watson, David J.

    Data volume, complexity, and access issues pose severe challenges for analysts, regulators and stakeholders attempting to efficiently use legacy data to support decision making at the U.S. Department of Energy’s (DOE) Hanford Site. DOE has partnered with the Pacific Northwest National Laboratory (PNNL) on the PHOENIX (PNNL-Hanford Online Environmental Information System) project, which seeks to address data access, transparency, and integration challenges at Hanford to provide effective decision support. PHOENIX is a family of spatially-enabled web applications providing quick access to decades of valuable scientific data and insight through intuitive query, visualization, and analysis tools. PHOENIX realizes broad, public accessibilitymore » by relying only on ubiquitous web-browsers, eliminating the need for specialized software. It accommodates a wide range of users with intuitive user interfaces that require little or no training to quickly obtain and visualize data. Currently, PHOENIX is actively hosting three applications focused on groundwater monitoring, groundwater clean-up performance reporting, and in-tank monitoring. PHOENIX-based applications are being used to streamline investigative and analytical processes at Hanford, saving time and money. But more importantly, by integrating previously isolated datasets and developing relevant visualization and analysis tools, PHOENIX applications are enabling DOE to discover new correlations hidden in legacy data, allowing them to more effectively address complex issues at Hanford.« less

  17. A Framework for the Design of Effective Graphics for Scientific Visualization

    NASA Technical Reports Server (NTRS)

    Miceli, Kristina D.

    1992-01-01

    This proposal presents a visualization framework, based on a data model, that supports the production of effective graphics for scientific visualization. Visual representations are effective only if they augment comprehension of the increasing amounts of data being generated by modern computer simulations. These representations are created by taking into account the goals and capabilities of the scientist, the type of data to be displayed, and software and hardware considerations. This framework is embodied in an assistant-based visualization system to guide the scientist in the visualization process. This will improve the quality of the visualizations and decrease the time the scientist is required to spend in generating the visualizations. I intend to prove that such a framework will create a more productive environment for tile analysis and interpretation of large, complex data sets.

  18. Effects of VR system fidelity on analyzing isosurface visualization of volume datasets.

    PubMed

    Laha, Bireswar; Bowman, Doug A; Socha, John J

    2014-04-01

    Volume visualization is an important technique for analyzing datasets from a variety of different scientific domains. Volume data analysis is inherently difficult because volumes are three-dimensional, dense, and unfamiliar, requiring scientists to precisely control the viewpoint and to make precise spatial judgments. Researchers have proposed that more immersive (higher fidelity) VR systems might improve task performance with volume datasets, and significant results tied to different components of display fidelity have been reported. However, more information is needed to generalize these results to different task types, domains, and rendering styles. We visualized isosurfaces extracted from synchrotron microscopic computed tomography (SR-μCT) scans of beetles, in a CAVE-like display. We ran a controlled experiment evaluating the effects of three components of system fidelity (field of regard, stereoscopy, and head tracking) on a variety of abstract task categories that are applicable to various scientific domains, and also compared our results with those from our prior experiment using 3D texture-based rendering. We report many significant findings. For example, for search and spatial judgment tasks with isosurface visualization, a stereoscopic display provides better performance, but for tasks with 3D texture-based rendering, displays with higher field of regard were more effective, independent of the levels of the other display components. We also found that systems with high field of regard and head tracking improve performance in spatial judgment tasks. Our results extend existing knowledge and produce new guidelines for designing VR systems to improve the effectiveness of volume data analysis.

  19. Preparing for in situ processing on upcoming leading-edge supercomputers

    DOE PAGES

    Kress, James; Churchill, Randy Michael; Klasky, Scott; ...

    2016-10-01

    High performance computing applications are producing increasingly large amounts of data and placing enormous stress on current capabilities for traditional post-hoc visualization techniques. Because of the growing compute and I/O imbalance, data reductions, including in situ visualization, are required. These reduced data are used for analysis and visualization in a variety of different ways. Many of he visualization and analysis requirements are known a priori, but when they are not, scientists are dependent on the reduced data to accurately represent the simulation in post hoc analysis. The contributions of this paper is a description of the directions we are pursuingmore » to assist a large scale fusion simulation code succeed on the next generation of supercomputers. Finally, these directions include the role of in situ processing for performing data reductions, as well as the tradeoffs between data size and data integrity within the context of complex operations in a typical scientific workflow.« less

  20. A Space and Atmospheric Visualization Science System

    NASA Technical Reports Server (NTRS)

    Szuszczewicz, E. P.; Blanchard, P.; Mankofsky, A.; Goodrich, C.; Kamins, D.; Kulkarni, R.; Mcnabb, D.; Moroh, M.

    1994-01-01

    SAVS (a Space and Atmospheric Visualization Science system) is an integrated system with user-friendly functionality that employs a 'push-button' software environment that mimics the logical scientific processes in data acquisition, reduction, analysis, and visualization. All of this is accomplished without requiring a detailed understanding of the methods, networks, and modules that link the tools and effectively execute the functions. This report describes SAVS and its components, followed by several applications based on generic research interests in interplanetary and magnetospheric physics (IMP/ISTP), active experiments in space (CRRES), and mission planning focused on the earth's thermospheric, ionospheric, and mesospheric domains (TIMED). The final chapters provide a user-oriented description of interface functionalities, hands-on operations, and customized modules, with details of the primary modules presented in the appendices. The overall intent of the report is to reflect the accomplishments of the three-year development effort and to introduce potential users to the power and utility of the integrated data acquisition, analysis, and visualization system.

  1. Ambiguous Science and the Visual Representation of the Real

    ERIC Educational Resources Information Center

    Newbold, Curtis Robert

    2012-01-01

    The emergence of visual media as prominent and even expected forms of communication in nearly all disciplines, including those scientific, has raised new questions about how the art and science of communication epistemologically affect the interpretation of scientific phenomena. In this dissertation I explore how the influence of aesthetics in…

  2. Visual Invention and the Composition of Scientific Research Graphics: A Topological Approach

    ERIC Educational Resources Information Center

    Walsh, Lynda

    2018-01-01

    This report details the second phase of an ongoing research project investigating the visual invention and composition processes of scientific researchers. In this phase, four academic researchers completed think-aloud protocols as they composed graphics for research presentations; they also answered follow-up questions about their visual…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E Wes; Brugger, Eric

    Supercomputing centers are unique resources that aim to enable scientific knowledge discovery by employing large computational resources - the 'Big Iron.' Design, acquisition, installation, and management of the Big Iron are carefully planned and monitored. Because these Big Iron systems produce a tsunami of data, it's natural to colocate the visualization and analysis infrastructure. This infrastructure consists of hardware (Little Iron) and staff (Skinny Guys). Our collective experience suggests that design, acquisition, installation, and management of the Little Iron and Skinny Guys doesn't receive the same level of treatment as that of the Big Iron. This article explores the followingmore » questions about the Little Iron: How should we size the Little Iron to adequately support visualization and analysis of data coming off the Big Iron? What sort of capabilities must it have? Related questions concern the size of visualization support staff: How big should a visualization program be - that is, how many Skinny Guys should it have? What should the staff do? How much of the visualization should be provided as a support service, and how much should applications scientists be expected to do on their own?« less

  4. Toward Mixed Method Evaluations of Scientific Visualizations and Design Process as an Evaluation Tool.

    PubMed

    Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M; Nuckley, David J; Keefe, Daniel F

    2012-10-01

    In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations.

  5. Toward Mixed Method Evaluations of Scientific Visualizations and Design Process as an Evaluation Tool

    PubMed Central

    Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M.; Nuckley, David J.

    2017-01-01

    In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations. PMID:28944349

  6. Bringing the Unidata IDV to the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, W. I.; Oxelson Ganter, J.

    2015-12-01

    Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  7. VTK-m: Accelerating the Visualization Toolkit for Massively Threaded Architectures

    DOE PAGES

    Moreland, Kenneth; Sewell, Christopher; Usher, William; ...

    2016-05-09

    Here, one of the most critical challenges for high-performance computing (HPC) scientific visualization is execution on massively threaded processors. Of the many fundamental changes we are seeing in HPC systems, one of the most profound is a reliance on new processor types optimized for execution bandwidth over latency hiding. Our current production scientific visualization software is not designed for these new types of architectures. To address this issue, the VTK-m framework serves as a container for algorithms, provides flexible data representation, and simplifies the design of visualization algorithms on new and future computer architecture.

  8. VTK-m: Accelerating the Visualization Toolkit for Massively Threaded Architectures

    DOE PAGES

    Moreland, Kenneth; Sewell, Christopher; Usher, William; ...

    2016-05-09

    Execution on massively threaded processors is one of the most critical challenges for high-performance computing (HPC) scientific visualization. Of the many fundamental changes we are seeing in HPC systems, one of the most profound is a reliance on new processor types optimized for execution bandwidth over latency hiding. Moreover, our current production scientific visualization software is not designed for these new types of architectures. In order to address this issue, the VTK-m framework serves as a container for algorithms, provides flexible data representation, and simplifies the design of visualization algorithms on new and future computer architecture.

  9. Visual communication of engineering and scientific data in the courtroom

    NASA Astrophysics Data System (ADS)

    Jackson, Gerald W.; Henry, Andrew C.

    1993-01-01

    Presenting engineering and scientific information in the courtroom is challenging. Quite often the data is voluminous and, therefore, difficult to digest by engineering experts, let alone a lay judge, lawyer, or jury. This paper discusses computer visualization techniques designed to provide the court methods of communicating data in visual formats thus allowing a more accurate understanding of complicated concepts and results. Examples are presented that include accident reconstructions, technical concept illustration, and engineering data visualization. Also presented is the design of an electronic courtroom which facilitates the display and communication of information to the courtroom.

  10. Image pattern recognition supporting interactive analysis and graphical visualization

    NASA Technical Reports Server (NTRS)

    Coggins, James M.

    1992-01-01

    Image Pattern Recognition attempts to infer properties of the world from image data. Such capabilities are crucial for making measurements from satellite or telescope images related to Earth and space science problems. Such measurements can be the required product itself, or the measurements can be used as input to a computer graphics system for visualization purposes. At present, the field of image pattern recognition lacks a unified scientific structure for developing and evaluating image pattern recognition applications. The overall goal of this project is to begin developing such a structure. This report summarizes results of a 3-year research effort in image pattern recognition addressing the following three principal aims: (1) to create a software foundation for the research and identify image pattern recognition problems in Earth and space science; (2) to develop image measurement operations based on Artificial Visual Systems; and (3) to develop multiscale image descriptions for use in interactive image analysis.

  11. Visualization of system dynamics using phasegrams

    PubMed Central

    Herbst, Christian T.; Herzel, Hanspeter; Švec, Jan G.; Wyman, Megan T.; Fitch, W. Tecumseh

    2013-01-01

    A new tool for visualization and analysis of system dynamics is introduced: the phasegram. Its application is illustrated with both classical nonlinear systems (logistic map and Lorenz system) and with biological voice signals. Phasegrams combine the advantages of sliding-window analysis (such as the spectrogram) with well-established visualization techniques from the domain of nonlinear dynamics. In a phasegram, time is mapped onto the x-axis, and various vibratory regimes, such as periodic oscillation, subharmonics or chaos, are identified within the generated graph by the number and stability of horizontal lines. A phasegram can be interpreted as a bifurcation diagram in time. In contrast to other analysis techniques, it can be automatically constructed from time-series data alone: no additional system parameter needs to be known. Phasegrams show great potential for signal classification and can act as the quantitative basis for further analysis of oscillating systems in many scientific fields, such as physics (particularly acoustics), biology or medicine. PMID:23697715

  12. Visualization at supercomputing centers: the tale of little big iron and the three skinny guys.

    PubMed

    Bethel, E W; van Rosendale, J; Southard, D; Gaither, K; Childs, H; Brugger, E; Ahern, S

    2011-01-01

    Supercomputing centers are unique resources that aim to enable scientific knowledge discovery by employing large computational resources-the "Big Iron." Design, acquisition, installation, and management of the Big Iron are carefully planned and monitored. Because these Big Iron systems produce a tsunami of data, it's natural to colocate the visualization and analysis infrastructure. This infrastructure consists of hardware (Little Iron) and staff (Skinny Guys). Our collective experience suggests that design, acquisition, installation, and management of the Little Iron and Skinny Guys doesn't receive the same level of treatment as that of the Big Iron. This article explores the following questions about the Little Iron: How should we size the Little Iron to adequately support visualization and analysis of data coming off the Big Iron? What sort of capabilities must it have? Related questions concern the size of visualization support staff: How big should a visualization program be-that is, how many Skinny Guys should it have? What should the staff do? How much of the visualization should be provided as a support service, and how much should applications scientists be expected to do on their own?

  13. Visualising uncertainty: interpreting quantified geoscientific inversion outputs for a diverse user community.

    NASA Astrophysics Data System (ADS)

    Reading, A. M.; Morse, P. E.; Staal, T.

    2017-12-01

    Geoscientific inversion outputs, such as seismic tomography contour images, are finding increasing use amongst scientific user communities that have limited knowledge of the impact of output parameter uncertainty on subsequent interpretations made from such images. We make use of a newly written computer application which enables seismic tomography images to be displayed in a performant 3D graphics environment. This facilitates the mapping of colour scales to the human visual sensorium for the interactive interpretation of contoured inversion results incorporating parameter uncertainty. Two case examples of seismic tomography inversions or contoured compilations are compared from the southern hemisphere continents of Australia and Antarctica. The Australian example is based on the AuSREM contoured seismic wavespeed model while the Antarctic example is a valuable but less well constrained result. Through adjusting the multiple colour gradients, layer separations, opacity, illumination, shadowing and background effects, we can optimise the insights obtained from the 3D structure in the inversion compilation or result. Importantly, we can also limit the display to show information in a way that is mapped to the uncertainty in the 3D result. Through this practical application, we demonstrate that the uncertainty in the result can be handled through a well-posed mapping of the parameter values to displayed colours in the knowledge of what is perceived visually by a typical human. We found that this approach maximises the chance of a useful tectonic interpretation by a diverse scientific user community. In general, we develop the idea that quantified inversion uncertainty can be used to tailor the way that the output is presented to the analyst for scientific interpretation.

  14. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D.; Chu, C.; Mlynczak, P.

    2014-12-01

    The CERES project continues to provide the scientific community a wide variety of satellite-derived data products. The flagship products TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. These datasets encompass a wide range of temporal and spatial resolutions, suited to specific applications. We thus offer time resolutions that range from instantaneous to monthly means, with spatial resolutions that range from 20-km footprint to global scales. The 14-year record is mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. CERES products are also used by the remote sensing community for their climatological studies. In the last years however, our CERES products had been used by an even broader audience, like the green energy, health and environmental research communities, and others. Because of that, the CERES project has implemented a now well-established web-oriented Ordering and Visualization Tool (OVT), which is well into its fifth year of development. In order to help facilitate a comprehensive quality control of CERES products, the OVT Team began introducing a series of specialized functions. These include the 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and other specialized scientific application capabilities. Over time increasingly higher order temporal and spatial resolution products are being made available to the public through the CERES OVT. These high-resolution products require accessing the existing long-term archive - thus the reading of many very large netCDF or HDF files that pose a real challenge to the task of near instantaneous visualization. An overview of the CERES OVT basic functions and QC capabilities as well as future steps in expanding its capabilities will be presented at the meeting.

  15. Exploring Scientific Information for Policy Making under Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Forni, L.; Galaitsi, S.; Mehta, V. K.; Escobar, M.; Purkey, D. R.; Depsky, N. J.; Lima, N. A.

    2016-12-01

    Each actor evaluating potential management strategies brings her/his own distinct set of objectives to a complex decision space of system uncertainties. The diversity of these objectives require detailed and rigorous analyses that responds to multifaceted challenges. However, the utility of this information depends on the accessibility of scientific information to decision makers. This paper demonstrates data visualization tools for presenting scientific results to decision makers in two case studies, La Paz/ El Alto, Bolivia, and Yuba County,California. Visualization output from the case studies combines spatiotemporal, multivariate and multirun/multiscenario information to produce information corresponding to the objectives defined by key actors and stakeholders. These tools can manage complex data and distill scientific information into accessible formats. Using the visualizations, scientists and decision makers can navigate the decision space and potential objective trade-offs to facilitate discussion and consensus building. These efforts can support identifying stable negotiatedagreements between different stakeholders.

  16. Scientific Visualization & Modeling for Earth Systems Science Education

    NASA Technical Reports Server (NTRS)

    Chaudhury, S. Raj; Rodriguez, Waldo J.

    2003-01-01

    Providing research experiences for undergraduate students in Earth Systems Science (ESS) poses several challenges at smaller academic institutions that might lack dedicated resources for this area of study. This paper describes the development of an innovative model that involves students with majors in diverse scientific disciplines in authentic ESS research. In studying global climate change, experts typically use scientific visualization techniques applied to remote sensing data collected by satellites. In particular, many problems related to environmental phenomena can be quantitatively addressed by investigations based on datasets related to the scientific endeavours such as the Earth Radiation Budget Experiment (ERBE). Working with data products stored at NASA's Distributed Active Archive Centers, visualization software specifically designed for students and an advanced, immersive Virtual Reality (VR) environment, students engage in guided research projects during a structured 6-week summer program. Over the 5-year span, this program has afforded the opportunity for students majoring in biology, chemistry, mathematics, computer science, physics, engineering and science education to work collaboratively in teams on research projects that emphasize the use of scientific visualization in studying the environment. Recently, a hands-on component has been added through science student partnerships with school-teachers in data collection and reporting for the GLOBE Program (GLobal Observations to Benefit the Environment).

  17. Images of Earth and Space: The Role of Visualization in NASA Science

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Fly through the ocean at breakneck speed. Tour the moon. Even swim safely in the boiling sun. You can do these things and more in a 17 minute virtual journey through Earth and space. The trek is by way of colorful scientific visualizations developed by the NASA/Goddard Space Flight Center's Scientific Visualization Studio and the NASA HPCC Earth and Space Science Project investigators. Various styles of electronic music and lay-level narration provide the accompaniment.

  18. A framework for visual communication at Nature.

    PubMed

    Krause, Kelly

    2016-04-25

    The scientific journal Nature, published weekly since 1869, serves as an excellent case study in visual communication. While journals are becoming increasingly specialist, Nature remains firmly multidisciplinary; and unlike many scientific journals, it contains original journalism, opinion pieces, and expert analysis in addition to peer-reviewed research papers. This variety of content types-covering an extensive range of scientific disciplines-translates into a wide and varied audience, and the need to employ an equally wide variety of communication styles.For example, a research paper may employ technical language to communicate to a highly specialized audience in that field, whereas a news story on the same subject will explain the science to an educated lay audience, often adding a wider context and stripping out acronyms. Each type of piece will use a communication approach tailored for its intended audience.This is true for visual content as well: the intended audience of a scientific figure, illustration or data visualization will determine the design approach to that visual. At Nature, given the high volume of content plus high quality standards, this process is applied in a fairly systematic way, using a framework to guide creative decision-making. That framework is described here, along with a discussion of best practices for the design of research figures and graphics by context. © The Author(s) 2016.

  19. Bonsai: an event-based framework for processing and controlling data streams

    PubMed Central

    Lopes, Gonçalo; Bonacchi, Niccolò; Frazão, João; Neto, Joana P.; Atallah, Bassam V.; Soares, Sofia; Moreira, Luís; Matias, Sara; Itskov, Pavel M.; Correia, Patrícia A.; Medina, Roberto E.; Calcaterra, Lorenza; Dreosti, Elena; Paton, Joseph J.; Kampff, Adam R.

    2015-01-01

    The design of modern scientific experiments requires the control and monitoring of many different data streams. However, the serial execution of programming instructions in a computer makes it a challenge to develop software that can deal with the asynchronous, parallel nature of scientific data. Here we present Bonsai, a modular, high-performance, open-source visual programming framework for the acquisition and online processing of data streams. We describe Bonsai's core principles and architecture and demonstrate how it allows for the rapid and flexible prototyping of integrated experimental designs in neuroscience. We specifically highlight some applications that require the combination of many different hardware and software components, including video tracking of behavior, electrophysiology and closed-loop control of stimulation. PMID:25904861

  20. Active learning in camera calibration through vision measurement application

    NASA Astrophysics Data System (ADS)

    Li, Xiaoqin; Guo, Jierong; Wang, Xianchun; Liu, Changqing; Cao, Binfang

    2017-08-01

    Since cameras are increasingly more used in scientific application as well as in the applications requiring precise visual information, effective calibration of such cameras is getting more important. There are many reasons why the measurements of objects are not accurate. The largest reason is that the lens has a distortion. Another detrimental influence on the evaluation accuracy is caused by the perspective distortions in the image. They happen whenever we cannot mount the camera perpendicularly to the objects we want to measure. In overall, it is very important for students to understand how to correct lens distortions, that is camera calibration. If the camera is calibrated, the images are rectificated, and then it is possible to obtain undistorted measurements in world coordinates. This paper presents how the students should develop a sense of active learning for mathematical camera model besides the theoretical scientific basics. The authors will present the theoretical and practical lectures which have the goal of deepening the students understanding of the mathematical models of area scan cameras and building some practical vision measurement process by themselves.

  1. Giovanni - The Bridge Between Data and Science

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Acker, James

    2017-01-01

    This article describes new features in the Geospatial Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni), a user-friendly online tool that enables visualization, analysis, and assessment of NASA Earth science data sets without downloading data and software. Since the satellite era began, data collected from Earth-observing satellites have been widely used in research and applications; however, using satellite-based data sets can still be a challenge to many. To facilitate data access and evaluation, as well as scientific exploration and discovery, the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) has developed Giovanni for a wide range of users around the world. This article describes the latest capabilities of Giovanni with examples, and discusses future plans for this innovative system.

  2. Monotonicity preserving splines using rational cubic Timmer interpolation

    NASA Astrophysics Data System (ADS)

    Zakaria, Wan Zafira Ezza Wan; Alimin, Nur Safiyah; Ali, Jamaludin Md

    2017-08-01

    In scientific application and Computer Aided Design (CAD), users usually need to generate a spline passing through a given set of data, which preserves certain shape properties of the data such as positivity, monotonicity or convexity. The required curve has to be a smooth shape-preserving interpolant. In this paper a rational cubic spline in Timmer representation is developed to generate interpolant that preserves monotonicity with visually pleasing curve. To control the shape of the interpolant three parameters are introduced. The shape parameters in the description of the rational cubic interpolant are subjected to monotonicity constrained. The necessary and sufficient conditions of the rational cubic interpolant are derived and visually the proposed rational cubic Timmer interpolant gives very pleasing results.

  3. Lighting design in the neonatal intensive care unit: practical applications of scientific principles.

    PubMed

    White, Robert D

    2004-06-01

    Meeting the varied lighting needs of infants, caregivers, and families has become more complex as our understanding of visual development and perception and the effect of light on circadian rhythms advances. Optimal lighting strategies are discussed for new unit construction, as well as modifications to consider for existing units. In either case, the key concept is that lighting should be provided for the individual needs of each person, rather than the full-room lighting schemes previously used. Ideas gleaned from nonhospital settings, re-introduction of natural light into the neonatal intensive care unit, and new devices such as light-emitting diodes will dramatically change the lighting and visual environment of future neonatal intensive care units.

  4. Featured Article: Genotation: Actionable knowledge for the scientific reader

    PubMed Central

    Willis, Ethan; Sakauye, Mark; Jose, Rony; Chen, Hao; Davis, Robert L

    2016-01-01

    We present an article viewer application that allows a scientific reader to easily discover and share knowledge by linking genomics-related concepts to knowledge of disparate biomedical databases. High-throughput data streams generated by technical advancements have contributed to scientific knowledge discovery at an unprecedented rate. Biomedical Informaticists have created a diverse set of databases to store and retrieve the discovered knowledge. The diversity and abundance of such resources present biomedical researchers a challenge with knowledge discovery. These challenges highlight a need for a better informatics solution. We use a text mining algorithm, Genomine, to identify gene symbols from the text of a journal article. The identified symbols are supplemented with information from the GenoDB knowledgebase. Self-updating GenoDB contains information from NCBI Gene, Clinvar, Medgen, dbSNP, KEGG, PharmGKB, Uniprot, and Hugo Gene databases. The journal viewer is a web application accessible via a web browser. The features described herein are accessible on www.genotation.org. The Genomine algorithm identifies gene symbols with an accuracy shown by .65 F-Score. GenoDB currently contains information regarding 59,905 gene symbols, 5633 drug–gene relationships, 5981 gene–disease relationships, and 713 pathways. This application provides scientific readers with actionable knowledge related to concepts of a manuscript. The reader will be able to save and share supplements to be visualized in a graphical manner. This provides convenient access to details of complex biological phenomena, enabling biomedical researchers to generate novel hypothesis to further our knowledge in human health. This manuscript presents a novel application that integrates genomic, proteomic, and pharmacogenomic information to supplement content of a biomedical manuscript and enable readers to automatically discover actionable knowledge. PMID:26900164

  5. Featured Article: Genotation: Actionable knowledge for the scientific reader.

    PubMed

    Nagahawatte, Panduka; Willis, Ethan; Sakauye, Mark; Jose, Rony; Chen, Hao; Davis, Robert L

    2016-06-01

    We present an article viewer application that allows a scientific reader to easily discover and share knowledge by linking genomics-related concepts to knowledge of disparate biomedical databases. High-throughput data streams generated by technical advancements have contributed to scientific knowledge discovery at an unprecedented rate. Biomedical Informaticists have created a diverse set of databases to store and retrieve the discovered knowledge. The diversity and abundance of such resources present biomedical researchers a challenge with knowledge discovery. These challenges highlight a need for a better informatics solution. We use a text mining algorithm, Genomine, to identify gene symbols from the text of a journal article. The identified symbols are supplemented with information from the GenoDB knowledgebase. Self-updating GenoDB contains information from NCBI Gene, Clinvar, Medgen, dbSNP, KEGG, PharmGKB, Uniprot, and Hugo Gene databases. The journal viewer is a web application accessible via a web browser. The features described herein are accessible on www.genotation.org The Genomine algorithm identifies gene symbols with an accuracy shown by .65 F-Score. GenoDB currently contains information regarding 59,905 gene symbols, 5633 drug-gene relationships, 5981 gene-disease relationships, and 713 pathways. This application provides scientific readers with actionable knowledge related to concepts of a manuscript. The reader will be able to save and share supplements to be visualized in a graphical manner. This provides convenient access to details of complex biological phenomena, enabling biomedical researchers to generate novel hypothesis to further our knowledge in human health. This manuscript presents a novel application that integrates genomic, proteomic, and pharmacogenomic information to supplement content of a biomedical manuscript and enable readers to automatically discover actionable knowledge. © 2016 by the Society for Experimental Biology and Medicine.

  6. Exploring Students' Visual Conception of Matter: Towards Developing a Teaching Framework Using Models

    ERIC Educational Resources Information Center

    Espinosa, Allen A.; Marasigan, Arlyne C.; Datukan, Janir T.

    2016-01-01

    This study explored how students visualise the states and classifications of matter with the use of scientific models. Misconceptions of students in using scientific models were also identified to formulate a teaching framework. To elicit data in the study, a Visual Conception Questionnaire was administered to thirty-four (34), firstyear, general…

  7. Scientific Visualization and Computational Science: Natural Partners

    NASA Technical Reports Server (NTRS)

    Uselton, Samuel P.; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Scientific visualization is developing rapidly, stimulated by computational science, which is gaining acceptance as a third alternative to theory and experiment. Computational science is based on numerical simulations of mathematical models derived from theory. But each individual simulation is like a hypothetical experiment; initial conditions are specified, and the result is a record of the observed conditions. Experiments can be simulated for situations that can not really be created or controlled. Results impossible to measure can be computed.. Even for observable values, computed samples are typically much denser. Numerical simulations also extend scientific exploration where the mathematics is analytically intractable. Numerical simulations are used to study phenomena from subatomic to intergalactic scales and from abstract mathematical structures to pragmatic engineering of everyday objects. But computational science methods would be almost useless without visualization. The obvious reason is that the huge amounts of data produced require the high bandwidth of the human visual system, and interactivity adds to the power. Visualization systems also provide a single context for all the activities involved from debugging the simulations, to exploring the data, to communicating the results. Most of the presentations today have their roots in image processing, where the fundamental task is: Given an image, extract information about the scene. Visualization has developed from computer graphics, and the inverse task: Given a scene description, make an image. Visualization extends the graphics paradigm by expanding the possible input. The goal is still to produce images; the difficulty is that the input is not a scene description displayable by standard graphics methods. Visualization techniques must either transform the data into a scene description or extend graphics techniques to display this odd input. Computational science is a fertile field for visualization research because the results vary so widely and include things that have no known appearance. The amount of data creates additional challenges for both hardware and software systems. Evaluations of visualization should ultimately reflect the insight gained into the scientific phenomena. So making good visualizations requires consideration of characteristics of the user and the purpose of the visualization. Knowledge about human perception and graphic design is also relevant. It is this breadth of knowledge that stimulates proposals for multidisciplinary visualization teams and intelligent visualization assistant software. Visualization is an immature field, but computational science is stimulating research on a broad front.

  8. Visualizing without Vision at the Microscale: Students with Visual Impairments Explore Cells with Touch

    ERIC Educational Resources Information Center

    Jones, M. Gail; Minogue, James; Oppewal, Tom; Cook, Michelle P.; Broadwell, Bethany

    2006-01-01

    Science instruction is typically highly dependent on visual representations of scientific concepts that are communicated through textbooks, teacher presentations, and computer-based multimedia materials. Little is known about how students with visual impairments access and interpret these types of visually-dependent instructional materials. This…

  9. Stepping Into Science Data: Data Visualization in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Skolnik, S.

    2017-12-01

    Have you ever seen people get really excited about science data? Navteca, along with the Earth Science Technology Office (ESTO), within the Earth Science Division of NASA's Science Mission Directorate have been exploring virtual reality (VR) technology for the next generation of Earth science technology information systems. One of their first joint experiments was visualizing climate data from the Goddard Earth Observing System Model (GEOS) in VR, and the resulting visualizations greatly excited the scientific community. This presentation will share the value of VR for science, such as the capability of permitting the observer to interact with data rendered in real-time, make selections, and view volumetric data in an innovative way. Using interactive VR hardware (headset and controllers), the viewer steps into the data visualizations, physically moving through three-dimensional structures that are traditionally displayed as layers or slices, such as cloud and storm systems from NASA's Global Precipitation Measurement (GPM). Results from displaying this precipitation and cloud data show that there is interesting potential for scientific visualization, 3D/4D visualizations, and inter-disciplinary studies using VR. Additionally, VR visualizations can be leveraged as 360 content for scientific communication and outreach and VR can be used as a tool to engage policy and decision makers, as well as the public.

  10. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    NASA Astrophysics Data System (ADS)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  11. My recollections of Hubel and Wiesel and a brief review of functional circuitry in the visual pathway

    PubMed Central

    Alonso, Jose-Manuel

    2009-01-01

    The first paper of Hubel and Wiesel in The Journal of Physiology in 1959 marked the beginning of an exciting chapter in the history of visual neuroscience. Through a collaboration that lasted 25 years, Hubel and Wiesel described the main response properties of visual cortical neurons, the functional architecture of visual cortex and the role of visual experience in shaping cortical architecture. The work of Hubel and Wiesel transformed the field not only through scientific discovery but also by touching the life and scientific careers of many students. Here, I describe my personal experience as a postdoctoral student with Torsten Wiesel and how this experience influenced my own work. PMID:19525563

  12. magHD: a new approach to multi-dimensional data storage, analysis, display and exploitation

    NASA Astrophysics Data System (ADS)

    Angleraud, Christophe

    2014-06-01

    The ever increasing amount of data and processing capabilities - following the well- known Moore's law - is challenging the way scientists and engineers are currently exploiting large datasets. The scientific visualization tools, although quite powerful, are often too generic and provide abstract views of phenomena, thus preventing cross disciplines fertilization. On the other end, Geographic information Systems allow nice and visually appealing maps to be built but they often get very confused as more layers are added. Moreover, the introduction of time as a fourth analysis dimension to allow analysis of time dependent phenomena such as meteorological or climate models, is encouraging real-time data exploration techniques that allow spatial-temporal points of interests to be detected by integration of moving images by the human brain. Magellium is involved in high performance image processing chains for satellite image processing as well as scientific signal analysis and geographic information management since its creation (2003). We believe that recent work on big data, GPU and peer-to-peer collaborative processing can open a new breakthrough in data analysis and display that will serve many new applications in collaborative scientific computing, environment mapping and understanding. The magHD (for Magellium Hyper-Dimension) project aims at developing software solutions that will bring highly interactive tools for complex datasets analysis and exploration commodity hardware, targeting small to medium scale clusters with expansion capabilities to large cloud based clusters.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eric A. Wernert; William R. Sherman; Patrick O'Leary

    Immersive visualization makes use of the medium of virtual reality (VR) - it is a subset of virtual reality focused on the application of VR technologies to scientific and information visualization. As the name implies, there is a particular focus on the physically immersive aspect of VR that more fully engages the perceptual and kinesthetic capabilities of the scientist with the goal of producing greater insight. The immersive visualization community is uniquely positioned to address the analysis needs of the wide spectrum of domain scientists who are becoming increasingly overwhelmed by data. The outputs of computational science simulations and high-resolutionmore » sensors are creating a data deluge. Data is coming in faster than it can be analyzed, and there are countless opportunities for discovery that are missed as the data speeds by. By more fully utilizing the scientists visual and other sensory systems, and by offering a more natural user interface with which to interact with computer-generated representations, immersive visualization offers great promise in taming this data torrent. However, increasing the adoption of immersive visualization in scientific research communities can only happen by simultaneously lowering the engagement threshold while raising the measurable benefits of adoption. Scientists time spent immersed with their data will thus be rewarded with higher productivity, deeper insight, and improved creativity. Immersive visualization ties together technologies and methodologies from a variety of related but frequently disjoint areas, including hardware, software and human-computer interaction (HCI) disciplines. In many ways, hardware is a solved problem. There are well established technologies including large walk-in systems such as the CAVE{trademark} and head-based systems such as the Wide-5{trademark}. The advent of new consumer-level technologies now enable an entirely new generation of immersive displays, with smaller footprints and costs, widening the potential consumer base. While one would be hard-pressed to call software a solved problem, we now understand considerably more about best practices for designing and developing sustainable, scalable software systems, and we have useful software examples that illuminate the way to even better implementations. As with any research endeavour, HCI will always be exploring new topics in interface design, but we now have a sizable knowledge base of the strengths and weaknesses of the human perceptual systems and we know how to design effective interfaces for immersive systems. So, in a research landscape with a clear need for better visualization and analysis tools, a methodology in immersive visualization that has been shown to effectively address some of those needs, and vastly improved supporting technologies and knowledge of hardware, software, and HCI, why hasn't immersive visualization 'caught on' more with scientists? What can we do as a community of immersive visualization researchers and practitioners to facilitate greater adoption by scientific communities so as to make the transition from 'the promise of virtual reality' to 'the reality of virtual reality'.« less

  14. Scientific Assistant Virtual Laboratory (SAVL)

    NASA Astrophysics Data System (ADS)

    Alaghband, Gita; Fardi, Hamid; Gnabasik, David

    2007-03-01

    The Scientific Assistant Virtual Laboratory (SAVL) is a scientific discovery environment, an interactive simulated virtual laboratory, for learning physics and mathematics. The purpose of this computer-assisted intervention is to improve middle and high school student interest, insight and scores in physics and mathematics. SAVL develops scientific and mathematical imagination in a visual, symbolic, and experimental simulation environment. It directly addresses the issues of scientific and technological competency by providing critical thinking training through integrated modules. This on-going research provides a virtual laboratory environment in which the student directs the building of the experiment rather than observing a packaged simulation. SAVL: * Engages the persistent interest of young minds in physics and math by visually linking simulation objects and events with mathematical relations. * Teaches integrated concepts by the hands-on exploration and focused visualization of classic physics experiments within software. * Systematically and uniformly assesses and scores students by their ability to answer their own questions within the context of a Master Question Network. We will demonstrate how the Master Question Network uses polymorphic interfaces and C# lambda expressions to manage simulation objects.

  15. High brightness x ray source for directed energy and holographic imaging applications, phase 2

    NASA Astrophysics Data System (ADS)

    McPherson, Armon; Rhodes, Charles K.

    1992-03-01

    Advances in x-ray imaging technology and x-ray sources are such that a new technology can be brought to commercialization enabling the three-dimensional (3-D) microvisualization of hydrated biological specimens. The Company is engaged in a program whose main goal is the development of a new technology for direct three dimensional (3-D) x-ray holographic imaging. It is believed that this technology will have a wide range of important applications in the defense, medical, and scientific sectors. For example, in the medical area, it is expected that biomedical science will constitute a very active and substantial market, because the application of physical technologies for the direct visualization of biological entities has had a long and extremely fruitful history.

  16. Recent developments in stereoscopic and holographic 3D display technologies

    NASA Astrophysics Data System (ADS)

    Sarma, Kalluri

    2014-06-01

    Currently, there is increasing interest in the development of high performance 3D display technologies to support a variety of applications including medical imaging, scientific visualization, gaming, education, entertainment, air traffic control and remote operations in 3D environments. In this paper we will review the attributes of the various 3D display technologies including stereoscopic and holographic 3D, human factors issues of stereoscopic 3D, the challenges in realizing Holographic 3D displays and the recent progress in these technologies.

  17. Use of computational modeling combined with advanced visualization to develop strategies for the design of crop ideotypes to address food security

    DOE PAGES

    Christensen, A. J.; Srinivasan, V.; Hart, J. C.; ...

    2018-03-17

    Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have ledmore » to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. Lastly, this survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.« less

  18. Seminar in Flow Visualization at Lafayette College: Variations on the Hertzberg Effect

    NASA Astrophysics Data System (ADS)

    Rossmann, Jenn Stroud

    2013-11-01

    Flow visualization reveals an invisible world of fluid dynamics, blending scientific investigation and artistic exploration. The resulting images have inspired, and in some cases themselves become appreciated as, art. At Lafayette College, a sophomore-level seminar in The Art and Science of Flow Visualization exposes students to these techniques and the science of fluid mechanics, and to the photographic methods needed to create effective images that are successful both scientifically and artistically. Unlike other courses in flow visualization, this course assumes no a priori familiarity with fluid flow or with photography. The fundamentals of both are taught and practiced in a studio setting. Students are engaged in an interdisciplinary discourse about fluids and physics, photography, scientific ethics, and historical societal responses to science and art. Relevant texts from several disciplines are read, discussed, and responded to in student writing. This seminar approach makes flow visualization and fluid dynamics a natural part of a liberal education. The development, implementation, and assessment of this team-taught course at Lafayette College will be discussed. Support provided by National Science Foundation.

  19. Use of computational modeling combined with advanced visualization to develop strategies for the design of crop ideotypes to address food security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, A. J.; Srinivasan, V.; Hart, J. C.

    Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have ledmore » to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. Lastly, this survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.« less

  20. Use of computational modeling combined with advanced visualization to develop strategies for the design of crop ideotypes to address food security.

    PubMed

    Christensen, A J; Srinivasan, Venkatraman; Hart, John C; Marshall-Colon, Amy

    2018-05-01

    Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have led to discoveries in "big data" analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. This survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields.

  1. Use of computational modeling combined with advanced visualization to develop strategies for the design of crop ideotypes to address food security

    PubMed Central

    Christensen, A J; Srinivasan, Venkatraman; Hart, John C; Marshall-Colon, Amy

    2018-01-01

    Abstract Sustainable crop production is a contributing factor to current and future food security. Innovative technologies are needed to design strategies that will achieve higher crop yields on less land and with fewer resources. Computational modeling coupled with advanced scientific visualization enables researchers to explore and interact with complex agriculture, nutrition, and climate data to predict how crops will respond to untested environments. These virtual observations and predictions can direct the development of crop ideotypes designed to meet future yield and nutritional demands. This review surveys modeling strategies for the development of crop ideotypes and scientific visualization technologies that have led to discoveries in “big data” analysis. Combined modeling and visualization approaches have been used to realistically simulate crops and to guide selection that immediately enhances crop quantity and quality under challenging environmental conditions. This survey of current and developing technologies indicates that integrative modeling and advanced scientific visualization may help overcome challenges in agriculture and nutrition data as large-scale and multidimensional data become available in these fields. PMID:29562368

  2. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    PubMed

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Oceans of Data: In what ways can learning research inform the development of electronic interfaces and tools for use by students accessing large scientific databases?

    NASA Astrophysics Data System (ADS)

    Krumhansl, R. A.; Foster, J.; Peach, C. L.; Busey, A.; Baker, I.

    2012-12-01

    The practice of science and engineering is being revolutionized by the development of cyberinfrastructure for accessing near real-time and archived observatory data. Large cyberinfrastructure projects have the potential to transform the way science is taught in high school classrooms, making enormous quantities of scientific data available, giving students opportunities to analyze and draw conclusions from many kinds of complex data, and providing students with experiences using state-of-the-art resources and techniques for scientific investigations. However, online interfaces to scientific data are built by scientists for scientists, and their design can significantly impede broad use by novices. Knowledge relevant to the design of student interfaces to complex scientific databases is broadly dispersed among disciplines ranging from cognitive science to computer science and cartography and is not easily accessible to designers of educational interfaces. To inform efforts at bridging scientific cyberinfrastructure to the high school classroom, Education Development Center, Inc. and the Scripps Institution of Oceanography conducted an NSF-funded 2-year interdisciplinary review of literature and expert opinion pertinent to making interfaces to large scientific databases accessible to and usable by precollege learners and their teachers. Project findings are grounded in the fundamentals of Cognitive Load Theory, Visual Perception, Schemata formation and Universal Design for Learning. The Knowledge Status Report (KSR) presents cross-cutting and visualization-specific guidelines that highlight how interface design features can address/ ameliorate challenges novice high school students face as they navigate complex databases to find data, and construct and look for patterns in maps, graphs, animations and other data visualizations. The guidelines present ways to make scientific databases more broadly accessible by: 1) adjusting the cognitive load imposed by the user interface and visualizations so that it doesn't exceed the amount of information the learner can actively process; 2) drawing attention to important features and patterns; and 3) enabling customization of visualizations and tools to meet the needs of diverse learners.

  4. Stereoscopy and the Human Visual System

    PubMed Central

    Banks, Martin S.; Read, Jenny C. A.; Allison, Robert S.; Watt, Simon J.

    2012-01-01

    Stereoscopic displays have become important for many applications, including operation of remote devices, medical imaging, surgery, scientific visualization, and computer-assisted design. But the most significant and exciting development is the incorporation of stereo technology into entertainment: specifically, cinema, television, and video games. In these applications for stereo, three-dimensional (3D) imagery should create a faithful impression of the 3D structure of the scene being portrayed. In addition, the viewer should be comfortable and not leave the experience with eye fatigue or a headache. Finally, the presentation of the stereo images should not create temporal artifacts like flicker or motion judder. This paper reviews current research on stereo human vision and how it informs us about how best to create and present stereo 3D imagery. The paper is divided into four parts: (1) getting the geometry right, (2) depth cue interactions in stereo 3D media, (3) focusing and fixating on stereo images, and (4) how temporal presentation protocols affect flicker, motion artifacts, and depth distortion. PMID:23144596

  5. Towards an Analysis of Visual Images in School Science Textbooks and Press Articles about Science and Technology

    NASA Astrophysics Data System (ADS)

    Dimopoulos, Kostas; Koulaidis, Vasilis; Sklaveniti, Spyridoula

    2003-04-01

    This paper aims at presenting the application of a grid for the analysis of the pedagogic functions of visual images included in school science textbooks and daily press articles about science and technology. The analysis is made using the dimensions of content specialisation (classification) and social-pedagogic relationships (framing) promoted by the images as well as the elaboration and abstraction of the corresponding visual code (formality), thus combining pedagogical and socio-semiotic perspectives. The grid is applied to the analysis of 2819 visual images collected from school science textbooks and another 1630 visual images additionally collected from the press. The results show that the science textbooks in comparison to the press material: a) use ten times more images, b) use more images so as to familiarise their readers with the specialised techno-scientific content and codes, and c) tend to create a sense of higher empowerment for their readers by using the visual mode. Furthermore, as the educational level of the school science textbooks (i.e., from primary to lower secondary level) rises, the content specialisation projected by the visual images and the elaboration and abstraction of the corresponding visual code also increases. The above results have implications for the terms and conditions for the effective exploitation of visual material as the educational level rises as well as for the effective incorporation of visual images from press material into science classes.

  6. Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization

    PubMed Central

    Marai, G. Elisabeta

    2018-01-01

    Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550

  7. Instruments of scientific visual representation in atomic databases

    NASA Astrophysics Data System (ADS)

    Kazakov, V. V.; Kazakov, V. G.; Meshkov, O. I.

    2017-10-01

    Graphic tools of spectral data representation provided by operating information systems on atomic spectroscopy—ASD NIST, VAMDC, SPECTR-W3, and Electronic Structure of Atoms—for the support of scientific-research and human-resource development are presented. Such tools of visual representation of scientific data as those of the spectrogram and Grotrian diagram plotting are considered. The possibility of comparative analysis of the experimentally obtained spectra and reference spectra of atomic systems formed according to the database of a resource is described. The access techniques to the mentioned graphic tools are presented.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth; Sewell, Christopher; Usher, William

    Here, one of the most critical challenges for high-performance computing (HPC) scientific visualization is execution on massively threaded processors. Of the many fundamental changes we are seeing in HPC systems, one of the most profound is a reliance on new processor types optimized for execution bandwidth over latency hiding. Our current production scientific visualization software is not designed for these new types of architectures. To address this issue, the VTK-m framework serves as a container for algorithms, provides flexible data representation, and simplifies the design of visualization algorithms on new and future computer architecture.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth; Sewell, Christopher; Usher, William

    Execution on massively threaded processors is one of the most critical challenges for high-performance computing (HPC) scientific visualization. Of the many fundamental changes we are seeing in HPC systems, one of the most profound is a reliance on new processor types optimized for execution bandwidth over latency hiding. Moreover, our current production scientific visualization software is not designed for these new types of architectures. In order to address this issue, the VTK-m framework serves as a container for algorithms, provides flexible data representation, and simplifies the design of visualization algorithms on new and future computer architecture.

  10. Visualizers, Visualizations, and Visualizees: Differences in Meaning-Making by Scientific Experts and Novices from Global Visualizations of Ocean Data

    ERIC Educational Resources Information Center

    Stofer, Kathryn A.

    2013-01-01

    Data visualizations designed for academic scientists are not immediately meaningful to everyday scientists. Communicating between a specialized, expert audience and a general, novice public is non-trivial; it requires careful translation. However, more widely available visualization technologies and platforms, including new three-dimensional…

  11. Determination of efficacy of fingermark enhancement reagents; the use of propyl chloroformate for the derivatization of fingerprint amino acids extracted from paper.

    PubMed

    Mink, Tineke; Voorhaar, Annelies; Stoel, Reinoud; de Puit, Marcel

    2013-09-01

    The analysis of the constituents of fingerprints has been described numerous times, mainly with the purpose of determining the aging effect on fingerprints or showing the differences between donors or groups of donors. In this paper we describe the use of derivatized amino acids to determine the efficacy of the visualization reagents 1,8-diazafluoren-9-one (DFO) and ninhydrin. At present certain conditions are used for the application of these reagents, as determined by trial-and-error investigations, to the effect on fingerprints. The recovery of amino acids from a porous surface can be used as a measure for the efficacy of a visualization agent. In this paper we describe a method for the determination of the amount of amino acid left after reaction with well known fingerprint visualization reagents. This will allow a more scientific approach to method development for fingermark enhancement techniques. Furthermore, investigations on the influence of the concentration of fingermark amino acids, the order of application of and exposure time to reagents and the influence of age of the amino acids were carried out. These studies have resulted in a broader understanding of the mechanism involved in visualization of fingermarks using DFO and ninhydrin. Copyright © 2013 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Ranked centroid projection: a data visualization approach with self-organizing maps.

    PubMed

    Yen, G G; Wu, Z

    2008-02-01

    The self-organizing map (SOM) is an efficient tool for visualizing high-dimensional data. In this paper, the clustering and visualization capabilities of the SOM, especially in the analysis of textual data, i.e., document collections, are reviewed and further developed. A novel clustering and visualization approach based on the SOM is proposed for the task of text mining. The proposed approach first transforms the document space into a multidimensional vector space by means of document encoding. Afterwards, a growing hierarchical SOM (GHSOM) is trained and used as a baseline structure to automatically produce maps with various levels of detail. Following the GHSOM training, the new projection method, namely the ranked centroid projection (RCP), is applied to project the input vectors to a hierarchy of 2-D output maps. The RCP is used as a data analysis tool as well as a direct interface to the data. In a set of simulations, the proposed approach is applied to an illustrative data set and two real-world scientific document collections to demonstrate its applicability.

  13. Data, Analysis, and Visualization | Computational Science | NREL

    Science.gov Websites

    Data, Analysis, and Visualization Data, Analysis, and Visualization Data management, data analysis . At NREL, our data management, data analysis, and scientific visualization capabilities help move the approaches to image analysis and computer vision. Data Management and Big Data Systems, software, and tools

  14. Beyond Ball-and-Stick: Students' Processing of Novel STEM Visualizations

    ERIC Educational Resources Information Center

    Hinze, Scott R.; Rapp, David N.; Williamson, Vickie M.; Shultz, Mary Jane; Deslongchamps, Ghislain; Williamson, Kenneth C.

    2013-01-01

    Students are frequently presented with novel visualizations introducing scientific concepts and processes normally unobservable to the naked eye. Despite being unfamiliar, students are expected to understand and employ the visualizations to solve problems. Domain experts exhibit more competency than novices when using complex visualizations, but…

  15. How Scientists Develop Competence in Visual Communication

    ERIC Educational Resources Information Center

    Ostergren, Marilyn

    2013-01-01

    Visuals (maps, charts, diagrams and illustrations) are an important tool for communication in most scientific disciplines, which means that scientists benefit from having strong visual communication skills. This dissertation examines the nature of competence in visual communication and the means by which scientists acquire this competence. This…

  16. Envision: An interactive system for the management and visualization of large geophysical data sets

    NASA Technical Reports Server (NTRS)

    Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.

    1995-01-01

    Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.

  17. Visualizando el desarrollo de la nanomedicina en México.

    PubMed

    Robles-Belmont, Eduardo; Gortari-Rabiela, Rebeca de; Galarza-Barrios, Pilar; Siqueiros-García, Jesús Mario; Ruiz-León, Alejandro Arnulfo

    2017-01-01

    In this article we present a set of different visualizations of Mexico's nanomedicine scientific production data. Visualizations were developed using different methodologies for data analysis and visualization such as social network analysis, geography of science maps, and complex network communities analysis. Results are a multi-dimensional overview of the evolution of nanomedicine in Mexico. Moreover, visualizations allowed to identify trends and patterns of collaboration at the national and international level. Trends are also found in the knowledge structure of themes and disciplines. Finally, we identified the scientific communities in Mexico that are responsible for the new knowledge production in this emergent field of science. Copyright: © 2017 SecretarÍa de Salud

  18. GIS Application System Design Applied to Information Monitoring

    NASA Astrophysics Data System (ADS)

    Qun, Zhou; Yujin, Yuan; Yuena, Kang

    Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.

  19. Visual Field Defects and Retinal Ganglion Cell Losses in Human Glaucoma Patients

    PubMed Central

    Harwerth, Ronald S.; Quigley, Harry A.

    2007-01-01

    Objective The depth of visual field defects are correlated with retinal ganglion cell densities in experimental glaucoma. This study was to determine whether a similar structure-function relationship holds for human glaucoma. Methods The study was based on retinal ganglion cell densities and visual thresholds of patients with documented glaucoma (Kerrigan-Baumrind, et al.) The data were analyzed by a model that predicted ganglion cell densities from standard clinical perimetry, which were then compared to histologic cell counts. Results The model, without free parameters, produced accurate and relatively precise quantification of ganglion cell densities associated with visual field defects. For 437 sets of data, the unity correlation for predicted vs. measured cell densities had a coefficient of determination of 0.39. The mean absolute deviation of the predicted vs. measured values was 2.59 dB, the mean and SD of the distribution of residual errors of prediction was -0.26 ± 3.22 dB. Conclusions Visual field defects by standard clinical perimetry are proportional to neural losses caused by glaucoma. Clinical Relevance The evidence for quantitative structure-function relationships provides a scientific basis of interpreting glaucomatous neuropathy from visual thresholds and supports the application of standard perimetry to establish the stage of the disease. PMID:16769839

  20. [Constructing images and territories: thinking on the visuality and materiality of remote sensing].

    PubMed

    Monteiro, Marko

    2015-01-01

    This article offers a reflection on the question of the image in science, thinking about how visual practices contribute towards the construction of knowledge and territories. The growing centrality of the visual in current scientific practices shows the need for reflection that goes beyond the image. The object of discussion will be the scientific images used in the monitoring and visualization of territory. The article looks into the relations between visuality and a number of other factors: the researchers that construct it; the infrastructure involved in the construction; and the institutions and policies that monitor the territory. It is argued that such image-relations do not just visualize but help to construct the territory based on specific forms. Exploring this process makes it possible to develop a more complex understanding of the forms through which sciences and technology help to construct realities.

  1. An Examination of the Effects of Collaborative Scientific Visualization via Model-Based Reasoning on Science, Technology, Engineering, and Mathematics (STEM) Learning within an Immersive 3D World

    ERIC Educational Resources Information Center

    Soleimani, Ali

    2013-01-01

    Immersive 3D worlds can be designed to effectively engage students in peer-to-peer collaborative learning activities, supported by scientific visualization, to help with understanding complex concepts associated with learning science, technology, engineering, and mathematics (STEM). Previous research studies have shown STEM learning benefits…

  2. From Visual Exploration to Storytelling and Back Again.

    PubMed

    Gratzl, S; Lex, A; Gehlenborg, N; Cosgrove, N; Streit, M

    2016-06-01

    The primary goal of visual data exploration tools is to enable the discovery of new insights. To justify and reproduce insights, the discovery process needs to be documented and communicated. A common approach to documenting and presenting findings is to capture visualizations as images or videos. Images, however, are insufficient for telling the story of a visual discovery, as they lack full provenance information and context. Videos are difficult to produce and edit, particularly due to the non-linear nature of the exploratory process. Most importantly, however, neither approach provides the opportunity to return to any point in the exploration in order to review the state of the visualization in detail or to conduct additional analyses. In this paper we present CLUE (Capture, Label, Understand, Explain), a model that tightly integrates data exploration and presentation of discoveries. Based on provenance data captured during the exploration process, users can extract key steps, add annotations, and author "Vistories", visual stories based on the history of the exploration. These Vistories can be shared for others to view, but also to retrace and extend the original analysis. We discuss how the CLUE approach can be integrated into visualization tools and provide a prototype implementation. Finally, we demonstrate the general applicability of the model in two usage scenarios: a Gapminder-inspired visualization to explore public health data and an example from molecular biology that illustrates how Vistories could be used in scientific journals. (see Figure 1 for visual abstract).

  3. From Visual Exploration to Storytelling and Back Again

    PubMed Central

    Gratzl, S.; Lex, A.; Gehlenborg, N.; Cosgrove, N.; Streit, M.

    2016-01-01

    The primary goal of visual data exploration tools is to enable the discovery of new insights. To justify and reproduce insights, the discovery process needs to be documented and communicated. A common approach to documenting and presenting findings is to capture visualizations as images or videos. Images, however, are insufficient for telling the story of a visual discovery, as they lack full provenance information and context. Videos are difficult to produce and edit, particularly due to the non-linear nature of the exploratory process. Most importantly, however, neither approach provides the opportunity to return to any point in the exploration in order to review the state of the visualization in detail or to conduct additional analyses. In this paper we present CLUE (Capture, Label, Understand, Explain), a model that tightly integrates data exploration and presentation of discoveries. Based on provenance data captured during the exploration process, users can extract key steps, add annotations, and author “Vistories”, visual stories based on the history of the exploration. These Vistories can be shared for others to view, but also to retrace and extend the original analysis. We discuss how the CLUE approach can be integrated into visualization tools and provide a prototype implementation. Finally, we demonstrate the general applicability of the model in two usage scenarios: a Gapminder-inspired visualization to explore public health data and an example from molecular biology that illustrates how Vistories could be used in scientific journals. (see Figure 1 for visual abstract) PMID:27942091

  4. Thermal Imaging in the Science Classroom

    ERIC Educational Resources Information Center

    Short, Daniel B.

    2012-01-01

    Thermal cameras are useful tools for use in scientific investigation and for teaching scientific concepts to students in the classroom. Demonstrations of scientific phenomena can be greatly enhanced visually by the use of this cutting-edge technology. (Contains 7 figures.)

  5. Bridging Theory with Practice: An Exploratory Study of Visualization Use and Design for Climate Model Comparison

    DOE PAGES

    Dasgupta, Aritra; Poco, Jorge; Wei, Yaxing; ...

    2015-03-16

    Evaluation methodologies in visualization have mostly focused on how well the tools and techniques cater to the analytical needs of the user. While this is important in determining the effectiveness of the tools and advancing the state-of-the-art in visualization research, a key area that has mostly been overlooked is how well established visualization theories and principles are instantiated in practice. This is especially relevant when domain experts, and not visualization researchers, design visualizations for analysis of their data or for broader dissemination of scientific knowledge. There is very little research on exploring the synergistic capabilities of cross-domain collaboration between domainmore » experts and visualization researchers. To fill this gap, in this paper we describe the results of an exploratory study of climate data visualizations conducted in tight collaboration with a pool of climate scientists. The study analyzes a large set of static climate data visualizations for identifying their shortcomings in terms of visualization design. The outcome of the study is a classification scheme that categorizes the design problems in the form of a descriptive taxonomy. The taxonomy is a first attempt for systematically categorizing the types, causes, and consequences of design problems in visualizations created by domain experts. We demonstrate the use of the taxonomy for a number of purposes, such as, improving the existing climate data visualizations, reflecting on the impact of the problems for enabling domain experts in designing better visualizations, and also learning about the gaps and opportunities for future visualization research. We demonstrate the applicability of our taxonomy through a number of examples and discuss the lessons learnt and implications of our findings.« less

  6. Bridging Theory with Practice: An Exploratory Study of Visualization Use and Design for Climate Model Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Poco, Jorge; Wei, Yaxing

    Evaluation methodologies in visualization have mostly focused on how well the tools and techniques cater to the analytical needs of the user. While this is important in determining the effectiveness of the tools and advancing the state-of-the-art in visualization research, a key area that has mostly been overlooked is how well established visualization theories and principles are instantiated in practice. This is especially relevant when domain experts, and not visualization researchers, design visualizations for analysis of their data or for broader dissemination of scientific knowledge. There is very little research on exploring the synergistic capabilities of cross-domain collaboration between domainmore » experts and visualization researchers. To fill this gap, in this paper we describe the results of an exploratory study of climate data visualizations conducted in tight collaboration with a pool of climate scientists. The study analyzes a large set of static climate data visualizations for identifying their shortcomings in terms of visualization design. The outcome of the study is a classification scheme that categorizes the design problems in the form of a descriptive taxonomy. The taxonomy is a first attempt for systematically categorizing the types, causes, and consequences of design problems in visualizations created by domain experts. We demonstrate the use of the taxonomy for a number of purposes, such as, improving the existing climate data visualizations, reflecting on the impact of the problems for enabling domain experts in designing better visualizations, and also learning about the gaps and opportunities for future visualization research. We demonstrate the applicability of our taxonomy through a number of examples and discuss the lessons learnt and implications of our findings.« less

  7. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  8. Creating a Prototype Web Application for Spacecraft Real-Time Data Visualization on Mobile Devices

    NASA Technical Reports Server (NTRS)

    Lang, Jeremy S.; Irving, James R.

    2014-01-01

    Mobile devices (smart phones, tablets) have become commonplace among almost all sectors of the workforce, especially in the technical and scientific communities. These devices provide individuals the ability to be constantly connected to any area of interest they may have, whenever and wherever they are located. The Huntsville Operations Support Center (HOSC) is attempting to take advantage of this constant connectivity to extend the data visualization component of the Payload Operations and Integration Center (POIC) to a person's mobile device. POIC users currently have a rather unique capability to create custom user interfaces in order to view International Space Station (ISS) payload health and status telemetry. These displays are used at various console positions within the POIC. The Software Engineering team has created a Mobile Display capability that will allow authenticated users to view the same displays created for the console positions on the mobile device of their choice. Utilizing modern technologies including ASP.net, JavaScript, and HTML5, we have created a web application that renders the user's displays in any modern desktop or mobile web browser, regardless of the operating system on the device. Additionally, the application is device aware which enables it to render its configuration and selection menus with themes that correspond to the particular device. The Mobile Display application uses a communication mechanism known as signalR to push updates to the web client. This communication mechanism automatically detects the best communication protocol between the client and server and also manages disconnections and reconnections of the client to the server. One benefit of this application is that the user can monitor important telemetry even while away from their console position. If expanded to the scientific community, this application would allow a scientist to view a snapshot of the state of their particular experiment at any time or place. Because the web application renders the displays that can currently be created with the POIC ground system, the user can tailor their displays for a particular device using tools that they are already trained to use.

  9. Visualization at Supercomputing Centers: The Tale of Little Big Iron and the Three Skinny Guys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; van Rosendale, John; Southard, Dale

    2010-12-01

    Supercomputing Centers (SC's) are unique resources that aim to enable scientific knowledge discovery through the use of large computational resources, the Big Iron. Design, acquisition, installation, and management of the Big Iron are activities that are carefully planned and monitored. Since these Big Iron systems produce a tsunami of data, it is natural to co-locate visualization and analysis infrastructure as part of the same facility. This infrastructure consists of hardware (Little Iron) and staff (Skinny Guys). Our collective experience suggests that design, acquisition, installation, and management of the Little Iron and Skinny Guys does not receive the same level ofmore » treatment as that of the Big Iron. The main focus of this article is to explore different aspects of planning, designing, fielding, and maintaining the visualization and analysis infrastructure at supercomputing centers. Some of the questions we explore in this article include:"How should the Little Iron be sized to adequately support visualization and analysis of data coming off the Big Iron?" What sort of capabilities does it need to have?" Related questions concern the size of visualization support staff:"How big should a visualization program be (number of persons) and what should the staff do?" and"How much of the visualization should be provided as a support service, and how much should applications scientists be expected to do on their own?"« less

  10. Machine Detection of Enhanced Electromechanical Energy Conversion in PbZr 0.2Ti 0.8O 3 Thin Films

    DOE PAGES

    Agar, Joshua C.; Cao, Ye; Naul, Brett; ...

    2018-05-28

    Many energy conversion, sensing, and microelectronic applications based on ferroic materials are determined by the domain structure evolution under applied stimuli. New hyperspectral, multidimensional spectroscopic techniques now probe dynamic responses at relevant length and time scales to provide an understanding of how these nanoscale domain structures impact macroscopic properties. Such approaches, however, remain limited in use because of the difficulties that exist in extracting and visualizing scientific insights from these complex datasets. Using multidimensional band-excitation scanning probe spectroscopy and adapting tools from both computer vision and machine learning, an automated workflow is developed to featurize, detect, and classify signatures ofmore » ferroelectric/ferroelastic switching processes in complex ferroelectric domain structures. This approach enables the identification and nanoscale visualization of varied modes of response and a pathway to statistically meaningful quantification of the differences between those modes. Lastly, among other things, the importance of domain geometry is spatially visualized for enhancing nanoscale electromechanical energy conversion.« less

  11. Toyz: A framework for scientific analysis of large datasets and astronomical images

    NASA Astrophysics Data System (ADS)

    Moolekamp, F.; Mamajek, E.

    2015-11-01

    As the size of images and data products derived from astronomical data continues to increase, new tools are needed to visualize and interact with that data in a meaningful way. Motivated by our own astronomical images taken with the Dark Energy Camera (DECam) we present Toyz, an open source Python package for viewing and analyzing images and data stored on a remote server or cluster. Users connect to the Toyz web application via a web browser, making it ​a convenient tool for students to visualize and interact with astronomical data without having to install any software on their local machines. In addition it provides researchers with an easy-to-use tool that allows them to browse the files on a server and quickly view very large images (>2 Gb) taken with DECam and other cameras with a large FOV and create their own visualization tools that can be added on as extensions to the default Toyz framework.

  12. The multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) high performance computing infrastructure: applications in neuroscience and neuroinformatics research

    PubMed Central

    Goscinski, Wojtek J.; McIntosh, Paul; Felzmann, Ulrich; Maksimenko, Anton; Hall, Christopher J.; Gureyev, Timur; Thompson, Darren; Janke, Andrew; Galloway, Graham; Killeen, Neil E. B.; Raniga, Parnesh; Kaluza, Owen; Ng, Amanda; Poudel, Govinda; Barnes, David G.; Nguyen, Toan; Bonnington, Paul; Egan, Gary F.

    2014-01-01

    The Multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) is a national imaging and visualization facility established by Monash University, the Australian Synchrotron, the Commonwealth Scientific Industrial Research Organization (CSIRO), and the Victorian Partnership for Advanced Computing (VPAC), with funding from the National Computational Infrastructure and the Victorian Government. The MASSIVE facility provides hardware, software, and expertise to drive research in the biomedical sciences, particularly advanced brain imaging research using synchrotron x-ray and infrared imaging, functional and structural magnetic resonance imaging (MRI), x-ray computer tomography (CT), electron microscopy and optical microscopy. The development of MASSIVE has been based on best practice in system integration methodologies, frameworks, and architectures. The facility has: (i) integrated multiple different neuroimaging analysis software components, (ii) enabled cross-platform and cross-modality integration of neuroinformatics tools, and (iii) brought together neuroimaging databases and analysis workflows. MASSIVE is now operational as a nationally distributed and integrated facility for neuroinfomatics and brain imaging research. PMID:24734019

  13. Machine Detection of Enhanced Electromechanical Energy Conversion in PbZr 0.2Ti 0.8O 3 Thin Films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agar, Joshua C.; Cao, Ye; Naul, Brett

    Many energy conversion, sensing, and microelectronic applications based on ferroic materials are determined by the domain structure evolution under applied stimuli. New hyperspectral, multidimensional spectroscopic techniques now probe dynamic responses at relevant length and time scales to provide an understanding of how these nanoscale domain structures impact macroscopic properties. Such approaches, however, remain limited in use because of the difficulties that exist in extracting and visualizing scientific insights from these complex datasets. Using multidimensional band-excitation scanning probe spectroscopy and adapting tools from both computer vision and machine learning, an automated workflow is developed to featurize, detect, and classify signatures ofmore » ferroelectric/ferroelastic switching processes in complex ferroelectric domain structures. This approach enables the identification and nanoscale visualization of varied modes of response and a pathway to statistically meaningful quantification of the differences between those modes. Lastly, among other things, the importance of domain geometry is spatially visualized for enhancing nanoscale electromechanical energy conversion.« less

  14. The Earth Science Research Network as Seen Through Network Analysis of the AGU

    NASA Astrophysics Data System (ADS)

    Narock, T.; Hasnain, S.; Stephan, R.

    2017-12-01

    Scientometrics is the science of science. Scientometric research includes measurements of impact, mapping of scientific fields, and the production of indicators for use in policy and management. We have leveraged network analysis in a scientometric study of the American Geophysical Union (AGU). Data from the AGU's Linked Data Abstract Browser was used to create a visualization and analytics tools to explore the Earth science's research network. Our application applies network theory to look at network structure within the various AGU sections, identify key individuals and communities related to Earth science topics, and examine multi-disciplinary collaboration across sections. Opportunities to optimize Earth science output, as well as policy and outreach applications, are discussed.

  15. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    NASA Astrophysics Data System (ADS)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  16. Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3

    NASA Astrophysics Data System (ADS)

    Endsley, K. A.; Billmire, M. G.

    2016-01-01

    Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.

  17. A Unified Data-Driven Approach for Programming In Situ Analysis and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, Alex

    The placement and movement of data is becoming the key limiting factor on both performance and energy efficiency of high performance computations. As systems generate more data, it is becoming increasingly difficult to actually move that data elsewhere for post-processing, as the rate of improvements in supporting I/O infrastructure is not keeping pace. Together, these trends are creating a shift in how we think about exascale computations, from a viewpoint that focuses on FLOPS to one that focuses on data and data-centric operations as fundamental to the reasoning about, and optimization of, scientific workflows on extreme-scale architectures. The overarching goalmore » of our effort was the study of a unified data-driven approach for programming applications and in situ analysis and visualization. Our work was to understand the interplay between data-centric programming model requirements at extreme-scale and the overall impact of those requirements on the design, capabilities, flexibility, and implementation details for both applications and the supporting in situ infrastructure. In this context, we made many improvements to the Legion programming system (one of the leading data-centric models today) and demonstrated in situ analyses on real application codes using these improvements.« less

  18. Visual Literacy in Bloom: Using Bloom's Taxonomy to Support Visual Learning Skills

    ERIC Educational Resources Information Center

    Arneson, Jessie B.; Offerdahl, Erika G.

    2018-01-01

    "Vision and Change" identifies science communication as one of the core competencies in undergraduate biology. Visual representations are an integral part of science communication, allowing ideas to be shared among and between scientists and the public. As such, development of scientific visual literacy should be a desired outcome of…

  19. Visual Organizers as Scaffolds in Teaching English as a Foreign Language

    ERIC Educational Resources Information Center

    Chang, Yu-Liang

    2006-01-01

    This thesis deals with using visual organizers as scaffolds in teaching English as a foreign language (EFL). Based on the findings of scientific researches, the review of literature explicates the effectiveness and fruitfulness in employing visuals organizers in EFL instructions. It includes the five following components. First, visual organizers…

  20. How Investment in #GovTech Tools Helped with USGS Disaster Response During Hurricane Harvey

    NASA Astrophysics Data System (ADS)

    Shah, S.; Pearson, D. K.

    2017-12-01

    Hurricane Harvey was an unprecedented storm event that not only included a challenge to decision-makers, but also the scientific community to provide clear and rapid dissemination of changing streamflow conditions and potential flooding concerns. Of primary importance to the U.S. Geological Survey (USGS) Texas Water Science Center was to focus on the availability of accessible data and scientific communication of rapidly changing water conditions across Texas with regards to heavy rainfall rates, rising rivers, streams, and lake elevations where USGS has monitoring stations. Infrastructure modernization leading to advanced GovTech practices and data visualization was key to the USGS role in providing data during Hurricane Harvey. In the last two years, USGS has released two web applications, "Texas Water Dashboard" and "Water-On-The-Go", which were heavily utilized by partners, local media, and municipal government officials. These tools provided the backbone for data distribution through both desktop and mobile applications as decision support during flood events. The combination of Texas Water Science Center web tools and the USGS National Water Information System handled more than 5-million data requests over the course of the storm. On the ground local information near Buffalo Bayou and Addicks/Barker Dams, as well as statewide support of USGS real-time scientific data, were delivered to the National Weather Service, U.S. Army Corps of Engineers, FEMA, Harris County Flood Control District, the general public, and others. This presentation will provide an overview of GovTech solutions used during Hurricane Harvey, including the history of USGS tool development, discussion on the public response, and future applications for helping provide scientific communications to the public.

  1. Supporting the scientific lifecycle through cloud services

    NASA Astrophysics Data System (ADS)

    Gensch, S.; Klump, J. F.; Bertelmann, R.; Braune, C.

    2014-12-01

    Cloud computing has made resources and applications available for numerous use cases ranging from business processes in the private sector to scientific applications. Developers have created tools for data management, collaborative writing, social networking, data access and visualization, project management and many more; either for free or as paid premium services with additional or extended features. Scientists have begun to incorporate tools that fit their needs into their daily work. To satisfy specialized needs, some cloud applications specifically address the needs of scientists for sharing research data, literature search, laboratory documentation, or data visualization. Cloud services may vary in extent, user coverage, and inter-service integration and are also at risk of being abandonend or changed by the service providers making changes to their business model, or leaving the field entirely.Within the project Academic Enterprise Cloud we examine cloud based services that support the research lifecycle, using feature models to describe key properties in the areas of infrastructure and service provision, compliance to legal regulations, and data curation. Emphasis is put on the term Enterprise as to establish an academic cloud service provider infrastructure that satisfies demands of the research community through continious provision across the whole cloud stack. This could enable the research community to be independent from service providers regarding changes to terms of service and ensuring full control of its extent and usage. This shift towards a self-empowered scientific cloud provider infrastructure and its community raises implications about feasability of provision and overall costs. Legal aspects and licensing issues have to be considered, when moving data into cloud services, especially when personal data is involved.Educating researchers about cloud based tools is important to help in the transition towards effective and safe use. Scientists can benefit from the provision of standard services, like weblog and website creation, virtual machine deployments, and groupware provision using cloud based app store-like portals. And, other than in an industrial environment, researchers will want to keep their existing user profile when moving from one institution to another.

  2. Bringing "Scientific Expeditions" Into the Schools

    NASA Technical Reports Server (NTRS)

    Watson, Val; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D, high resolution, dynamic, interactive viewing of scientific data (such as simulations or measurements of fluid dynamics). The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects in computational fluid dynamics (CFD) and wind tunnel testing. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualiZation of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewer's local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: 1. The visual is much higher in resolution (1280xl024 pixels with 24 bits of color) than typical video format transmitted over the network. 2. The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). 3. A rich variety of guided expeditions through the data can be included easily. 4. A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. 5. The scenes can be viewed in 3D using stereo vision. 6. The network bandwidth used for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.)

  3. Fast 3D Net Expeditions: Tools for Effective Scientific Collaboration on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Watson, Val; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D (three dimensional), high resolution, dynamic, interactive viewing of scientific data. The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG (Motion Picture Expert Group) movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewers local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: (1) The visual is much higher in resolution (1280x1024 pixels with 24 bits of color) than typical video format transmitted over the network. (2) The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). (3) A rich variety of guided expeditions through the data can be included easily. (4) A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. (5) The scenes can be viewed in 3D using stereo vision. (6) The network bandwidth for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.) This talk will illustrate the use of these new technologies and present a proposal for using these technologies to improve science education.

  4. Syntonic phototherapy

    NASA Astrophysics Data System (ADS)

    Gottlieb, Raymond L.

    2010-02-01

    Syntonic phototherapy is an application of clinical phototherapy that is not well known by most LLLT photobiomodulation researchers and clinicians in spite of its long history. This is because of three main reasons: this approach was beyond the limits of the "reasonable" scientific paradigm, it has not been well researched and it is used mainly by optometrists. Clinical and basic researcher in the last decades about light's impact on cells, tissues, blood, circadian rhythms and mood disorders has broadened the paradigm and increased the acceptance of light as a healing agent. Perhaps now is an appropriate time to describe Syntonic optometric phototherapy with the purpose of exciting research to validate and expand its use. Syntonics uses non-coherent, non-polarized, broad-band light delivered into the eyes to treat brain injury, headache, strabismus, eye pathology, learning disability, mood and developmental syndromes. The eyes permit direct, non-invasive application of light to the retinal blood supply and to non-visual, retinal photoreceptor systems that signal circadian and other brain centers. Patients look at prescribed colors for 20-minutes/day for twenty treatments. Visual field, pupil, and binocular testing, medical history and current symptoms determine the syntonic filter prescription. Presentation describes syntonic theory, phototherapy device, visual field and pupil tests and cases reports with pre- and post-data and case resolution.

  5. Scientific Data Management Center for Enabling Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vouk, Mladen A.

    Managing scientific data has been identified by the scientific community as one of the most important emerging needs because of the sheer volume and increasing complexity of data being collected. Effectively generating, managing, and analyzing this information requires a comprehensive, end-to-end approach to data management that encompasses all of the stages from the initial data acquisition to the final analysis of the data. Fortunately, the data management problems encountered by most scientific domains are common enough to be addressed through shared technology solutions. Based on community input, we have identified three significant requirements. First, more efficient access to storage systemsmore » is needed. In particular, parallel file system and I/O system improvements are needed to write and read large volumes of data without slowing a simulation, analysis, or visualization engine. These processes are complicated by the fact that scientific data are structured differently for specific application domains, and are stored in specialized file formats. Second, scientists require technologies to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis and searches over extremely large data sets. Specialized feature discovery and statistical analysis techniques are needed before the data can be understood or visualized. Furthermore, interactive analysis requires techniques for efficiently selecting subsets of the data. Finally, generating the data, collecting and storing the results, keeping track of data provenance, data post-processing, and analysis of results is a tedious, fragmented process. Tools for automation of this process in a robust, tractable, and recoverable fashion are required to enhance scientific exploration. The SDM center was established under the SciDAC program to address these issues. The SciDAC-1 Scientific Data Management (SDM) Center succeeded in bringing an initial set of advanced data management technologies to DOE application scientists in astrophysics, climate, fusion, and biology. Equally important, it established collaborations with these scientists to better understand their science as well as their forthcoming data management and data analytics challenges. Building on our early successes, we have greatly enhanced, robustified, and deployed our technology to these communities. In some cases, we identified new needs that have been addressed in order to simplify the use of our technology by scientists. This report summarizes our work so far in SciDAC-2. Our approach is to employ an evolutionary development and deployment process: from research through prototypes to deployment and infrastructure. Accordingly, we have organized our activities in three layers that abstract the end-to-end data flow described above. We labeled the layers (from bottom to top): a) Storage Efficient Access (SEA), b) Data Mining and Analysis (DMA), c) Scientific Process Automation (SPA). The SEA layer is immediately on top of hardware, operating systems, file systems, and mass storage systems, and provides parallel data access technology, and transparent access to archival storage. The DMA layer, which builds on the functionality of the SEA layer, consists of indexing, feature identification, and parallel statistical analysis technology. The SPA layer, which is on top of the DMA layer, provides the ability to compose scientific workflows from the components in the DMA layer as well as application specific modules. NCSU work performed under this contract was primarily at the SPA layer.« less

  6. Methods for structuring scientific knowledge from many areas related to aging research.

    PubMed

    Zhavoronkov, Alex; Cantor, Charles R

    2011-01-01

    Aging and age-related disease represents a substantial quantity of current natural, social and behavioral science research efforts. Presently, no centralized system exists for tracking aging research projects across numerous research disciplines. The multidisciplinary nature of this research complicates the understanding of underlying project categories, the establishment of project relations, and the development of a unified project classification scheme. We have developed a highly visual database, the International Aging Research Portfolio (IARP), available at AgingPortfolio.org to address this issue. The database integrates information on research grants, peer-reviewed publications, and issued patent applications from multiple sources. Additionally, the database uses flexible project classification mechanisms and tools for analyzing project associations and trends. This system enables scientists to search the centralized project database, to classify and categorize aging projects, and to analyze the funding aspects across multiple research disciplines. The IARP is designed to provide improved allocation and prioritization of scarce research funding, to reduce project overlap and improve scientific collaboration thereby accelerating scientific and medical progress in a rapidly growing area of research. Grant applications often precede publications and some grants do not result in publications, thus, this system provides utility to investigate an earlier and broader view on research activity in many research disciplines. This project is a first attempt to provide a centralized database system for research grants and to categorize aging research projects into multiple subcategories utilizing both advanced machine algorithms and a hierarchical environment for scientific collaboration.

  7. Bring NASA Scientific Data into GIS

    NASA Astrophysics Data System (ADS)

    Xu, H.

    2016-12-01

    NASA's Earth Observation System (EOS) and many other missions produce data of huge volume and near real time which drives the research and understanding of climate change. Geographic Information System (GIS) is a technology used for the management, visualization and analysis of spatial data. Since it's inception in the 1960s, GIS has been applied to many fields at the city, state, national, and world scales. People continue to use it today to analyze and visualize trends, patterns, and relationships from the massive datasets of scientific data. There is great interest in both the scientific and GIS communities in improving technologies that can bring scientific data into a GIS environment, where scientific research and analysis can be shared through the GIS platform to the public. Most NASA scientific data are delivered in the Hierarchical Data Format (HDF), a format is both flexible and powerful. However, this flexibility results in challenges when trying to develop supported GIS software - data stored with HDF formats lack a unified standard and convention among these products. The presentation introduces an information model that enables ArcGIS software to ingest NASA scientific data and create a multidimensional raster - univariate and multivariate hypercubes - for scientific visualization and analysis. We will present the framework how ArcGIS leverages the open source GDAL (Geospatial Data Abstract Library) to support its raster data access, discuss how we overcame the GDAL drivers limitations in handing scientific products that are stored with HDF4 and HDF5 formats and how we improve the way in modeling the multidimensionality with GDAL. In additional, we will talk about the direction of ArcGIS handling NASA products and demonstrate how the multidimensional information model can help scientists work with various data products such as MODIS, MOPPIT, SMAP as well as many data products in a GIS environment.

  8. 22 CFR 61.3 - Certification and authentication criteria.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... AUDIO-VISUAL MATERIALS § 61.3 Certification and authentication criteria. (a) The Department shall certify or authenticate audio-visual materials submitted for review as educational, scientific and... of the material. (b) The Department will not certify or authenticate any audio-visual material...

  9. 22 CFR 61.3 - Certification and authentication criteria.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... AUDIO-VISUAL MATERIALS § 61.3 Certification and authentication criteria. (a) The Department shall certify or authenticate audio-visual materials submitted for review as educational, scientific and... of the material. (b) The Department will not certify or authenticate any audio-visual material...

  10. 22 CFR 61.3 - Certification and authentication criteria.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... AUDIO-VISUAL MATERIALS § 61.3 Certification and authentication criteria. (a) The Department shall certify or authenticate audio-visual materials submitted for review as educational, scientific and... of the material. (b) The Department will not certify or authenticate any audio-visual material...

  11. User Interface Technology Transfer to NASA's Virtual Wind Tunnel System

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1998-01-01

    Funded by NASA grants for four years, the Brown Computer Graphics Group has developed novel 3D user interfaces for desktop and immersive scientific visualization applications. This past grant period supported the design and development of a software library, the 3D Widget Library, which supports the construction and run-time management of 3D widgets. The 3D Widget Library is a mechanism for transferring user interface technology from the Brown Graphics Group to the Virtual Wind Tunnel system at NASA Ames as well as the public domain.

  12. A Visual Database System for Image Analysis on Parallel Computers and its Application to the EOS Amazon Project

    NASA Technical Reports Server (NTRS)

    Shapiro, Linda G.; Tanimoto, Steven L.; Ahrens, James P.

    1996-01-01

    The goal of this task was to create a design and prototype implementation of a database environment that is particular suited for handling the image, vision and scientific data associated with the NASA's EOC Amazon project. The focus was on a data model and query facilities that are designed to execute efficiently on parallel computers. A key feature of the environment is an interface which allows a scientist to specify high-level directives about how query execution should occur.

  13. Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.

  14. Discovering Communicable Models from Earth Science Data

    NASA Technical Reports Server (NTRS)

    Schwabacher, Mark; Langley, Pat; Potter, Christopher; Klooster, Steven; Torregrosa, Alicia

    2002-01-01

    This chapter describes how we used regression rules to improve upon results previously published in the Earth science literature. In such a scientific application of machine learning, it is crucially important for the learned models to be understandable and communicable. We recount how we selected a learning algorithm to maximize communicability, and then describe two visualization techniques that we developed to aid in understanding the model by exploiting the spatial nature of the data. We also report how evaluating the learned models across time let us discover an error in the data.

  15. PyContact: Rapid, Customizable, and Visual Analysis of Noncovalent Interactions in MD Simulations.

    PubMed

    Scheurer, Maximilian; Rodenkirch, Peter; Siggel, Marc; Bernardi, Rafael C; Schulten, Klaus; Tajkhorshid, Emad; Rudack, Till

    2018-02-06

    Molecular dynamics (MD) simulations have become ubiquitous in all areas of life sciences. The size and model complexity of MD simulations are rapidly growing along with increasing computing power and improved algorithms. This growth has led to the production of a large amount of simulation data that need to be filtered for relevant information to address specific biomedical and biochemical questions. One of the most relevant molecular properties that can be investigated by all-atom MD simulations is the time-dependent evolution of the complex noncovalent interaction networks governing such fundamental aspects as molecular recognition, binding strength, and mechanical and structural stability. Extracting, evaluating, and visualizing noncovalent interactions is a key task in the daily work of structural biologists. We have developed PyContact, an easy-to-use, highly flexible, and intuitive graphical user interface-based application, designed to provide a toolkit to investigate biomolecular interactions in MD trajectories. PyContact is designed to facilitate this task by enabling identification of relevant noncovalent interactions in a comprehensible manner. The implementation of PyContact as a standalone application enables rapid analysis and data visualization without any additional programming requirements, and also preserves full in-program customization and extension capabilities for advanced users. The statistical analysis representation is interactively combined with full mapping of the results on the molecular system through the synergistic connection between PyContact and VMD. We showcase the capabilities and scientific significance of PyContact by analyzing and visualizing in great detail the noncovalent interactions underlying the ion permeation pathway of the human P2X 3 receptor. As a second application, we examine the protein-protein interaction network of the mechanically ultrastable cohesin-dockering complex. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. Orchestrating Distributed Resource Ensembles for Petascale Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baldin, Ilya; Mandal, Anirban; Ruth, Paul

    2014-04-24

    Distributed, data-intensive computational science applications of interest to DOE scientific com- munities move large amounts of data for experiment data management, distributed analysis steps, remote visualization, and accessing scientific instruments. These applications need to orchestrate ensembles of resources from multiple resource pools and interconnect them with high-capacity multi- layered networks across multiple domains. It is highly desirable that mechanisms are designed that provide this type of resource provisioning capability to a broad class of applications. It is also important to have coherent monitoring capabilities for such complex distributed environments. In this project, we addressed these problems by designing an abstractmore » API, enabled by novel semantic resource descriptions, for provisioning complex and heterogeneous resources from multiple providers using their native provisioning mechanisms and control planes: computational, storage, and multi-layered high-speed network domains. We used an extensible resource representation based on semantic web technologies to afford maximum flexibility to applications in specifying their needs. We evaluated the effectiveness of provisioning using representative data-intensive ap- plications. We also developed mechanisms for providing feedback about resource performance to the application, to enable closed-loop feedback control and dynamic adjustments to resource allo- cations (elasticity). This was enabled through development of a novel persistent query framework that consumes disparate sources of monitoring data, including perfSONAR, and provides scalable distribution of asynchronous notifications.« less

  17. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less

  18. Facilitating Scientific Collaboration and Education with Easy Access Web Maps Using the AGAP Antarctic Geophysical Data

    NASA Astrophysics Data System (ADS)

    Abdi, A.

    2012-12-01

    Science and science education benefit from easy access to data yet often geophysical data sets are large, complex and difficult to share. The difficulty in sharing data and imagery easily inhibits both collaboration and the use of real data in educational applications. The dissemination of data products through web maps serves a very efficient and user-friendly method for students, the public and the science community to gain insights and understanding from data. Few research groups provide direct access to their data, let alone map-based visualizations. By building upon current GIS infrastructure with web mapping technologies, like ArcGIS Server, scientific groups, institutions and agencies can enhance the value of their GIS investments. The advantages of web maps to serve data products are many; existing web-mapping technology allows complex GIS analysis to be shared across the Internet, and can be easily scaled from a few users to millions. This poster highlights the features of an interactive web map developed at the Polar Geophysics Group at the Lamont-Doherty Earth Observatory of Columbia University that provides a visual representation of, and access to, data products that resulted from the group's recently concluded AGAP project (http://pgg.ldeo.columbia.edu). The AGAP project collected more than 120,000 line km of new aerogeophysical data using two Twin Otter aircrafts. Data included ice penetrating radar, magnetometer, gravimeter and laser altimeter measurements. The web map is based upon ArcGIS Viewer for Flex, which is a configurable client application built on the ArcGIS API for Flex that works seamlessly with ArcGIS Server 10. The application can serve a variety of raster and vector file formats through the Data Interoperability for Server, which eliminates data sharing barriers across numerous file formats. The ability of the application to serve large datasets is only hindered by the availability of appropriate hardware. ArcGIS is a proprietary product, but there are a few data portals in the earth sciences that have a map interface using open access products such as MapServer and OpenLayers, the most notable being the NASA IceBridge Data Portal. Indeed, with the widespread availability of web mapping technology, the scientific community should advance towards this direction when disseminating their data.

  19. Teaching foundational topics and scientific skills in biochemistry within the conceptual framework of HIV protease.

    PubMed

    Johnson, R Jeremy

    2014-01-01

    HIV protease has served as a model protein for understanding protein structure, enzyme kinetics, structure-based drug design, and protein evolution. Inhibitors of HIV protease are also an essential part of effective HIV/AIDS treatment and have provided great societal benefits. The broad applications for HIV protease and its inhibitors make it a perfect framework for integrating foundational topics in biochemistry around a big picture scientific and societal issue. Herein, I describe a series of classroom exercises that integrate foundational topics in biochemistry around the structure, biology, and therapeutic inhibition of HIV protease. These exercises center on foundational topics in biochemistry including thermodynamics, acid/base properties, protein structure, ligand binding, and enzymatic catalysis. The exercises also incorporate regular student practice of scientific skills including analysis of primary literature, evaluation of scientific data, and presentation of technical scientific arguments. Through the exercises, students also gain experience accessing computational biochemical resources such as the protein data bank, Proteopedia, and protein visualization software. As these HIV centered exercises cover foundational topics common to all first semester biochemistry courses, these exercises should appeal to a broad audience of undergraduate students and should be readily integrated into a variety of teaching styles and classroom sizes. © 2014 The International Union of Biochemistry and Molecular Biology.

  20. CONTROLLING STUDENT RESPONSES DURING VISUAL PRESENTATIONS--STUDIES IN TELEVISED INSTRUCTION, THE ROLE OF VISUALS IN VERBAL LEARNING, REPORT 2.

    ERIC Educational Resources Information Center

    GROPPER, GEORGE L.

    THIS IS A REPORT OF TWO STUDIES IN WHICH PRINCIPLES OF PROGRAMED INSTRUCTION WERE ADAPTED FOR VISUAL PRESENTATIONS. SCIENTIFIC DEMONSTRATIONS WERE PREPARED WITH A VISUAL PROGRAM AND A VERBAL PROGRAM ON--(1) ARCHIMEDES' LAW AND (2) FORCE AND PRESSURE. RESULTS SUGGESTED THAT RESPONSES ARE MORE READILY BROUGHT UNDER THE CONTROL OF VISUAL PRESENTATION…

  1. Reusable Client-Side JavaScript Modules for Immersive Web-Based Real-Time Collaborative Neuroimage Visualization.

    PubMed

    Bernal-Rusiel, Jorge L; Rannou, Nicolas; Gollub, Randy L; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E; Pienaar, Rudolph

    2017-01-01

    In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView , a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution.

  2. A Critique of the Theoretical and Empirical Literature of the Use of Diagrams, Graphs, and Other Visual Aids in the Learning of Scientific-Technical Content from Expository Texts and Instruction

    ERIC Educational Resources Information Center

    Carifio, James; Perla, Rocco J.

    2009-01-01

    This article presents a critical review and analysis of key studies that have been done in science education and other areas on the effects and effectiveness of using diagrams, graphs, photographs, illustrations, and concept maps as "adjunct visual aids" in the learning of scientific-technical content. It also summarizes and reviews those studies…

  3. [Visual representation of biological structures in teaching material].

    PubMed

    Morato, M A; Struchiner, M; Bordoni, E; Ricciardi, R M

    1998-01-01

    Parameters must be defined for presenting and handling scientific information presented in the form of teaching materials. Through library research and consultations with specialists in the health sciences and in graphic arts and design, this study undertook a comparative description of the first examples of scientific illustrations of anatomy and the evolution of visual representations of knowledge on the cell. The study includes significant examples of illustrations which served as elements of analysis.

  4. MeshVoro: A Three-Dimensional Voronoi Mesh Building Tool for the TOUGH Family of Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, C. M.; Boyle, K. L.; Reagan, M.

    2013-09-30

    Few tools exist for creating and visualizing complex three-dimensional simulation meshes, and these have limitations that restrict their application to particular geometries and circumstances. Mesh generation needs to trend toward ever more general applications. To that end, we have developed MeshVoro, a tool that is based on the Voro (Rycroft 2009) library and is capable of generating complex threedimensional Voronoi tessellation-based (unstructured) meshes for the solution of problems of flow and transport in subsurface geologic media that are addressed by the TOUGH (Pruess et al. 1999) family of codes. MeshVoro, which includes built-in data visualization routines, is a particularly usefulmore » tool because it extends the applicability of the TOUGH family of codes by enabling the scientifically robust and relatively easy discretization of systems with challenging 3D geometries. We describe several applications of MeshVoro. We illustrate the ability of the tool to straightforwardly transform a complex geological grid into a simulation mesh that conforms to the specifications of the TOUGH family of codes. We demonstrate how MeshVoro can describe complex system geometries with a relatively small number of grid blocks, and we construct meshes for geometries that would have been practically intractable with a standard Cartesian grid approach. We also discuss the limitations and appropriate applications of this new technology.« less

  5. BASTet: Shareable and Reproducible Analysis and Visualization of Mass Spectrometry Imaging Data via OpenMSI.

    PubMed

    Rubel, Oliver; Bowen, Benjamin P

    2018-01-01

    Mass spectrometry imaging (MSI) is a transformative imaging method that supports the untargeted, quantitative measurement of the chemical composition and spatial heterogeneity of complex samples with broad applications in life sciences, bioenergy, and health. While MSI data can be routinely collected, its broad application is currently limited by the lack of easily accessible analysis methods that can process data of the size, volume, diversity, and complexity generated by MSI experiments. The development and application of cutting-edge analytical methods is a core driver in MSI research for new scientific discoveries, medical diagnostics, and commercial-innovation. However, the lack of means to share, apply, and reproduce analyses hinders the broad application, validation, and use of novel MSI analysis methods. To address this central challenge, we introduce the Berkeley Analysis and Storage Toolkit (BASTet), a novel framework for shareable and reproducible data analysis that supports standardized data and analysis interfaces, integrated data storage, data provenance, workflow management, and a broad set of integrated tools. Based on BASTet, we describe the extension of the OpenMSI mass spectrometry imaging science gateway to enable web-based sharing, reuse, analysis, and visualization of data analyses and derived data products. We demonstrate the application of BASTet and OpenMSI in practice to identify and compare characteristic substructures in the mouse brain based on their chemical composition measured via MSI.

  6. New NASA 3D Animation Shows Seven Days of Simulated Earth Weather

    NASA Image and Video Library

    2014-08-11

    This visualization shows early test renderings of a global computational model of Earth's atmosphere based on data from NASA's Goddard Earth Observing System Model, Version 5 (GEOS-5). This particular run, called Nature Run 2, was run on a supercomputer, spanned 2 years of simulation time at 30 minute intervals, and produced Petabytes of output. The visualization spans a little more than 7 days of simulation time which is 354 time steps. The time period was chosen because a simulated category-4 typhoon developed off the coast of China. The 7 day period is repeated several times during the course of the visualization. Credit: NASA's Scientific Visualization Studio Read more or download here: svs.gsfc.nasa.gov/goto?4180 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  7. Observation and visualization: reflections on the relationship between science, visual arts, and the evolution of the scientific image.

    PubMed

    Kolijn, Eveline

    2013-10-01

    The connections between biological sciences, art and printed images are of great interest to the author. She reflects on the historical relevance of visual representations for science. She argues that the connection between art and science seems to have diminished during the twentieth century. However, this connection is currently growing stronger again through digital media and new imaging methods. Scientific illustrations have fuelled art, while visual modeling tools have assisted scientific research. As a print media artist, she explores the relationship between art and science in her studio practice and will present this historical connection with examples related to evolution, microbiology and her own work. Art and science share a common source, which leads to scrutiny and enquiry. Science sets out to reveal and explain our reality, whereas art comments and makes connections that don't need to be tested by rigorous protocols. Art and science should each be evaluated on their own merit. Allowing room for both in the quest to understand our world will lead to an enriched experience.

  8. Spatial Data Infrastructures (SDIs): Improving the Scientific Environmental Data Management and Visualization with ArcGIS Platform

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Hogeweg, M.; Rose, B.; Turner, A.

    2017-12-01

    The requirement for quality, authoritatively sourced data can often be challenging when working with scientific data. In addition, the lack of standard mechanism to discover, access, and use such data can be cumbersome. This results in slow research, poor dissemination and missed opportunities for research to positively impact policy and knowledge. There is widespread recognition that authoritative datasets are maintained by multiple organizations following various standards, and addressing these challenges will involve multiple stakeholders as well. The bottom line is that organizations need a mechanism to efficiently create, share, catalog, and discover data, and the ability to apply these to create an authoritative information products and applications is powerful and provides value. In real-world applications, individual organizations develop, modify, finalize, and support foundational data for distributed users across the system and thus require an efficient method of data management. For this, the SDI (Spatial Data Infrastructure) framework can be applied for GIS users to facilitate efficient and powerful decision making based on strong visualization and analytics. Working with research institutions, governments, and organizations across the world, we have developed a Hub framework for data and analysis sharing that is driven by outcome-centric goals which apply common methodologies and standards. SDI are an operational capability that should be equitably accessible to policy-makers, analysts, departments and public communities. These SDI need to align with operational workflows and support social communications and collaboration. The Hub framework integrates data across agencies, projects and organizations to support interoperability and drive coordination. We will present and share how Esri has been supporting the development of local, state, and national SDIs for many years and show some use cases for applications of planetary SDI. We will also share what makes an SDI successful, how organizations have used the ArcGIS platform to quickly stand up key SDI products and applications, and describe some typical SDI scenarios.

  9. LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team

    2011-01-01

    The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.

  10. Complementary and alternative medicine for post-traumatic stress disorder symptoms: A systematic review

    PubMed Central

    Wahbeh, Helané; Senders, Angela; Neuendorf, Rachel; Cayton, Julien

    2014-01-01

    Objectives To 1) characterize complementary and alternative medicine (CAM) studies for posttraumatic stress disorder symptoms (PTSD), 2) evaluate the quality of these studies, and 3) systematically grade the scientific evidence for individual CAM modalities for PTSD. Design Systematic Review. Eight data sources were searched. Selection criteria included any study design assessing PTSD outcomes and any CAM intervention. The body of evidence for each modality was assessed with the Natural Standard evidence-based, validated grading rationale.™ Results and Conclusions Thirty-three studies (n=1329) were reviewed. Scientific evidence of benefit for PTSD was Strong for repetitive transcranial magnetic stimulation and Good for acupuncture, hypnotherapy, meditation, and visualization. Evidence was Unclear or Conflicting for biofeedback, relaxation, Emotional Freedom and Thought Field therapies, yoga, and natural products. Considerations for clinical applications and future research recommendations are discussed. PMID:24676593

  11. Scaffolding Learning from Molecular Visualizations

    ERIC Educational Resources Information Center

    Chang, Hsin-Yi; Linn, Marcia C.

    2013-01-01

    Powerful online visualizations can make unobservable scientific phenomena visible and improve student understanding. Instead, they often confuse or mislead students. To clarify the impact of molecular visualizations for middle school students we explored three design variations implemented in a Web-based Inquiry Science Environment (WISE) unit on…

  12. 22 CFR 61.1 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... DEPARTMENT OF STATE PUBLIC DIPLOMACY AND EXCHANGES WORLD-WIDE FREE FLOW OF AUDIO-VISUAL MATERIALS § 61.1... educational, scientific and cultural audio-visual materials between nations by providing favorable import... issuance or authentication of a certificate that the audio-visual material for which favorable treatment is...

  13. 22 CFR 61.1 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... DEPARTMENT OF STATE PUBLIC DIPLOMACY AND EXCHANGES WORLD-WIDE FREE FLOW OF AUDIO-VISUAL MATERIALS § 61.1... educational, scientific and cultural audio-visual materials between nations by providing favorable import... issuance or authentication of a certificate that the audio-visual material for which favorable treatment is...

  14. 22 CFR 61.1 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... DEPARTMENT OF STATE PUBLIC DIPLOMACY AND EXCHANGES WORLD-WIDE FREE FLOW OF AUDIO-VISUAL MATERIALS § 61.1... educational, scientific and cultural audio-visual materials between nations by providing favorable import... issuance or authentication of a certificate that the audio-visual material for which favorable treatment is...

  15. Supporting Students' Knowledge Integration with Technology-Enhanced Inquiry Curricula

    ERIC Educational Resources Information Center

    Chiu, Jennifer Lopseen

    2010-01-01

    Dynamic visualizations of scientific phenomena have the potential to transform how students learn and understand science. Dynamic visualizations enable interaction and experimentation with unobservable atomic-level phenomena. A series of studies clarify the conditions under which embedding dynamic visualizations in technology-enhanced inquiry…

  16. Communicating Science Concepts to Individuals with Visual Impairments Using Short Learning Modules

    ERIC Educational Resources Information Center

    Stender, Anthony S.; Newell, Ryan; Villarreal, Eduardo; Swearer, Dayne F.; Bianco, Elisabeth; Ringe, Emilie

    2016-01-01

    Of the 6.7 million individuals in the United States who are visually impaired, 63% are unemployed, and 59% have not attained an education beyond a high school diploma. Providing a basic science education to children and adults with visual disabilities can be challenging because most scientific learning relies on visual demonstrations. Creating…

  17. Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Analytical Judgment Using Visualizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik

    Scientists working in a particular domain often adhere to conventional data analysis and presentation methods and this leads to familiarity with these methods over time. But does high familiarity always lead to better analytical judgment? This question is especially relevant when visualizations are used in scientific tasks, as there can be discrepancies between visualization best practices and domain conventions. However, there is little empirical evidence of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their effect on scientific judgment. To address this gap and to study these factors, we focus on the climatemore » science domain, specifically on visualizations used for comparison of model performance. We present a comprehensive user study with 47 climate scientists where we explored the following factors: i) relationships between scientists’ familiarity, their perceived levels of com- fort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less

  18. Learning Science Through Visualization

    NASA Technical Reports Server (NTRS)

    Chaudhury, S. Raj

    2005-01-01

    In the context of an introductory physical science course for non-science majors, I have been trying to understand how scientific visualizations of natural phenomena can constructively impact student learning. I have also necessarily been concerned with the instructional and assessment approaches that need to be considered when focusing on learning science through visually rich information sources. The overall project can be broken down into three distinct segments : (i) comparing students' abilities to demonstrate proportional reasoning competency on visual and verbal tasks (ii) decoding and deconstructing visualizations of an object falling under gravity (iii) the role of directed instruction to elicit alternate, valid scientific visualizations of the structure of the solar system. Evidence of student learning was collected in multiple forms for this project - quantitative analysis of student performance on written, graded assessments (tests and quizzes); qualitative analysis of videos of student 'think aloud' sessions. The results indicate that there are significant barriers for non-science majors to succeed in mastering the content of science courses, but with informed approaches to instruction and assessment, these barriers can be overcome.

  19. Future Directions in Computer Graphics and Visualization: From CG&A's Editorial Board

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Encarnacao, L. M.; Chuang, Yung-Yu; Stork, Andre

    2015-01-01

    With many new members joining the CG&A editorial board over the past year, and with a renewed commitment to not only document the state of the art in computer graphics research and applications but to anticipate and where possible foster future areas of scientific discourse and industrial practice, we asked editorial and advisory council members about where they see their fields of expertise going. The answers compiled here aren’t meant to be all encompassing or deterministic when it comes to the opportunities computer graphics and interactive visualization hold for the future. Instead, we aim to accomplish two things: give amore » more in-depth introduction of members of the editorial board to the CG&A readership and encourage cross-disciplinary discourse toward approaching, complementing, or disputing the visions laid out in this compilation.« less

  20. Art, science, and immersion: data-driven experiences

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Monroe, Laura; Ford Morie, Jacquelyn; Aguilera, Julieta

    2013-03-01

    This panel and dialog-paper explores the potentials at the intersection of art, science, immersion and highly dimensional, "big" data to create new forms of engagement, insight and cultural forms. We will address questions such as: "What kinds of research questions can be identified at the intersection of art + science + immersive environments that can't be expressed otherwise?" "How is art+science+immersion distinct from state-of-the art visualization?" "What does working with immersive environments and visualization offer that other approaches don't or can't?" "Where does immersion fall short?" We will also explore current trends in the application of immersion for gaming, scientific data, entertainment, simulation, social media and other new forms of big data. We ask what expressive, arts-based approaches can contribute to these forms in the broad cultural landscape of immersive technologies.

  1. Reducing the Analytical Bottleneck for Domain Scientists: Lessons from a Climate Data Visualization Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Poco, Jorge; Bertini, Enrico

    2016-01-01

    The gap between large-scale data production rate and the rate of generation of data-driven scientific insights has led to an analytical bottleneck in scientific domains like climate, biology, etc. This is primarily due to the lack of innovative analytical tools that can help scientists efficiently analyze and explore alternative hypotheses about the data, and communicate their findings effectively to a broad audience. In this paper, by reflecting on a set of successful collaborative research efforts between with a group of climate scientists and visualization researchers, we introspect how interactive visualization can help reduce the analytical bottleneck for domain scientists.

  2. Planetary SUrface Portal (PSUP): a tool for easy visualization and analysis of Martian surface

    NASA Astrophysics Data System (ADS)

    Poulet, Francois; Quantin-Nataf, Cathy; Ballans, Hervé; Lozac'h, Loic; Audouard, Joachim; Carter, John; Dassas, karin; Malapert, Jean-Christophe; Marmo, Chiara; Poulleau, Gilles; Riu, Lucie; Séjourné, antoine

    2016-10-01

    PSUP is two software application platforms for working with raster, vector, DTM, and hyper-spectral data acquired by various space instruments analyzing the surface of Mars from orbit. The first platform of PSUP is MarsSI (Martian surface data processing Information System, http://emars.univ-lyon1.fr). It provides data analysis functionalities to select and download ready-to-use products or to process data though specific and validated pipelines. To date, MarsSI handles CTX, HiRISE and CRISM data of NASA/MRO mission, HRSC and OMEGA data of ESA/MEx mission and THEMIS data of NASA/ODY mission (Lozac'h et al., EPSC 2015). The second part of PSUP is also open to the scientific community and can be visited at http://psup.ias.u-psud.fr/. This web-based user interface provides access to many data products for Mars: image footprints and rasters from the MarsSI tool; compositional maps from OMEGA and TES; albedo and thermal inertia from OMEGA and TES; mosaics from THEMIS, Viking, and CTX; high level specific products (defined as catalogues) such as hydrated mineral sites derived from CRISM and OMEGA data, central peaks mineralogy,… In addition, OMEGA C channel data cubes corrected for atmospheric and aerosol contributions can be downloaded. The architecture of PSUP data management and visualization is based on SITools2 and MIZAR, two CNES generic tools developed by a joint effort between CNES and scientific laboratories. SITools2 provides a self-manageable data access layer deployed on the PSUP data, while MIZAR is 3D application in a browser for discovering and visualizing geospatial data. Further developments including the addition of high level products of Mars (regional geological maps, new global compositional maps,…) are foreseen. Ultimately, PSUP will be adapted to other planetary surfaces and space missions in which the French research institutes are involved.

  3. Comparison of User Performance with Interactive and Static 3d Visualization - Pilot Study

    NASA Astrophysics Data System (ADS)

    Herman, L.; Stachoň, Z.

    2016-06-01

    Interactive 3D visualizations of spatial data are currently available and popular through various applications such as Google Earth, ArcScene, etc. Several scientific studies have focused on user performance with 3D visualization, but static perspective views are used as stimuli in most of the studies. The main objective of this paper is to try to identify potential differences in user performance with static perspective views and interactive visualizations. This research is an exploratory study. An experiment was designed as a between-subject study and a customized testing tool based on open web technologies was used for the experiment. The testing set consists of an initial questionnaire, a training task and four experimental tasks. Selection of the highest point and determination of visibility from the top of a mountain were used as the experimental tasks. Speed and accuracy of each task performance of participants were recorded. The movement and actions in the virtual environment were also recorded within the interactive variant. The results show that participants deal with the tasks faster when using static visualization. The average error rate was also higher in the static variant. The findings from this pilot study will be used for further testing, especially for formulating of hypotheses and designing of subsequent experiments.

  4. Modern data science for analytical chemical data - A comprehensive review.

    PubMed

    Szymańska, Ewa

    2018-10-22

    Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Ocean Drilling Program: Publication Services

    Science.gov Websites

    before each cruise. Preliminary Report: A summary of the shipboard scientific results and technical detailed summary the scientific and engineering results from each leg including visual core descriptions

  6. Comparing the quality of accessing medical literature using content-based visual and textual information retrieval

    NASA Astrophysics Data System (ADS)

    Müller, Henning; Kalpathy-Cramer, Jayashree; Kahn, Charles E., Jr.; Hersh, William

    2009-02-01

    Content-based visual information (or image) retrieval (CBIR) has been an extremely active research domain within medical imaging over the past ten years, with the goal of improving the management of visual medical information. Many technical solutions have been proposed, and application scenarios for image retrieval as well as image classification have been set up. However, in contrast to medical information retrieval using textual methods, visual retrieval has only rarely been applied in clinical practice. This is despite the large amount and variety of visual information produced in hospitals every day. This information overload imposes a significant burden upon clinicians, and CBIR technologies have the potential to help the situation. However, in order for CBIR to become an accepted clinical tool, it must demonstrate a higher level of technical maturity than it has to date. Since 2004, the ImageCLEF benchmark has included a task for the comparison of visual information retrieval algorithms for medical applications. In 2005, a task for medical image classification was introduced and both tasks have been run successfully for the past four years. These benchmarks allow an annual comparison of visual retrieval techniques based on the same data sets and the same query tasks, enabling the meaningful comparison of various retrieval techniques. The datasets used from 2004-2007 contained images and annotations from medical teaching files. In 2008, however, the dataset used was made up of 67,000 images (along with their associated figure captions and the full text of their corresponding articles) from two Radiological Society of North America (RSNA) scientific journals. This article describes the results of the medical image retrieval task of the ImageCLEF 2008 evaluation campaign. We compare the retrieval results of both visual and textual information retrieval systems from 15 research groups on the aforementioned data set. The results show clearly that, currently, visual retrieval alone does not achieve the performance necessary for real-world clinical applications. Most of the common visual retrieval techniques have a MAP (Mean Average Precision) of around 2-3%, which is much lower than that achieved using textual retrieval (MAP=29%). Advanced machine learning techniques, together with good training data, have been shown to improve the performance of visual retrieval systems in the past. Multimodal retrieval (basing retrieval on both visual and textual information) can achieve better results than purely visual, but only when carefully applied. In many cases, multimodal retrieval systems performed even worse than purely textual retrieval systems. On the other hand, some multimodal retrieval systems demonstrated significantly increased early precision, which has been shown to be a desirable behavior in real-world systems.

  7. A 500 megabyte/second disk array

    NASA Technical Reports Server (NTRS)

    Ruwart, Thomas M.; Okeefe, Matthew T.

    1994-01-01

    Applications at the Army High Performance Computing Research Center's (AHPCRC) Graphic and Visualization Laboratory (GVL) at the University of Minnesota require a tremendous amount of I/O bandwidth and this appetite for data is growing. Silicon Graphics workstations are used to perform the post-processing, visualization, and animation of multi-terabyte size datasets produced by scientific simulations performed of AHPCRC supercomputers. The M.A.X. (Maximum Achievable Xfer) was designed to find the maximum achievable I/O performance of the Silicon Graphics CHALLENGE/Onyx-class machines that run these applications. Running a fully configured Onyx machine with 12-150MHz R4400 processors, 512MB of 8-way interleaved memory, 31 fast/wide SCSI-2 channel each with a Ciprico disk array controller we were able to achieve a maximum sustained transfer rate of 509.8 megabytes per second. However, after analyzing the results it became clear that the true maximum transfer rate is somewhat beyond this figure and we will need to do further testing with more disk array controllers in order to find the true maximum.

  8. Fitting the Jigsaw of Citation: Information Visualization in Domain Analysis.

    ERIC Educational Resources Information Center

    Chen, Chaomei; Paul, Ray J.; O'Keefe, Bob

    2001-01-01

    Discusses the role of information visualization in modeling and representing intellectual structures associated with scientific disciplines and visualizes the domain of computer graphics based on bibliographic data from author cocitation patterns. Highlights include author cocitation maps, citation time lines, animation of a high-dimensional…

  9. Improving Visual Communication.

    PubMed

    Singh, Gary

    2018-01-01

    A tool that creates realtime interactive color maps for scientific visualization helped enhance the dynamics of a major research project for the Climate, Ocean, and Sea Ice Modeling team at Los Alamos National Laboratory.

  10. GWVis: A Tool for Comparative Ground-Water Data Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Lewis, Robert R.

    2010-11-01

    The Ground-Water Visualization application (GWVis) presents ground-water data visually in order to educate the public on ground-water issues. It is also intended for presentations to government and other funding agencies. Current three dimensional models of ground-water are overly complex, while the two dimensional representations (i.e., on paper) are neither comprehensive, nor engaging. At present, GWVis operates on water head elevation data over a given time span, together with a matching (fixed) underlying geography. Two elevation scenarios are compared with each other, typically a control data set (actual field data) and a simulation. Scenario comparison can be animated for the timemore » span provided. We developed GWVis using the Python programming language, associated libraries, and pyOpenGL extension packages to improve performance and control of attributes of the mode (such as color, positioning, scale, and interpolation). GWVis bridges the gap between two dimensional and dynamic three dimensional research visualizations by providing an intuitive, interactive design that allows participants to view the model from different perspectives and to infer information about scenarios. By incorporating scientific data in an environment that can be easily understood, GWVis allows the information to be presented to a large audience base.« less

  11. The use of haptic interfaces and web services in crystallography: an application for a `screen to beam' interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruno, Andrew E.; Soares, Alexei S.; Owen, Robin L.

    Haptic interfaces have become common in consumer electronics. They enable easy interaction and information entry without the use of a mouse or keyboard. Our work illustrates the application of a haptic interface to crystallization screening in order to provide a natural means for visualizing and selecting results. By linking this to a cloud-based database and web-based application program interface, the same application shifts the approach from `point and click' to `touch and share', where results can be selected, annotated and discussed collaboratively. Furthermore, in the crystallographic application, given a suitable crystallization plate, beamline and robotic end effector, the resulting informationmore » can be used to close the loop between screening and X-ray analysis, allowing a direct and efficient `screen to beam' approach. The application is not limited to the area of crystallization screening; `touch and share' can be used by any information-rich scientific analysis and geographically distributed collaboration.« less

  12. The use of haptic interfaces and web services in crystallography: an application for a `screen to beam' interface

    DOE PAGES

    Bruno, Andrew E.; Soares, Alexei S.; Owen, Robin L.; ...

    2016-11-11

    Haptic interfaces have become common in consumer electronics. They enable easy interaction and information entry without the use of a mouse or keyboard. Our work illustrates the application of a haptic interface to crystallization screening in order to provide a natural means for visualizing and selecting results. By linking this to a cloud-based database and web-based application program interface, the same application shifts the approach from `point and click' to `touch and share', where results can be selected, annotated and discussed collaboratively. Furthermore, in the crystallographic application, given a suitable crystallization plate, beamline and robotic end effector, the resulting informationmore » can be used to close the loop between screening and X-ray analysis, allowing a direct and efficient `screen to beam' approach. The application is not limited to the area of crystallization screening; `touch and share' can be used by any information-rich scientific analysis and geographically distributed collaboration.« less

  13. Cognitive Foundations for Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.; Noonan, Christine F.; Franklin, Lyndsey

    In this report, we provide an overview of scientific/technical literature on information visualization and VA. Topics discussed include an update and overview of the extensive literature search conducted for this study, the nature and purpose of the field, major research thrusts, and scientific foundations. We review methodologies for evaluating and measuring the impact of VA technologies as well as taxonomies that have been proposed for various purposes to support the VA community. A cognitive science perspective underlies each of these discussions.

  14. The CAVE (TM) automatic virtual environment: Characteristics and applications

    NASA Technical Reports Server (NTRS)

    Kenyon, Robert V.

    1995-01-01

    Virtual reality may best be defined as the wide-field presentation of computer-generated, multi-sensory information that tracks a user in real time. In addition to the more well-known modes of virtual reality -- head-mounted displays and boom-mounted displays -- the Electronic Visualization Laboratory at the University of Illinois at Chicago recently introduced a third mode: a room constructed from large screens on which the graphics are projected on to three walls and the floor. The CAVE is a multi-person, room sized, high resolution, 3D video and audio environment. Graphics are rear projected in stereo onto three walls and the floor, and viewed with stereo glasses. As a viewer wearing a location sensor moves within its display boundaries, the correct perspective and stereo projections of the environment are updated, and the image moves with and surrounds the viewer. The other viewers in the CAVE are like passengers in a bus, along for the ride. 'CAVE,' the name selected for the virtual reality theater, is both a recursive acronym (Cave Automatic Virtual Environment) and a reference to 'The Simile of the Cave' found in Plato's 'Republic,' in which the philosopher explores the ideas of perception, reality, and illusion. Plato used the analogy of a person facing the back of a cave alive with shadows that are his/her only basis for ideas of what real objects are. Rather than having evolved from video games or flight simulation, the CAVE has its motivation rooted in scientific visualization and the SIGGRAPH 92 Showcase effort. The CAVE was designed to be a useful tool for scientific visualization. The Showcase event was an experiment; the Showcase chair and committee advocated an environment for computational scientists to interactively present their research at a major professional conference in a one-to-many format on high-end workstations attached to large projection screens. The CAVE was developed as a 'virtual reality theater' with scientific content and projection that met the criteria of Showcase.

  15. High-throughput neuroimaging-genetics computational infrastructure

    PubMed Central

    Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D.; Franco, Joseph; Toga, Arthur W.

    2014-01-01

    Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize diverse suites of software tools and web-services. These pipeline workflows are represented as portable XML objects which transfer the execution instructions and user specifications from the client user machine to remote pipeline servers for distributed computing. Using Alzheimer's and Parkinson's data, we provide several examples of translational applications using this infrastructure1. PMID:24795619

  16. The growth and evolution of cardiovascular magnetic resonance: a 20-year history of the Society for Cardiovascular Magnetic Resonance (SCMR) annual scientific sessions.

    PubMed

    Lee, Daniel C; Markl, Michael; Dall'Armellina, Erica; Han, Yuchi; Kozerke, Sebastian; Kuehne, Titus; Nielles-Vallespin, Sonia; Messroghli, Daniel; Patel, Amit; Schaeffter, Tobias; Simonetti, Orlando; Valente, Anne Marie; Weinsaft, Jonathan W; Wright, Graham; Zimmerman, Stefan; Schulz-Menger, Jeanette

    2018-01-31

    The purpose of this work is to summarize cardiovascular magnetic resonance (CMR) research trends and highlights presented at the annual Society for Cardiovascular Magnetic Resonance (SCMR) scientific sessions over the past 20 years. Scientific programs from all SCMR Annual Scientific Sessions from 1998 to 2017 were obtained. SCMR Headquarters also provided data for the number and the country of origin of attendees and the number of accepted abstracts according to type. Data analysis included text analysis (key word extraction) and visualization by 'word clouds' representing the most frequently used words in session titles for 5-year intervals. In addition, session titles were sorted into 17 major subject categories to further evaluate research and clinical CMR trends over time. Analysis of SCMR annual scientific sessions locations, attendance, and number of accepted abstracts demonstrated substantial growth of CMR research and clinical applications. As an international field of study, significant growth of CMR was documented by a strong increase in SCMR scientific session attendance (> 500%, 270 to 1406 from 1998 to 2017, number of accepted abstracts (> 700%, 98 to 701 from 1998 to 2018) and number of international participants (42-415% increase for participants from Asia, Central and South America, Middle East and Africa in 2004-2017). 'Word clouds' based evaluation of research trends illustrated a shift from early focus on 'MRI technique feasibility' to new established techniques (e.g. late gadolinium enhancement) and their clinical applications and translation (key words 'patient', 'disease') and more recently novel techniques and quantitative CMR imaging (key words 'mapping', 'T1', 'flow', 'function'). Nearly every topic category demonstrated an increase in the number of sessions over the 20-year period with 'Clinical Practice' leading all categories. Our analysis identified three growth areas 'Congenital', 'Clinical Practice', and 'Structure/function/flow'. The analysis of the SCMR historical archives demonstrates a healthy and internationally active field of study which continues to undergo substantial growth and expansion into new and emerging CMR topics and clinical application areas.

  17. Phytoremediation in education: textile dye teaching experiments.

    PubMed

    Ibbini, Jwan H; Davis, Lawrence C; Erickson, Larry E

    2009-07-01

    Phytoremediation, the use of plants to clean up contaminated soil and water, has a wide range of applications and advantages, and can be extended to scientific education. Phytoremediation of textile dyes can be used as a scientific experiment or demonstration in teaching laboratories of middle school, high school and college students. In the experiments that we developed, students were involved in a hands-on activity where they were able to learn about phytoremediation concepts. Experiments were set up with 20-40 mg L(-1) dye solutions of different colors. Students can be involved in the set up process and may be involved in the experimental design. In its simplest forms, they use two-week-old sunflower seedlings and place them into a test tube of known volume of dye solution. Color change and/or dye disappearance can be monitored by visual comparison or with a spectrophotometer. Intensity and extent of the lab work depends on student's educational level, and time constraints. Among the many dyes tested, Evan's Blue proved to be the most readily decolorized azo dye. Results could be observed within 1-2 hours. From our experience, dye phytoremediation experiments are suitable and easy to understand by both college and middle school students. These experiments help visual learners, as students compare the color of the dye solution before and after the plant application. In general, simple phytoremediation experiments of this kind can be introduced in many classes including biology, biochemistry and ecological engineering. This paper presents success stories of teaching phytoremediation to middle school and college students.

  18. Advances in Multi-Sensor Scanning and Visualization of Complex Plants: the Utmost Case of a Reactor Building

    NASA Astrophysics Data System (ADS)

    Hullo, J.-F.; Thibault, G.; Boucheny, C.

    2015-02-01

    In a context of increased maintenance operations and workers generational renewal, a nuclear owner and operator like Electricité de France (EDF) is interested in the scaling up of tools and methods of "as-built virtual reality" for larger buildings and wider audiences. However, acquisition and sharing of as-built data on a large scale (large and complex multi-floored buildings) challenge current scientific and technical capacities. In this paper, we first present a state of the art of scanning tools and methods for industrial plants with very complex architecture. Then, we introduce the inner characteristics of the multi-sensor scanning and visualization of the interior of the most complex building of a power plant: a nuclear reactor building. We introduce several developments that made possible a first complete survey of such a large building, from acquisition, processing and fusion of multiple data sources (3D laser scans, total-station survey, RGB panoramic, 2D floor plans, 3D CAD as-built models). In addition, we present the concepts of a smart application developed for the painless exploration of the whole dataset. The goal of this application is to help professionals, unfamiliar with the manipulation of such datasets, to take into account spatial constraints induced by the building complexity while preparing maintenance operations. Finally, we discuss the main feedbacks of this large experiment, the remaining issues for the generalization of such large scale surveys and the future technical and scientific challenges in the field of industrial "virtual reality".

  19. TIES that BIND: an introduction to domain mapping as a visualization tool for virtual rehabilitation.

    PubMed

    Weiss, Patrice L Tamar; Kedar, Rochelle; Shahar, Meir

    2006-04-01

    The application of virtual reality (VR) to rehabilitation is a young, interdisciplinary field where clinical implementation very rapidly follows scientific discovery and technological advancement. Implementation is often so rapid that demonstration of intervention efficacy by investigators, and establishment of research and development priorities by funding bodies tend to be more reactive than proactive. An examination of the dynamic unfolding of the history of our young discipline may help us recognize the facilitators of current practice and identify the barriers that limit greater progress. This paper presents a first step towards the examination of the past and future growth of VR-based rehabilitation by presenting the use of concept maps to explore the publication history of application of VR to rehabilitation.

  20. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  1. Driving with Bioptic Telescopes: Organizing a Research Agenda

    PubMed Central

    Owsley, Cynthia

    2012-01-01

    Being a licensed driver in the U. S. and many other countries facilitates health and well-being. Based on the vision standards in most states, individuals with worse than 20/40 visual acuity who desire licensure are denied through the usual licensure application process. However, over 40 states have bioptic telescope licensing programs where applicants can gain licensure contingent on meeting specific requirements. In spite of the existence of the bioptic telescope and these licensing programs since the 1970s, there has been little rigorous scientific study of this topic. Here I offer an organizing perspective for a research agenda on driving with bioptic telescopes, with the long term practical goal being to provide an evidence basis for licensure policies and training programs. PMID:22863791

  2. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Papka, M.; Messina, P.; Coffey, R.

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursormore » to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to implement those algorithms. The Data Analytics and Visualization Team lends expertise in tools and methods for high-performance, post-processing of large datasets, interactive data exploration, batch visualization, and production visualization. The Operations Team ensures that system hardware and software work reliably and optimally; system tools are matched to the unique system architectures and scale of ALCF resources; the entire system software stack works smoothly together; and I/O performance issues, bug fixes, and requests for system software are addressed. The User Services and Outreach Team offers frontline services and support to existing and potential ALCF users. The team also provides marketing and outreach to users, DOE, and the broader community.« less

  3. Can Dynamic Visualizations Improve Middle School Students' Understanding of Energy in Photosynthesis?

    ERIC Educational Resources Information Center

    Ryoo, Kihyun; Linn, Marcia C.

    2012-01-01

    Dynamic visualizations have the potential to make abstract scientific phenomena more accessible and visible to students, but they can also be confusing and difficult to comprehend. This research investigates how dynamic visualizations, compared to static illustrations, can support middle school students in developing an integrated understanding of…

  4. FUn: a framework for interactive visualizations of large, high-dimensional datasets on the web.

    PubMed

    Probst, Daniel; Reymond, Jean-Louis

    2018-04-15

    During the past decade, big data have become a major tool in scientific endeavors. Although statistical methods and algorithms are well-suited for analyzing and summarizing enormous amounts of data, the results do not allow for a visual inspection of the entire data. Current scientific software, including R packages and Python libraries such as ggplot2, matplotlib and plot.ly, do not support interactive visualizations of datasets exceeding 100 000 data points on the web. Other solutions enable the web-based visualization of big data only through data reduction or statistical representations. However, recent hardware developments, especially advancements in graphical processing units, allow for the rendering of millions of data points on a wide range of consumer hardware such as laptops, tablets and mobile phones. Similar to the challenges and opportunities brought to virtually every scientific field by big data, both the visualization of and interaction with copious amounts of data are both demanding and hold great promise. Here we present FUn, a framework consisting of a client (Faerun) and server (Underdark) module, facilitating the creation of web-based, interactive 3D visualizations of large datasets, enabling record level visual inspection. We also introduce a reference implementation providing access to SureChEMBL, a database containing patent information on more than 17 million chemical compounds. The source code and the most recent builds of Faerun and Underdark, Lore.js and the data preprocessing toolchain used in the reference implementation, are available on the project website (http://doc.gdb.tools/fun/). daniel.probst@dcb.unibe.ch or jean-louis.reymond@dcb.unibe.ch.

  5. Analysis, Mining and Visualization Service at NCSA

    NASA Astrophysics Data System (ADS)

    Wilhelmson, R.; Cox, D.; Welge, M.

    2004-12-01

    NCSA's goal is to create a balanced system that fully supports high-end computing as well as: 1) high-end data management and analysis; 2) visualization of massive, highly complex data collections; 3) large databases; 4) geographically distributed Grid computing; and 5) collaboratories, all based on a secure computational environment and driven with workflow-based services. To this end NCSA has defined a new technology path that includes the integration and provision of cyberservices in support of data analysis, mining, and visualization. NCSA has begun to develop and apply a data mining system-NCSA Data-to-Knowledge (D2K)-in conjunction with both the application and research communities. NCSA D2K will enable the formation of model-based application workflows and visual programming interfaces for rapid data analysis. The Java-based D2K framework, which integrates analytical data mining methods with data management, data transformation, and information visualization tools, will be configurable from the cyberservices (web and grid services, tools, ..) viewpoint to solve a wide range of important data mining problems. This effort will use modules, such as a new classification methods for the detection of high-risk geoscience events, and existing D2K data management, machine learning, and information visualization modules. A D2K cyberservices interface will be developed to seamlessly connect client applications with remote back-end D2K servers, providing computational resources for data mining and integration with local or remote data stores. This work is being coordinated with SDSC's data and services efforts. The new NCSA Visualization embedded workflow environment (NVIEW) will be integrated with D2K functionality to tightly couple informatics and scientific visualization with the data analysis and management services. Visualization services will access and filter disparate data sources, simplifying tasks such as fusing related data from distinct sources into a coherent visual representation. This approach enables collaboration among geographically dispersed researchers via portals and front-end clients, and the coupling with data management services enables recording associations among datasets and building annotation systems into visualization tools and portals, giving scientists a persistent, shareable, virtual lab notebook. To facilitate provision of these cyberservices to the national community, NCSA will be providing a computational environment for large-scale data assimilation, analysis, mining, and visualization. This will be initially implemented on the new 512 processor shared memory SGI's recently purchased by NCSA. In addition to standard batch capabilities, NCSA will provide on-demand capabilities for those projects requiring rapid response (e.g., development of severe weather, earthquake events) for decision makers. It will also be used for non-sequential interactive analysis of data sets where it is important have access to large data volumes over space and time.

  6. Fostering Outreach, Education and Exploration of the Moon Using the Lunar Mapping & Modeling Portal

    NASA Astrophysics Data System (ADS)

    Dodge, K.; Law, E.; Malhotra, S.; Chang, G.; Kim, R. M.; Bui, B.; Sadaqathullah, S.; Day, B. H.

    2014-12-01

    The Lunar Mapping and Modeling Portal (LMMP)[1], is a web-based Portal and a suite of interactive visualization and analysis tools for users to access mapped lunar data products (including image mosaics, digital elevation models, etc.) from past and current lunar missions (e.g., Lunar Reconnaissance Orbiter, Apollo, etc.). Originally designed as a mission planning tool for the Constellation Program, LMMP has grown into a generalized suite of tools facilitating a wide range of activities in support of lunar exploration including public outreach, education, lunar mission planning and scientific research. LMMP fosters outreach, education, and exploration of the Moon by educators, students, amateur astronomers, and the general public. These efforts are enhanced by Moon Tours, LMMP's mobile application, which makes LMMP's information accessible to people of all ages, putting opportunities for real lunar exploration in the palms of their hands. Our talk will include an overview of LMMP and a demonstration of its technologies (web portals, mobile apps), to show how it serves NASA data as commodities for use by advanced visualization facilities (e.g., planetariums) and how it contributes to improving teaching and learning, increasing scientific literacy of the general public, and enriching STEM efforts. References:[1] http://www.lmmp.nasa.gov

  7. The VIMS Data Explorer: A tool for locating and visualizing hyperspectral data

    NASA Astrophysics Data System (ADS)

    Pasek, V. D.; Lytle, D. M.; Brown, R. H.

    2016-12-01

    Since successfully entering Saturn's orbit during Summer 2004 there have been over 300,000 hyperspectral data cubes returned from the visible and infrared mapping spectrometer (VIMS) instrument onboard the Cassini spacecraft. The VIMS Science Investigation is a multidisciplinary effort that uses these hyperspectral data to study a variety of scientific problems, including surface characterizations of the icy satellites and atmospheric analyses of Titan and Saturn. Such investigations may need to identify thousands of exemplary data cubes for analysis and can span many years in scope. Here we describe the VIMS data explorer (VDE) application, currently employed by the VIMS Investigation to search for and visualize data. The VDE application facilitates real-time inspection of the entire VIMS hyperspectral dataset, the construction of in situ maps, and markers to save and recall work. The application relies on two databases to provide comprehensive search capabilities. The first database contains metadata for every cube. These metadata searches are used to identify records based on parameters such as target, observation name, or date taken; they fall short in utility for some investigations. The cube metadata contains no target geometry information. Through the introduction of a post-calibration pixel database, the VDE tool enables users to greatly expand their searching capabilities. Users can select favorable cubes for further processing into 2-D and 3-D interactive maps, aiding in the data interpretation and selection process. The VDE application enables efficient search, visualization, and access to VIMS hyperspectral data. It is simple to use, requiring nothing more than a browser for access. Hyperspectral bands can be individually selected or combined to create real-time color images, a technique commonly employed by hyperspectral researchers to highlight compositional differences.

  8. Workshop on Molecular Animation

    PubMed Central

    Bromberg, Sarina; Chiu, Wah; Ferrin, Thomas E.

    2011-01-01

    Summary February 25–26, 2010, in San Francisco, the Resource for Biocomputing, Visualization and Informatics (RBVI) and the National Center for Macromolecular Imaging (NCMI) hosted a molecular animation workshop for 21 structural biologists, molecular animators, and creators of molecular visualization software. Molecular animation aims to visualize scientific understanding of biomolecular processes and structures. The primary goal of the workshop was to identify the necessary tools for: producing high quality molecular animations, understanding complex molecular and cellular structures, creating publication supplementary materials and conference presentations, and teaching science to students and the public. Another use of molecular animation emerged in the workshop: helping to focus scientific inquiry about the motions of molecules and enhancing informal communication within and between laboratories. PMID:20947014

  9. Visualizing time: how linguistic metaphors are incorporated into displaying instruments in the process of interpreting time-varying signals

    NASA Astrophysics Data System (ADS)

    Garcia-Belmonte, Germà

    2017-06-01

    Spatial visualization is a well-established topic of education research that has allowed improving science and engineering students' skills on spatial relations. Connections have been established between visualization as a comprehension tool and instruction in several scientific fields. Learning about dynamic processes mainly relies upon static spatial representations or images. Visualization of time is inherently problematic because time can be conceptualized in terms of two opposite conceptual metaphors based on spatial relations as inferred from conventional linguistic patterns. The situation is particularly demanding when time-varying signals are recorded using displaying electronic instruments, and the image should be properly interpreted. This work deals with the interplay between linguistic metaphors, visual thinking and scientific instrument mediation in the process of interpreting time-varying signals displayed by electronic instruments. The analysis draws on a simplified version of a communication system as example of practical signal recording and image visualization in a physics and engineering laboratory experience. Instrumentation delivers meaningful signal representations because it is designed to incorporate a specific and culturally favored time view. It is suggested that difficulties in interpreting time-varying signals are linked with the existing dual perception of conflicting time metaphors. The activation of specific space-time conceptual mapping might allow for a proper signal interpretation. Instruments play then a central role as visualization mediators by yielding an image that matches specific perception abilities and practical purposes. Here I have identified two ways of understanding time as used in different trajectories through which students are located. Interestingly specific displaying instruments belonging to different cultural traditions incorporate contrasting time views. One of them sees time in terms of a dynamic metaphor consisting of a static observer looking at passing events. This is a general and widespread practice common in the contemporary mass culture, which lies behind the process of making sense to moving images usually visualized by means of movie shots. In contrast scientific culture favored another way of time conceptualization (static time metaphor) that historically fostered the construction of graphs and the incorporation of time-dependent functions, as represented on the Cartesian plane, into displaying instruments. Both types of cultures, scientific and mass, are considered highly technological in the sense that complex instruments, apparatus or machines participate in their visual practices.

  10. Prioritizing Scientific Data for Transmission

    NASA Technical Reports Server (NTRS)

    Castano, Rebecca; Anderson, Robert; Estlin, Tara; DeCoste, Dennis; Gaines, Daniel; Mazzoni, Dominic; Fisher, Forest; Judd, Michele

    2004-01-01

    A software system has been developed for prioritizing newly acquired geological data onboard a planetary rover. The system has been designed to enable efficient use of limited communication resources by transmitting the data likely to have the most scientific value. This software operates onboard a rover by analyzing collected data, identifying potential scientific targets, and then using that information to prioritize data for transmission to Earth. Currently, the system is focused on the analysis of acquired images, although the general techniques are applicable to a wide range of data modalities. Image prioritization is performed using two main steps. In the first step, the software detects features of interest from each image. In its current application, the system is focused on visual properties of rocks. Thus, rocks are located in each image and rock properties, such as shape, texture, and albedo, are extracted from the identified rocks. In the second step, the features extracted from a group of images are used to prioritize the images using three different methods: (1) identification of key target signature (finding specific rock features the scientist has identified as important), (2) novelty detection (finding rocks we haven t seen before), and (3) representative rock sampling (finding the most average sample of each rock type). These methods use techniques such as K-means unsupervised clustering and a discrimination-based kernel classifier to rank images based on their interest level.

  11. Smartphone schlieren and shadowgraph imaging

    NASA Astrophysics Data System (ADS)

    Settles, Gary S.

    2018-05-01

    Schlieren and shadowgraph techniques are used throughout the realm of scientific experimentation to reveal transparent refractive phenomena, but the requirement of large precise optics has kept them mostly out of reach of the public. New developments, including the ubiquity of smartphones with high-resolution digital cameras and the Background-Oriented Schlieren technique (BOS), which replaces the precise optics with digital image processing, have changed these circumstances. This paper demonstrates a number of different schlieren and shadowgraph setups and image examples based only on a smartphone, its software applications, and some inexpensive accessories. After beginning with a simple traditional schlieren system the emphasis is placed on what can be visualized and measured using BOS and digital slit-scan imaging on the smartphone. Thermal plumes, liquid mixing and glass are used as subjects of investigation. Not only recreational and experimental photography, but also serious scientific imaging can be done.

  12. Complementary and Alternative Medicine for Posttraumatic Stress Disorder Symptoms: A Systematic Review.

    PubMed

    Wahbeh, Helané; Senders, Angela; Neuendorf, Rachel; Cayton, Julien

    2014-07-01

    To (1) characterize complementary and alternative medicine studies for posttraumatic stress disorder symptoms, (2) evaluate the quality of these studies, and (3) systematically grade the scientific evidence for individual CAM modalities for posttraumatic stress disorder. Systematic review. Eight data sources were searched. Selection criteria included any study design assessing posttraumatic stress disorder outcomes and any complementary and alternative medicine intervention. The body of evidence for each modality was assessed with the Natural Standard evidence-based, validated grading rationale. Thirty-three studies (n = 1329) were reviewed. Scientific evidence of benefit for posttraumatic stress disorder was strong for repetitive transcranial magnetic stimulation and good for acupuncture, hypnotherapy, meditation, and visualization. Evidence was unclear or conflicting for biofeedback, relaxation, Emotional Freedom and Thought Field therapies, yoga, and natural products. Considerations for clinical applications and future research recommendations are discussed. © The Author(s) 2014.

  13. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-02-01

    The System for Automated Geoscientific Analyses (SAGA) is an open-source Geographic Information System (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular organized software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, an easily approachable graphical user interface with many visualization options, a command line interpreter, and interfaces to scripting and low level programming languages like R and Python. The current version 2.1.4 offers more than 700 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Further, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  14. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-07-01

    The System for Automated Geoscientific Analyses (SAGA) is an open source geographic information system (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, a user friendly graphical user interface with many visualization options, a command line interpreter, and interfaces to interpreted languages like R and Python. The current version 2.1.4 offers more than 600 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Furthermore, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies, with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  15. A survey of GPU-based medical image computing techniques

    PubMed Central

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming

    2012-01-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  16. ePMV embeds molecular modeling into professional animation software environments.

    PubMed

    Johnson, Graham T; Autin, Ludovic; Goodsell, David S; Sanner, Michel F; Olson, Arthur J

    2011-03-09

    Increasingly complex research has made it more difficult to prepare data for publication, education, and outreach. Many scientists must also wade through black-box code to interface computational algorithms from diverse sources to supplement their bench work. To reduce these barriers we have developed an open-source plug-in, embedded Python Molecular Viewer (ePMV), that runs molecular modeling software directly inside of professional 3D animation applications (hosts) to provide simultaneous access to the capabilities of these newly connected systems. Uniting host and scientific algorithms into a single interface allows users from varied backgrounds to assemble professional quality visuals and to perform computational experiments with relative ease. By enabling easy exchange of algorithms, ePMV can facilitate interdisciplinary research, smooth communication between broadly diverse specialties, and provide a common platform to frame and visualize the increasingly detailed intersection(s) of cellular and molecular biology. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. ePMV Embeds Molecular Modeling into Professional Animation Software Environments

    PubMed Central

    Johnson, Graham T.; Autin, Ludovic; Goodsell, David S.; Sanner, Michel F.; Olson, Arthur J.

    2011-01-01

    SUMMARY Increasingly complex research has made it more difficult to prepare data for publication, education, and outreach. Many scientists must also wade through black-box code to interface computational algorithms from diverse sources to supplement their bench work. To reduce these barriers, we have developed an open-source plug-in, embedded Python Molecular Viewer (ePMV), that runs molecular modeling software directly inside of professional 3D animation applications (hosts) to provide simultaneous access to the capabilities of these newly connected systems. Uniting host and scientific algorithms into a single interface allows users from varied backgrounds to assemble professional quality visuals and to perform computational experiments with relative ease. By enabling easy exchange of algorithms, ePMV can facilitate interdisciplinary research, smooth communication between broadly diverse specialties and provide a common platform to frame and visualize the increasingly detailed intersection(s) of cellular and molecular biology. PMID:21397181

  18. WebGL-enabled 3D visualization of a Solar Flare Simulation

    NASA Astrophysics Data System (ADS)

    Chen, A.; Cheung, C. M. M.; Chintzoglou, G.

    2016-12-01

    The visualization of magnetohydrodynamic (MHD) simulations of astrophysical systems such as solar flares often requires specialized software packages (e.g. Paraview and VAPOR). A shortcoming of using such software packages is the inability to share our findings with the public and scientific community in an interactive and engaging manner. By using the javascript-based WebGL application programming interface (API) and the three.js javascript package, we create an online in-browser experience for rendering solar flare simulations that will be interactive and accessible to the general public. The WebGL renderer displays objects such as vector flow fields, streamlines and textured isosurfaces. This allows the user to explore the spatial relation between the solar coronal magnetic field and the thermodynamic structure of the plasma in which the magnetic field is embedded. Plans for extending the features of the renderer will also be presented.

  19. SOCR "Motion Charts": An Efficient, Open-Source, Interactive and Dynamic Applet for Visualizing Longitudinal Multivariate Data

    ERIC Educational Resources Information Center

    Al-Aziz, Jameel; Christou, Nicolas; Dinov, Ivo D.

    2010-01-01

    The amount, complexity and provenance of data have dramatically increased in the past five years. Visualization of observed and simulated data is a critical component of any social, environmental, biomedical or scientific quest. Dynamic, exploratory and interactive visualization of multivariate data, without preprocessing by dimensionality…

  20. Visualization as an Aid to Problem-Solving: Examples from History.

    ERIC Educational Resources Information Center

    Rieber, Lloyd P.

    This paper presents a historical overview of visualization as a human problem-solving tool. Visualization strategies, such as mental imagery, pervade historical accounts of scientific discovery and invention. A selected number of historical examples are presented and discussed on a wide range of topics such as physics, aviation, and the science of…

  1. 76 FR 51914 - Duty-Free Treatment of Certain Visual and Auditory Materials

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-19

    ...(b) that the proposed amendments, if adopted, will not have a significant economic impact on a... CFR Parts 10 and 163 [USCBP-2011-0030] RIN 1515-AD75 Duty-Free Treatment of Certain Visual and... required for duty-free treatment of certain visual and auditory materials of an educational, scientific, or...

  2. Accommodating Scientific Illiteracy: Award-Winning Visualizations on the Covers of "Science"

    ERIC Educational Resources Information Center

    Gigante, Maria E.

    2012-01-01

    The International Science and Engineering Visualization Challenge, recently established by the National Science Foundation (NSF), is an alleged attempt at public outreach. The NSF encourages scientists to submit visualizations that would appeal to non-expert audiences by displaying their work in an annual "special feature" in "Science" magazine,…

  3. Art-Science-Technology collaboration through immersive, interactive 3D visualization

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2014-12-01

    At the W. M. Keck Center for Active Visualization in Earth Sciences (KeckCAVES), a group of geoscientists and computer scientists collaborate to develop and use of interactive, immersive, 3D visualization technology to view, manipulate, and interpret data for scientific research. The visual impact of immersion in a CAVE environment can be extremely compelling, and from the outset KeckCAVES scientists have collaborated with artists to bring this technology to creative works, including theater and dance performance, installations, and gamification. The first full-fledged collaboration designed and produced a performance called "Collapse: Suddenly falling down", choreographed by Della Davidson, which investigated the human and cultural response to natural and man-made disasters. Scientific data (lidar scans of disaster sites, such as landslides and mine collapses) were fully integrated into the performance by the Sideshow Physical Theatre. This presentation will discuss both the technological and creative characteristics of, and lessons learned from the collaboration. Many parallels between the artistic and scientific process emerged. We observed that both artists and scientists set out to investigate a topic, solve a problem, or answer a question. Refining that question or problem is an essential part of both the creative and scientific workflow. Both artists and scientists seek understanding (in this case understanding of natural disasters). Differences also emerged; the group noted that the scientists sought clarity (including but not limited to quantitative measurements) as a means to understanding, while the artists embraced ambiguity, also as a means to understanding. Subsequent art-science-technology collaborations have responded to evolving technology for visualization and include gamification as a means to explore data, and use of augmented reality for informal learning in museum settings.

  4. Venus Quadrangle Geological Mapping: Use of Geoscience Data Visualization Systems in Mapping and Training

    NASA Technical Reports Server (NTRS)

    Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil

    2008-01-01

    We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].

  5. Image Analysis via Fuzzy-Reasoning Approach: Prototype Applications at NASA

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steven J.

    2004-01-01

    A set of imaging techniques based on Fuzzy Reasoning (FR) approach was built for NASA at Kennedy Space Center (KSC) to perform complex real-time visual-related safety prototype tasks, such as detection and tracking of moving Foreign Objects Debris (FOD) during the NASA Space Shuttle liftoff and visual anomaly detection on slidewires used in the emergency egress system for Space Shuttle at the launch pad. The system has also proved its prospective in enhancing X-ray images used to screen hard-covered items leading to a better visualization. The system capability was used as well during the imaging analysis of the Space Shuttle Columbia accident. These FR-based imaging techniques include novel proprietary adaptive image segmentation, image edge extraction, and image enhancement. Probabilistic Neural Network (PNN) scheme available from NeuroShell(TM) Classifier and optimized via Genetic Algorithm (GA) was also used along with this set of novel imaging techniques to add powerful learning and image classification capabilities. Prototype applications built using these techniques have received NASA Space Awards, including a Board Action Award, and are currently being filed for patents by NASA; they are being offered for commercialization through the Research Triangle Institute (RTI), an internationally recognized corporation in scientific research and technology development. Companies from different fields, including security, medical, text digitalization, and aerospace, are currently in the process of licensing these technologies from NASA.

  6. Data visualization in interactive maps and time series

    NASA Astrophysics Data System (ADS)

    Maigne, Vanessa; Evano, Pascal; Brockmann, Patrick; Peylin, Philippe; Ciais, Philippe

    2014-05-01

    State-of-the-art data visualization has nothing to do with plots and maps we used few years ago. Many opensource tools are now available to provide access to scientific data and implement accessible, interactive, and flexible web applications. Here we will present a web site opened November 2013 to create custom global and regional maps and time series from research models and datasets. For maps, we explore and get access to data sources from a THREDDS Data Server (TDS) with the OGC WMS protocol (using the ncWMS implementation) then create interactive maps with the OpenLayers javascript library and extra information layers from a GeoServer. Maps become dynamic, zoomable, synchroneaously connected to each other, and exportable to Google Earth. For time series, we extract data from a TDS with the Netcdf Subset Service (NCSS) then display interactive graphs with a custom library based on the Data Driven Documents javascript library (D3.js). This time series application provides dynamic functionalities such as interpolation, interactive zoom on different axes, display of point values, and export to different formats. These tools were implemented for the Global Carbon Atlas (http://www.globalcarbonatlas.org): a web portal to explore, visualize, and interpret global and regional carbon fluxes from various model simulations arising from both human activities and natural processes, a work led by the Global Carbon Project.

  7. The Heliosphere in Space

    NASA Astrophysics Data System (ADS)

    Frisch, P. C.; Hanson, A. J.; Fu, P. C.

    2008-12-01

    A scientifically accurate visualization of the Journey of the Sun through deep space has been created in order to share the excitement of heliospheric physics and scientific discovery with the non-expert. The MHD heliosphere model of Linde (1998) displays the interaction of the solar wind with the interstellar medium for a supersonic heliosphere traveling through a low density magnetized interstellar medium. The camera viewpoint follows the solar motion through a virtual space of the Milky Way Galaxy. This space is constructed from real data placed in the three-dimensional solar neighborhood, and populated with Hipparcos stars in front of a precisely aligned image of the Milky Way itself. The celestial audio track of this three minute movie includes the music of the heliosphere, heard by the two Voyager satellites as 3 kHz emissions from the edge of the heliosphere. This short heliosphere visualization can be downloaded from http://www.cs.indiana.edu/~soljourn/pub/AstroBioScene7Sound.mov, and the full scientific data visualization of the Solar Journey is available commercially.

  8. Scientific Visualization Tools for Enhancement of Undergraduate Research

    NASA Astrophysics Data System (ADS)

    Rodriguez, W. J.; Chaudhury, S. R.

    2001-05-01

    Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable four-dimensional interactive environment. These tools allow students to make higher order decisions based on large multidimensional sets of data while diminishing the level of frustration that results from dealing with the details of processing large data sets.

  9. Reusable Client-Side JavaScript Modules for Immersive Web-Based Real-Time Collaborative Neuroimage Visualization

    PubMed Central

    Bernal-Rusiel, Jorge L.; Rannou, Nicolas; Gollub, Randy L.; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E.; Pienaar, Rudolph

    2017-01-01

    In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView, a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution. PMID:28507515

  10. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  11. Ecoacoustic Music for Geoscience: Sonic Physiographies and Sound Casting

    NASA Astrophysics Data System (ADS)

    Burtner, M.

    2017-12-01

    The author describes specific ecoacoustic applications in his original compositions, Sonic Physiography of a Time-Stretched Glacier (2015), Catalog of Roughness (2017), Sound Cast of Matanuska Glacier (2016) and Ecoacoustic Concerto (Eagle Rock) (2014). Ecoacoustic music uses technology to map systems from nature into music through techniques such as sonification, material amplification, and field recording. The author aspires for this music to be descriptive of the data (as one would expect from a visualization) and also to function as engaging and expressive music/sound art on its own. In this way, ecoacoustic music might provide a fitting accompaniment to a scientific presentation (such as music for a science video) while also offering an exemplary concert hall presentation for a dedicated listening public. The music can at once support the communication of scientific research, and help science make inroads into culture. The author discusses how music created using the data, sounds and methods derived from earth science can recast this research into a sonic art modality. Such music can amplify the communication and dissemination of scientific knowledge by broadening the diversity of methods and formats we use to bring excellent scientific research to the public. Music can also open the public's imagination to science, inspiring curiosity and emotional resonance. Hearing geoscience as music may help a non-scientist access scientific knowledge in new ways, and it can greatly expand the types of venues in which this work can appear. Anywhere music is played - concert halls, festivals, galleries, radio, etc - become a venue for scientific discovery.

  12. The Astronomy Workshop: Scientific Notation and Solar System Visualizer

    NASA Astrophysics Data System (ADS)

    Deming, Grace; Hamilton, D.; Hayes-Gehrke, M.

    2008-09-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive World Wide Web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes and by the general public. The philosophy of the site is to foster student interest in astronomy by exploiting their fascination with computers and the internet. We have expanded the "Scientific Notation” tool from simply converting decimal numbers into and out of scientific notation to adding, subtracting, multiplying, and dividing numbers expressed in scientific notation. Students practice these skills and when confident they may complete a quiz. In addition, there are suggestions on how instructors may use the site to encourage students to practice these basic skills. The Solar System Visualizer animates orbits of planets, moons, and rings to scale. Extrasolar planetary systems are also featured. This research was sponsored by NASA EPO grant NNG06GGF99G.

  13. Interactive access and management for four-dimensional environmental data sets using McIDAS

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Tripoli, Gregory J.

    1995-01-01

    This grant has fundamentally changed the way that meteorologists look at the output of their atmospheric models, through the development and wide distribution of the Vis5D system. The Vis5D system is also gaining acceptance among oceanographers and atmospheric chemists. Vis5D gives these scientists an interactive three-dimensional movie of their very large data sets that they can use to understand physical mechanisms and to trace problems to their sources. This grant has also helped to define the future direction of scientific visualization through the development of the VisAD system and its lattice data model. The VisAD system can be used to interactively steer and visualize scientific computations. A key element of this capability is the flexibility of the system's data model to adapt to a wide variety of scientific data, including the integration of several forms of scientific metadata.

  14. Multi-Spacecraft Analysis with Generic Visualization Tools

    NASA Astrophysics Data System (ADS)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  15. Mobile Application Removes Societal Barriers to P4 Medicine.

    PubMed

    Michel, J-P

    2017-01-01

    The overlap between one innovative paradigm (P4 medicine: predictive, personalized, participatory and preventive) and another (a new definition of "Healthy ageing") is fertile ground for new technologies; a new mobile application (app) that could broaden our scientific knowledge of the ageing process and help us to better analyse the impact of possible interventions in slowing the ageing decline. A novel mobile application is here presented as a game including questions and tests will allow in 10 minutes the assessment of the following domains: robustness, flexibility (lower muscle strength), balance, mental and memory complaints, semantic memory and visual retention. This game is completed by specific measurements, which could allow establishing precise information on functional and cognitive abilities. A global evaluation precedes advice and different types of exercises. The repetition of the tests and measures will allow a long follow up of the individual performances which could be shared (on specific request) with family members and general practitioners.

  16. When a Picture Isn't Worth 1000 Words: Learners Struggle to Find Meaning in Data Visualizations

    ERIC Educational Resources Information Center

    Stofer, Kathryn A.

    2016-01-01

    The oft-repeated phrase "a picture is worth a thousand words" supposes that an image can replace a profusion of words to more easily express complex ideas. For scientific visualizations that represent profusions of numerical data, however, an untranslated academic visualization suffers the same pitfalls untranslated jargon does. Previous…

  17. Visualizing Time: How Linguistic Metaphors Are Incorporated into Displaying Instruments in the Process of Interpreting Time-Varying Signals

    ERIC Educational Resources Information Center

    Garcia-Belmonte, Germà

    2017-01-01

    Spatial visualization is a well-established topic of education research that has allowed improving science and engineering students' skills on spatial relations. Connections have been established between visualization as a comprehension tool and instruction in several scientific fields. Learning about dynamic processes mainly relies upon static…

  18. Conceptual-level workflow modeling of scientific experiments using NMR as a case study

    PubMed Central

    Verdi, Kacy K; Ellis, Heidi JC; Gryk, Michael R

    2007-01-01

    Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment. PMID:17263870

  19. Conceptual-level workflow modeling of scientific experiments using NMR as a case study.

    PubMed

    Verdi, Kacy K; Ellis, Heidi Jc; Gryk, Michael R

    2007-01-30

    Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment.

  20. 76 FR 28441 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ...: Vision, Cognition and Pain. Date: June 29-30, 2011. Time: 8 a.m. to 6 p.m. Agenda: To review and evaluate...: Center for Scientific Review Special Emphasis Panel; Member Conflict: Cognition and Central Visual...

  1. Visual Analytics of integrated Data Systems for Space Weather Purposes

    NASA Astrophysics Data System (ADS)

    Rosa, Reinaldo; Veronese, Thalita; Giovani, Paulo

    Analysis of information from multiple data sources obtained through high resolution instrumental measurements has become a fundamental task in all scientific areas. The development of expert methods able to treat such multi-source data systems, with both large variability and measurement extension, is a key for studying complex scientific phenomena, especially those related to systemic analysis in space and environmental sciences. In this talk, we present a time series generalization introducing the concept of generalized numerical lattice, which represents a discrete sequence of temporal measures for a given variable. In this novel representation approach each generalized numerical lattice brings post-analytical data information. We define a generalized numerical lattice as a set of three parameters representing the following data properties: dimensionality, size and post-analytical measure (e.g., the autocorrelation, Hurst exponent, etc)[1]. From this representation generalization, any multi-source database can be reduced to a closed set of classified time series in spatiotemporal generalized dimensions. As a case study, we show a preliminary application in space science data, highlighting the possibility of a real time analysis expert system. In this particular application, we have selected and analyzed, using detrended fluctuation analysis (DFA), several decimetric solar bursts associated to X flare-classes. The association with geomagnetic activity is also reported. DFA method is performed in the framework of a radio burst automatic monitoring system. Our results may characterize the variability pattern evolution, computing the DFA scaling exponent, scanning the time series by a short windowing before the extreme event [2]. For the first time, the application of systematic fluctuation analysis for space weather purposes is presented. The prototype for visual analytics is implemented in a Compute Unified Device Architecture (CUDA) by using the K20 Nvidia graphics processing units (GPUs) to reduce the integrated analysis runtime. [1] Veronese et al. doi: 10.6062/jcis.2009.01.02.0021, 2010. [2] Veronese et al. doi:http://dx.doi.org/10.1016/j.jastp.2010.09.030, 2011.

  2. Design and Evaluation of Dedicated Smartphone Applications for Collaborative Science Education

    NASA Astrophysics Data System (ADS)

    Fertitta, John A., Jr.

    2011-12-01

    Over the past several years, the use of scientific probes is becoming more common in science classrooms. The goal of teaching with these science probes is to engage students in inquiry-based learning. However, they are often complicated and stationary, forcing experiments to remain in the classroom and limiting their use. The Internet System for Networked Sensor Experimentation (iSENSE) was created to address these limitations. iSENSE is a web-system for storing and visualizing sensor data. The project also includes a hardware package, the PINPoint, that interfaces to existing probes, and acts as a probe itself. As the mobile phone industry continues to advance, we are beginning to see smartphones that are just as powerful, if not more powerful, than many desktop computers. These devices are often equipped with advanced sensors, making them as capable as some science probes at a lower cost. With this background, this thesis explores the use of smartphones in secondary school science classrooms. By collaborating with one teacher, three custom applications were developed for four separate curriculum-based learning activities. The smartphones replaced existing traditional tools and science probes. Some data collected with the smartphones were uploaded to the iSENSE web-system for analysis. Student use of the smartphones and the subsequent scientific visualizations using the iSENSE web-system were observed. A teacher interview was conducted afterward. It was found that a collaborative design process involving the teacher resulted in the successful integration of smartphone applications into learning activities. In one case, the smartphones and use of iSENSE did not improve the students' understanding of the learning objectives. In several others, however, the smartphones out-performed traditional probeware as a data collector, and with the classroom teachers guidance, the iSENSE web-system facilitated more in-depth discussions of the data.

  3. Immersive visualization of rail simulation data.

    DOT National Transportation Integrated Search

    2016-01-01

    The prime objective of this project was to create scientific, immersive visualizations of a Rail-simulation. This project is a part of a larger initiative that consists of three distinct parts. The first step consists of performing a finite element a...

  4. Immersive Virtual Reality Technologies as a New Platform for Science, Scholarship, and Education

    NASA Astrophysics Data System (ADS)

    Djorgovski, Stanislav G.; Hut, P.; McMillan, S.; Knop, R.; Vesperini, E.; Graham, M.; Portegies Zwart, S.; Farr, W.; Mahabal, A.; Donalek, C.; Longo, G.

    2010-01-01

    Immersive virtual reality (VR) and virtual worlds (VWs) are an emerging set of technologies which likely represent the next evolutionary step in the ways we use information technology to interact with the world of information and with other people, the roles now generally fulfilled by the Web and other common Internet applications. Currently, these technologies are mainly accessed through various VWs, e.g., the Second Life (SL), which are general platforms for a broad range of user activities. As an experiment in the utilization of these technologies for science, scholarship, education, and public outreach, we have formed the Meta-Institute for Computational Astrophysics (MICA; http://mica-vw.org), the first professional scientific organization based exclusively in VWs. The goals of MICA are: (1) Exploration, development and promotion of VWs and VR technologies for professional research in astronomy and related fields. (2) Providing and developing novel social networking venues and mechanisms for scientific collaboration and communications, including professional meetings, effective telepresence, etc. (3) Use of VWs and VR technologies for education and public outreach. (4) Exchange of ideas and joint efforts with other scientific disciplines in promoting these goals for science and scholarship in general. To this effect, we have a regular schedule of professional and public outreach events in SL, including technical seminars, workshops, journal club, collaboration meetings, public lectures, etc. We find that these technologies are already remarkably effective as a telepresence platform for scientific and scholarly discussions, meetings, etc. They can offer substantial savings of time and resources, and eliminate a lot of unnecessary travel. They are equally effective as a public outreach platform, reaching a world-wide audience. On the pure research front, we are currently exploring the use of these technologies as a venue for numerical simulations and their visualization, as well as the immersive and interactive visualization of highly-dimensional data sets.

  5. GLOBE Program's Data and Information System

    NASA Astrophysics Data System (ADS)

    Memarsadeghi, N.; Overoye, D.; Lewis, C.; Butler, D. M.; Ramapriyan, H.

    2016-12-01

    "The Global Learning and Observations to Benefit the Environment (GLOBE) Program is an international science and education program that provides students and the public worldwide with the opportunity to participate in data collection and the scientific process, and contribute meaningfully to our understanding of the Earth system and global environment" (www.globe.gov ). GLOBE Program has a rich community of students, teachers, scientists, trainers, country coordinators, and alumni across the world, technologically spanning both high- and low-end users. There are 117 GLOBE participating countries from around the world. GLOBE's Science data protocols and educational material span atmosphere, biosphere, hydrosphere, soil (pedosphere), and Earth as a System scientific areas (http://www.globe.gov/do-globe/globe-teachers-guide). GLOBE's Data and Information System (DIS), when first introduced in 1995, was a cutting edge system that was well-received and innovative for its time. However, internet-based technologies have changed dramatically since then. Projects to modernize and evolve the GLOBE DIS started in 2010, resulting in today's GLOBE DIS. The current GLOBE DIS is now built upon the latest information technologies and is engaging and supporting the user community with advanced tools and services to further the goals of the GLOBE Program. GLOBE DIS consists of over 20 years of observation and training data, a rich set of software systems and applications for data entry, visualization, and analysis, as well as tools for training users in various science data protocols and enabling collaborations among members of the international user community. We present the existing GLOBE DIS, application technologies, and lessons learned for their operations, development, sustaining engineering, and data management practices. Examples of GLOBE DIS technologies include Liferay System for integrated user and content management, a Postgress/PostGIS database, Ruby on Rails for Data Entry systems, and OpenGeo for Visualization system.

  6. Shipboard Analytical Capabilities on the Renovated JOIDES Resolution, IODP Riserless Drilling Vessel

    NASA Astrophysics Data System (ADS)

    Blum, P.; Foster, P.; Houpt, D.; Bennight, C.; Brandt, L.; Cobine, T.; Crawford, W.; Fackler, D.; Fujine, K.; Hastedt, M.; Hornbacher, D.; Mateo, Z.; Moortgat, E.; Vasilyev, M.; Vasilyeva, Y.; Zeliadt, S.; Zhao, J.

    2008-12-01

    The JOIDES Resolution (JR) has conducted 121 scientific drilling expeditions during the Ocean Drilling Program (ODP) and the first phase of the Integrated Ocean Drilling Program (IODP) (1983-2006). The vessel and scientific systems have just completed an NSF-sponsored renovation (2005-2008). Shipboard analytical systems have been upgraded, within funding constraints imposed by market driven vessel conversion cost increases, to include: (1) enhanced shipboard analytical services including instruments and software for sampling and the capture of chemistry, physical properties, and geological data; (2) new data management capabilities built around a laboratory information management system (LIMS), digital asset management system, and web services; (3) operations data services with enhanced access to navigation and rig instrumentation data; and (4) a combination of commercial and home-made user applications for workflow- specific data extractions, generic and customized data reporting, and data visualization within a shipboard production environment. The instrumented data capture systems include a new set of core loggers for rapid and non-destructive acquisition of images and other physical properties data from drill cores. Line-scan imaging and natural gamma ray loggers capture data at unprecedented quality due to new and innovative designs. Many instruments used to characterize chemical compounds of rocks, sediments, and interstitial fluids were upgraded with the latest technology. The shipboard analytical environment features a new and innovative framework (DESCinfo) and application (DESClogik) for capturing descriptive and interpretive data from geological sub-domains such as sedimentology, petrology, paleontology, structural geology, stratigraphy, etc. This system fills a long-standing gap by providing a global database, controlled vocabularies and taxa name lists with version control, a highly configurable spreadsheet environment for data capture, and visualization of context data collected with the shipboard core loggers and other instruments.

  7. YODA++: A proposal for a semi-automatic space mission control

    NASA Astrophysics Data System (ADS)

    Casolino, M.; de Pascale, M. P.; Nagni, M.; Picozza, P.

    YODA++ is a proposal for a semi-automated data handling and analysis system for the PAMELA space experiment. The core of the routines have been developed to process a stream of raw data downlinked from the Resurs DK1 satellite (housing PAMELA) to the ground station in Moscow. Raw data consist of scientific data and are complemented by housekeeping information. Housekeeping information will be analyzed within a short time from download (1 h) in order to monitor the status of the experiment and to foreseen the mission acquisition planning. A prototype for the data visualization will run on an APACHE TOMCAT web application server, providing an off-line analysis tool using a browser and part of code for the system maintenance. Data retrieving development is in production phase, while a GUI interface for human friendly monitoring is on preliminary phase as well as a JavaServerPages/JavaServerFaces (JSP/JSF) web application facility. On a longer timescale (1 3 h from download) scientific data are analyzed. The data storage core will be a mix of CERNs ROOT files structure and MySQL as a relational database. YODA++ is currently being used in the integration and testing on ground of PAMELA data.

  8. Adapting line integral convolution for fabricating artistic virtual environment

    NASA Astrophysics Data System (ADS)

    Lee, Jiunn-Shyan; Wang, Chung-Ming

    2003-04-01

    Vector field occurs not only extensively in scientific applications but also in treasured art such as sculptures and paintings. Artist depicts our natural environment stressing valued directional feature besides color and shape information. Line integral convolution (LIC), developed for imaging vector field in scientific visualization, has potential of producing directional image. In this paper we present several techniques of exploring LIC techniques to generate impressionistic images forming artistic virtual environment. We take advantage of directional information given by a photograph, and incorporate many investigations to the work including non-photorealistic shading technique and statistical detail control. In particular, the non-photorealistic shading technique blends cool and warm colors into the photograph to imitate artists painting convention. Besides, we adopt statistical technique controlling integral length according to image variance to preserve details. Furthermore, we also propose method for generating a series of mip-maps, which revealing constant strokes under multi-resolution viewing and achieving frame coherence in an interactive walkthrough system. The experimental results show merits of emulating satisfyingly and computing efficiently, as a consequence, relying on the proposed technique successfully fabricates a wide category of non-photorealistic rendering (NPR) application such as interactive virtual environment with artistic perception.

  9. Wavelet-Based Interpolation and Representation of Non-Uniformly Sampled Spacecraft Mission Data

    NASA Technical Reports Server (NTRS)

    Bose, Tamal

    2000-01-01

    A well-documented problem in the analysis of data collected by spacecraft instruments is the need for an accurate, efficient representation of the data set. The data may suffer from several problems, including additive noise, data dropouts, an irregularly-spaced sampling grid, and time-delayed sampling. These data irregularities render most traditional signal processing techniques unusable, and thus the data must be interpolated onto an even grid before scientific analysis techniques can be applied. In addition, the extremely large volume of data collected by scientific instrumentation presents many challenging problems in the area of compression, visualization, and analysis. Therefore, a representation of the data is needed which provides a structure which is conducive to these applications. Wavelet representations of data have already been shown to possess excellent characteristics for compression, data analysis, and imaging. The main goal of this project is to develop a new adaptive filtering algorithm for image restoration and compression. The algorithm should have low computational complexity and a fast convergence rate. This will make the algorithm suitable for real-time applications. The algorithm should be able to remove additive noise and reconstruct lost data samples from images.

  10. The Multi-Sector Sustainability Browser (MSSB): A Tool for ...

    EPA Pesticide Factsheets

    The MSSB is the first and only decision support tool containing information from scientific literature and technical reports that can be used to develop and implement sustainability initiatives. The MSSB is designed to assist individuals and communities in understanding the impacts that the four key dimensions of sustainability - Land Use, Buildings and Infrastructure, Transportation, and Materials Management - can have on human health, the economy, the built environment and natural environments. The MSSB has the following capabilities: a. Displays and describes linkages between the four major sustainability concepts (Land Use, Buildings and Infrastructure, Transportation, and Materials Management) and their subordinate concepts. b. Displays and lists literature sources and references (including weblinks where applicable) providing information about each major sustainability concept and its associated subordinate concepts. c. Displays and lists quantitative data related to each major sustainability concept and its associated subordinate concepts, with weblinks where applicable.The MSSB serves as a ‘visual database’, allowing users to: investigate one or more of the four key sustainability dimensions; explore available scientific literature references, and; assess potential impacts of sustainability activities. The MSSB reduces the amount of time and effort required to assess the state of sustainability science and engineering research pertaining

  11. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  12. Visualization of planetary subsurface radar sounder data in three dimensions using stereoscopy

    NASA Astrophysics Data System (ADS)

    Frigeri, A.; Federico, C.; Pauselli, C.; Ercoli, M.; Coradini, A.; Orosei, R.

    2010-12-01

    Planetary subsurface sounding radar data extend the knowledge of planetary surfaces to a third dimension: the depth. The interpretation of delays of radar echoes converted into depth often requires the comparative analysis with other data, mainly topography, and radar data from different orbits can be used to investigate the spatial continuity of signals from subsurface geologic features. This scenario requires taking into account spatially referred information in three dimensions. Three dimensional objects are generally easier to understand if represented into a three dimensional space, and this representation can be improved by stereoscopic vision. Since its invention in the first half of 19th century, stereoscopy has been used in a broad range of application, including scientific visualization. The quick improvement of computer graphics and the spread of graphic rendering hardware allow to apply the basic principles of stereoscopy in the digital domain, allowing the stereoscopic projection of complex models. Specialized system for stereoscopic view of scientific data have been available in the industry, and proprietary solutions were affordable only to large research institutions. In the last decade, thanks to the GeoWall Consortium, the basics of stereoscopy have been applied for setting up stereoscopic viewers based on off-the shelf hardware products. Geowalls have been spread and are now used by several geo-science research institutes and universities. We are exploring techniques for visualizing planetary subsurface sounding radar data in three dimensions and we are developing a hardware system for rendering it in a stereoscopic vision system. Several Free Open Source Software tools and libraries are being used, as their level of interoperability is typically high and their licensing system offers the opportunity to implement quickly new functionalities to solve specific needs during the progress of the project. Visualization of planetary radar data in three dimensions represents a challenging task, and the exploration of different strategies will bring to the selection of the most appropriate ones for a meaningful extraction of information from the products of these innovative instruments.

  13. Students' Communicative Resources in Relation to Their Conceptual Understanding—The Role of Non-Conventionalized Expressions in Making Sense of Visualizations of Protein Function

    NASA Astrophysics Data System (ADS)

    Rundgren, Carl-Johan; Hirsch, Richard; Chang Rundgren, Shu-Nu; Tibell, Lena A. E.

    2012-10-01

    This study examines how students explain their conceptual understanding of protein function using visualizations. Thirteen upper secondary students, four tertiary students (studying chemical biology), and two experts were interviewed in semi-structured interviews. The interviews were structured around 2D illustrations of proteins and an animated representation of water transport through a channel in the cell membrane. In the analysis of the transcripts, a score, based on the SOLO-taxonomy, was given to each student to indicate the conceptual depth achieved in their explanations. The use of scientific terms and non-conventionalized expressions in the students' explanations were investigated based upon a semiotic approach. The results indicated that there was a positive relationship between use of scientific terms and level of education. However, there was no correlation between students' use of scientific terms and conceptual depth. In the interviews, we found that non-conventionalized expressions were used by several participants to express conceptual understanding and played a role in making sense of the visualizations of protein function. Interestingly, also the experts made use of non-conventionalized expressions. The results of our study imply that more attention should be drawn to students' use of scientific and non-conventionalized terms in relation to their conceptual understanding.

  14. Development of a Web-Based Visualization Platform for Climate Research Using Google Earth

    NASA Technical Reports Server (NTRS)

    Sun, Xiaojuan; Shen, Suhung; Leptoukh, Gregory G.; Wang, Panxing; Di, Liping; Lu, Mingyue

    2011-01-01

    Recently, it has become easier to access climate data from satellites, ground measurements, and models from various data centers, However, searching. accessing, and prc(essing heterogeneous data from different sources are very tim -consuming tasks. There is lack of a comprehensive visual platform to acquire distributed and heterogeneous scientific data and to render processed images from a single accessing point for climate studies. This paper. documents the design and implementation of a Web-based visual, interoperable, and scalable platform that is able to access climatological fields from models, satellites, and ground stations from a number of data sources using Google Earth (GE) as a common graphical interface. The development is based on the TCP/IP protocol and various data sharing open sources, such as OPeNDAP, GDS, Web Processing Service (WPS), and Web Mapping Service (WMS). The visualization capability of integrating various measurements into cE extends dramatically the awareness and visibility of scientific results. Using embedded geographic information in the GE, the designed system improves our understanding of the relationships of different elements in a four dimensional domain. The system enables easy and convenient synergistic research on a virtual platform for professionals and the general public, gr$tly advancing global data sharing and scientific research collaboration.

  15. Make it fun for everyone: visualization techniques in geoscience

    NASA Astrophysics Data System (ADS)

    Portnov, A.; Sojtaric, M.

    2017-12-01

    We live on a planet that mostly consists of oceans, but most people cannot picture what the surface and the subsurface of the ocean floor looks like. Marine geophysics has traditionally been difficult to explain to general public as most of what we do happens beyond the visual realm of an average audience. However, recent advances in 3D visualization of scientific data is one of the tools we can employ to better explain complex systems through gripping visual content. Coupled with a narrative approach, this type of visualization can open up a whole new and relatively little known world of science to general public. Up-to-date remote-sensing methods provide unique data of surface of seabed and subsurface all over the planet. Modern software can present this data in a spectacular way and with great scientific accuracy, making it attractive both for specialists and non-specialists in geoscience. As an example, we present several visualizations, which in simple way tell stories of various research in the remote parts of the World, such as Arctic regions and deep ocean in the Gulf of Mexico. Diverse datasets: multibeam echosounding; hydrographic survey; seismic and borehole data are put together to build up perfectly geo-referenced environment, showing the complexity of geological processes on our planet. Some of the data was collected 10-15 years ago, but acquired its new life with the help of new data visualization techniques. Every digital object with assigned coordinates, including 2D pictures and 3D models may become a part of this virtual geologic environment, limiting the potential of geo-visualization only by the imagination of a scientist. Presented videos have an apparent scientific focus on marine geology and geophysics, since the data was collected by several research and petroleum organizations, specialized in this field. The stories which we tell in this way may, for example, provide the public with further insight in complexities surrounding natural subsea gas storage and release.

  16. A GIS-based decision support system for regional eco-security assessment and its application on the Tibetan Plateau.

    PubMed

    Xiaodan, Wang; Xianghao, Zhong; Pan, Gao

    2010-10-01

    Regional eco-security assessment is an intricate, challenging task. In previous studies, the integration of eco-environmental models and geographical information systems (GIS) usually takes two approaches: loose coupling and tight coupling. However, the present study used a full coupling approach to develop a GIS-based regional eco-security assessment decision support system (ESDSS). This was achieved by merging the pressure-state-response (PSR) model and the analytic hierarchy process (AHP) into ArcGIS 9 as a dynamic link library (DLL) using ArcObjects in ArcGIS and Visual Basic for Applications. Such an approach makes it easy to capitalize on the GIS visualization and spatial analysis functions, thereby significantly supporting the dynamic estimation of regional eco-security. A case study is presented for the Tibetan Plateau, known as the world's "third pole" after the Arctic and Antarctic. Results verified the usefulness and feasibility of the developed method. As a useful tool, the ESDSS can also help local managers to make scientifically-based and effective decisions about Tibetan eco-environmental protection and land use. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  17. Spectral imaging: principles and applications.

    PubMed

    Garini, Yuval; Young, Ian T; McNamara, George

    2006-08-01

    Spectral imaging extends the capabilities of biological and clinical studies to simultaneously study multiple features such as organelles and proteins qualitatively and quantitatively. Spectral imaging combines two well-known scientific methodologies, namely spectroscopy and imaging, to provide a new advantageous tool. The need to measure the spectrum at each point of the image requires combining dispersive optics with the more common imaging equipment, and introduces constrains as well. The principles of spectral imaging and a few representative applications are described. Spectral imaging analysis is necessary because the complex data structure cannot be analyzed visually. A few of the algorithms are discussed with emphasis on the usage for different experimental modes (fluorescence and bright field). Finally, spectral imaging, like any method, should be evaluated in light of its advantages to specific applications, a selection of which is described. Spectral imaging is a relatively new technique and its full potential is yet to be exploited. Nevertheless, several applications have already shown its potential. (c) 2006 International Society for Analytical Cytology.

  18. NCI Visuals Online

    Cancer.gov

    NCI Visuals Online contains images from the collections of the National Cancer Institute's Office of Communications and Public Liaison, including general biomedical and science-related images, cancer-specific scientific and patient care-related images, and portraits of directors and staff of the National Cancer Institute.

  19. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https://github.com/OpenDataAnalytics/gaia). In this presentation, we will discuss core features of each of these tools and will present lessons learned on handling large data in the context of data management, analyses and visualization.

  20. Python-Based Applications for Hydrogeological Modeling

    NASA Astrophysics Data System (ADS)

    Khambhammettu, P.

    2013-12-01

    Python is a general-purpose, high-level programming language whose design philosophy emphasizes code readability. Add-on packages supporting fast array computation (numpy), plotting (matplotlib), scientific /mathematical Functions (scipy), have resulted in a powerful ecosystem for scientists interested in exploratory data analysis, high-performance computing and data visualization. Three examples are provided to demonstrate the applicability of the Python environment in hydrogeological applications. Python programs were used to model an aquifer test and estimate aquifer parameters at a Superfund site. The aquifer test conducted at a Groundwater Circulation Well was modeled with the Python/FORTRAN-based TTIM Analytic Element Code. The aquifer parameters were estimated with PEST such that a good match was produced between the simulated and observed drawdowns. Python scripts were written to interface with PEST and visualize the results. A convolution-based approach was used to estimate source concentration histories based on observed concentrations at receptor locations. Unit Response Functions (URFs) that relate the receptor concentrations to a unit release at the source were derived with the ATRANS code. The impact of any releases at the source could then be estimated by convolving the source release history with the URFs. Python scripts were written to compute and visualize receptor concentrations for user-specified source histories. The framework provided a simple and elegant way to test various hypotheses about the site. A Python/FORTRAN-based program TYPECURVEGRID-Py was developed to compute and visualize groundwater elevations and drawdown through time in response to a regional uniform hydraulic gradient and the influence of pumping wells using either the Theis solution for a fully-confined aquifer or the Hantush-Jacob solution for a leaky confined aquifer. The program supports an arbitrary number of wells that can operate according to arbitrary schedules. The python wrapper invokes the underlying FORTRAN layer to compute transient groundwater elevations and processes this information to create time-series and 2D plots.

  1. Interactive Exploration of Cosmological Dark-Matter Simulation Data.

    PubMed

    Scherzinger, Aaron; Brix, Tobias; Drees, Dominik; Volker, Andreas; Radkov, Kiril; Santalidis, Niko; Fieguth, Alexander; Hinrichs, Klaus H

    2017-01-01

    The winning entry of the 2015 IEEE Scientific Visualization Contest, this article describes a visualization tool for cosmological data resulting from dark-matter simulations. The proposed system helps users explore all aspects of the data at once and receive more detailed information about structures of interest at any time. Moreover, novel methods for visualizing and interactively exploring dark-matter halo substructures are proposed.

  2. The visual theology of Victorian popularizers of science. From reverent eye to chemical retina.

    PubMed

    Lightman, B

    2000-12-01

    This essay examines the use of visual images during the latter half of the nineteenth century in the work of three important popularizers of science. J. G. Wood, Richard Proctor, and Agnes Clerke skillfully used illustrations and photographs to establish their credibility as trustworthy guides to scientific, moral, and religious truths. All three worked within the natural theology tradition, despite the powerful critique of William Paley's argument from design set forth in Charles Darwin's Origin of Species (1859). Wood, Proctor, and Clerke recognized that in order to reach a popular audience with their message of divine wonder in nature, they would have to take advantage of the developing mass visual culture embodied in the new pictorial magazines, spectacles, and entertaining toys based on scientific gadgets emblematic of the reorganization of vision. But in drawing on different facets of the emerging visual culture and in looking to the images produced by the new visual technologies to find the hand of God in nature, these popularizers subtly transformed the natural theology tradition.

  3. FAST: A multi-processed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.

  4. Undergraduate Labs for Biological Physics: Brownian Motion and Optical Trapping

    NASA Astrophysics Data System (ADS)

    Chu, Kelvin; Laughney, A.; Williams, J.

    2006-12-01

    We describe a set of case-study driven labs for an upper-division biological physics course. These labs are motivated by case-studies and consist of inquiry-driven investigations of Brownian motion and optical-trapping experiments. Each lab incorporates two innovative educational techniques to drive the process and application aspects of scientific learning. Case studies are used to encourage students to think independently and apply the scientific method to a novel lab situation. Student input from this case study is then used to decide how to best do the measurement, guide the project and ultimately evaluate the success of the program. Where appropriate, visualization and simulation using VPython is used. Direct visualization of Brownian motion allows students to directly calculate Avogadro's number or the Boltzmann constant. Following case-study driven discussion, students use video microscopy to measure the motion of latex spheres in different viscosity fluids arrive at a good approximation of NA or kB. Optical trapping (laser tweezer) experiments allow students to investigate the consequences of 100-pN forces on small particles. The case study consists of a discussion of the Boltzmann distribution and equipartition theorem followed by a consideration of the shape of the potential. Students can then use video capture to measure the distribution of bead positions to determine the shape and depth of the trap. This work supported by NSF DUE-0536773.

  5. Stereoscopy in Static Scientific Imagery in an Informal Education Setting: Does It Matter?

    NASA Astrophysics Data System (ADS)

    Price, C. Aaron; Lee, H.-S.; Malatesta, K.

    2014-12-01

    Stereoscopic technology (3D) is rapidly becoming ubiquitous across research, entertainment and informal educational settings. Children of today may grow up never knowing a time when movies, television and video games were not available stereoscopically. Despite this rapid expansion, the field's understanding of the impact of stereoscopic visualizations on learning is rather limited. Much of the excitement of stereoscopic technology could be due to a novelty effect, which will wear off over time. This study controlled for the novelty factor using a variety of techniques. On the floor of an urban science center, 261 children were shown 12 photographs and visualizations of highly spatial scientific objects and scenes. The images were randomly shown in either traditional (2D) format or in stereoscopic format. The children were asked two questions of each image—one about a spatial property of the image and one about a real-world application of that property. At the end of the test, the child was asked to draw from memory the last image they saw. Results showed no overall significant difference in response to the questions associated with 2D or 3D images. However, children who saw the final slide only in 3D drew more complex representations of the slide than those who did not. Results are discussed through the lenses of cognitive load theory and the effect of novelty on engagement.

  6. WeBIAS: a web server for publishing bioinformatics applications.

    PubMed

    Daniluk, Paweł; Wilczyński, Bartek; Lesyng, Bogdan

    2015-11-02

    One of the requirements for a successful scientific tool is its availability. Developing a functional web service, however, is usually considered a mundane and ungratifying task, and quite often neglected. When publishing bioinformatic applications, such attitude puts additional burden on the reviewers who have to cope with poorly designed interfaces in order to assess quality of presented methods, as well as impairs actual usefulness to the scientific community at large. In this note we present WeBIAS-a simple, self-contained solution to make command-line programs accessible through web forms. It comprises a web portal capable of serving several applications and backend schedulers which carry out computations. The server handles user registration and authentication, stores queries and results, and provides a convenient administrator interface. WeBIAS is implemented in Python and available under GNU Affero General Public License. It has been developed and tested on GNU/Linux compatible platforms covering a vast majority of operational WWW servers. Since it is written in pure Python, it should be easy to deploy also on all other platforms supporting Python (e.g. Windows, Mac OS X). Documentation and source code, as well as a demonstration site are available at http://bioinfo.imdik.pan.pl/webias . WeBIAS has been designed specifically with ease of installation and deployment of services in mind. Setting up a simple application requires minimal effort, yet it is possible to create visually appealing, feature-rich interfaces for query submission and presentation of results.

  7. EarthChem: International Collaboration for Solid Earth Geochemistry in Geoinformatics

    NASA Astrophysics Data System (ADS)

    Walker, J. D.; Lehnert, K. A.; Hofmann, A. W.; Sarbas, B.; Carlson, R. W.

    2005-12-01

    The current on-line information systems for igneous rock geochemistry - PetDB, GEOROC, and NAVDAT - convincingly demonstrate the value of rigorous scientific data management of geochemical data for research and education. The next generation of hypothesis formulation and testing can be vastly facilitated by enhancing these electronic resources through integration of available datasets, expansion of data coverage in location, time, and tectonic setting, timely updates with new data, and through intuitive and efficient access and data analysis tools for the broader geosciences community. PetDB, GEOROC, and NAVDAT have therefore formed the EarthChem consortium (www.earthchem.org) as a international collaborative effort to address these needs and serve the larger earth science community by facilitating the compilation, communication, serving, and visualization of geochemical data, and their integration with other geological, geochronological, geophysical, and geodetic information to maximize their scientific application. We report on the status of and future plans for EarthChem activities. EarthChem's development plan includes: (1) expanding the functionality of the web portal to become a `one-stop shop for geochemical data' with search capability across databases, standardized and integrated data output, generally applicable tools for data quality assessment, and data analysis/visualization including plotting methods and an information-rich map interface; and (2) expanding data holdings by generating new datasets as identified and prioritized through community outreach, and facilitating data contributions from the community by offering web-based data submission capability and technical assistance for design, implementation, and population of new databases and their integration with all EarthChem data holdings. Such federated databases and datasets will retain their identity within the EarthChem system. We also plan on working with publishers to ease the assimilation of geochemical data into the EarthChem database. As a community resource, EarthChem will address user concerns and respond to broad scientific and educational needs. EarthChem will hold yearly workshops, town hall meetings, and/or exhibits at major meetings. The group has established a two-tier committee structure to help ease the communication and coordination of database and IT issues between existing data management projects, and to receive feedback and support from individuals and groups from the larger geosciences community.

  8. Development of an Overview Display to Allow Advanced Outage Control Center Management to Quickly Evaluate Outage Status

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    St Germain, Shawn Walter; Hugo, Jacques Victor

    This report describes recent advances made in developing a framework for the design of visual outage information presentation, as well as an overview of the scientific principles that informed the development of the visualizations.

  9. Chang'E-3 data pre-processing system based on scientific workflow

    NASA Astrophysics Data System (ADS)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  10. Perform light and optic experiments in Augmented Reality

    NASA Astrophysics Data System (ADS)

    Wozniak, Peter; Vauderwange, Oliver; Curticapean, Dan; Javahiraly, Nicolas; Israel, Kai

    2015-10-01

    In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university's laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based. Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one's perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content. The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.

  11. Space Weather Research in Armenia

    NASA Astrophysics Data System (ADS)

    Chilingarian, A. A.

    DVIN for ASEC (Data Visualization interactive Network for Aragats Space Environmental Center) is product for accessing and analysis the on-line data from Solar Monitors located at high altitude research station on Mt. Aragats in Armenia. Data from ASEC monitors is used worldwide for scientific purposes and for monitoring of severe solar storms in progress. Alert service, based on the automatic analysis of variations of the different species of cosmic ray particles is available for subscribers. DVIN advantages: DVIN is strategically important as a scientific application to help develop space science and to foster global collaboration in forecasting potential hazards of solar storms. It precisely fits with the goals of the new evolving information society to provide long-term monitoring and collection of high quality scientific data, and enables adequate dialogue between scientists, decision makers, and civil society. The system is highly interactive and exceptional information is easily accessible online. Data can be monitored and analyzed for desired time spans in a fast and reliable manner. The ASEC activity is an example of a balance between the scientific independence of fundamental research and the needs of civil society. DVIN is also an example of how scientific institutions can apply the newest powerful methods of information technologies, such as multivariate data analysis, to their data and also how information technologies can provide convenient and reliable access to this data and to new knowledge for the world-wide scientific community. DVIN provides very wide possibilities for sharing data and sending warnings and alerts to scientists and other entities world-wide, which have fundamental and practical interest in knowing the space weather conditions.

  12. Scientific Visualization and Simulation for Multi-dimensional Marine Environment Data

    NASA Astrophysics Data System (ADS)

    Su, T.; Liu, H.; Wang, W.; Song, Z.; Jia, Z.

    2017-12-01

    As higher attention on the ocean and rapid development of marine detection, there are increasingly demands for realistic simulation and interactive visualization of marine environment in real time. Based on advanced technology such as GPU rendering, CUDA parallel computing and rapid grid oriented strategy, a series of efficient and high-quality visualization methods, which can deal with large-scale and multi-dimensional marine data in different environmental circumstances, has been proposed in this paper. Firstly, a high-quality seawater simulation is realized by FFT algorithm, bump mapping and texture animation technology. Secondly, large-scale multi-dimensional marine hydrological environmental data is virtualized by 3d interactive technologies and volume rendering techniques. Thirdly, seabed terrain data is simulated with improved Delaunay algorithm, surface reconstruction algorithm, dynamic LOD algorithm and GPU programming techniques. Fourthly, seamless modelling in real time for both ocean and land based on digital globe is achieved by the WebGL technique to meet the requirement of web-based application. The experiments suggest that these methods can not only have a satisfying marine environment simulation effect, but also meet the rendering requirements of global multi-dimension marine data. Additionally, a simulation system for underwater oil spill is established by OSG 3D-rendering engine. It is integrated with the marine visualization method mentioned above, which shows movement processes, physical parameters, current velocity and direction for different types of deep water oil spill particle (oil spill particles, hydrates particles, gas particles, etc.) dynamically and simultaneously in multi-dimension. With such application, valuable reference and decision-making information can be provided for understanding the progress of oil spill in deep water, which is helpful for ocean disaster forecasting, warning and emergency response.

  13. Design and construction of a modular low-cost epifluorescence upright microscope for neuron visualized recording and fluorescence detection.

    PubMed

    Beltran-Parrazal, Luis; Morgado-Valle, Consuelo; Serrano, Raul E; Manzo, Jorge; Vergara, Julio L

    2014-03-30

    One of the limitations when establishing an electrophysiology setup, particularly in low resource settings, is the high cost of microscopes. The average cost for a microscope equipped with the optics for infrared (IR) contrast or microfluorometry is $40,000. We hypothesized that optical elements and features included in commercial microscopes are not necessary to IR video-visualize neurons or for microfluorometry. We present instructions for building a low-cost epifluorescence upright microscope suitable for visualized patch-clamp recording and fluorescence detection using mostly catalog-available parts. This microscope supports applications such as visualized whole-cell recording using IR oblique illumination (IR-OI), or more complex applications such as microfluorometry using a photodiode. In both IR-OI and fluorescence, actual resolution measured with 2-μm latex beads is close to theoretical resolution. The lack of movable parts to switch configurations ensures stability when doing intracellular recording. The low cost is a significant advantage of this microscope compared to existent custom-built microscopes. The cost of the simplest configuration with IR-OI is ∼$2000, whereas the cost of the configuration with epifluorescence is ∼$5000. Since this design does not use pieces discarded from commercial microscopes, it is completely reproducible. We suggest that this microscope is a viable alternative for doing in vitro electrophysiology and microfluorometry in low-resource settings. Characteristics such as an open box design, easy assembly, and low-cost make this microscope a useful instrument for science education and teaching for topics such as optics, biology, neuroscience, and for scientific "hands-on" workshops. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Exploring Metacogntive Visual Literacy Tasks for Teaching Astronomy

    NASA Astrophysics Data System (ADS)

    Slater, Timothy F.; Slater, S.; Dwyer, W.

    2010-01-01

    Undoubtedly, astronomy is a scientific enterprise which often results in colorful and inspirational images of the cosmos that naturally capture our attention. Students encountering astronomy in the college classroom are often bombarded with images, movies, simulations, conceptual cartoons, graphs, and charts intended to convey the substance and technological advancement inherent in astronomy. For students who self-identify themselves as visual learners, this aspect can make the science of astronomy come alive. For students who naturally attend to visual aesthetics, this aspect can make astronomy seem relevant. In other words, the visual nature that accompanies much of the scientific realm of astronomy has the ability to connect a wide range of students to science, not just those few who have great abilities and inclinations toward the mathematical analysis world. Indeed, this is fortunate for teachers of astronomy, who actively try to find ways to connect and build astronomical understanding with a broad range of student interests, motivations, and abilities. In the context of learning science, metacognition describes students’ self-monitoring, -regulation, and -awareness when thinking about learning. As such, metacognition is one of the foundational pillars supporting what we know about how people learn. Yet, the astronomy teaching and learning community knows very little about how to operationalize and support students’ metacognition in the classroom. In response, the Conceptual Astronomy, Physics and Earth sciences Research (CAPER) Team is developing and pilot-testing metacogntive tasks in the context of astronomy that focus on visual literacy of astronomical phenomena. In the initial versions, students are presented with a scientifically inaccurate narrative supposedly describing visual information, including images and graphical information, and asked to assess and correct the narrative, in the form of peer evaluation. To guide student thinking, students are provided with a scaffolded series of multiple-choice questions highlighting conceptual aspects of the prompt.

  15. Decision support system for emergency management of oil spill accidents in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Liubartseva, Svitlana; Coppini, Giovanni; Pinardi, Nadia; De Dominicis, Michela; Lecci, Rita; Turrisi, Giuseppe; Cretì, Sergio; Martinelli, Sara; Agostini, Paola; Marra, Palmalisa; Palermo, Francesco

    2016-08-01

    This paper presents an innovative web-based decision support system to facilitate emergency management in the case of oil spill accidents, called WITOIL (Where Is The Oil). The system can be applied to create a forecast of oil spill events, evaluate uncertainty of the predictions, and calculate hazards based on historical meteo-oceanographic datasets. To compute the oil transport and transformation, WITOIL uses the MEDSLIK-II oil spill model forced by operational meteo-oceanographic services. Results of the modeling are visualized through Google Maps. A special application for Android is designed to provide mobile access for competent authorities, technical and scientific institutions, and citizens.

  16. ART AND SCIENCE OF IMAGE MAPS.

    USGS Publications Warehouse

    Kidwell, Richard D.; McSweeney, Joseph A.

    1985-01-01

    The visual image of reflected light is influenced by the complex interplay of human color discrimination, spatial relationships, surface texture, and the spectral purity of light, dyes, and pigments. Scientific theories of image processing may not always achieve acceptable results as the variety of factors, some psychological, are in part, unpredictable. Tonal relationships that affect digital image processing and the transfer functions used to transform from the continuous-tone source image to a lithographic image, may be interpreted for an insight of where art and science fuse in the production process. The application of art and science in image map production at the U. S. Geological Survey is illustrated and discussed.

  17. Adjustable lossless image compression based on a natural splitting of an image into drawing, shading, and fine-grained components

    NASA Technical Reports Server (NTRS)

    Novik, Dmitry A.; Tilton, James C.

    1993-01-01

    The compression, or efficient coding, of single band or multispectral still images is becoming an increasingly important topic. While lossy compression approaches can produce reconstructions that are visually close to the original, many scientific and engineering applications require exact (lossless) reconstructions. However, the most popular and efficient lossless compression techniques do not fully exploit the two-dimensional structural links existing in the image data. We describe here a general approach to lossless data compression that effectively exploits two-dimensional structural links of any length. After describing in detail two main variants on this scheme, we discuss experimental results.

  18. Grid computing technology for hydrological applications

    NASA Astrophysics Data System (ADS)

    Lecca, G.; Petitdidier, M.; Hluchy, L.; Ivanovic, M.; Kussul, N.; Ray, N.; Thieron, V.

    2011-06-01

    SummaryAdvances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are collected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technologies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six different surface and subsurface hydrology applications that have been deployed on the Grid. All the applications aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, groundwater resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.

  19. Vergence-accommodation conflicts hinder visual performance and cause visual fatigue.

    PubMed

    Hoffman, David M; Girshick, Ahna R; Akeley, Kurt; Banks, Martin S

    2008-03-28

    Three-dimensional (3D) displays have become important for many applications including vision research, operation of remote devices, medical imaging, surgical training, scientific visualization, virtual prototyping, and more. In many of these applications, it is important for the graphic image to create a faithful impression of the 3D structure of the portrayed object or scene. Unfortunately, 3D displays often yield distortions in perceived 3D structure compared with the percepts of the real scenes the displays depict. A likely cause of such distortions is the fact that computer displays present images on one surface. Thus, focus cues-accommodation and blur in the retinal image-specify the depth of the display rather than the depths in the depicted scene. Additionally, the uncoupling of vergence and accommodation required by 3D displays frequently reduces one's ability to fuse the binocular stimulus and causes discomfort and fatigue for the viewer. We have developed a novel 3D display that presents focus cues that are correct or nearly correct for the depicted scene. We used this display to evaluate the influence of focus cues on perceptual distortions, fusion failures, and fatigue. We show that when focus cues are correct or nearly correct, (1) the time required to identify a stereoscopic stimulus is reduced, (2) stereoacuity in a time-limited task is increased, (3) distortions in perceived depth are reduced, and (4) viewer fatigue and discomfort are reduced. We discuss the implications of this work for vision research and the design and use of displays.

  20. Sciologer: Visualizing and Exploring Scientific Communities

    ERIC Educational Resources Information Center

    Bales, Michael Eliot

    2009-01-01

    Despite the recognized need to increase interdisciplinary collaboration, there are few information resources available to provide researchers with an overview of scientific communities--topics under investigation by various groups, and patterns of collaboration among groups. The tools that are available are designed for expert social network…

  1. The VIS-AD data model: Integrating metadata and polymorphic display with a scientific programming language

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Dyer, Charles R.; Paul, Brian E.

    1994-01-01

    The VIS-AD data model integrates metadata about the precision of values, including missing data indicators and the way that arrays sample continuous functions, with the data objects of a scientific programming language. The data objects of this data model form a lattice, ordered by the precision with which they approximate mathematical objects. We define a similar lattice of displays and study visualization processes as functions from data lattices to display lattices. Such functions can be applied to visualize data objects of all data types and are thus polymorphic.

  2. The Methods of Cognitive Visualization for the Astronomical Databases Analyzing Tools Development

    NASA Astrophysics Data System (ADS)

    Vitkovskiy, V.; Gorohov, V.

    2008-08-01

    There are two kinds of computer graphics: the illustrative one and the cognitive one. Appropriate the cognitive pictures not only make evident and clear the sense of complex and difficult scientific concepts, but promote, --- and not so very rarely, --- a birth of a new knowledge. On the basis of the cognitive graphics concept, we worked out the SW-system for visualization and analysis. It allows to train and to aggravate intuition of researcher, to raise his interest and motivation to the creative, scientific cognition, to realize process of dialogue with the very problems simultaneously.

  3. The Role of Visual Representations in Scientific Practices: From Conceptual Understanding and Knowledge Generation to 'Seeing' How Science Works

    ERIC Educational Resources Information Center

    Evagorou, Maria; Erduran, Sibel; Mäntylä, Terhi

    2015-01-01

    Background: The use of visual representations (i.e., photographs, diagrams, models) has been part of science, and their use makes it possible for scientists to interact with and represent complex phenomena, not observable in other ways. Despite a wealth of research in science education on visual representations, the emphasis of such research has…

  4. ESA Atmospheric Toolbox

    NASA Astrophysics Data System (ADS)

    Niemeijer, Sander

    2017-04-01

    The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and analysis application for atmospheric data and can be used to visualize and analyze the data that you retrieve using the CODA and HARP interfaces. The application uses the Python language as the means through which you provide commands to the application. The Python interfaces for CODA and HARP are included so you can directly ingest product data from within VISAN. Powerful visualization functionality for 2D plots and geographical plots in VISAN will allow you to directly visualize the ingested data. All components from the ESA Atmospheric Toolbox are Open Source and freely available. Software packages can be downloaded from the BEAT website: http://stcorp.nl/beat/

  5. REMOTE SENSING, VISUALIZATION AND DECISION SUPPORT FOR WATERSHED MANAGEMENT AND SUSTAINABLE AGRICULTURE

    EPA Science Inventory

    The integration of satellite and airborne remote sensing, scientific visualization and decision support tools is discussed within the context of management techniques for minimizing the non-point source pollution load of inland waterways and the sustainability of food crop produc...

  6. SensorWeb Hub infrastructure for open access to scientific research data

    NASA Astrophysics Data System (ADS)

    de Filippis, Tiziana; Rocchi, Leandro; Rapisardi, Elena

    2015-04-01

    The sharing of research data is a new challenge for the scientific community that may benefit from a large amount of information to solve environmental issues and sustainability in agriculture and urban contexts. Prerequisites for this challenge is the development of an infrastructure that ensure access, management and preservation of data, technical support for a coordinated and harmonious management of data that, in the framework of Open Data Policies, should encourages the reuse and the collaboration. The neogeography and the citizen as sensors approach, highlight that new data sources need a new set of tools and practices so to collect, validate, categorize, and use / access these "crowdsourced" data, that integrate the data sets produced in the scientific field, thus "feeding" the overall available data for analysis and research. When the scientific community embraces the dimension of collaboration and sharing, access and re-use, in order to accept the open innovation approach, it should redesign and reshape the processes of data management: the challenges of technological and cultural innovation, enabled by web 2.0 technologies, bring to the scenario where the sharing of structured and interoperable data will constitute the unavoidable building block to set up a new paradigm of scientific research. In this perspective the Institute of Biometeorology, CNR, whose aim is contributing to sharing and development of research data, has developed the "SensorWebHub" (SWH) infrastructure to support the scientific activities carried out in several research projects at national and international level. It is designed to manage both mobile and fixed open source meteorological and environmental sensors, in order to integrate the existing agro-meteorological and urban monitoring networks. The proposed architecture uses open source tools to ensure sustainability in the development and deployment of web applications with geographic features and custom analysis, as requested by the different research projects. The SWH components are organized in typical client-server architecture and interact from the sensing process to the representation of the results to the end-users. The Web Application enables to view and analyse the data stored in the GeoDB. The interface is designed following Internet browsers specifications allowing the visualization of collected data in different formats (tabular, chart and geographic map). The services for the dissemination of geo-referenced information, adopt the OGC specifications. SWH is a bottom-up collaborative initiative to share real time research data and pave the way for a open innovation approach in the scientific research. Until now this framework has been used for several WebGIS applications and WebApp for environmental monitoring at different temporal and spatial scales.

  7. A Computational framework for telemedicine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; von Laszewski, G.; Thiruvathukal, G. K.

    1998-07-01

    Emerging telemedicine applications require the ability to exploit diverse and geographically distributed resources. Highspeed networks are used to integrate advanced visualization devices, sophisticated instruments, large databases, archival storage devices, PCs, workstations, and supercomputers. This form of telemedical environment is similar to networked virtual supercomputers, also known as metacomputers. Metacomputers are already being used in many scientific application areas. In this article, we analyze requirements necessary for a telemedical computing infrastructure and compare them with requirements found in a typical metacomputing environment. We will show that metacomputing environments can be used to enable a more powerful and unified computational infrastructure formore » telemedicine. The Globus metacomputing toolkit can provide the necessary low level mechanisms to enable a large scale telemedical infrastructure. The Globus toolkit components are designed in a modular fashion and can be extended to support the specific requirements for telemedicine.« less

  8. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    PubMed

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  9. New trend in electron holography

    NASA Astrophysics Data System (ADS)

    Tanigaki, Toshiaki; Harada, Ken; Murakami, Yasukazu; Niitsu, Kodai; Akashi, Tetsuya; Takahashi, Yoshio; Sugawara, Akira; Shindo, Daisuke

    2016-06-01

    Electron holography using a coherent electron wave is a promising technique for high-resolution visualization of electromagnetic fields in and around objects. The capability of electron holography has been enhanced by the development of new technologies and has thus become an even more powerful tool for exploring scientific frontiers. This review introduces these technologies including split-illumination electron holography and vector-field electron tomography. Split-illumination electron holography, which uses separated coherent waves, overcomes the limits imposed by the lateral coherence requirement for electron waves in electron holography. Areas that are difficult to observe using conventional electron holography are now observable. Exemplified applications include observing a singular magnetic domain wall in electrical steel sheets, local magnetizations at anti-phase boundaries, and electrostatic potentials in metal-oxide-semiconductor field-effect transistors. Vector-field electron tomography can be used to visualize magnetic vectors in three dimensions. Two components of the vectors are reconstructed using dual-axis tomography, and the remaining one is calculated using div B   =  0. A high-voltage electron microscope can be used to achieve precise magnetic reconstruction. For example, magnetic vortices have been visualized using a 1 MV holography electron microscope.

  10. Web based tools for data manipulation, visualisation and validation with interactive georeferenced graphs

    NASA Astrophysics Data System (ADS)

    Ivankovic, D.; Dadic, V.

    2009-04-01

    Some of oceanographic parameters have to be manually inserted into database; some (for example data from CTD probe) are inserted from various files. All this parameters requires visualization, validation and manipulation from research vessel or scientific institution, and also public presentation. For these purposes is developed web based system, containing dynamic sql procedures and java applets. Technology background is Oracle 10g relational database, and Oracle application server. Web interfaces are developed using PL/SQL stored database procedures (mod PL/SQL). Additional parts for data visualization include use of Java applets and JavaScript. Mapping tool is Google maps API (javascript) and as alternative java applet. Graph is realized as dynamically generated web page containing java applet. Mapping tool and graph are georeferenced. That means that click on some part of graph, automatically initiate zoom or marker onto location where parameter was measured. This feature is very useful for data validation. Code for data manipulation and visualization are partially realized with dynamic SQL and that allow as to separate data definition and code for data manipulation. Adding new parameter in system requires only data definition and description without programming interface for this kind of data.

  11. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  12. A systematic review of phacoemulsification cataract surgery in virtual reality simulators.

    PubMed

    Lam, Chee Kiang; Sundaraj, Kenneth; Sulaiman, Mohd Nazri

    2013-01-01

    The aim of this study was to review the capability of virtual reality simulators in the application of phacoemulsification cataract surgery training. Our review included the scientific publications on cataract surgery simulators that had been developed by different groups of researchers along with commercialized surgical training products, such as EYESI® and PhacoVision®. The review covers the simulation of the main cataract surgery procedures, i.e., corneal incision, capsulorrhexis, phacosculpting, and intraocular lens implantation in various virtual reality surgery simulators. Haptics realism and visual realism of the procedures are the main elements in imitating the actual surgical environment. The involvement of ophthalmology in research on virtual reality since the early 1990s has made a great impact on the development of surgical simulators. Most of the latest cataract surgery training systems are able to offer high fidelity in visual feedback and haptics feedback, but visual realism, such as the rotational movements of an eyeball with response to the force applied by surgical instruments, is still lacking in some of them. The assessment of the surgical tasks carried out on the simulators showed a significant difference in the performance before and after the training.

  13. PubMed Central

    Hackethal, Andreas; Hirschburger, Markus; Eicker, Sven Oliver; Mücke, Thomas; Lindner, Christoph; Buchweitz, Olaf

    2018-01-01

    Modern surgical strategies aim to reduce trauma by using functional imaging to improve surgical outcomes. This reviews considers and evaluates the importance of the fluorescent dye indocyanine green (ICG) to visualize lymph nodes, lymphatic pathways and vessels and tissue borders in an interdisciplinary setting. The work is based on a selective search of the literature in PubMed, Scopus, and Google Scholar and the authorsʼ own clinical experience. Because of its simple, radiation-free and uncomplicated application, ICG has become an important clinical indicator in recent years. In oncologic surgery ICG is used extensively to identify sentinel lymph nodes with promising results. In some studies, the detection rates with ICG have been better than the rates obtained with established procedures. When ICG is used for visualization and the quantification of tissue perfusion, it can lead to fewer cases of anastomotic insufficiency or transplant necrosis. The use of ICG for the imaging of organ borders, flap plasty borders and postoperative vascularization has also been scientifically evaluated. Combining the easily applied ICG dye with technical options for intraoperative and interventional visualization has the potential to create new functional imaging procedures which, in future, could expand or even replace existing established surgical techniques, particularly the techniques used for sentinel lymph node and anastomosis imaging. PMID:29375146

  14. Analysis and Visualization of ChIP-Seq and RNA-Seq Sequence Alignments Using ngs.plot.

    PubMed

    Loh, Yong-Hwee Eddie; Shen, Li

    2016-01-01

    The continual maturation and increasing applications of next-generation sequencing technology in scientific research have yielded ever-increasing amounts of data that need to be effectively and efficiently analyzed and innovatively mined for new biological insights. We have developed ngs.plot-a quick and easy-to-use bioinformatics tool that performs visualizations of the spatial relationships between sequencing alignment enrichment and specific genomic features or regions. More importantly, ngs.plot is customizable beyond the use of standard genomic feature databases to allow the analysis and visualization of user-specified regions of interest generated by the user's own hypotheses. In this protocol, we demonstrate and explain the use of ngs.plot using command line executions, as well as a web-based workflow on the Galaxy framework. We replicate the underlying commands used in the analysis of a true biological dataset that we had reported and published earlier and demonstrate how ngs.plot can easily generate publication-ready figures. With ngs.plot, users would be able to efficiently and innovatively mine their own datasets without having to be involved in the technical aspects of sequence coverage calculations and genomic databases.

  15. Situated cognition in clinical visualization: the role of transparency in GammaKnife neurosurgery planning.

    PubMed

    Dinka, David; Nyce, James M; Timpka, Toomas

    2009-06-01

    The aim of this study was to investigate how the clinical use of visualization technology can be advanced by the application of a situated cognition perspective. The data were collected in the GammaKnife radiosurgery setting and analyzed using qualitative methods. Observations and in-depth interviews with neurosurgeons and physicists were performed at three clinics using the Leksell GammaKnife. The users' ability to perform cognitive tasks was found to be reduced each time visualizations incongruent with the particular user's perception of clinical reality were used. The main issue here was a lack of transparency, i.e. a black box problem where machine representations "stood between" users and the cognitive tasks they wanted to perform. For neurosurgeons, transparency meant their previous experience from traditional surgery could be applied, i.e. that they were not forced to perform additional cognitive work. From the view of the physicists, on the other hand, the concept of transparency was associated with mathematical precision and avoiding creating a cognitive distance between basic patient data and what is experienced as clinical reality. The physicists approached clinical visualization technology as though it was a laboratory apparatus--one that required continual adjustment and assessment in order to "capture" a quantitative clinical reality. Designers of visualization technology need to compare the cognitive interpretations generated by the new visualization systems to conceptions generated during "traditional" clinical work. This means that the viewpoint of different clinical user groups involved in a given clinical task would have to be taken into account as well. A way forward would be to acknowledge that visualization is a socio-cognitive function that has practice-based antecedents and consequences, and to reconsider what analytical and scientific challenges this presents us with.

  16. The disappearing third dimension.

    PubMed

    Rowe, Timothy; Frank, Lawrence R

    2011-02-11

    Three-dimensional computing is driving what many would call a revolution in scientific visualization. However, its power and advancement are held back by the absence of sustainable archives for raw data and derivative visualizations. Funding agencies, professional societies, and publishers each have unfulfilled roles in archive design and data management policy.

  17. A New System To Support Knowledge Discovery: Telemakus.

    ERIC Educational Resources Information Center

    Revere, Debra; Fuller, Sherrilynne S.; Bugni, Paul F.; Martin, George M.

    2003-01-01

    The Telemakus System builds on the areas of concept representation, schema theory, and information visualization to enhance knowledge discovery from scientific literature. This article describes the underlying theories and an overview of a working implementation designed to enhance the knowledge discovery process through retrieval, visual and…

  18. Using a free software tool for the visualization of complicated electromagnetic fields

    NASA Astrophysics Data System (ADS)

    Murello, A.; Milotti, E.

    2014-01-01

    Here, we show how a readily available and free scientific visualization program—ParaView—can be used to display electric fields in interesting situations. We give a few examples and specify the individual steps that lead to highly educational representations of the fields.

  19. A Technology-Enhanced Unit of Modeling Static Electricity: Integrating scientific explanations and everyday observations

    NASA Astrophysics Data System (ADS)

    Shen, Ji; Linn, Marcia C.

    2011-08-01

    What trajectories do students follow as they connect their observations of electrostatic phenomena to atomic-level visualizations? We designed an electrostatics unit, using the knowledge integration framework to help students link observations and scientific ideas. We analyze how learners integrate ideas about charges, charged particles, energy, and observable events. We compare learning enactments in a typical school and a magnet school in the USA. We use pre-tests, post-tests, embedded notes, and delayed post-tests to capture the trajectories of students' knowledge integration. We analyze how visualizations help students grapple with abstract electrostatics concepts such as induction. We find that overall students gain more sophisticated ideas. They can interpret dynamic, interactive visualizations, and connect charge- and particle-based explanations to interpret observable events. Students continue to have difficulty in applying the energy-based explanation.

  20. A Collaborative Education Network for Advancing Climate Literacy using Data Visualization Technology

    NASA Astrophysics Data System (ADS)

    McDougall, C.; Russell, E. L.; Murray, M.; Bendel, W. B.

    2013-12-01

    One of the more difficult issues in engaging broad audiences with scientific research is to present it in a way that is intuitive, captivating and up-to-date. Over the past ten years, the National Oceanic and Atmospheric Administration (NOAA) has made significant progress in this area through Science On a Sphere(R) (SOS). SOS is a room-sized, global display system that uses computers and video projectors to display Earth systems data onto a six-foot diameter sphere, analogous to a giant animated globe. This well-crafted data visualization system serves as a way to integrate and display global change phenomena; including polar ice melt, projected sea level rise, ocean acidification and global climate models. Beyond a display for individual data sets, SOS provides a holistic global perspective that highlights the interconnectedness of Earth systems, nations and communities. SOS is now a featured exhibit at more than 100 science centers, museums, universities, aquariums and other institutions around the world reaching more than 33 million visitors every year. To facilitate the development of how this data visualization technology and these visualizations could be used with public audiences, we recognized the need for the exchange of information among the users. To accomplish this, we established the SOS Users Collaborative Network. This network consists of the institutions that have an SOS system or partners who are creating content and educational programming for SOS. When we began the Network in 2005, many museums had limited capacity to both incorporate real-time, authentic scientific data about the Earth system and interpret global change visualizations. They needed not only the visualization platform and the scientific content, but also assistance with methods of approach. We needed feedback from these users on how to craft understandable visualizations and how to further develop the SOS platform to support learning. Through this Network and the collaboration among members, we have, collectively, been able to advance all of our efforts. The member institutions, through regular face-to-face workshops and an online community, share practices in creation and cataloging of datasets, new methods for delivering content via SOS, and updates on the SOS system and software. One hallmark of the SOS Users Collaborative Network is that it exemplifies an ideal partnership between federal science agencies and informal science education institutions. The science agencies (including NOAA, NASA, and the Department of Energy) provide continuously updated global datasets, scientific expertise, funding, and support. In turn, museums act as trusted public providers of scientific information, provide audience-appropriate presentations, localized relevance to global phenomena and a forum for discussing the complex science and repercussions of global change. We will discuss the characteristics of this Network that maximize collaboration and what we're learning as a community to improve climate literacy.

  1. Operating principles and detection characteristics of the Visible and Near-Infrared Imaging Spectrometer in the Chang'e-3

    NASA Astrophysics Data System (ADS)

    He, Zhi-Ping; Wang, Bin-Yong; Lü, Gang; Li, Chun-Lai; Yuan, Li-Yin; Xu, Rui; Liu, Bin; Chen, Kai; Wang, Jian-Yu

    2014-12-01

    The Visible and Near-Infrared Imaging Spectrometer (VNIS), using two acousto-optic tunable filters as dispersive components, consists of a VIS/NIR imaging spectrometer (0.45-0.95 μm), a shortwave IR spectrometer (0.9-2.4 μm) and a calibration unit with dust-proofing functionality. The VNIS was utilized to detect the spectrum of the lunar surface and achieve in-orbit calibration, which satisfied the requirements for scientific detection. Mounted at the front of the Yutu rover, lunar objects that are detected with the VNIS with a 45° visual angle to obtain spectra and geometrical data in order to analyze the mineral composition of the lunar surface. After landing successfully on the Moon, the VNIS performed several explorations and calibrations, and obtained several spectral images and spectral reflectance curves of the lunar soil in the region of Mare Imbrium. This paper describes the working principle and detection characteristics of the VNIS and provides a reference for data processing and scientific applications.

  2. ObsPy: A Python Toolbox for Seismology - Recent Developments and Applications

    NASA Astrophysics Data System (ADS)

    Megies, T.; Krischer, L.; Barsch, R.; Sales de Andrade, E.; Beyreuther, M.

    2014-12-01

    ObsPy (http://www.obspy.org) is a community-driven, open-source project dedicated to building a bridge for seismology into the scientific Python ecosystem. It offersa) read and write support for essentially all commonly used waveform, station, and event metadata file formats with a unified interface,b) a comprehensive signal processing toolbox tuned to the needs of seismologists,c) integrated access to all large data centers, web services and databases, andd) convenient wrappers to legacy codes like libtau and evalresp.Python, currently the most popular language for teaching introductory computer science courses at top-ranked U.S. departments, is a full-blown programming language with the flexibility of an interactive scripting language. Its extensive standard library and large variety of freely available high quality scientific modules cover most needs in developing scientific processing workflows. Together with packages like NumPy, SciPy, Matplotlib, IPython, Pandas, lxml, and PyQt, ObsPy enables the construction of complete workflows in Python. These vary from reading locally stored data or requesting data from one or more different data centers through to signal analysis and data processing and on to visualizations in GUI and web applications, output of modified/derived data and the creation of publication-quality figures.ObsPy enjoys a large world-wide rate of adoption in the community. Applications successfully using it include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. All functionality is extensively documented and the ObsPy tutorial and gallery give a good impression of the wide range of possible use cases.We will present the basic features of ObsPy, new developments and applications, and a roadmap for the near future and discuss the sustainability of our open-source development model.

  3. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    NASA Astrophysics Data System (ADS)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application programming interface (API), which will allow other organizations to build their own custom applications and tools. New features such as finer scale aggregation and an online carbon calculator are being added to the LandCarbon web application to continue to make the site interactive, visually compelling, and useful for a wide range of users.

  4. A model of "integrated scientific method" and its application for the analysis of instruction

    NASA Astrophysics Data System (ADS)

    Rusbult, Craig Francis

    A model of 'integrated scientific method' (ISM) was constructed as a framework for describing the process of science in terms of activities (formulating a research problem, and inventing and evaluating actions--such as selecting and inventing theories, evaluating theories, designing experiments, and doing experiments--intended to solve the problem) and evaluation criteria (empirical, conceptual, and cultural-personal). Instead of trying to define the scientific method, ISM is intended to serve as a flexible framework that--by varying the characteristics of its components, their integrated relationships, and their relative importance can be used to describe a variety of scientific methods, and a variety of perspectives about what constitutes an accurate portrayal of scientific methods. This framework is outlined visually and verbally, followed by an elaboration of the framework and my own views about science, and an evaluation of whether ISM can serve as a relatively neutral framework for describing a wide range of science practices and science interpretations. ISM was used to analyze an innovative, guided inquiry classroom (taught by Susan Johnson, using Genetics Construction Kit software) in which students do simulated scientific research by solving classical genetics problems that require effect-to-cause reasoning and theory revision. The immediate goal of analysis was to examine the 'science experiences' of students, to determine how the 'structure of instruction' provides opportunities for these experiences. Another goal was to test and improve the descriptive and analytical utility of ISM. In developing ISM, a major objective was to make ISM educationally useful. A concluding discussion includes controversies about "the nature of science" and how to teach it, how instruction can expand opportunities for student experience, and how goal-oriented intentional learning (using ISM might improve the learning, retention, and transfer of thinking skills. Potential educational applications of ISM could involve its use for instructional analysis or design, or for teaching students in the classroom; or ISM and IDM (a closely related, generalized 'integrated design method') could play valuable roles in a 'wide spiral' curriculum designed for the coordinated teaching of thinking skills, including creativity and critical thinking, across a wide range of subjects.

  5. An open source workflow for 3D printouts of scientific data volumes

    NASA Astrophysics Data System (ADS)

    Loewe, P.; Klump, J. F.; Wickert, J.; Ludwig, M.; Frigeri, A.

    2013-12-01

    As the amount of scientific data continues to grow, researchers need new tools to help them visualize complex data. Immersive data-visualisations are helpful, yet fail to provide tactile feedback and sensory feedback on spatial orientation, as provided from tangible objects. The gap in sensory feedback from virtual objects leads to the development of tangible representations of geospatial information to solve real world problems. Examples are animated globes [1], interactive environments like tangible GIS [2], and on demand 3D prints. The production of a tangible representation of a scientific data set is one step in a line of scientific thinking, leading from the physical world into scientific reasoning and back: The process starts with a physical observation, or from a data stream generated by an environmental sensor. This data stream is turned into a geo-referenced data set. This data is turned into a volume representation which is converted into command sequences for the printing device, leading to the creation of a 3D printout. As a last, but crucial step, this new object has to be documented and linked to the associated metadata, and curated in long term repositories to preserve its scientific meaning and context. The workflow to produce tangible 3D data-prints from science data at the German Research Centre for Geosciences (GFZ) was implemented as a software based on the Free and Open Source Geoinformatics tools GRASS GIS and Paraview. The workflow was successfully validated in various application scenarios at GFZ using a RapMan printer to create 3D specimens of elevation models, geological underground models, ice penetrating radar soundings for planetology, and space time stacks for Tsunami model quality assessment. While these first pilot applications have demonstrated the feasibility of the overall approach [3], current research focuses on the provision of the workflow as Software as a Service (SAAS), thematic generalisation of information content and long term curation. [1] http://www.arcscience.com/systemDetails/omniTechnology.html [2] http://video.esri.com/watch/53/landscape-design-with-tangible-gis [3] Löwe et al. (2013), Geophysical Research Abstracts, Vol. 15, EGU2013-1544-1.

  6. Using JournalMap to improve discovery and visualization of rangeland scientific knowledge

    USDA-ARS?s Scientific Manuscript database

    Most of the ecological research conducted around the world is tied to specific places; however, that location information is locked up in the text and figures of scientific articles in myriad forms that are not easily searchable. While access to ecological literature has improved dramatically in the...

  7. Science Learning with Information Technologies as a Tool for "Scientific Thinking" in Engineering Education

    ERIC Educational Resources Information Center

    Smirnov, Eugeny; Bogun, Vitali

    2011-01-01

    New methodologies in science (or mathematics) learning process and scientific thinking in the classroom activity of engineer students with ICT (information and communication technology), including graphic calculator are presented: visual modelling with ICT, action research with graphic calculator, insight in classroom and communications and…

  8. SBOL Visual: A Graphical Language for Genetic Designs.

    PubMed

    Quinn, Jacqueline Y; Cox, Robert Sidney; Adler, Aaron; Beal, Jacob; Bhatia, Swapnil; Cai, Yizhi; Chen, Joanna; Clancy, Kevin; Galdzicki, Michal; Hillson, Nathan J; Le Novère, Nicolas; Maheshwari, Akshay J; McLaughlin, James Alastair; Myers, Chris J; P, Umesh; Pocock, Matthew; Rodriguez, Cesar; Soldatova, Larisa; Stan, Guy-Bart V; Swainston, Neil; Wipat, Anil; Sauro, Herbert M

    2015-12-01

    Synthetic Biology Open Language (SBOL) Visual is a graphical standard for genetic engineering. It consists of symbols representing DNA subsequences, including regulatory elements and DNA assembly features. These symbols can be used to draw illustrations for communication and instruction, and as image assets for computer-aided design. SBOL Visual is a community standard, freely available for personal, academic, and commercial use (Creative Commons CC0 license). We provide prototypical symbol images that have been used in scientific publications and software tools. We encourage users to use and modify them freely, and to join the SBOL Visual community: http://www.sbolstandard.org/visual.

  9. Gendermetrics.NET: a novel software for analyzing the gender representation in scientific authoring.

    PubMed

    Bendels, Michael H K; Brüggmann, Dörthe; Schöffel, Norman; Groneberg, David A

    2016-01-01

    Imbalances in female career promotion are believed to be strong in the field of academic science. A primary parameter to analyze gender inequalities is the gender authoring in scientific publications. Since the presently available data on gender distribution is largely limited to underpowered studies, we here develop a new approach to analyze authors' genders in large bibliometric databases. A SQL-Server based multiuser software suite was developed that serves as an integrative tool for analyzing bibliometric data with a special emphasis on gender and topographical analysis. The presented system allows seamless integration, inspection, modification, evaluation and visualization of bibliometric data. By providing an adaptive and almost fully automatic integration and analysis process, the inter-individual variability of analysis is kept at a low level. Depending on the scientific question, the system enables the user to perform a scientometric analysis including its visualization within a short period of time. In summary, a new software suite for analyzing gender representations in scientific articles was established. The system is suitable for the comparative analysis of scientific structures on the level of continents, countries, cities, city regions, institutions, research fields and journals.

  10. Integrating natural language processing and web GIS for interactive knowledge domain visualization

    NASA Astrophysics Data System (ADS)

    Du, Fangming

    Recent years have seen a powerful shift towards data-rich environments throughout society. This has extended to a change in how the artifacts and products of scientific knowledge production can be analyzed and understood. Bottom-up approaches are on the rise that combine access to huge amounts of academic publications with advanced computer graphics and data processing tools, including natural language processing. Knowledge domain visualization is one of those multi-technology approaches, with its aim of turning domain-specific human knowledge into highly visual representations in order to better understand the structure and evolution of domain knowledge. For example, network visualizations built from co-author relations contained in academic publications can provide insight on how scholars collaborate with each other in one or multiple domains, and visualizations built from the text content of articles can help us understand the topical structure of knowledge domains. These knowledge domain visualizations need to support interactive viewing and exploration by users. Such spatialization efforts are increasingly looking to geography and GIS as a source of metaphors and practical technology solutions, even when non-georeferenced information is managed, analyzed, and visualized. When it comes to deploying spatialized representations online, web mapping and web GIS can provide practical technology solutions for interactive viewing of knowledge domain visualizations, from panning and zooming to the overlay of additional information. This thesis presents a novel combination of advanced natural language processing - in the form of topic modeling - with dimensionality reduction through self-organizing maps and the deployment of web mapping/GIS technology towards intuitive, GIS-like, exploration of a knowledge domain visualization. A complete workflow is proposed and implemented that processes any corpus of input text documents into a map form and leverages a web application framework to let users explore knowledge domain maps interactively. This workflow is implemented and demonstrated for a data set of more than 66,000 conference abstracts.

  11. Building Stories about Sea Level Rise through Interactive Visualizations

    NASA Astrophysics Data System (ADS)

    Stephens, S. H.; DeLorme, D. E.; Hagen, S. C.

    2013-12-01

    Digital media provide storytellers with dynamic new tools for communicating about scientific issues via interactive narrative visualizations. While traditional storytelling uses plot, characterization, and point of view to engage audiences with underlying themes and messages, interactive visualizations can be described as 'narrative builders' that promote insight through the process of discovery (Dove, G. & Jones, S. 2012, Proc. IHCI 2012). Narrative visualizations are used in online journalism to tell complex stories that allow readers to select aspects of datasets to explore and construct alternative interpretations of information (Segel, E. & Heer, J. 2010, IEEE Trans. Vis. Comp. Graph.16, 1139), thus enabling them to participate in the story-building process. Nevertheless, narrative visualizations also incorporate author-selected narrative elements that help guide and constrain the overall themes and messaging of the visualization (Hullman, J. & Diakopoulos, N. 2011, IEEE Trans. Vis. Comp. Graph. 17, 2231). One specific type of interactive narrative visualization that is used for science communication is the sea level rise (SLR) viewer. SLR viewers generally consist of a base map, upon which projections of sea level rise scenarios can be layered, and various controls for changing the viewpoint and scenario parameters. They are used to communicate the results of scientific modeling and help readers visualize the potential impacts of SLR on the coastal zone. Readers can use SLR viewers to construct personal narratives of the effects of SLR under different scenarios in locations that are important to them, thus extending the potential reach and impact of scientific research. With careful selection of narrative elements that guide reader interpretation, the communicative aspects of these visualizations may be made more effective. This presentation reports the results of a content analysis of a subset of existing SLR viewers selected in order to comprehensively identify and characterize the narrative elements that contribute to this storytelling medium. The results describe four layers of narrative elements in these viewers: data, visual representations, annotations, and interactivity; and explain the ways in which these elements are used to communicate about SLR. Most existing SLR viewers have been designed with attention to technical usability; however, careful design of narrative elements could increase their overall effectiveness as story-building tools. The analysis concludes with recommendations for narrative elements that should be considered when designing new SLR viewers, and offers suggestions for integrating these components to balance author-driven and reader-driven design features for more effective messaging.

  12. Data Visualization for ESM and ELINT: Visualizing 3D and Hyper Dimensional Data

    DTIC Science & Technology

    2011-06-01

    technique to present multiple 2D views was devised by D. Asimov . He assembled multiple two dimensional scatter plot views of the hyper dimensional...Viewing Multidimensional Data”, D. Asimov , DIAM Journal on Scientific and Statistical Computing, vol.61, pp.128-143, 1985. [2] “High-Dimensional

  13. In situ visualization for large-scale combustion simulations.

    PubMed

    Yu, Hongfeng; Wang, Chaoli; Grout, Ray W; Chen, Jacqueline H; Ma, Kwan-Liu

    2010-01-01

    As scientific supercomputing moves toward petascale and exascale levels, in situ visualization stands out as a scalable way for scientists to view the data their simulations generate. This full picture is crucial particularly for capturing and understanding highly intermittent transient phenomena, such as ignition and extinction events in turbulent combustion.

  14. A Simple Model of Hox Genes: Bone Morphology Demonstration

    ERIC Educational Resources Information Center

    Shmaefsky, Brian

    2008-01-01

    Visual demonstrations of abstract scientific concepts are effective strategies for enhancing content retention (Shmaefsky 2004). The concepts associated with gene regulation of growth and development are particularly complex and are well suited for teaching with visual models. This demonstration provides a simple and accurate model of Hox gene…

  15. The Language of Visualisation

    NASA Astrophysics Data System (ADS)

    Wyatt, R.

    2014-01-01

    There is a visual language present in all images and this article explores the meaning of these languages, their importance, and what it means for the visualisation of science. Do we, as science communicators, confuse and confound our audiences by assuming the visual vernacular of the scientist or isolate our scientific audience by ignoring it?

  16. Graphic Abilities in Relation to Mathematical and Scientific Ability in Adolescents

    ERIC Educational Resources Information Center

    Stavridou, Fotini; Kakana, Domna

    2008-01-01

    Background: The study investigated a small range of cognitive abilities, related to visual-spatial intelligence, in adolescents. This specific range of cognitive abilities was termed "graphic abilities" and defined as a range of abilities to visualise and think in three dimensions, originating in the domain of visual-spatial…

  17. Six Myths about Spatial Thinking

    ERIC Educational Resources Information Center

    Newcombe, Nora S.; Stieff, Mike

    2012-01-01

    Visualizations are an increasingly important part of scientific education and discovery. However, users often do not gain knowledge from them in a complete or efficient way. This article aims to direct research on visualizations in science education in productive directions by reviewing the evidence for widespread assumptions that learning styles,…

  18. Understanding Pictorial Information in Biology: Students' Cognitive Activities and Visual Reading Strategies

    ERIC Educational Resources Information Center

    Brandstetter, Miriam; Sandmann, Angela; Florian, Christine

    2017-01-01

    In classroom, scientific contents are increasingly communicated through visual forms of representations. Students' learning outcomes rely on their ability to read and understand pictorial information. Understanding pictorial information in biology requires cognitive effort and can be challenging to students. Yet evidence-based knowledge about…

  19. Histochemical Seeing: Scientific Visualization and Art Education

    ERIC Educational Resources Information Center

    Knochel, Aaron

    2013-01-01

    What are the capacities of visual arts curricula to engage learning within narrow frameworks of overly "scientistic" standards (Lather, 2007)? With growing emphasis in schools under STEM initiatives and evidence-based standards, the possible cross-pollination of effects that art education may have on a science-centric education may be a…

  20. A "Thinking Journey" to the Planets Using Scientific Visualization Technologies: Implications to Astronomy Education.

    ERIC Educational Resources Information Center

    Yair, Yoav; Schur, Yaron; Mintz, Rachel

    2003-01-01

    Presents a novel approach to teaching astronomy and planetary sciences centered on visual images and simulations of planetary objects. Focuses on the study of the moon and the planet Mars by means of observations, interpretation, and comparison to planet Earth. (Contains 22 references.) (Author/YDS)

  1. The Conceptual Understanding of Sound by Students with Visual Impairments

    ERIC Educational Resources Information Center

    Wild, Tiffany A.; Hilson, Margilee P.; Hobson, Sally M.

    2013-01-01

    Introduction: The purpose of the study presented here was to understand and describe the misconceptions of students with visual impairments about sound and instructional techniques that may help them to develop a scientific understanding. Methods: Semistructured interview-centered pre-and posttests were used to identify the students' conceptual…

  2. Visualization of Earth and Space Science Data at JPL's Science Data Processing Systems Section

    NASA Technical Reports Server (NTRS)

    Green, William B.

    1996-01-01

    This presentation will provide an overview of systems in use at NASA's Jet Propulsion Laboratory for processing data returned by space exploration and earth observations spacecraft. Graphical and visualization techniques used to query and retrieve data from large scientific data bases will be described.

  3. Visual and Spatial Modes in Science Learning

    ERIC Educational Resources Information Center

    Ramadas, Jayashree

    2009-01-01

    This paper surveys some major trends from research on visual and spatial thinking coming from cognitive science, developmental psychology, science literacy, and science studies. It explores the role of visualisation in creativity, in building mental models, and in the communication of scientific ideas, in order to place these findings in the…

  4. Mobile collaborative medical display system.

    PubMed

    Park, Sanghun; Kim, Wontae; Ihm, Insung

    2008-03-01

    Because of recent advances in wireless communication technologies, the world of mobile computing is flourishing with a variety of applications. In this study, we present an integrated architecture for a personal digital assistant (PDA)-based mobile medical display system that supports collaborative work between remote users. We aim to develop a system that enables users in different regions to share a working environment for collaborative visualization with the potential for exploring huge medical datasets. Our system consists of three major components: mobile client, gateway, and parallel rendering server. The mobile client serves as a front end and enables users to choose the visualization and control parameters interactively and cooperatively. The gateway handles requests and responses between mobile clients and the rendering server for efficient communication. Through the gateway, it is possible to share working environments between users, allowing them to work together in computer supported cooperative work (CSCW) mode. Finally, the parallel rendering server is responsible for performing heavy visualization tasks. Our experience indicates that some features currently available to our mobile clients for collaborative scientific visualization are limited due to the poor performance of mobile devices and the low bandwidth of wireless connections. However, as mobile devices and wireless network systems are experiencing considerable elevation in their capabilities, we believe that our methodology will be utilized effectively in building quite responsive, useful mobile collaborative medical systems in the very near future.

  5. Visualisation methods for large provenance collections in data-intensive collaborative platforms

    NASA Astrophysics Data System (ADS)

    Spinuso, Alessandro; Fligueira, Rosa; Atkinson, Malcolm; Gemuend, Andre

    2016-04-01

    This work investigates improving the methods of visually representing provenance information in the context of modern data-driven scientific research. It explores scenarios where data-intensive workflows systems are serving communities of researchers within collaborative environments, supporting the sharing of data and methods, and offering a variety of computation facilities, including HPC, HTC and Cloud. It focuses on the exploration of big-data visualization techniques aiming at producing comprehensive and interactive views on top of large and heterogeneous provenance data. The same approach is applicable to control-flow and data-flow workflows or to combinations of the two. This flexibility is achieved using the W3C-PROV recommendation as a reference model, especially its workflow oriented profiles such as D-PROV (Messier et al. 2013). Our implementation is based on the provenance records produced by the dispel4py data-intensive processing library (Filgueira et al. 2015). dispel4py is an open-source Python framework for describing abstract stream-based workflows for distributed data-intensive applications, developed during the VERCE project. dispel4py enables scientists to develop their scientific methods and applications on their laptop and then run them at scale on a wide range of e-Infrastructures (Cloud, Cluster, etc.) without making changes. Users can therefore focus on designing their workflows at an abstract level, describing actions, input and output streams, and how they are connected. The dispel4py system then maps these descriptions to the enactment platforms, such as MPI, Storm, multiprocessing. It provides a mechanism which allows users to determine the provenance information to be collected and to analyze it at runtime. For this work we consider alternative visualisation methods for provenance data, from infinite lists and localised interactive graphs, to radial-views. The latter technique has been positively explored in many fields, from text data visualisation to genomics and social networking analysis. Its adoption for provenance has been presented in literature (Borkin et al. 2013) in the context of parent-child relationships across processes, constructed from control-flow information. Computer graphics research has focused on the advantage of this radial distribution of interlinked information and on ways to improve the visual efficiency and tunability of such representations, like the Hierarchical Edge Bundles visualisation method, (Holten et al. 2006), which aims at reducing visual clutter of highly connected structures via the generation of bundles. Our approach explores the potential of the combination of these methods. It serves environments where the size of the provenance collection, coupled with the diversity of the infrastructures and the domain metadata, make the extrapolation of usage trends extremely challenging. Applications of such visualisation systems can engage groups of scientists, data providers and computational engineers, by serving visual snapshots that highlight relationships between an item and its connected processes. We will present examples of comprehensive views on the distribution of processing and data transfers during a workflow's execution in HPC, as well as cross workflows interactions and internal dynamics. The latter in the context of faceted searches on domain metadata values-range. These are obtained from the analysis of real provenance data generated by the processing of seismic traces performed through the VERCE platform.

  6. Identifying secondary-school students' difficulties when reading visual representations displayed in physics simulations

    NASA Astrophysics Data System (ADS)

    López, Víctor; Pintó, Roser

    2017-07-01

    Computer simulations are often considered effective educational tools, since their visual and communicative power enable students to better understand physical systems and phenomena. However, previous studies have found that when students read visual representations some reading difficulties can arise, especially when these are complex or dynamic representations. We have analyzed how secondary-school students read the visual representations displayed in two PhET simulations (one addressing the friction-heating at microscopic level, and the other addressing the electromagnetic induction), and different typologies of reading difficulties have been identified: when reading the compositional structure of the representation, when giving appropriate relevance and semantic meaning to each visual element, and also when dealing with multiple representations and dynamic information. All students experienced at least one of these difficulties, and very similar difficulties appeared in the two groups of students, despite the different scientific content of the simulations. In conclusion, visualisation does not imply a full comprehension of the content of scientific simulations per se, and an effective reading process requires a set of reading skills, previous knowledge, attention, and external supports. Science teachers should bear in mind these issues in order to help students read images to take benefit of their educational potential.

  7. Diderot: a Domain-Specific Language for Portable Parallel Scientific Visualization and Image Analysis.

    PubMed

    Kindlmann, Gordon; Chiw, Charisee; Seltzer, Nicholas; Samuels, Lamont; Reppy, John

    2016-01-01

    Many algorithms for scientific visualization and image analysis are rooted in the world of continuous scalar, vector, and tensor fields, but are programmed in low-level languages and libraries that obscure their mathematical foundations. Diderot is a parallel domain-specific language that is designed to bridge this semantic gap by providing the programmer with a high-level, mathematical programming notation that allows direct expression of mathematical concepts in code. Furthermore, Diderot provides parallel performance that takes advantage of modern multicore processors and GPUs. The high-level notation allows a concise and natural expression of the algorithms and the parallelism allows efficient execution on real-world datasets.

  8. Visualization analysis of author collaborations in schizophrenia research.

    PubMed

    Wu, Ying; Duan, Zhiguang

    2015-02-19

    Schizophrenia is a serious mental illness that levies a heavy medical toll and cost burden throughout the world. Scientific collaborations are necessary for progress in psychiatric research. However, there have been few publications on scientific collaborations in schizophrenia. The aim of this study was to investigate the extent of author collaborations in schizophrenia research. This study used 58,107 records on schizophrenia from 2003 to 2012 which were downloaded from Science Citation Index Expanded (SCI Expanded) via Web of Science. CiteSpace III, an information visualization and analysis software, was used to make a visual analysis. Collaborative author networks within the field of schizophrenia were determined using published documents. We found that external author collaboration networks were more scattered while potential author collaboration networks were more compact. Results from hierarchical clustering analysis showed that the main collaborative field was genetic research in schizophrenia. Based on the results, authors belonging to different institutions and in different countries should be encouraged to collaborate in schizophrenia research. This will help researchers focus their studies on key issues, and allow each other to offer reasonable suggestions for making polices and providing scientific evidence to effectively diagnose, prevent, and cure schizophrenia.

  9. Relevance of tissue Doppler in the quantification of stress echocardiography for the detection of myocardial ischemia in clinical practice

    PubMed Central

    Sicari, Rosa

    2005-01-01

    In the present article we review the main published data on the application of Tissue Doppler Imaging (TDI) to stress echocardiography for the detection of myocardial ischemia. TDI has been applied to stress echocardiography in order to overcome the limitations of visual analysis for myocardial ischemia. The introduction of a new technology for clinical routine use should pass through the different phases of scientific assessment from feasibility studies to large multicenter studies, from efficacy to effectiveness studies. Nonetheless the pro-technology bias plays a major role in medicine and expensive and sophisticated techniques are accepted before their real usefulness and incremental value to the available ones is assessed. Apparently, TDI is not exempted by this approach : its applications are not substantiated by strong and sound results. Nonetheless, conventional stress echocardiography for myocardial ischemia detection is heavily criticized on the basis of its subjectivity. Stress echocardiography has a long lasting history and the evidence collected over 20 years positioned it as an established tool for the detection and prognostication of coronary artery disease. The quantitative assessment of myocardial ischemia remains a scientific challenge and a clinical goal but time has not come for these newer ultrasonographic techniques which should be restricted to research laboratories. PMID:15679889

  10. Wyrm: A Brain-Computer Interface Toolbox in Python.

    PubMed

    Venthur, Bastian; Dähne, Sven; Höhne, Johannes; Heller, Hendrik; Blankertz, Benjamin

    2015-10-01

    In the last years Python has gained more and more traction in the scientific community. Projects like NumPy, SciPy, and Matplotlib have created a strong foundation for scientific computing in Python and machine learning packages like scikit-learn or packages for data analysis like Pandas are building on top of it. In this paper we present Wyrm ( https://github.com/bbci/wyrm ), an open source BCI toolbox in Python. Wyrm is applicable to a broad range of neuroscientific problems. It can be used as a toolbox for analysis and visualization of neurophysiological data and in real-time settings, like an online BCI application. In order to prevent software defects, Wyrm makes extensive use of unit testing. We will explain the key aspects of Wyrm's software architecture and design decisions for its data structure, and demonstrate and validate the use of our toolbox by presenting our approach to the classification tasks of two different data sets from the BCI Competition III. Furthermore, we will give a brief analysis of the data sets using our toolbox, and demonstrate how we implemented an online experiment using Wyrm. With Wyrm we add the final piece to our ongoing effort to provide a complete, free and open source BCI system in Python.

  11. Interactive web visualization tools to the results interpretation of a seismic risk study aimed at the emergency levels definition

    NASA Astrophysics Data System (ADS)

    Rivas-Medina, A.; Gutierrez, V.; Gaspar-Escribano, J. M.; Benito, B.

    2009-04-01

    Results of a seismic risk assessment study are often applied and interpreted by users unspecialised on the topic or lacking a scientific background. In this context, the availability of tools that help translating essentially scientific contents to broader audiences (such as decision makers or civil defence officials) as well as representing and managing results in a user-friendly fashion, are on indubitable value. On of such tools is the visualization tool VISOR-RISNA, a web tool developed within the RISNA project (financed by the Emergency Agency of Navarre, Spain) for regional seismic risk assessment of Navarre and the subsequent development of emergency plans. The RISNA study included seismic hazard evaluation, geotechnical characterization of soils, incorporation of site effects to expected ground motions, vulnerability distribution assessment and estimation of expected damage distributions for a 10% probability of exceedance in 50 years. The main goal of RISNA was the identification of higher risk area where focusing detailed, local-scale risk studies in the future and the corresponding urban emergency plans. A geographic information system was used to combine different information layers, generate tables of results and represent maps with partial and final results. The visualization tool VISOR-RISNA is intended to facilitate the interpretation and representation of the collection of results, with the ultimate purpose of defining actuation plans. A number of criteria for defining actuation priorities are proposed in this work. They are based on combinations of risk parameters resulting from the risk study (such as expected ground motion and damage and exposed population), as determined by risk assessment specialists. Although the values that these parameters take are a result of the risk study, their distribution in several classes depends on the intervals defined by decision takers or civil defense officials. These criteria provide a ranking of municipalities according to the expected actuation level and eventually, to alert levels. In this regard, the visualization tool constitutes an intuitive and useful tool that the end-user of the risk study may use to optimize and guide its application on emergency planning. The use of this type of tools can be adapted to other scenarios with different boundary conditions (seismicity level, vulnerability distribution) and user profiles (policy makers, stakeholders, students, general public) maintaining the same final goal: to improve the adaptation of the results of a scientific-technical work to the needs of other users with different backgrounds.

  12. Augmented Reality in Scientific Publications-Taking the Visualization of 3D Structures to the Next Level.

    PubMed

    Wolle, Patrik; Müller, Matthias P; Rauh, Daniel

    2018-03-16

    The examination of three-dimensional structural models in scientific publications allows the reader to validate or invalidate conclusions drawn by the authors. However, either due to a (temporary) lack of access to proper visualization software or a lack of proficiency, this information is not necessarily available to every reader. As the digital revolution is quickly progressing, technologies have become widely available that overcome the limitations and offer to all the opportunity to appreciate models not only in 2D, but also in 3D. Additionally, mobile devices such as smartphones and tablets allow access to this information almost anywhere, at any time. Since access to such information has only recently become standard practice, we want to outline straightforward ways to incorporate 3D models in augmented reality into scientific publications, books, posters, and presentations and suggest that this should become general practice.

  13. Integrating a geographic information system, a scientific visualization system and an orographic precipitation model

    USGS Publications Warehouse

    Hay, L.; Knapp, L.

    1996-01-01

    Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.

  14. Field: a new meta-authoring platform for data-intensive scientific visualization

    NASA Astrophysics Data System (ADS)

    Downie, M.; Ameres, E.; Fox, P. A.; Goebel, J.; Graves, A.; Hendler, J.

    2012-12-01

    This presentation will demonstrate a new platform for data-intensive scientific visualization, called Field, that rethinks the problem of visual data exploration. Several new opportunities for scientific visualization present themselves here at this moment in time. We believe that when taken together they may catalyze a transformation of the practice of science and to begin to seed a technical culture within science that fuses data analysis, programming and myriad visual strategies. It is at integrative levels that the principle challenges exist, for many fundamental technical components of our field are now well understood and widely available. File formats from CSV through HDF all have broad library support; low-level high-performance graphics APIs (OpenGL) are in a period of stable growth; and a dizzying ecosystem of analysis and machine learning libraries abound. The hardware of computer graphics offers unprecedented computing power within commodity components; programming languages and platforms are coalescing around a core set of umbrella runtimes. Each of these trends are each set to continue — computer graphics hardware is developing at a super-Moore-law rate, and trends in publication and dissemination point only towards an increasing amount of access to code and data. The critical opportunity here for scientific visualization is, we maintain, not a in developing a new statistical library, nor a new tool centered on a particular technique, but rather new visual, "live" programming environment that is promiscuous in its scope. We can identify the necessarily methodological practice and traditions required here not in science or engineering but in the "live-coding" practices prevalent in the fields of digital art and design. We can define this practice as an approach to programming that is live, iterative, integrative, speculative and exploratory. "Live" because it is exclusively practiced in real-time (often during performance); "iterative", because intermediate programs and this visual results are constantly being made and remade en route; "speculative", because these programs and images result out of mode of inquiry into image-making not unlike that of hypothesis formation and testing; "integrative" because this style draws deeply upon the libraries of algorithms and materials available online today; and "exploratory" because the results of these speculations are inherently open to the data and unforeseen out the outset. To this end our development environment — Field — comprises a minimal core and a powerful plug-in system that can be extended from within the environment itself. By providing a hybrid text editor that can incorporate text-based programming at the same time with graphical user-interface elements, its flexible and extensible interface provides space as necessary for notation, visualization, interface construction, and introspection. In addition, it provides an advanced GPU-accelerated graphics system ideal for large-scale data visualization. Since Field was created in the context of widely divergent interdisciplinary projects, its aim is to give its users not only the ability to work rapidly, but to shape their Field environment extensively and flexibly for their own demands.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sewell, Christopher Meyer

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  16. Analysis of Visual Illusions Using Multiresolution Wavelet Decomposition Based Models

    DTIC Science & Technology

    1991-12-01

    1962). 22. Hubel , David H. "The Visual Cortex of The Brain," Scientific American, 209(5):54-62 (November 1963). 23. Hubel , David H. and Torsten N...model the visual system. In 1990, Oberndorf, a masters student at the Air Force Institrt, of Technology, tested the Gabor theo y on visual illusion...represento d by x2 + y2 = r 2 in Cartesian space is now more easily expressed by p = r in polar space. The coordinates x and y or p and 0 provide alternate

  17. [To explain is to narrate. How to visualize scientific data].

    PubMed

    Hawtin, Nigel

    2014-01-01

    When you try to appeal a vast ranging audience, as it occurs at the New Scientist that addresses scientists as well as the general public, your scientific visual explainer must be succinct, clear, accurate and easily understandable. In order to reach this goal, your message should provide only the main data, the ones that allow you to balance information and clarity: information should be put into context and all the extra details should be cut down. It is very important, then, to know well both your audience and the subject you are going to describe, as graphic masters of the past, like William Playfair and Charles Minard, have taught us. Moreover, you should try to engage your reader connecting the storytelling power of words and the driving force of the graphics: colours, visual elements, typography. To be effective, in fact, an infographic should not only be truthful and functional, but also elegant, having style and legibility.

  18. Visual analytics as a translational cognitive science.

    PubMed

    Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard

    2011-07-01

    Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.

  19. Final Technical Progress Report; Closeout Certifications; CSSV Newsletter Volume I; CSSV Newsletter Volume II; CSSV Activity Journal; CSSV Final Financial Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houston, Johnny L; Geter, Kerry

    This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization andmore » the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.« less

  20. Application of Mathematical and Three-Dimensional Computer Modeling Tools in the Planning of Processes of Fuel and Energy Complexes

    NASA Astrophysics Data System (ADS)

    Aksenova, Olesya; Nikolaeva, Evgenia; Cehlár, Michal

    2017-11-01

    This work aims to investigate the effectiveness of mathematical and three-dimensional computer modeling tools in the planning of processes of fuel and energy complexes at the planning and design phase of a thermal power plant (TPP). A solution for purification of gas emissions at the design development phase of waste treatment systems is proposed employing mathematical and three-dimensional computer modeling - using the E-nets apparatus and the development of a 3D model of the future gas emission purification system. Which allows to visualize the designed result, to select and scientifically prove economically feasible technology, as well as to ensure the high environmental and social effect of the developed waste treatment system. The authors present results of a treatment of planned technological processes and the system for purifying gas emissions in terms of E-nets. using mathematical modeling in the Simulink application. What allowed to create a model of a device from the library of standard blocks and to perform calculations. A three-dimensional model of a system for purifying gas emissions has been constructed. It allows to visualize technological processes and compare them with the theoretical calculations at the design phase of a TPP and. if necessary, make adjustments.

  1. Sculpting in cyberspace: Parallel processing the development of new software

    NASA Technical Reports Server (NTRS)

    Fisher, Rob

    1993-01-01

    Stimulating creativity in problem solving, particularly where software development is involved, is applicable to many disciplines. Metaphorical thinking keeps the problem in focus but in a different light, jarring people out of their mental ruts and sparking fresh insights. It forces the mind to stretch to find patterns between dissimilar concepts, in the hope of discovering unusual ideas in odd associations (Technology Review January 1993, p. 37). With a background in Engineering and Visual Design from MIT, I have for the past 30 years pursued a career as a sculptor of interdisciplinary monumental artworks that bridge the fields of science, engineering and art. Since 1979, I have pioneered the application of computer simulation to solve the complex problems associated with these projects. A recent project for the roof of the Carnegie Science Center in Pittsburgh made particular use of the metaphoric creativity technique described above. The problem-solving process led to the creation of hybrid software combining scientific, architectural and engineering visualization techniques. David Steich, a Doctoral Candidate in Electrical Engineering at Penn State, was commissioned to develop special software that enabled me to create innovative free-form sculpture. This paper explores the process of inventing the software through a detailed analysis of the interaction between an artist and a computer programmer.

  2. Overview of NASARTI (NASA Radiation Track Image) Program: Highlights of the Model Improvement and the New Results

    NASA Technical Reports Server (NTRS)

    Ponomarev, Artem L.; Plante, I.; George, Kerry; Cornforth, M. N.; Loucas, B. D.; Wu, Honglu

    2014-01-01

    This presentation summarizes several years of research done by the co-authors developing the NASARTI (NASA Radiation Track Image) program and supporting it with scientific data. The goal of the program is to support NASA mission to achieve a safe space travel for humans despite the perils of space radiation. The program focuses on selected topics in radiation biology that were deemed important throughout this period of time, both for the NASA human space flight program and to academic radiation research. Besides scientific support to develop strategies protecting humans against an exposure to deep space radiation during space missions, and understanding health effects from space radiation on astronauts, other important ramifications of the ionizing radiation were studied with the applicability to greater human needs: understanding the origins of cancer, the impact on human genome, and the application of computer technology to biological research addressing the health of general population. The models under NASARTI project include: the general properties of ionizing radiation, such as particular track structure, the effects of radiation on human DNA, visualization and the statistical properties of DSBs (DNA double-strand breaks), DNA damage and repair pathways models and cell phenotypes, chromosomal aberrations, microscopy data analysis and the application to human tissue damage and cancer models. The development of the GUI and the interactive website, as deliverables to NASA operations teams and tools for a broader research community, is discussed. Most recent findings in the area of chromosomal aberrations and the application of the stochastic track structure are also presented.

  3. Developing cloud applications using the e-Science Central platform.

    PubMed

    Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek

    2013-01-28

    This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction.

  4. Developing cloud applications using the e-Science Central platform

    PubMed Central

    Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek

    2013-01-01

    This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction. PMID:23230161

  5. Award for Distinguished Scientific Early Career Contributions to Psychology: Christian N. L. Olivers

    ERIC Educational Resources Information Center

    American Psychologist, 2009

    2009-01-01

    Christian N. L. Olivers, winner of the Award for Distinguished Scientific Early Career Contributions to Psychology, is cited for outstanding research on visual attention and working memory. Olivers uses classic experimental designs in an innovative and sophisticated way to determine underlying mechanisms. He has formulated important theoretical…

  6. An Interdisciplinary Guided Inquiry on Estuarine Transport Using a Computer Model in High School Classrooms

    ERIC Educational Resources Information Center

    Chan, Kit Yu Karen; Yang, Sylvia; Maliska, Max E.; Grunbaum, Daniel

    2012-01-01

    The National Science Education Standards have highlighted the importance of active learning and reflection for contemporary scientific methods in K-12 classrooms, including the use of models. Computer modeling and visualization are tools that researchers employ in their scientific inquiry process, and often computer models are used in…

  7. 75 FR 57965 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-23

    ... Emphasis Panel; RFA Panel: Drug Discovery for the Nervous System. Date: October 14-15, 2010. Time: 8 a.m...: Center for Scientific Review Special Emphasis Panel; RFA Panel: Drug Discovery for the Nervous System... Review Special Emphasis Panel; Small Business: Visual Systems. Date: October 28, 2010. Time: 8 a.m. to 6...

  8. Undergraduate Non-Science Majors' Descriptions and Interpretations of Scientific Data Visualizations

    ERIC Educational Resources Information Center

    Swenson, Sandra Signe

    2010-01-01

    Professionally developed and freely accessible through the Internet, scientific data maps have great potential for teaching and learning with data in the science classroom. Solving problems or developing ideas while using data maps of Earth phenomena in the science classroom may help students to understand the nature and process of science. Little…

  9. ScienceDesk Project Overview

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Norvig, Peter (Technical Monitor)

    2000-01-01

    NASA's ScienceDesk Project at the Ames Research Center is responsible for scientific knowledge management which includes ensuring the capture, preservation, and traceability of scientific knowledge. Other responsibilities include: 1) Maintaining uniform information access which is achieved through intelligent indexing and visualization, 2) Collaborating both asynchronous and synchronous science teamwork, 3) Monitoring and controlling semi-autonomous remote experimentation.

  10. Visual Language for the Expression of Scientific Concepts

    ERIC Educational Resources Information Center

    Zender, Mike; Crutcher, Keith A.

    2007-01-01

    The accelerating rate of data generation and resulting publications are taxing the ability of scientific investigators to stay current with the emerging literature. This problem, acute in science, is not uncommon in other areas. New approaches to managing this explosion of information are needed. While it is only possible to read one paper or…

  11. Cognitive Affordances of the Cyberinfrastructure for Science and Math Learning

    ERIC Educational Resources Information Center

    Martinez, Michael E.; Peters Burton, Erin E.

    2011-01-01

    The "cyberinfrastucture" is a broad informational network that entails connections to real-time data sensors as well as tools that permit visualization and other forms of analysis, and that facilitates access to vast scientific databases. This multifaceted network, already a major boon to scientific discovery, now shows exceptional promise in…

  12. Assessing Student Scientific Expression Using Media: The Media-Enhanced Science Presentation Rubric (MESPR)

    ERIC Educational Resources Information Center

    Mott, Michael S.; Chessin, Debby A.; Sumrall, William J.; Rutherford, Angela S.; Moore, Virginia J.

    2011-01-01

    The current study evaluated an assessment designed to dually promote student understanding of the experimental method and student ability to include digital and visual qualities in their presentations of scientific experiment results. The rubric, the Media-Enhanced Science Presentation Rubric (MESPR) focuses teacher-student dialogue along the…

  13. Watermark: An Application and Methodology and Application for Interactive and intelligent Decision Support for Groundwater Systems

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.

    2016-12-01

    Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.

  14. 77 FR 61739 - Application(s) for Duty-Free Entry of Scientific Instruments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-11

    ... DEPARTMENT OF COMMERCE International Trade Administration Application(s) for Duty-Free Entry of Scientific Instruments Pursuant to Section 6(c) of the Educational, Scientific and Cultural Materials... combustion, such as hydroxyl (OH) radicals. The [[Page 61740

  15. Database of Novel and Emerging Adsorbent Materials

    National Institute of Standards and Technology Data Gateway

    SRD 205 NIST/ARPA-E Database of Novel and Emerging Adsorbent Materials (Web, free access)   The NIST/ARPA-E Database of Novel and Emerging Adsorbent Materials is a free, web-based catalog of adsorbent materials and measured adsorption properties of numerous materials obtained from article entries from the scientific literature. Search fields for the database include adsorbent material, adsorbate gas, experimental conditions (pressure, temperature), and bibliographic information (author, title, journal), and results from queries are provided as a list of articles matching the search parameters. The database also contains adsorption isotherms digitized from the cataloged articles, which can be compared visually online in the web application or exported for offline analysis.

  16. Datamonkey 2.0: a modern web application for characterizing selective and other evolutionary processes.

    PubMed

    Weaver, Steven; Shank, Stephen D; Spielman, Stephanie J; Li, Michael; Muse, Spencer V; Kosakovsky Pond, Sergei L

    2018-01-02

    Inference of how evolutionary forces have shaped extant genetic diversity is a cornerstone of modern comparative sequence analysis. Advances in sequence generation and increased statistical sophistication of relevant methods now allow researchers to extract ever more evolutionary signal from the data, albeit at an increased computational cost. Here, we announce the release of Datamonkey 2.0, a completely re-engineered version of the Datamonkey web-server for analyzing evolutionary signatures in sequence data. For this endeavor, we leveraged recent developments in open-source libraries that facilitate interactive, robust, and scalable web application development. Datamonkey 2.0 provides a carefully curated collection of methods for interrogating coding-sequence alignments for imprints of natural selection, packaged as a responsive (i.e. can be viewed on tablet and mobile devices), fully interactive, and API-enabled web application. To complement Datamonkey 2.0, we additionally release HyPhy Vision, an accompanying JavaScript application for visualizing analysis results. HyPhy Vision can also be used separately from Datamonkey 2.0 to visualize locally-executed HyPhy analyses. Together, Datamonkey 2.0 and HyPhy Vision showcase how scientific software development can benefit from general-purpose open-source frameworks. Datamonkey 2.0 is freely and publicly available at http://www.datamonkey. org, and the underlying codebase is available from https://github.com/veg/datamonkey-js. © The Author 2018. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. A morphologically preserved multi-resolution TIN surface modeling and visualization method for virtual globes

    NASA Astrophysics Data System (ADS)

    Zheng, Xianwei; Xiong, Hanjiang; Gong, Jianya; Yue, Linwei

    2017-07-01

    Virtual globes play an important role in representing three-dimensional models of the Earth. To extend the functioning of a virtual globe beyond that of a "geobrowser", the accuracy of the geospatial data in the processing and representation should be of special concern for the scientific analysis and evaluation. In this study, we propose a method for the processing of large-scale terrain data for virtual globe visualization and analysis. The proposed method aims to construct a morphologically preserved multi-resolution triangulated irregular network (TIN) pyramid for virtual globes to accurately represent the landscape surface and simultaneously satisfy the demands of applications at different scales. By introducing cartographic principles, the TIN model in each layer is controlled with a data quality standard to formulize its level of detail generation. A point-additive algorithm is used to iteratively construct the multi-resolution TIN pyramid. The extracted landscape features are also incorporated to constrain the TIN structure, thus preserving the basic morphological shapes of the terrain surface at different levels. During the iterative construction process, the TIN in each layer is seamlessly partitioned based on a virtual node structure, and tiled with a global quadtree structure. Finally, an adaptive tessellation approach is adopted to eliminate terrain cracks in the real-time out-of-core spherical terrain rendering. The experiments undertaken in this study confirmed that the proposed method performs well in multi-resolution terrain representation, and produces high-quality underlying data that satisfy the demands of scientific analysis and evaluation.

  18. Enhancing UCSF Chimera through web services

    PubMed Central

    Huang, Conrad C.; Meng, Elaine C.; Morris, John H.; Pettersen, Eric F.; Ferrin, Thomas E.

    2014-01-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. PMID:24861624

  19. Research and development of web oriented remote sensing image publication system based on Servlet technique

    NASA Astrophysics Data System (ADS)

    Juanle, Wang; Shuang, Li; Yunqiang, Zhu

    2005-10-01

    According to the requirements of China National Scientific Data Sharing Program (NSDSP), the research and development of web oriented RS Image Publication System (RSIPS) is based on Java Servlet technique. The designing of RSIPS framework is composed of 3 tiers, which is Presentation Tier, Application Service Tier and Data Resource Tier. Presentation Tier provides user interface for data query, review and download. For the convenience of users, visual spatial query interface is included. Served as a middle tier, Application Service Tier controls all actions between users and databases. Data Resources Tier stores RS images in file and relationship databases. RSIPS is developed with cross platform programming based on Java Servlet tools, which is one of advanced techniques in J2EE architecture. RSIPS's prototype has been developed and applied in the geosciences clearinghouse practice which is among the experiment units of NSDSP in China.

  20. Applications of hypermedia systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lennon, J.; Maurer, H.

    1995-05-01

    In this paper, we consider several new aspects of modern hypermedia systems. The applications discussed include: (1) General Information and Communication Systems: Distributed information systems for businesses, schools and universities, museums, libraries, health systems, etc. (2) Electronic orientation and information displays: Electronic guided tours, public information kiosks, and publicity dissemination with archive facilities. (3) Lecturing: A system going beyond the traditional to empower both teachers and learners. (4) Libraries: A further step towards fully electronic library systems. (5) Directories of all kinds: Staff, telephone, and all sorts of generic directories. (6) Administration: A fully integrated system such as the onemore » proposed will mean efficient data processing and valuable statistical data. (7) Research: Material can now be accessed from databases all around the world. The effects of networking and computer-supported collaborative work are discussed, and examples of new scientific visualization programs are quoted. The paper concludes with a section entitled {open_quotes}Future Directions{close_quotes}.« less

Top