Sample records for spatial analytical tools

  1. Linking climate change and fish conservation efforts using spatially explicit decision support tools

    Treesearch

    Douglas P. Peterson; Seth J. Wenger; Bruce E. Rieman; Daniel J. Isaak

    2013-01-01

    Fisheries professionals are increasingly tasked with incorporating climate change projections into their decisions. Here we demonstrate how a structured decision framework, coupled with analytical tools and spatial data sets, can help integrate climate and biological information to evaluate management alternatives. We present examples that link downscaled climate...

  2. Model for Atmospheric Propagation of Spatially Combined Laser Beams

    DTIC Science & Technology

    2016-09-01

    thesis modeling tools is discussed. In Chapter 6, the thesis validated the model with analytical computations and simulations result from...using propagation model . Based on both the analytical computation and WaveTrain results, the diraction e ects simulated in the propagation model are...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS MODEL FOR ATMOSPHERIC PROPAGATION OF SPATIALLY COMBINED LASER BEAMS by Kum Leong Lee

  3. Semantic Interaction for Sensemaking: Inferring Analytical Reasoning for Model Steering.

    PubMed

    Endert, A; Fiaux, P; North, C

    2012-12-01

    Visual analytic tools aim to support the cognitively demanding task of sensemaking. Their success often depends on the ability to leverage capabilities of mathematical models, visualization, and human intuition through flexible, usable, and expressive interactions. Spatially clustering data is one effective metaphor for users to explore similarity and relationships between information, adjusting the weighting of dimensions or characteristics of the dataset to observe the change in the spatial layout. Semantic interaction is an approach to user interaction in such spatializations that couples these parametric modifications of the clustering model with users' analytic operations on the data (e.g., direct document movement in the spatialization, highlighting text, search, etc.). In this paper, we present results of a user study exploring the ability of semantic interaction in a visual analytic prototype, ForceSPIRE, to support sensemaking. We found that semantic interaction captures the analytical reasoning of the user through keyword weighting, and aids the user in co-creating a spatialization based on the user's reasoning and intuition.

  4. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  5. A consumer guide: tools to manage vegetation and fuels.

    Treesearch

    David L. Peterson; Louisa Evers; Rebecca A. Gravenmier; Ellen Eberhardt

    2007-01-01

    Current efforts to improve the scientific basis for fire management on public lands will benefit from more efficient transfer of technical information and tools that support planning, implementation, and effectiveness of vegetation and hazardous fuel treatments. The technical scope, complexity, and relevant spatial scale of analytical and decision support tools differ...

  6. [Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].

    PubMed

    Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou

    2014-08-01

    In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).

  7. A high-performance spatial database based approach for pathology imaging algorithm evaluation

    PubMed Central

    Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A.D.; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J.; Saltz, Joel H.

    2013-01-01

    Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. Aims: (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and loaded into a spatial database. To support efficient data loading, we have implemented a parallel data loading tool that takes advantage of multi-core CPUs to accelerate data injection. The spatial database manages both geometric shapes and image features or classifications, and enables spatial sampling, result comparison, and result aggregation through expressive structured query language (SQL) queries with spatial extensions. To provide scalable and efficient query support, we have employed a shared nothing parallel database architecture, which distributes data homogenously across multiple database partitions to take advantage of parallel computation power and implements spatial indexing to achieve high I/O throughput. Results: Our work proposes a high performance, parallel spatial database platform for algorithm validation and comparison. This platform was evaluated by storing, managing, and comparing analysis results from a set of brain tumor whole slide images. The tools we develop are open source and available to download. Conclusions: Pathology image algorithm validation and comparison are essential to iterative algorithm development and refinement. One critical component is the support for queries involving spatial predicates and comparisons. In our work, we develop an efficient data model and parallel database approach to model, normalize, manage and query large volumes of analytical image result data. Our experiments demonstrate that the data partitioning strategy and the grid-based indexing result in good data distribution across database nodes and reduce I/O overhead in spatial join queries through parallel retrieval of relevant data and quick subsetting of datasets. The set of tools in the framework provide a full pipeline to normalize, load, manage and query analytical results for algorithm evaluation. PMID:23599905

  8. Environmental Assessment and Monitoring with ICAMS (Image Characterization and Modeling System) Using Multiscale Remote-Sensing Data

    NASA Technical Reports Server (NTRS)

    Lam, N.; Qiu, H.-I.; Quattrochi, Dale A.; Zhao, Wei

    1997-01-01

    With the rapid increase in spatial data, especially in the NASA-EOS (Earth Observing System) era, it is necessary to develop efficient and innovative tools to handle and analyze these data so that environmental conditions can be assessed and monitored. A main difficulty facing geographers and environmental scientists in environmental assessment and measurement is that spatial analytical tools are not easily accessible. We have recently developed a remote sensing/GIS software module called Image Characterization and Modeling System (ICAMS) to provide specialized spatial analytical tools for the measurement and characterization of satellite and other forms of spatial data. ICAMS runs on both the Intergraph-MGE and Arc/info UNIX and Windows-NT platforms. The main techniques in ICAMS include fractal measurement methods, variogram analysis, spatial autocorrelation statistics, textural measures, aggregation techniques, normalized difference vegetation index (NDVI), and delineation of land/water and vegetated/non-vegetated boundaries. In this paper, we demonstrate the main applications of ICAMS on the Intergraph-MGE platform using Landsat Thematic Mapper images from the city of Lake Charles, Louisiana. While the utilities of ICAMS' spatial measurement methods (e.g., fractal indices) in assessing environmental conditions remain to be researched, making the software available to a wider scientific community can permit the techniques in ICAMS to be evaluated and used for a diversity of applications. The findings from these various studies should lead to improved algorithms and more reliable models for environmental assessment and monitoring.

  9. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.

  10. Linking linear programming and spatial simulation models to predict landscape effects of forest management alternatives

    Treesearch

    Eric J. Gustafson; L. Jay Roberts; Larry A. Leefers

    2006-01-01

    Forest management planners require analytical tools to assess the effects of alternative strategies on the sometimes disparate benefits from forests such as timber production and wildlife habitat. We assessed the spatial patterns of alternative management strategies by linking two models that were developed for different purposes. We used a linear programming model (...

  11. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  12. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  13. GIS Tools For Improving Pedestrian & Bicycle Safety

    DOT National Transportation Integrated Search

    2000-07-01

    Geographic Information System (GIS) software turns statistical data, such as accidents, and geographic data, such as roads and crash locations, into meaningful information for spatial analysis and mapping. In this project, GIS-based analytical techni...

  14. Laura Jackson, Ph.D.

    EPA Pesticide Factsheets

    Research Biologist with the EPA. Her current work involves linking natural and built infrastructure to human health and well-being at multiple spatial scales, in order to develop interpretive maps and analytical tools for an interactive, web-based Atlas.

  15. Making sense of human ecology mapping: an overview of approaches to integrating socio-spatial data into environmental planning

    Treesearch

    Rebecca McLain; Melissa R. Poe; Kelly Biedenweg; Lee K. Cerveny; Diane Besser; Dale J. Blahna

    2013-01-01

    Ecosystem-based planning and management have stimulated the need to gather sociocultural values and human uses of land in formats accessible to diverse planners and researchers. Human Ecology Mapping (HEM) approaches offer promising spatial data gathering and analytical tools, while also addressing important questions about human-landscape connections. This article...

  16. A Prototype Digital Library for 3D Collections: Tools To Capture, Model, Analyze, and Query Complex 3D Data.

    ERIC Educational Resources Information Center

    Rowe, Jeremy; Razdan, Anshuman

    The Partnership for Research in Spatial Modeling (PRISM) project at Arizona State University (ASU) developed modeling and analytic tools to respond to the limitations of two-dimensional (2D) data representations perceived by affiliated discipline scientists, and to take advantage of the enhanced capabilities of three-dimensional (3D) data that…

  17. Connecting mathematics learning through spatial reasoning

    NASA Astrophysics Data System (ADS)

    Mulligan, Joanne; Woolcott, Geoffrey; Mitchelmore, Michael; Davis, Brent

    2018-03-01

    Spatial reasoning, an emerging transdisciplinary area of interest to mathematics education research, is proving integral to all human learning. It is particularly critical to science, technology, engineering and mathematics (STEM) fields. This project will create an innovative knowledge framework based on spatial reasoning that identifies new pathways for mathematics learning, pedagogy and curriculum. Novel analytical tools will map the unknown complex systems linking spatial and mathematical concepts. It will involve the design, implementation and evaluation of a Spatial Reasoning Mathematics Program (SRMP) in Grades 3 to 5. Benefits will be seen through development of critical spatial skills for students, increased teacher capability and informed policy and curriculum across STEM education.

  18. Spatially explicit spectral analysis of point clouds and geospatial data

    USGS Publications Warehouse

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described, and its functionality illustrated with an example of a high-resolution bathymetric point cloud data collected with multibeam echosounder.

  19. An Evaluation of Fractal Surface Measurement Methods for Characterizing Landscape Complexity from Remote-Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina Siu-Ngan; Qiu, Hong-Lie; Quattrochi, Dale A.; Emerson, Charles W.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The rapid increase in digital data volumes from new and existing sensors necessitates the need for efficient analytical tools for extracting information. We developed an integrated software package called ICAMS (Image Characterization and Modeling System) to provide specialized spatial analytical functions for interpreting remote sensing data. This paper evaluates the three fractal dimension measurement methods: isarithm, variogram, and triangular prism, along with the spatial autocorrelation measurement methods Moran's I and Geary's C, that have been implemented in ICAMS. A modified triangular prism method was proposed and implemented. Results from analyzing 25 simulated surfaces having known fractal dimensions show that both the isarithm and triangular prism methods can accurately measure a range of fractal surfaces. The triangular prism method is most accurate at estimating the fractal dimension of higher spatial complexity, but it is sensitive to contrast stretching. The variogram method is a comparatively poor estimator for all of the surfaces, particularly those with higher fractal dimensions. Similar to the fractal techniques, the spatial autocorrelation techniques are found to be useful to measure complex images but not images with low dimensionality. These fractal measurement methods can be applied directly to unclassified images and could serve as a tool for change detection and data mining.

  20. Individual human cell responses to low doses of chemicals studied by synchrotron infrared spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Holman, Hoi-Ying N.; Goth-Goldstein, Regine; Blakely, Elanor A.; Bjornstad, Kathy; Martin, Michael C.; McKinney, Wayne R.

    2000-05-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in the individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR-FTIR microscopy probes intact living cells providing a composite view of all of the molecular response and the ability to monitor the response over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low- doses of chemicals. In this study we used the high spatial - resolution SR-FTIR vibrational spectromicroscopy as a sensitive analytical tool to detect chemical- and radiation- induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of dioxin. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio- compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  1. Towards a minimally invasive sampling tool for high resolution tissue analytical mapping

    NASA Astrophysics Data System (ADS)

    Gottardi, R.

    2015-09-01

    Multiple spatial mapping techniques of biological tissues have been proposed over the years, but all present limitations either in terms of resolution, analytical capacity or invasiveness. Ren et al (2015 Nanotechnology 26 284001) propose in their most recent work the use of a picosecond infrared laser (PIRL) under conditions of ultrafast desorption by impulsive vibrational excitation (DIVE) to extract small amounts of cellular and molecular components, conserving their viability, structure and activity. The PIRL DIVE technique would then work as a nanobiopsy with minimal damage to the surrounding tissues, which could potentially be applied for high resolution local structural characterization of tissues in health and disease with the spatial limit determined by the laser focus.

  2. Analytic expressions for Atomic Layer Deposition: coverage, throughput, and materials utilization in cross-flow, particle coating, and spatial ALD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yanguas-Gil, Angel; Elam, Jeffrey W.

    2014-05-01

    In this work, the authors present analytic models for atomic layer deposition (ALD) in three common experimental configurations: cross-flow, particle coating, and spatial ALD. These models, based on the plug-flow and well-mixed approximations, allow us to determine the minimum dose times and materials utilization for all three configurations. A comparison between the three models shows that throughput and precursor utilization can each be expressed by universal equations, in which the particularity of the experimental system is contained in a single parameter related to the residence time of the precursor in the reactor. For the case of cross-flow reactors, the authorsmore » show how simple analytic expressions for the reactor saturation profiles agree well with experimental results. Consequently, the analytic model can be used to extract information about the ALD surface chemistry (e. g., the reaction probability) by comparing the analytic and experimental saturation profiles, providing a useful tool for characterizing new and existing ALD processes. (C) 2014 American Vacuum Society« less

  3. Verification of spatial and temporal pressure distributions in segmented solid rocket motors

    NASA Technical Reports Server (NTRS)

    Salita, Mark

    1989-01-01

    A wide variety of analytical tools are in use today to predict the history and spatial distributions of pressure in the combustion chambers of solid rocket motors (SRMs). Experimental and analytical methods are presented here that allow the verification of many of these predictions. These methods are applied to the redesigned space shuttle booster (RSRM). Girth strain-gage data is compared to the predictions of various one-dimensional quasisteady analyses in order to verify the axial drop in motor static pressure during ignition transients as well as quasisteady motor operation. The results of previous modeling of radial flows in the bore, slots, and around grain overhangs are supported by approximate analytical and empirical techniques presented here. The predictions of circumferential flows induced by inhibitor asymmetries, nozzle vectoring, and propellant slump are compared to each other and to subscale cold air and water tunnel measurements to ascertain their validity.

  4. Analytical solutions for solute transport in groundwater and riverine flow using Green's Function Method and pertinent coordinate transformation method

    NASA Astrophysics Data System (ADS)

    Sanskrityayn, Abhishek; Suk, Heejun; Kumar, Naveen

    2017-04-01

    In this study, analytical solutions of one-dimensional pollutant transport originating from instantaneous and continuous point sources were developed in groundwater and riverine flow using both Green's Function Method (GFM) and pertinent coordinate transformation method. Dispersion coefficient and flow velocity are considered spatially and temporally dependent. The spatial dependence of the velocity is linear, non-homogeneous and that of dispersion coefficient is square of that of velocity, while the temporal dependence is considered linear, exponentially and asymptotically decelerating and accelerating. Our proposed analytical solutions are derived for three different situations depending on variations of dispersion coefficient and velocity, respectively which can represent real physical processes occurring in groundwater and riverine systems. First case refers to steady solute transport situation in steady flow in which dispersion coefficient and velocity are only spatially dependent. The second case represents transient solute transport in steady flow in which dispersion coefficient is spatially and temporally dependent while the velocity is spatially dependent. Finally, the third case indicates transient solute transport in unsteady flow in which both dispersion coefficient and velocity are spatially and temporally dependent. The present paper demonstrates the concentration distribution behavior from a point source in realistically occurring flow domains of hydrological systems including groundwater and riverine water in which the dispersivity of pollutant's mass is affected by heterogeneity of the medium as well as by other factors like velocity fluctuations, while velocity is influenced by water table slope and recharge rate. Such capabilities give the proposed method's superiority about application of various hydrological problems to be solved over other previously existing analytical solutions. Especially, to author's knowledge, any other solution doesn't exist for both spatially and temporally variations of dispersion coefficient and velocity. In this study, the existing analytical solutions from previous widely known studies are used for comparison as validation tools to verify the proposed analytical solution as well as the numerical code of the Two-Dimensional Subsurface Flow, Fate and Transport of Microbes and Chemicals (2DFATMIC) code and the developed 1D finite difference code (FDM). All such solutions show perfect match with the respective proposed solutions.

  5. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    USGS Publications Warehouse

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  6. Feature Geo Analytics and Big Data Processing: Hybrid Approaches for Earth Science and Real-Time Decision Support

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.

    2016-12-01

    Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.

  7. RADSS: an integration of GIS, spatial statistics, and network service for regional data mining

    NASA Astrophysics Data System (ADS)

    Hu, Haitang; Bao, Shuming; Lin, Hui; Zhu, Qing

    2005-10-01

    Regional data mining, which aims at the discovery of knowledge about spatial patterns, clusters or association between regions, has widely applications nowadays in social science, such as sociology, economics, epidemiology, crime, and so on. Many applications in the regional or other social sciences are more concerned with the spatial relationship, rather than the precise geographical location. Based on the spatial continuity rule derived from Tobler's first law of geography: observations at two sites tend to be more similar to each other if the sites are close together than if far apart, spatial statistics, as an important means for spatial data mining, allow the users to extract the interesting and useful information like spatial pattern, spatial structure, spatial association, spatial outlier and spatial interaction, from the vast amount of spatial data or non-spatial data. Therefore, by integrating with the spatial statistical methods, the geographical information systems will become more powerful in gaining further insights into the nature of spatial structure of regional system, and help the researchers to be more careful when selecting appropriate models. However, the lack of such tools holds back the application of spatial data analysis techniques and development of new methods and models (e.g., spatio-temporal models). Herein, we make an attempt to develop such an integrated software and apply it into the complex system analysis for the Poyang Lake Basin. This paper presents a framework for integrating GIS, spatial statistics and network service in regional data mining, as well as their implementation. After discussing the spatial statistics methods involved in regional complex system analysis, we introduce RADSS (Regional Analysis and Decision Support System), our new regional data mining tool, by integrating GIS, spatial statistics and network service. RADSS includes the functions of spatial data visualization, exploratory spatial data analysis, and spatial statistics. The tool also includes some fundamental spatial and non-spatial database in regional population and environment, which can be updated by external database via CD or network. Utilizing this data mining and exploratory analytical tool, the users can easily and quickly analyse the huge mount of the interrelated regional data, and better understand the spatial patterns and trends of the regional development, so as to make a credible and scientific decision. Moreover, it can be used as an educational tool for spatial data analysis and environmental studies. In this paper, we also present a case study on Poyang Lake Basin as an application of the tool and spatial data mining in complex environmental studies. At last, several concluding remarks are discussed.

  8. Geo-epidemiologic mapping in the new public health surveillance. The malaria case in Chiapas, Mexico, 2002.

    PubMed

    Castillo-Salgado, Carlos

    2017-01-01

    The new public health surveillance requires at the global, national and local levels the use of new authoritative analytical approaches and tools for better recognition of the epidemiologic characteristics of the priority health events and risk factors affecting the population health. The identification of the events in time and space is of fundamental importance so that the geo-spatial description of the situation of diseases and health events facilitates the identification of social, environmental and health care related risks. This assessment examines the application and use of geo-spatial tools for identifying relevant spatial and epidemiological conglomerates of malaria in Chiapas, Mexico. The study design was ecological and the level of aggregation of the collected information of the epidemiological and spatial variables was municipalities. The data were collected in all municipalities of the state of Chiapas, Mexico during the years 2000-2002. The main outcome variable was cases and types of malaria diagnosed by blood smears in weekly reports. Independent variables were age, sex, ethnicity, literacy of the cases of malaria and environmental factors such as altitude, road type and network in the municipalities and cities of Chiapas. The production of thematic maps and the application of geo-spatial analytical tools such Moran and local indicator of spatial autocorrelation metrics for malaria clustering allowed the visualization and recognition that the important population risk factors associated with high malaria incidence in Chiapas were low literacy rate, areas with high percentage of indigenous population that reflects the social inequalities gaps in health and the great burden of disease that is affecting this important vulnerable group in Chiapas. The presence of road networks allowed greater spatial diffusion of Malaria. An important epidemiological and spatial cluster of malaria was identified in the areas and populations in the proximity of the southern border. The use of geospatial metrics in local areas will assist in the epidemiological stratification of malaria for better targeting more effective and equitable prevention and control interventions. Copyright: © 2017 SecretarÍa de Salud.

  9. An approach to estimate spatial distribution of analyte within cells using spectrally-resolved fluorescence microscopy.

    PubMed

    Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam

    2017-01-18

    While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in cellular media due to strong cross-talk between energetically separated detection channels.

  10. An approach to estimate spatial distribution of analyte within cells using spectrally-resolved fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam

    2017-03-01

    While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in cellular media due to strong cross-talk between energetically separated detection channels. Dedicated to Professor Kankan Bhattacharyya.

  11. A GIS-based approach for the screening assessment of noise and vibration impacts from transit projects.

    PubMed

    Hamed, Maged; Effat, Waleed

    2007-08-01

    Urban transportation projects are essential in increasing the efficiency of moving people and goods within a city, and between cities. Environmental impacts from such projects must be evaluated and mitigated, as applicable. Spatial modeling is a valuable tool for quantifying the potential level of environmental consequences within the context of an environmental impact assessment (EIA) study. This paper presents a GIS-based tool for the assessment of airborne-noise and ground-borne vibration from public transit systems, and its application to an actual project. The tool is based on the US Federal Transit Administration's (FTA) approach, and incorporates spatial information, satellite imaging, geostatistical modeling, and software programming. The tool is applied on a case study of initial environmental evaluation of a light rail transit project in an urban city in the Middle East, to evaluate alternative layouts. The tool readily allowed the alternative evaluation and the results were used as input to a multi-criteria analytic framework.

  12. Weathering the Storm: Developing a Spatial Data Infrastructure and Online Research Platform for Oil Spill Preparedness

    NASA Astrophysics Data System (ADS)

    Bauer, J. R.; Rose, K.; Romeo, L.; Barkhurst, A.; Nelson, J.; Duran-Sesin, R.; Vielma, J.

    2016-12-01

    Efforts to prepare for and reduce the risk of hazards, from both natural and anthropogenic sources, which threaten our oceans and coasts requires an understanding of the dynamics and interactions between the physical, ecological, and socio-economic systems. Understanding these coupled dynamics are essential as offshore oil & gas exploration and production continues to push into harsher, more extreme environments where risks and uncertainty increase. However, working with these large, complex data from various sources and scales to assess risks and potential impacts associated with offshore energy exploration and production poses several challenges to research. In order to address these challenges, an integrated assessment model (IAM) was developed at the Department of Energy's (DOE) National Energy Technology Laboratory (NETL) that combines spatial data infrastructure and an online research platform to manage, process, analyze, and share these large, multidimensional datasets, research products, and the tools and models used to evaluate risk and reduce uncertainty for the entire offshore system, from the subsurface, through the water column, to coastal ecosystems and communities. Here, we will discuss the spatial data infrastructure and online research platform, NETL's Energy Data eXchange (EDX), that underpin the offshore IAM, providing information on how the framework combines multidimensional spatial data and spatio-temporal tools to evaluate risks to the complex matrix of potential environmental, social, and economic impacts stemming from modeled offshore hazard scenarios, such as oil spills or hurricanes. In addition, we will discuss the online analytics, tools, and visualization methods integrated into this framework that support availability and access to data, as well as allow for the rapid analysis and effective communication of analytical results to aid a range of decision-making needs.

  13. Fractals and Spatial Methods for Mining Remote Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina; Emerson, Charles; Quattrochi, Dale

    2003-01-01

    The rapid increase in digital remote sensing and GIS data raises a critical problem -- how can such an enormous amount of data be handled and analyzed so that useful information can be derived quickly? Efficient handling and analysis of large spatial data sets is central to environmental research, particularly in global change studies that employ time series. Advances in large-scale environmental monitoring and modeling require not only high-quality data, but also reliable tools to analyze the various types of data. A major difficulty facing geographers and environmental scientists in environmental assessment and monitoring is that spatial analytical tools are not easily accessible. Although many spatial techniques have been described recently in the literature, they are typically presented in an analytical form and are difficult to transform to a numerical algorithm. Moreover, these spatial techniques are not necessarily designed for remote sensing and GIS applications, and research must be conducted to examine their applicability and effectiveness in different types of environmental applications. This poses a chicken-and-egg problem: on one hand we need more research to examine the usability of the newer techniques and tools, yet on the other hand, this type of research is difficult to conduct if the tools to be explored are not accessible. Another problem that is fundamental to environmental research are issues related to spatial scale. The scale issue is especially acute in the context of global change studies because of the need to integrate remote-sensing and other spatial data that are collected at different scales and resolutions. Extrapolation of results across broad spatial scales remains the most difficult problem in global environmental research. There is a need for basic characterization of the effects of scale on image data, and the techniques used to measure these effects must be developed and implemented to allow for a multiple scale assessment of the data before any useful process-oriented modeling involving scale-dependent data can be conducted. Through the support of research grants from NASA, we have developed a software module called ICAMS (Image Characterization And Modeling System) to address the need to develop innovative spatial techniques and make them available to the broader scientific communities. ICAMS provides new spatial techniques, such as fractal analysis, geostatistical functions, and multiscale analysis that are not easily available in commercial GIS/image processing software. By bundling newer spatial methods in a user-friendly software module, researchers can begin to test and experiment with the new spatial analysis methods and they can gauge scale effects using a variety of remote sensing imagery. In the following, we describe briefly the development of ICAMS and present application examples.

  14. Large High Resolution Displays for Co-Located Collaborative Sensemaking: Display Usage and Territoriality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradel, Lauren; Endert, Alexander; Koch, Kristen

    2013-08-01

    Large, high-resolution vertical displays carry the potential to increase the accuracy of collaborative sensemaking, given correctly designed visual analytics tools. From an exploratory user study using a fictional textual intelligence analysis task, we investigated how users interact with the display to construct spatial schemas and externalize information, as well as how they establish shared and private territories. We investigated the space management strategies of users partitioned by type of tool philosophy followed (visualization- or text-centric). We classified the types of territorial behavior exhibited in terms of how the users interacted with information on the display (integrated or independent workspaces). Next,more » we examined how territorial behavior impacted the common ground between the pairs of users. Finally, we offer design suggestions for building future co-located collaborative visual analytics tools specifically for use on large, high-resolution vertical displays.« less

  15. Air quality climate in the Columbia River Basin.

    Treesearch

    Sue A. Ferguson

    1998-01-01

    Aspects of climate that influence air quality in the Columbia River basin of the Northwestern United States are described. A few, relatively simple, analytical tools were developed to show the spatial and temporal patterns of mean-monthly mixing heights, precipitation scavenging, upper level and surface trajectory winds, and drought that inhibit pollution uptake. Also...

  16. Machinic Assemblages: Women, Art Education and Space

    ERIC Educational Resources Information Center

    Tamboukou, Maria

    2008-01-01

    In this paper I explore connections between women, art education and spatial relations drawing on the Deleuzo-Guattarian concept of "machinic assemblage" as a useful analytical tool for making sense of the heterogeneity and meshwork of life narratives and their social milieus. In focusing on Mary Bradish Titcomb, a fin-de-siecle Bostonian woman…

  17. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).

  18. Spatial modeling in ecology: the flexibility of eigenfunction spatial analyses.

    PubMed

    Griffith, Daniel A; Peres-Neto, Pedro R

    2006-10-01

    Recently, analytical approaches based on the eigenfunctions of spatial configuration matrices have been proposed in order to consider explicitly spatial predictors. The present study demonstrates the usefulness of eigenfunctions in spatial modeling applied to ecological problems and shows equivalencies of and differences between the two current implementations of this methodology. The two approaches in this category are the distance-based (DB) eigenvector maps proposed by P. Legendre and his colleagues, and spatial filtering based upon geographic connectivity matrices (i.e., topology-based; CB) developed by D. A. Griffith and his colleagues. In both cases, the goal is to create spatial predictors that can be easily incorporated into conventional regression models. One important advantage of these two approaches over any other spatial approach is that they provide a flexible tool that allows the full range of general and generalized linear modeling theory to be applied to ecological and geographical problems in the presence of nonzero spatial autocorrelation.

  19. Scattering from phase-separated vesicles. I. An analytical form factor for multiple static domains

    DOE PAGES

    Heberle, Frederick A.; Anghel, Vinicius N. P.; Katsaras, John

    2015-08-18

    This is the first in a series of studies considering elastic scattering from laterally heterogeneous lipid vesicles containing multiple domains. Unique among biophysical tools, small-angle neutron scattering can in principle give detailed information about the size, shape and spatial arrangement of domains. A general theory for scattering from laterally heterogeneous vesicles is presented, and the analytical form factor for static domains with arbitrary spatial configuration is derived, including a simplification for uniformly sized round domains. The validity of the model, including series truncation effects, is assessed by comparison with simulated data obtained from a Monte Carlo method. Several aspects ofmore » the analytical solution for scattering intensity are discussed in the context of small-angle neutron scattering data, including the effect of varying domain size and number, as well as solvent contrast. Finally, the analysis indicates that effects of domain formation are most pronounced when the vesicle's average scattering length density matches that of the surrounding solvent.« less

  20. Human factors issues and approaches in the spatial layout of a space station control room, including the use of virtual reality as a design analysis tool

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1994-01-01

    Human Factors Engineering support was provided for the 30% design review of the late Space Station Freedom Payload Control Area (PCA). The PCA was to be the payload operations control room, analogous to the Spacelab Payload Operations Control Center (POCC). This effort began with a systematic collection and refinement of the relevant requirements driving the spatial layout of the consoles and PCA. This information was used as input for specialized human factors analytical tools and techniques in the design and design analysis activities. Design concepts and configuration options were developed and reviewed using sketches, 2-D Computer-Aided Design (CAD) drawings, and immersive Virtual Reality (VR) mockups.

  1. Individual Human Cell Responses to Low Doses of Chemicals and Radiation Studied by Synchrotron Infrared Spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Martin, Michael C.; Holman, Hoi-Ying N.; Blakely, Eleanor A.; Goth-Goldstein, Regine; McKinney, Wayne R.

    2000-03-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR FTIR microscopy probes intact living cells providing a composite view of all of the molecular responses and the ability to monitor the responses over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low-doses of radiation and chemicals. In this study we used high spatial-resolution SR FTIR vibrational spectromicroscopy at ALS Beamline 1.4.3 as a sensitive analytical tool to detect chemical- and radiation-induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of oxidative stresses: bleomycin, hydrogen peroxide, and X-rays. We observe spectral changes that are unique to each exogenous stressor. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio-compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  2. Size, weight and position: ion mobility spectrometry and imaging MS combined.

    PubMed

    Kiss, András; Heeren, Ron M A

    2011-03-01

    Size, weight and position are three of the most important parameters that describe a molecule in a biological system. Ion mobility spectrometry is capable of separating molecules on the basis of their size or shape, whereas imaging mass spectrometry is an effective tool to measure the molecular weight and spatial distribution of molecules. Recent developments in both fields enabled the combination of the two technologies. As a result, ion-mobility-based imaging mass spectrometry is gaining more and more popularity as a (bio-)analytical tool enabling the determination of the size, weight and position of several molecules simultaneously on biological surfaces. This paper reviews the evolution of ion-mobility-based imaging mass spectrometry and provides examples of its application in analytical studies of biological surfaces.

  3. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  4. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  5. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  6. Using geovisual analytics in Google Earth to understand disease distribution: a case study of campylobacteriosis in the Czech Republic (2008-2012).

    PubMed

    Marek, Lukáš; Tuček, Pavel; Pászto, Vít

    2015-01-28

    Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.

  7. The potential contributions of geographic information science to the study of social determinants of health in Iran.

    PubMed

    Rabiei-Dastjerdi, Hamidreza; Matthews, Stephen A

    2018-01-01

    Recent interest in the social determinants of health (SDOH) and the effects of neighborhood contexts on individual health and well-being has grown exponentially. In this brief communication, we describe recent developments in both analytical perspectives and methods that have opened up new opportunities for researchers interested in exploring neighborhoods and health research within a SDOH framework. We focus specifically on recent advances in geographic information science, statistical methods, and spatial analytical tools. We close with a discussion of how these recent developments have the potential to enhance SDOH research in Iran.

  8. Correlated Raman micro-spectroscopy and scanning electron microscopy analyses of flame retardants in environmental samples: a micro-analytical tool for probing chemical composition, origin and spatial distribution.

    PubMed

    Ghosal, Sutapa; Wagner, Jeff

    2013-07-07

    We present correlated application of two micro-analytical techniques: scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDS) and Raman micro-spectroscopy (RMS) for the non-invasive characterization and molecular identification of flame retardants (FRs) in environmental dusts and consumer products. The SEM/EDS-RMS technique offers correlated, morphological, molecular, spatial distribution and semi-quantitative elemental concentration information at the individual particle level with micrometer spatial resolution and minimal sample preparation. The presented methodology uses SEM/EDS analyses for rapid detection of particles containing FR specific elements as potential indicators of FR presence in a sample followed by correlated RMS analyses of the same particles for characterization of the FR sub-regions and surrounding matrices. The spatially resolved characterization enabled by this approach provides insights into the distributional heterogeneity as well as potential transfer and exposure mechanisms for FRs in the environment that is typically not available through traditional FR analysis. We have used this methodology to reveal a heterogeneous distribution of highly concentrated deca-BDE particles in environmental dust, sometimes in association with identifiable consumer materials. The observed coexistence of deca-BDE with consumer material in dust is strongly indicative of its release into the environment via weathering/abrasion of consumer products. Ingestion of such enriched FR particles in dust represents a potential for instantaneous exposure to high FR concentrations. Therefore, correlated SEM/RMS analysis offers a novel investigative tool for addressing an area of important environmental concern.

  9. Toward automatic finite element analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Perucchio, Renato; Voelcker, Herbert

    1987-01-01

    Two problems must be solved if the finite element method is to become a reliable and affordable blackbox engineering tool. Finite element meshes must be generated automatically from computer aided design databases and mesh analysis must be made self-adaptive. The experimental system described solves both problems in 2-D through spatial and analytical substructuring techniques that are now being extended into 3-D.

  10. ALTERNATIVES FOR REDUCING INSECTICIDES ON COTTON AND CORN: ECONOMIC AND ENVIRONMENTAL IMPACT - SUPPLEMENT 2: PROCEDURES USED IN SETTING UP THE AGRICULTURAL PRODUCTION MODEL

    EPA Science Inventory

    The procedures used in setting up the agricultural production model used in a study of alternatives for reducing insecticides on cotton and corn are described. The major analytical tool used is a spatial equilibrium model of U.S. agriculture. This is a linear programming model th...

  11. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities

    PubMed Central

    Kwan, Paul; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops. PMID:28875085

  12. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities.

    PubMed

    Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  13. Bioimaging of cells and tissues using accelerator-based sources.

    PubMed

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (< 1 mum) or tissue (<1 mm) analyses, (2) a temporal resolution to follow molecular dynamics, and (3) a sensitivity in the micromolar to nanomolar range, thus allowing true investigations on biological dynamics. Third-generation synchrotrons now offer the opportunity of bioanalytical measurements at nanometer resolutions with incredible sensitivity. Linear accelerators are more specialized in their physical features but may exceed synchrotron performances. All these techniques have become irreplaceable tools for developing knowledge in biology. This review highlights the pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  14. A new spatial multi-criteria decision support tool for site selection for implementation of managed aquifer recharge.

    PubMed

    Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin

    2012-05-30

    This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Utility assessment of a map-based online geo-collaboration tool.

    PubMed

    Sidlar, Christopher L; Rinner, Claus

    2009-05-01

    Spatial group decision-making processes often include both informal and analytical components. Discussions among stakeholders or planning experts are an example of an informal component. When participants discuss spatial planning projects they typically express concerns and comments by pointing to places on a map. The Argumentation Map model provides a conceptual basis for collaborative tools that enable explicit linkages of arguments to the places to which they refer. These tools allow for the input of explicitly geo-referenced arguments as well as the visual access to arguments through a map interface. In this paper, we will review previous utility studies in geo-collaboration and evaluate a case study of a Web-based Argumentation Map application. The case study was conducted in the summer of 2005 when student participants discussed planning issues on the University of Toronto St. George campus. During a one-week unmoderated discussion phase, 11 participants wrote 60 comments on issues such as safety, facilities, parking, and building aesthetics. By measuring the participants' use of geographic references, we draw conclusions on how well the software tool supported the potential of the underlying concept. This research aims to contribute to a scientific approach to geo-collaboration in which the engineering of novel spatial decision support methods is complemented by a critical assessment of their utility in controlled, realistic experiments.

  16. Spiral Form of the Human Cochlea Results from Spatial Constraints.

    PubMed

    Pietsch, M; Aguirre Dávila, L; Erfurt, P; Avci, E; Lenarz, T; Kral, A

    2017-08-08

    The human inner ear has an intricate spiral shape often compared to shells of mollusks, particularly to the nautilus shell. It has inspired many functional hearing theories. The reasons for this complex geometry remain unresolved. We digitized 138 human cochleae at microscopic resolution and observed an astonishing interindividual variability in the shape. A 3D analytical cochlear model was developed that fits the analyzed data with high precision. The cochlear geometry neither matched a proposed function, namely sound focusing similar to a whispering gallery, nor did it have the form of a nautilus. Instead, the innate cochlear blueprint and its actual ontogenetic variants were determined by spatial constraints and resulted from an efficient packing of the cochlear duct within the petrous bone. The analytical model predicts well the individual 3D cochlear geometry from few clinical measures and represents a clinical tool for an individualized approach to neurosensory restoration with cochlear implants.

  17. Hyperspectral imaging for non-contact analysis of forensic traces.

    PubMed

    Edelman, G J; Gaston, E; van Leeuwen, T G; Cullen, P J; Aalders, M C G

    2012-11-30

    Hyperspectral imaging (HSI) integrates conventional imaging and spectroscopy, to obtain both spatial and spectral information from a specimen. This technique enables investigators to analyze the chemical composition of traces and simultaneously visualize their spatial distribution. HSI offers significant potential for the detection, visualization, identification and age estimation of forensic traces. The rapid, non-destructive and non-contact features of HSI mark its suitability as an analytical tool for forensic science. This paper provides an overview of the principles, instrumentation and analytical techniques involved in hyperspectral imaging. We describe recent advances in HSI technology motivating forensic science applications, e.g. the development of portable and fast image acquisition systems. Reported forensic science applications are reviewed. Challenges are addressed, such as the analysis of traces on backgrounds encountered in casework, concluded by a summary of possible future applications. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  18. Potential of Near-Infrared Chemical Imaging as Process Analytical Technology Tool for Continuous Freeze-Drying.

    PubMed

    Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas

    2018-04-03

    Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.

  19. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality

    PubMed Central

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M

    2008-01-01

    Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. Conclusion The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. Method We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit. PMID:18992163

  20. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality.

    PubMed

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M

    2008-11-07

    Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit.

  1. SU-C-204-01: A Fast Analytical Approach for Prompt Gamma and PET Predictions in a TPS for Proton Range Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroniger, K; Herzog, M; Landry, G

    2015-06-15

    Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less

  2. A smarter way to search, share and utilize open-spatial online data for energy R&D - Custom machine learning and GIS tools in U.S. DOE's virtual data library & laboratory, EDX

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J.; Baker, D.; Barkhurst, A.; Bean, A.; DiGiulio, J.; Jones, K.; Jones, T.; Justman, D.; Miller, R., III; Romeo, L.; Sabbatino, M.; Tong, A.

    2017-12-01

    As spatial datasets are increasingly accessible through open, online systems, the opportunity to use these resources to address a range of Earth system questions grows. Simultaneously, there is a need for better infrastructure and tools to find and utilize these resources. We will present examples of advanced online computing capabilities, hosted in the U.S. DOE's Energy Data eXchange (EDX), that address these needs for earth-energy research and development. In one study the computing team developed a custom, machine learning, big data computing tool designed to parse the web and return priority datasets to appropriate servers to develop an open-source global oil and gas infrastructure database. The results of this spatial smart search approach were validated against expert-driven, manual search results which required a team of seven spatial scientists three months to produce. The custom machine learning tool parsed online, open systems, including zip files, ftp sites and other web-hosted resources, in a matter of days. The resulting resources were integrated into a geodatabase now hosted for open access via EDX. Beyond identifying and accessing authoritative, open spatial data resources, there is also a need for more efficient tools to ingest, perform, and visualize multi-variate, spatial data analyses. Within the EDX framework, there is a growing suite of processing, analytical and visualization capabilities that allow multi-user teams to work more efficiently in private, virtual workspaces. An example of these capabilities are a set of 5 custom spatio-temporal models and data tools that form NETL's Offshore Risk Modeling suite that can be used to quantify oil spill risks and impacts. Coupling the data and advanced functions from EDX with these advanced spatio-temporal models has culminated with an integrated web-based decision-support tool. This platform has capabilities to identify and combine data across scales and disciplines, evaluate potential environmental, social, and economic impacts, highlight knowledge or technology gaps, and reduce uncertainty for a range of `what if' scenarios relevant to oil spill prevention efforts. These examples illustrate EDX's growing capabilities for advanced spatial data search and analysis to support geo-data science needs.

  3. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    NASA Astrophysics Data System (ADS)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  4. Three-dimensional localization of nanoscale battery reactions using soft X-ray tomography.

    PubMed

    Yu, Young-Sang; Farmand, Maryam; Kim, Chunjoong; Liu, Yijin; Grey, Clare P; Strobridge, Fiona C; Tyliszczak, Tolek; Celestre, Rich; Denes, Peter; Joseph, John; Krishnan, Harinarayan; Maia, Filipe R N C; Kilcoyne, A L David; Marchesini, Stefano; Leite, Talita Perciano Costa; Warwick, Tony; Padmore, Howard; Cabana, Jordi; Shapiro, David A

    2018-03-02

    Battery function is determined by the efficiency and reversibility of the electrochemical phase transformations at solid electrodes. The microscopic tools available to study the chemical states of matter with the required spatial resolution and chemical specificity are intrinsically limited when studying complex architectures by their reliance on two-dimensional projections of thick material. Here, we report the development of soft X-ray ptychographic tomography, which resolves chemical states in three dimensions at 11 nm spatial resolution. We study an ensemble of nano-plates of lithium iron phosphate extracted from a battery electrode at 50% state of charge. Using a set of nanoscale tomograms, we quantify the electrochemical state and resolve phase boundaries throughout the volume of individual nanoparticles. These observations reveal multiple reaction points, intra-particle heterogeneity, and size effects that highlight the importance of multi-dimensional analytical tools in providing novel insight to the design of the next generation of high-performance devices.

  5. Classification of High Spatial Resolution, Hyperspectral ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report,

  6. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  7. Modeling spatial accessibility of immigrants to culturally diverse family physicians.

    PubMed

    Wanga, Lu; Roisman, Deborah

    2011-01-01

    This article uses accessibility as an analytical tool to examine health care access among immigrants in a multicultural urban setting. It applies and improves on two widely used accessibility models—the gravity model and the two-step floating catchment area model—in measuring spatial accessibility by Mainland Chinese immigrants in the Toronto Census Metropolitan Area. Empirical data on physician-seeking behaviors are collected through two rounds of questionnaire surveys. Attention is focused on journey to physician location and utilization of linguistically matched family physicians. Based on the survey data, a two-zone accessibility model is developed by relaxing the travel threshold and distance impedance parameters that are traditionally treated as a constant in the accessibility models. General linear models are used to identify relationships among spatial accessibility, geography, and socioeconomic characteristics of Mainland Chinese immigrants. The results suggest a spatial mismatch in the supply of and demand for culturally sensitive care, and residential location is the primary factor that determines spatial accessibility to family physicians. The article yields important policy implications.

  8. Spatial statistical analysis of tree deaths using airborne digital imagery

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Mei; Baddeley, Adrian; Wallace, Jeremy; Canci, Michael

    2013-04-01

    High resolution digital airborne imagery offers unprecedented opportunities for observation and monitoring of vegetation, providing the potential to identify, locate and track individual vegetation objects over time. Analytical tools are required to quantify relevant information. In this paper, locations of trees over a large area of native woodland vegetation were identified using morphological image analysis techniques. Methods of spatial point process statistics were then applied to estimate the spatially-varying tree death risk, and to show that it is significantly non-uniform. [Tree deaths over the area were detected in our previous work (Wallace et al., 2008).] The study area is a major source of ground water for the city of Perth, and the work was motivated by the need to understand and quantify vegetation changes in the context of water extraction and drying climate. The influence of hydrological variables on tree death risk was investigated using spatial statistics (graphical exploratory methods, spatial point pattern modelling and diagnostics).

  9. Alcohol beverage control, privatization and the geographic distribution of alcohol outlets

    PubMed Central

    2012-01-01

    Background With Pennsylvania currently considering a move away from an Alcohol Beverage Control state to a privatized alcohol distribution system, this study uses a spatial analytical approach to examine potential impacts of privatization on the number and spatial distribution of alcohol outlets in the city of Philadelphia over a long time horizon. Methods A suite of geospatial data were acquired for Philadelphia, including 1,964 alcohol outlet locations, 569,928 land parcels, and school, church, hospital, park and playground locations. These data were used as inputs for exploratory spatial analysis to estimate the expected number of outlets that would eventually operate in Philadelphia. Constraints included proximity restrictions (based on current ordinances regulating outlet distribution) of at least 200 feet between alcohol outlets and at least 300 feet between outlets and schools, churches, hospitals, parks and playgrounds. Results Findings suggest that current state policies on alcohol outlet distributions in Philadelphia are loosely enforced, with many areas exhibiting extremely high spatial densities of outlets that violate existing proximity restrictions. The spatial model indicates that an additional 1,115 outlets could open in Philadelphia if privatization was to occur and current proximity ordinances were maintained. Conclusions The study reveals that spatial analytical approaches can function as an excellent tool for contingency-based “what-if” analysis, providing an objective snapshot of potential policy outcomes prior to implementation. In this case, the likely outcome is a tremendous increase in alcohol outlets in Philadelphia, with concomitant negative health, crime and quality of life outcomes that accompany such an increase. PMID:23170899

  10. A spline-based approach for computing spatial impulse responses.

    PubMed

    Ellis, Michael A; Guenther, Drake; Walker, William F

    2007-05-01

    Computer simulations are an essential tool for the design of phased-array ultrasonic imaging systems. FIELD II, which determines the two-way temporal response of a transducer at a point in space, is the current de facto standard for ultrasound simulation tools. However, the need often arises to obtain two-way spatial responses at a single point in time, a set of dimensions for which FIELD II is not well optimized. This paper describes an analytical approach for computing the two-way, far-field, spatial impulse response from rectangular transducer elements under arbitrary excitation. The described approach determines the response as the sum of polynomial functions, making computational implementation quite straightforward. The proposed algorithm, named DELFI, was implemented as a C routine under Matlab and results were compared to those obtained under similar conditions from the well-established FIELD II program. Under the specific conditions tested here, the proposed algorithm was approximately 142 times faster than FIELD II for computing spatial sensitivity functions with similar amounts of error. For temporal sensitivity functions with similar amounts of error, the proposed algorithm was about 1.7 times slower than FIELD II using rectangular elements and 19.2 times faster than FIELD II using triangular elements. DELFI is shown to be an attractive complement to FIELD II, especially when spatial responses are needed at a specific point in time.

  11. Fluorescence tomography using synchrotron radiation at the NSLS

    NASA Astrophysics Data System (ADS)

    Boisseau, P.; Grodzins, L.

    1987-03-01

    Fluorescence tomography utilizing focussed, tunable, monoenergetic X-rays from synchrotron light sources hold the promise of a non-invasive analytic tool for studying trace elements in specimens, particularly biological, at spatial resolutions of the order of micrometers. This note reports an early test at the National Synchrotron Light Source at Brookhaven National Laboratories in which fluorescence tomographic scans were successfully made of trace elements of iron and titanium in NBS standard glass and in a bee.

  12. Application of Raman microscopy to biodegradable double-walled microspheres.

    PubMed

    Widjaja, Effendi; Lee, Wei Li; Loo, Say Chye Joachim

    2010-02-15

    Raman mapping measurements were performed on the cross section of the ternary-phase biodegradable double-walled microsphere (DWMS) of poly(D,L-lactide-co-glycolide) (50:50) (PLGA), poly(L-lactide) (PLLA), and poly(epsilon-caprolactone) (PCL), which was fabricated by a one-step solvent evaporation method. The collected Raman spectra were subjected to a band-target entropy minimization (BTEM) algorithm in order to reconstruct the pure component spectra of the species observed in this sample. Seven pure component spectral estimates were recovered, and their spatial distributions within DWMS were determined. The first three spectral estimates were identified as PLLA, PLGA 50:50, and PCL, which were the main components in DWMS. The last four spectral estimates were identified as semicrystalline polyglycolic acid (PGA), dichloromethane (DCM), copper-phthalocyanine blue, and calcite, which were the minor components in DWMS. PGA was the decomposition product of PLGA. DCM was the solvent used in DWMS fabrication. Copper-phthalocyanine blue and calcite were the unexpected contaminants. The current result showed that combined Raman microscopy and BTEM analysis can provide a sensitive characterization tool to DWMS, as it can give more specific information on the chemical species present as well as the spatial distributions. This novel analytical method for microsphere characterization can serve as a complementary tool to other more established analytical techniques, such as scanning electron microscopy and optical microscopy.

  13. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    USGS Publications Warehouse

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  14. Analytical electron microscopy in the study of biological systems.

    PubMed

    Johnson, D E

    1986-01-01

    The AEM is a powerful tool in biological research, capable of providing information simply not available by other means. The use of a field emission STEM for this application can lead to a significant improvement in spatial resolution in most cases now allowed by the quality of the specimen preparation but perhaps ultimately limited by the effects of radiation damage. Increased elemental sensitivity is at least possible in selected cases with electron energy-loss spectrometry, but fundamental aspects of ELS will probably confine its role to that of a limited complement to EDS. The considerable margin for improvement in sensitivity of the basic analytical technique means that the search for technological improvement will continue. Fortunately, however, current technology can also continue to answer important biological questions.

  15. Three-dimensional localization of nanoscale battery reactions using soft X-ray tomography

    DOE PAGES

    Yu, Young-Sang; Farmand, Maryam; Kim, Chunjoong; ...

    2018-03-02

    Battery function is determined by the efficiency and reversibility of the electrochemical phase transformations at solid electrodes. The microscopic tools available to study the chemical states of matter with the required spatial resolution and chemical specificity are intrinsically limited when studying complex architectures by their reliance on two-dimensional projections of thick material. Here in this paper, we report the development of soft X-ray ptychographic tomography, which resolves chemical states in three dimensions at 11 nm spatial resolution. We study an ensemble of nano-plates of lithium iron phosphate extracted from a battery electrode at 50% state of charge. Using a setmore » of nanoscale tomograms, we quantify the electrochemical state and resolve phase boundaries throughout the volume of individual nanoparticles. These observations reveal multiple reaction points, intra-particle heterogeneity, and size effects that highlight the importance of multi-dimensional analytical tools in providing novel insight to the design of the next generation of high-performance devices.« less

  16. Three-dimensional localization of nanoscale battery reactions using soft X-ray tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Young-Sang; Farmand, Maryam; Kim, Chunjoong

    Battery function is determined by the efficiency and reversibility of the electrochemical phase transformations at solid electrodes. The microscopic tools available to study the chemical states of matter with the required spatial resolution and chemical specificity are intrinsically limited when studying complex architectures by their reliance on two-dimensional projections of thick material. Here in this paper, we report the development of soft X-ray ptychographic tomography, which resolves chemical states in three dimensions at 11 nm spatial resolution. We study an ensemble of nano-plates of lithium iron phosphate extracted from a battery electrode at 50% state of charge. Using a setmore » of nanoscale tomograms, we quantify the electrochemical state and resolve phase boundaries throughout the volume of individual nanoparticles. These observations reveal multiple reaction points, intra-particle heterogeneity, and size effects that highlight the importance of multi-dimensional analytical tools in providing novel insight to the design of the next generation of high-performance devices.« less

  17. Sagebrush ecosystem conservation and management: Ecoregional assessment tools and models for the Wyoming Basins

    USGS Publications Warehouse

    Hanser, S.E.; Leu, M.; Knick, S.T.; Aldridge, Cameron L.

    2011-01-01

    The Wyoming Basins are one of the remaining strongholds of the sagebrush ecosystem. However, like most sagebrush habitats, threats to this region are numerous. This book adds to current knowledge about the regional status of the sagebrush ecosystem, the distribution of habitats, the threats to the ecosystem, and the influence of threats and habitat conditions on occurrence and abundance of sagebrush associated fauna and flora in the Wyoming Basins. Comprehensive methods are outlined for use in data collection and monitoring of wildlife and plant populations. Field and spatial data are integrated into a spatially explicit analytical framework to develop models of species occurrence and abundance for the egion. This book provides significant new information on distributions, abundances, and habitat relationships for a number of species of conservation concern that depend on sagebrush in the region. The tools and models presented in this book increase our understanding of impacts from land uses and can contribute to the development of comprehensive management and conservation strategies.

  18. Spatial Correlation Of Streamflows: An Analytical Approach

    NASA Astrophysics Data System (ADS)

    Betterle, A.; Schirmer, M.; Botter, G.

    2016-12-01

    The interwoven space and time variability of climate and landscape properties results in complex and non-linear hydrological response of streamflow dynamics. Understanding how meteorologic and morphological characteristics of catchments affect similarity/dissimilarity of streamflow timeseries at their outlets represents a scientific challenge with application in water resources management, ecological studies and regionalization approaches aimed to predict streamflows in ungauged areas. In this study, we establish an analytical approach to estimate the spatial correlation of daily streamflows in two arbitrary locations within a given hydrologic district or river basin at seasonal and annual time scales. The method is based on a stochastic description of the coupled streamflow dynamics at the outlet of two catchments. The framework aims to express the correlation of daily streamflows at two locations along a river network as a function of a limited number of physical parameters characterizing the main underlying hydrological drivers, that include climate conditions, precipitation regime and catchment drainage rates. The proposed method portrays how heterogeneity of climate and landscape features affect the spatial variability of flow regimes along river systems. In particular, we show that frequency and intensity of synchronous effective rainfall events in the relevant contributing catchments are the main driver of the spatial correlation of daily discharge, whereas only pronounced differences in the drainage rate of the two basins bear a significant effect on the streamflow correlation. The topological arrangement of the two outlets also influences the underlying streamflow correlation, as we show that nested catchments tend to maximize the spatial correlation of flow regimes. The application of the method to a set of catchments in the South-Eastern US suggests the potential of the proposed tool for the characterization of spatial connections of flow regimes in the absence of discharge measurements.

  19. Phase transitions in coupled map lattices and in associated probabilistic cellular automata.

    PubMed

    Just, Wolfram

    2006-10-01

    Analytical tools are applied to investigate piecewise linear coupled map lattices in terms of probabilistic cellular automata. The so-called disorder condition of probabilistic cellular automata is closely related with attracting sets in coupled map lattices. The importance of this condition for the suppression of phase transitions is illustrated by spatially one-dimensional systems. Invariant densities and temporal correlations are calculated explicitly. Ising type phase transitions are found for one-dimensional coupled map lattices acting on repelling sets and for a spatially two-dimensional Miller-Huse-like system with stable long time dynamics. Critical exponents are calculated within a finite size scaling approach. The relevance of detailed balance of the resulting probabilistic cellular automaton for the critical behavior is pointed out.

  20. Changes in Visual/Spatial and Analytic Strategy Use in Organic Chemistry with the Development of Expertise

    ERIC Educational Resources Information Center

    Vlacholia, Maria; Vosniadou, Stella; Roussos, Petros; Salta, Katerina; Kazi, Smaragda; Sigalas, Michael; Tzougraki, Chryssa

    2017-01-01

    We present two studies that investigated the adoption of visual/spatial and analytic strategies by individuals at different levels of expertise in the area of organic chemistry, using the Visual Analytic Chemistry Task (VACT). The VACT allows the direct detection of analytic strategy use without drawing inferences about underlying mental…

  1. Translating statistical species-habitat models to interactive decision support tools

    USGS Publications Warehouse

    Wszola, Lyndsie S.; Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  2. Translating statistical species-habitat models to interactive decision support tools.

    PubMed

    Wszola, Lyndsie S; Simonsen, Victoria L; Stuber, Erica F; Gillespie, Caitlyn R; Messinger, Lindsey N; Decker, Karie L; Lusk, Jeffrey J; Jorgensen, Christopher F; Bishop, Andrew A; Fontaine, Joseph J

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  3. Translating statistical species-habitat models to interactive decision support tools

    PubMed Central

    Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences. PMID:29236707

  4. A physically based analytical spatial air temperature and humidity model

    Treesearch

    Yang Yang; Theodore A. Endreny; David J. Nowak

    2013-01-01

    Spatial variation of urban surface air temperature and humidity influences human thermal comfort, the settling rate of atmospheric pollutants, and plant physiology and growth. Given the lack of observations, we developed a Physically based Analytical Spatial Air Temperature and Humidity (PASATH) model. The PASATH model calculates spatial solar radiation and heat...

  5. The Geoinformatica free and open source software stack

    NASA Astrophysics Data System (ADS)

    Jolma, A.

    2012-04-01

    The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.

  6. Coastal On-line Assessment and Synthesis Tool 2.0

    NASA Technical Reports Server (NTRS)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  7. Analytic and numeric Green's functions for a two-dimensional electron gas in an orthogonal magnetic field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cresti, Alessandro; Grosso, Giuseppe; Parravicini, Giuseppe Pastori

    2006-05-15

    We have derived closed analytic expressions for the Green's function of an electron in a two-dimensional electron gas threaded by a uniform perpendicular magnetic field, also in the presence of a uniform electric field and of a parabolic spatial confinement. A workable and powerful numerical procedure for the calculation of the Green's functions for a large infinitely extended quantum wire is considered exploiting a lattice model for the wire, the tight-binding representation for the corresponding matrix Green's function, and the Peierls phase factor in the Hamiltonian hopping matrix element to account for the magnetic field. The numerical evaluation of themore » Green's function has been performed by means of the decimation-renormalization method, and quite satisfactorily compared with the analytic results worked out in this paper. As an example of the versatility of the numerical and analytic tools here presented, the peculiar semilocal character of the magnetic Green's function is studied in detail because of its basic importance in determining magneto-transport properties in mesoscopic systems.« less

  8. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  9. Spatial problem-solving strategies of middle school students: Wayfinding with geographic information systems

    NASA Astrophysics Data System (ADS)

    Wigglesworth, John C.

    2000-06-01

    Geographic Information Systems (GIS) is a powerful computer software package that emphasizes the use of maps and the management of spatially referenced environmental data archived in a systems data base. Professional applications of GIS have been in place since the 1980's, but only recently has GIS gained significant attention in the K--12 classroom. Students using GIS are able to manipulate and query data in order to solve all manners of spatial problems. Very few studies have examined how this technological innovation can support classroom learning. In particular, there has been little research on how experience in using the software correlates with a child's spatial cognition and his/her ability to understand spatial relationships. This study investigates the strategies used by middle school students to solve a wayfinding (route-finding) problem using the ArcView GIS software. The research design combined an individual background questionnaire, results from the Group Assessment of Logical Thinking (GALT) test, and analysis of reflective think-aloud sessions to define the characteristics of the strategies students' used to solve this particular class of spatial problem. Three uniquely different spatial problem solving strategies were identified. Visual/Concrete Wayfinders used a highly visual strategy; Logical/Abstract Wayfinders used GIS software tools to apply a more analytical and systematic approach; Transitional Wayfinders used an approach that showed evidence of one that was shifting from a visual strategy to one that was more analytical. The triangulation of data sources indicates that this progression of wayfinding strategy can be correlated both to Piagetian stages of logical thought and to experience with the use of maps. These findings suggest that GIS teachers must be aware that their students' performance will lie on a continuum that is based on cognitive development, spatial ability, and prior experience with maps. To be most effective, GIS teaching strategies and curriculum development should also represent a progression that correlates to the learners' current skills and experience.

  10. Review of spectral imaging technology in biomedical engineering: achievements and challenges.

    PubMed

    Li, Qingli; He, Xiaofu; Wang, Yiting; Liu, Hongying; Xu, Dongrong; Guo, Fangmin

    2013-10-01

    Spectral imaging is a technology that integrates conventional imaging and spectroscopy to get both spatial and spectral information from an object. Although this technology was originally developed for remote sensing, it has been extended to the biomedical engineering field as a powerful analytical tool for biological and biomedical research. This review introduces the basics of spectral imaging, imaging methods, current equipment, and recent advances in biomedical applications. The performance and analytical capabilities of spectral imaging systems for biological and biomedical imaging are discussed. In particular, the current achievements and limitations of this technology in biomedical engineering are presented. The benefits and development trends of biomedical spectral imaging are highlighted to provide the reader with an insight into the current technological advances and its potential for biomedical research.

  11. MRI of human hair.

    PubMed

    Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael

    2009-06-01

    Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.

  12. CIFOG: Cosmological Ionization Fields frOm Galaxies

    NASA Astrophysics Data System (ADS)

    Hutter, Anne

    2018-03-01

    CIFOG is a versatile MPI-parallelised semi-numerical tool to perform simulations of the Epoch of Reionization. From a set of evolving cosmological gas density and ionizing emissivity fields, it computes the time and spatially dependent ionization of neutral hydrogen (HI), neutral (HeI) and singly ionized helium (HeII) in the intergalactic medium (IGM). The code accounts for HII, HeII, HeIII recombinations, and provides different descriptions for the photoionization rate that are used to calculate the residual HI fraction in ionized regions. This tool has been designed to be coupled to semi-analytic galaxy formation models or hydrodynamical simulations. The modular fashion of the code allows the user to easily introduce new descriptions for recombinations and the photoionization rate.

  13. Quantitative characterization of edge enhancement in phase contrast x-ray imaging.

    PubMed

    Monnin, P; Bulling, S; Hoszowska, J; Valley, J F; Meuli, R; Verdun, F R

    2004-06-01

    The aim of this study was to model the edge enhancement effect in in-line holography phase contrast imaging. A simple analytical approach was used to quantify refraction and interference contrasts in terms of beam energy and imaging geometry. The model was applied to predict the peak intensity and frequency of the edge enhancement for images of cylindrical fibers. The calculations were compared with measurements, and the relationship between the spatial resolution of the detector and the amplitude of the phase contrast signal was investigated. Calculations using the analytical model were in good agreement with experimental results for nylon, aluminum and copper wires of 50 to 240 microm diameter, and with numerical simulations based on Fresnel-Kirchhoff theory. A relationship between the defocusing distance and the pixel size of the image detector was established. This analytical model is a useful tool for optimizing imaging parameters in phase contrast in-line holography, including defocusing distance, detector resolution and beam energy.

  14. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Facilitating participatory multilevel decision-making by using interactive mental maps.

    PubMed

    Pfeiffer, Constanze; Glaser, Stephanie; Vencatesan, Jayshree; Schliermann-Kraus, Elke; Drescher, Axel; Glaser, Rüdiger

    2008-11-01

    Participation of citizens in political, economic or social decisions is increasingly recognized as a precondition to foster sustainable development processes. Since spatial information is often important during planning and decision making, participatory mapping gains in popularity. However, little attention has been paid to the fact that information must be presented in a useful way to reach city planners and policy makers. Above all, the importance of visualisation tools to support collaboration, analytical reasoning, problem solving and decision-making in analysing and planning processes has been underestimated. In this paper, we describe how an interactive mental map tool has been developed in a highly interdisciplinary disaster management project in Chennai, India. We moved from a hand drawn mental maps approach to an interactive mental map tool. This was achieved by merging socio-economic and geospatial data on infrastructure, local perceptions, coping and adaptation strategies with remote sensing data and modern technology of map making. This newly developed interactive mapping tool allowed for insights into different locally-constructed realities and facilitated the communication of results to the wider public and respective policy makers. It proved to be useful in visualising information and promoting participatory decision-making processes. We argue that the tool bears potential also for health research projects. The interactive mental map can be used to spatially and temporally assess key health themes such as availability of, and accessibility to, existing health care services, breeding sites of disease vectors, collection and storage of water, waste disposal, location of public toilets or defecation sites.

  16. Quantum entanglement in time

    NASA Astrophysics Data System (ADS)

    Nowakowski, Marcin

    2017-05-01

    In this paper we present a concept of quantum entanglement in time in a context of entangled consistent histories. These considerations are supported by presentation of necessary tools closely related to those acting on a space of spatial multipartite quantum states. We show that in similarity to monogamy of quantum entanglement in space, quantum entanglement in time is also endowed with this property for a particular history. Basing on these observations, we discuss further bounding of temporal correlations and derive analytically the Tsirelson bound implied by entangled histories for the Leggett-Garg inequalities.

  17. Non-traditional isotopes in analytical ecogeochemistry assessed by MC-ICP-MS

    NASA Astrophysics Data System (ADS)

    Prohaska, Thomas; Irrgeher, Johanna; Horsky, Monika; Hanousek, Ondřej; Zitek, Andreas

    2014-05-01

    Analytical ecogeochemistry deals with the development and application of tools of analytical chemistry to study dynamic biological and ecological processes within ecosystems and across ecosystem boundaries in time. It can be best described as a linkage between modern analytical chemistry and a holistic understanding of ecosystems ('The total human ecosystem') within the frame of transdisciplinary research. One focus of analytical ecogeochemistry is the advanced analysis of elements and isotopes in abiotic and biotic matrices and the application of the results to basic questions in different research fields like ecology, environmental science, climatology, anthropology, forensics, archaeometry and provenancing. With continuous instrumental developments, new isotopic systems have been recognized for their potential to study natural processes and well established systems could be analyzed with improved techniques, especially using multi collector inductively coupled plasma mass spectrometry (MC-ICP-MS). For example, in case of S, isotope ratio measurements at high mass resolution could be achieved at much lower S concentrations with ICP-MS as compared to IRMS, still keeping suitable uncertainty. Almost 50 different isotope systems have been investigated by ICP-MS, so far, with - besides Sr, Pb and U - Ca, Mg, Cd, Li, Hg, Si, Ge and B being the most prominent and considerably pushing the limits of plasma based mass spectrometry also by applying high mass resolution. The use of laser ablation in combination with MC-ICP-MS offers the possibility to achieve isotopic information on high spatial (µm-range) and temporal scale (in case of incrementally growing structures). The information gained with these analytical techniques can be linked between different hierarchical scales in ecosystems, offering means to better understand ecosystem processes. The presentation will highlight the use of different isotopic systems in ecosystem studies accomplished by ICP-MS. Selected examples on combining isotopic systems for the study of ecosystem processes on different spatial scales will underpin the great opportunities substantiated by the field of analytical ecogeochemistry. Moreover, recent developments in plasma mass spectrometry and the application of new isotopic systems require sound metrological approaches in order to prevent scientific conclusions drawn from analytical artifacts.

  18. Active Optics: stress polishing of toric mirrors for the VLT SPHERE adaptive optics system.

    PubMed

    Hugot, Emmanuel; Ferrari, Marc; El Hadi, Kacem; Vola, Pascal; Gimenez, Jean Luc; Lemaitre, Gérard R; Rabou, Patrick; Dohlen, Kjetil; Puget, Pascal; Beuzit, Jean Luc; Hubin, Norbert

    2009-05-20

    The manufacturing of toric mirrors for the Very Large Telescope-Spectro-Polarimetric High-Contrast Exoplanet Research instrument (SPHERE) is based on Active Optics and stress polishing. This figuring technique allows minimizing mid and high spatial frequency errors on an aspherical surface by using spherical polishing with full size tools. In order to reach the tight precision required, the manufacturing error budget is described to optimize each parameter. Analytical calculations based on elasticity theory and finite element analysis lead to the mechanical design of the Zerodur blank to be warped during the stress polishing phase. Results on the larger (366 mm diameter) toric mirror are evaluated by interferometry. We obtain, as expected, a toric surface within specification at low, middle, and high spatial frequencies ranges.

  19. Evaluating efficiency-equality tradeoffs for mobile source control strategies in an urban area

    PubMed Central

    Levy, Jonathan I.; Greco, Susan L.; Melly, Steven J.; Mukhi, Neha

    2013-01-01

    In environmental risk management, there are often interests in maximizing public health benefits (efficiency) and addressing inequality in the distribution of health outcomes. However, both dimensions are not generally considered within a single analytical framework. In this study, we estimate both total population health benefits and changes in quantitative indicators of health inequality for a number of alternative spatial distributions of diesel particulate filter retrofits across half of an urban bus fleet in Boston, Massachusetts. We focus on the impact of emissions controls on primary fine particulate matter (PM2.5) emissions, modeling the effect on PM2.5 concentrations and premature mortality. Given spatial heterogeneity in baseline mortality rates, we apply the Atkinson index and other inequality indicators to quantify changes in the distribution of mortality risk. Across the different spatial distributions of control strategies, the public health benefits varied by more than a factor of two, related to factors such as mileage driven per day, population density near roadways, and baseline mortality rates in exposed populations. Changes in health inequality indicators varied across control strategies, with the subset of optimal strategies considering both efficiency and equality generally robust across different parametric assumptions and inequality indicators. Our analysis demonstrates the viability of formal analytical approaches to jointly address both efficiency and equality in risk assessment, providing a tool for decision-makers who wish to consider both issues. PMID:18793281

  20. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  1. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  2. A web-based multicriteria evaluation of spatial trade-offs between environmental and economic implications from hydraulic fracturing in a shale gas region in Ohio.

    PubMed

    Liu, X; Gorsevski, P V; Yacobucci, M M; Onasch, C M

    2016-06-01

    Planning of shale gas infrastructure and drilling sites for hydraulic fracturing has important spatial implications. The evaluation of conflicting and competing objectives requires an explicit consideration of multiple criteria as they have important environmental and economic implications. This study presents a web-based multicriteria spatial decision support system (SDSS) prototype with a flexible and user-friendly interface that could provide educational or decision-making capabilities with respect to hydraulic fracturing site selection in eastern Ohio. One of the main features of this SDSS is to emphasize potential trade-offs between important factors of environmental and economic ramifications from hydraulic fracturing activities using a weighted linear combination (WLC) method. In the prototype, the GIS-enabled analytical components allow spontaneous visualization of available alternatives on maps which provide value-added features for decision support processes and derivation of final decision maps. The SDSS prototype also facilitates nonexpert participation capabilities using a mapping module, decision-making tool, group decision module, and social media sharing tools. The logical flow of successively presented forms and standardized criteria maps is used to generate visualization of trade-off scenarios and alternative solutions tailored to individual user's preferences that are graphed for subsequent decision-making.

  3. The Analytical Limits of Modeling Short Diffusion Timescales

    NASA Astrophysics Data System (ADS)

    Bradshaw, R. W.; Kent, A. J.

    2016-12-01

    Chemical and isotopic zoning in minerals is widely used to constrain the timescales of magmatic processes such as magma mixing and crystal residence, etc. via diffusion modeling. Forward modeling of diffusion relies on fitting diffusion profiles to measured compositional gradients. However, an individual measurement is essentially an average composition for a segment of the gradient defined by the spatial resolution of the analysis. Thus there is the potential for the analytical spatial resolution to limit the timescales that can be determined for an element of given diffusivity, particularly where the scale of the gradient approaches that of the measurement. Here we use a probabilistic modeling approach to investigate the effect of analytical spatial resolution on estimated timescales from diffusion modeling. Our method investigates how accurately the age of a synthetic diffusion profile can be obtained by modeling an "unknown" profile derived from discrete sampling of the synthetic compositional gradient at a given spatial resolution. We also include the effects of analytical uncertainty and the position of measurements relative to the diffusion gradient. We apply this method to the spatial resolutions of common microanalytical techniques (LA-ICP-MS, SIMS, EMP, NanoSIMS). Our results confirm that for a given diffusivity, higher spatial resolution gives access to shorter timescales, and that each analytical spacing has a minimum timescale, below which it overestimates the timescale. For example, for Ba diffusion in plagioclase at 750 °C timescales are accurate (within 20%) above 10, 100, 2,600, and 71,000 years at 0.3, 1, 5, and 25 mm spatial resolution, respectively. For Sr diffusion in plagioclase at 750 °C, timescales are accurate above 0.02, 0.2, 4, and 120 years at the same spatial resolutions. Our results highlight the importance of selecting appropriate analytical techniques to estimate accurate diffusion-based timescales.

  4. Effects of Using Dynamic Mathematics Software on Preservice Mathematics Teachers' Spatial Visualization Skills: The Case of Spatial Analytic Geometry

    ERIC Educational Resources Information Center

    Kösa, Temel

    2016-01-01

    The purpose of this study was to investigate the effects of using dynamic geometry software on preservice mathematics teachers' spatial visualization skills and to determine whether spatial visualization skills can be a predictor of success in learning analytic geometry of space. The study used a quasi-experimental design with a control group.…

  5. qSR: a quantitative super-resolution analysis tool reveals the cell-cycle dependent organization of RNA Polymerase I in live human cells.

    PubMed

    Andrews, J O; Conway, W; Cho, W -K; Narayanan, A; Spille, J -H; Jayanth, N; Inoue, T; Mullen, S; Thaler, J; Cissé, I I

    2018-05-09

    We present qSR, an analytical tool for the quantitative analysis of single molecule based super-resolution data. The software is created as an open-source platform integrating multiple algorithms for rigorous spatial and temporal characterizations of protein clusters in super-resolution data of living cells. First, we illustrate qSR using a sample live cell data of RNA Polymerase II (Pol II) as an example of highly dynamic sub-diffractive clusters. Then we utilize qSR to investigate the organization and dynamics of endogenous RNA Polymerase I (Pol I) in live human cells, throughout the cell cycle. Our analysis reveals a previously uncharacterized transient clustering of Pol I. Both stable and transient populations of Pol I clusters co-exist in individual living cells, and their relative fraction vary during cell cycle, in a manner correlating with global gene expression. Thus, qSR serves to facilitate the study of protein organization and dynamics with very high spatial and temporal resolutions directly in live cell.

  6. Relationships between species feeding traits and environmental conditions in fish communities: a three-matrix approach.

    PubMed

    Brind'Amour, Anik; Boisclair, Daniel; Dray, Stéphane; Legendre, Pierre

    2011-03-01

    Understanding the relationships between species biological traits and the environment is crucial to predicting the effect of habitat perturbations on fish communities. It is also an essential step in the assessment of the functional diversity. Using two complementary three-matrix approaches (fourth-corner and RLQ analyses), we tested the hypothesis that feeding-oriented traits determine the spatial distributions of littoral fish species by assessing the relationship between fish spatial distributions, fish species traits, and habitat characteristics in two Laurentian Shield lakes. Significant associations between the feeding-oriented traits and the environmental characteristics suggested that fish communities in small lakes (displaying low species richness) can be spatially structured. Three groups of traits, mainly categorized by the species spatial and temporal feeding activity, were identified. The water column may be divided in two sections, each of them corresponding to a group of traits related to the vertical distribution of the prey coupled with the position of the mouth. Lake areas of low structural complexity were inhabited by functional assemblages dominated by surface feeders while structurally more complex areas were occupied by mid-water and benthic feeders. A third group referring to the time of feeding activity was observed. Our work could serve as a guideline study to evaluate species traits x environment associations at multiple spatial scales. Our results indicate that three-matrix statistical approaches are powerful tools that can be used to study such relationships. These recent statistical approaches open up new research directions such as the study of spatially based biological functions in lakes. They also provide new analytical tools for determining, for example, the potential size of freshwater protected areas.

  7. PlanetSense: A Real-time Streaming and Spatio-temporal Analytics Platform for Gathering Geo-spatial Intelligence from Open Source Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O

    Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less

  8. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  9. Developing a Web-Based Ppgis, as AN Environmental Reporting Service

    NASA Astrophysics Data System (ADS)

    Ranjbar Nooshery, N.; Taleai, M.; Kazemi, R.; Ebadi, K.

    2017-09-01

    Today municipalities are searching for new tools to empower locals for changing the future of their own areas by increasing their participation in different levels of urban planning. These tools should involve the community in planning process using participatory approaches instead of long traditional top-down planning models and help municipalities to obtain proper insight about major problems of urban neighborhoods from the residents' point of view. In this matter, public participation GIS (PPGIS) which enables citizens to record and following up their feeling and spatial knowledge regarding problems of the city in the form of maps have been introduced. In this research, a tool entitled CAER (Collecting & Analyzing of Environmental Reports) is developed. In the first step, a software framework based on Web-GIS tool, called EPGIS (Environmental Participatory GIS) has been designed to support public participation in reporting urban environmental problems and to facilitate data flow between citizens and municipality. A web-based cartography tool was employed for geo-visualization and dissemination of map-based reports. In the second step of CAER, a subsystem is developed based on SOLAP (Spatial On-Line Analytical Processing), as a data mining tools to elicit the local knowledge facilitating bottom-up urban planning practices and to help urban managers to find hidden relations among the recorded reports. This system is implemented in a case study area in Boston, Massachusetts and its usability was evaluated. The CAER should be considered as bottom-up planning tools to collect people's problems and views about their neighborhood and transmits them to the city officials. It also helps urban planners to find solutions for better management from citizen's viewpoint and gives them this chance to develop good plans to the neighborhoods that should be satisfied the citizens.

  10. Generalized reproduction numbers and the prediction of patterns in waterborne disease

    PubMed Central

    Gatto, Marino; Mari, Lorenzo; Bertuzzo, Enrico; Casagrandi, Renato; Righetto, Lorenzo; Rodriguez-Iturbe, Ignacio; Rinaldo, Andrea

    2012-01-01

    Understanding, predicting, and controlling outbreaks of waterborne diseases are crucial goals of public health policies, but pose challenging problems because infection patterns are influenced by spatial structure and temporal asynchrony. Although explicit spatial modeling is made possible by widespread data mapping of hydrology, transportation infrastructure, population distribution, and sanitation, the precise condition under which a waterborne disease epidemic can start in a spatially explicit setting is still lacking. Here we show that the requirement that all the local reproduction numbers be larger than unity is neither necessary nor sufficient for outbreaks to occur when local settlements are connected by networks of primary and secondary infection mechanisms. To determine onset conditions, we derive general analytical expressions for a reproduction matrix , explicitly accounting for spatial distributions of human settlements and pathogen transmission via hydrological and human mobility networks. At disease onset, a generalized reproduction number (the dominant eigenvalue of ) must be larger than unity. We also show that geographical outbreak patterns in complex environments are linked to the dominant eigenvector and to spectral properties of . Tests against data and computations for the 2010 Haiti and 2000 KwaZulu-Natal cholera outbreaks, as well as against computations for metapopulation networks, demonstrate that eigenvectors of provide a synthetic and effective tool for predicting the disease course in space and time. Networked connectivity models, describing the interplay between hydrology, epidemiology, and social behavior sustaining human mobility, thus prove to be key tools for emergency management of waterborne infections. PMID:23150538

  11. Hyperspectral Mineral Mapping in Support of Geothermal Exploration: Examples from Long Valley Caldera, CA and Dixie Valley, NV, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martini, B; Silver, E; Pickles, W

    2004-03-25

    Growing interest and exploration dollars within the geothermal sector have paved the way for increasingly sophisticated suites of geophysical and geochemical tools and methodologies. The efforts to characterize and assess known geothermal fields and find new, previously unknown resources has been aided by the advent of higher spatial resolution airborne geophysics (e.g. aeromagnetics), development of new seismic processing techniques, and the genesis of modern multi-dimensional fluid flow and structural modeling algorithms, just to name a few. One of the newest techniques on the scene, is hyperspectral imaging. Really an optical analytical geochemical tool, hyperspectral imagers (or imaging spectrometers as theymore » are also called), are generally flown at medium to high altitudes aboard mid-sized aircraft and much in the same way more familiar geophysics are flown. The hyperspectral data records a continuous spatial record of the earth's surface, as well as measuring a continuous spectral record of reflected sunlight or emitted thermal radiation. This high fidelity, uninterrupted spatial and spectral record allows for accurate material distribution mapping and quantitative identification at the pixel to sub-pixel level. In volcanic/geothermal regions, this capability translates to synoptic, high spatial resolution, large-area mineral maps generated at time scales conducive to both the faster pace of the exploration and drilling managers, as well as to the slower pace of geologists and other researchers trying to understand the geothermal system over the long run.« less

  12. Hyperspectral Mineral Mapping in Support of Geothermal Exploration: Examples from Long Valley Caldera, CA and Dixie Valley, NV, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pickles, W L; Martini, B A; Silver, E A

    2004-03-03

    Growing interest and exploration dollars within the geothermal sector have paved the way for increasingly sophisticated suites of geophysical and geochemical tools and methodologies. The efforts to characterize and assess known geothermal fields and find new, previously unknown resources has been aided by the advent of higher spatial resolution airborne geophysics (e.g. aeromagnetics), development of new seismic processing techniques, and the genesis of modern multi-dimensional fluid flow and structural modeling algorithms, just to name a few. One of the newest techniques on the scene, is hyperspectral imaging. Really an optical analytical geochemical tool, hyperspectral imagers (or imaging spectrometers as theymore » are also called), are generally flown at medium to high altitudes aboard mid-sized aircraft and much in the same way more familiar geophysics are flown. The hyperspectral data records a continuous spatial record of the earth's surface, as well as measuring a continuous spectral record of reflected sunlight or emitted thermal radiation. This high fidelity, uninterrupted spatial and spectral record allows for accurate material distribution mapping and quantitative identification at the pixel to sub-pixel level. In volcanic/geothermal regions, this capability translates to synoptic, high spatial resolution, large-area mineral maps generated at time scales conducive to both the faster pace of the exploration and drilling managers, as well as to the slower pace of geologists and other researchers trying to understand the geothermal system over the long run.« less

  13. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  14. KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery

    NASA Astrophysics Data System (ADS)

    Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan

    2013-05-01

    KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.

  15. Near-Infrared Spatially Resolved Spectroscopy for Tablet Quality Determination.

    PubMed

    Igne, Benoît; Talwar, Sameer; Feng, Hanzhou; Drennen, James K; Anderson, Carl A

    2015-12-01

    Near-infrared (NIR) spectroscopy has become a well-established tool for the characterization of solid oral dosage forms manufacturing processes and finished products. In this work, the utility of a traditional single-point NIR measurement was compared with that of a spatially resolved spectroscopic (SRS) measurement for the determination of tablet assay. Experimental designs were used to create samples that allowed for calibration models to be developed and tested on both instruments. Samples possessing a poor distribution of ingredients (highly heterogeneous) were prepared by under-blending constituents prior to compaction to compare the analytical capabilities of the two NIR methods. The results indicate that SRS can provide spatial information that is usually obtainable only through imaging experiments for the determination of local heterogeneity and detection of abnormal tablets that would not be detected with single-point spectroscopy, thus complementing traditional NIR measurement systems for in-line, and in real-time tablet analysis. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  16. Girls' Spatial Skills and Arithmetic Strategies in First Grade as Predictors of Fifth-Grade Analytical Math Reasoning

    ERIC Educational Resources Information Center

    Casey, Beth M.; Lombardi, Caitlin McPherran; Pollock, Amanda; Fineman, Bonnie; Pezaris, Elizabeth

    2017-01-01

    This study investigated longitudinal pathways leading from early spatial skills in first-grade girls to their fifth-grade analytical math reasoning abilities (N = 138). First-grade assessments included spatial skills, verbal skills, addition/subtraction skills, and frequency of choice of a decomposition or retrieval strategy on the…

  17. Integrative Spatial Data Analytics for Public Health Studies of New York State

    PubMed Central

    Chen, Xin; Wang, Fusheng

    2016-01-01

    Increased accessibility of health data made available by the government provides unique opportunity for spatial analytics with much higher resolution to discover patterns of diseases, and their correlation with spatial impact indicators. This paper demonstrated our vision of integrative spatial analytics for public health by linking the New York Cancer Mapping Dataset with datasets containing potential spatial impact indicators. We performed spatial based discovery of disease patterns and variations across New York State, and identify potential correlations between diseases and demographic, socio-economic and environmental indicators. Our methods were validated by three correlation studies: the correlation between stomach cancer and Asian race, the correlation between breast cancer and high education population, and the correlation between lung cancer and air toxics. Our work will allow public health researchers, government officials or other practitioners to adequately identify, analyze, and monitor health problems at the community or neighborhood level for New York State. PMID:28269834

  18. What Do They Have in Common? Physical Drivers of Streamflow Spatial Correlation and Prediction of Flow Regimes at Ungauged Locations in the Contiguous United States

    NASA Astrophysics Data System (ADS)

    Betterle, A.; Schirmer, M.; Botter, G.

    2017-12-01

    Streamflow dynamics strongly influence anthropogenic activities and the ecological functions of riverine and riparian habitats. However, the widespread lack of direct discharge measurements often challenges the set-up of conscious and effective decision-making processes, including droughts and floods protection, water resources management and river restoration practices. By characterizing the spatial correlation of daily streamflow timeseries at two arbitrary locations, this study provides a method to evaluate how spatially variable catchment-scale hydrological process affects the resulting streamflow dynamics along and across river systems. In particular, streamflow spatial correlation is described analytically as a function of morphological, climatic and vegetation properties in the contributing catchments, building on a joint probabilistic description of flow dynamics at pairs of outlets. The approach enables an explicit linkage between similarities of flow dynamics and spatial patterns of hydrologically relevant features of climate and landscape. Therefore, the method is suited to explore spatial patterns of streamflow dynamics across geomorphoclimatic gradients. In particular, we show how the streamflow correlation can be used at the continental scale to individuate catchment pairs with similar hydrological dynamics, thereby providing a useful tool for the estimate of flow duration curves in poorly gauged areas.

  19. Selective sweeps in growing microbial colonies

    NASA Astrophysics Data System (ADS)

    Korolev, Kirill S.; Müller, Melanie J. I.; Karahan, Nilay; Murray, Andrew W.; Hallatschek, Oskar; Nelson, David R.

    2012-04-01

    Evolutionary experiments with microbes are a powerful tool to study mutations and natural selection. These experiments, however, are often limited to the well-mixed environments of a test tube or a chemostat. Since spatial organization can significantly affect evolutionary dynamics, the need is growing for evolutionary experiments in spatially structured environments. The surface of a Petri dish provides such an environment, but a more detailed understanding of microbial growth on Petri dishes is necessary to interpret such experiments. We formulate a simple deterministic reaction-diffusion model, which successfully predicts the spatial patterns created by two competing species during colony expansion. We also derive the shape of these patterns analytically without relying on microscopic details of the model. In particular, we find that the relative fitness of two microbial strains can be estimated from the logarithmic spirals created by selective sweeps. The theory is tested with strains of the budding yeast Saccharomyces cerevisiae for spatial competitions with different initial conditions and for a range of relative fitnesses. The reaction-diffusion model also connects the microscopic parameters like growth rates and diffusion constants with macroscopic spatial patterns and predicts the relationship between fitness in liquid cultures and on Petri dishes, which we confirmed experimentally. Spatial sector patterns therefore provide an alternative fitness assay to the commonly used liquid culture fitness assays.

  20. Imaging and Analytics: The changing face of Medical Imaging

    NASA Astrophysics Data System (ADS)

    Foo, Thomas

    There have been significant technological advances in imaging capability over the past 40 years. Medical imaging capabilities have developed rapidly, along with technology development in computational processing speed and miniaturization. Moving to all-digital, the number of images that are acquired in a routine clinical examination has increased dramatically from under 50 images in the early days of CT and MRI to more than 500-1000 images today. The staggering number of images that are routinely acquired poses significant challenges for clinicians to interpret the data and to correctly identify the clinical problem. Although the time provided to render a clinical finding has not substantially changed, the amount of data available for interpretation has grown exponentially. In addition, the image quality (spatial resolution) and information content (physiologically-dependent image contrast) has also increased significantly with advances in medical imaging technology. On its current trajectory, medical imaging in the traditional sense is unsustainable. To assist in filtering and extracting the most relevant data elements from medical imaging, image analytics will have a much larger role. Automated image segmentation, generation of parametric image maps, and clinical decision support tools will be needed and developed apace to allow the clinician to manage, extract and utilize only the information that will help improve diagnostic accuracy and sensitivity. As medical imaging devices continue to improve in spatial resolution, functional and anatomical information content, image/data analytics will be more ubiquitous and integral to medical imaging capability.

  1. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...

  2. First GIS Analysis of Modern Stone Tools Used by Wild Chimpanzees (Pan troglodytes verus) in Bossou, Guinea, West Africa

    PubMed Central

    Arroyo, Adrian; Matsuzawa, Tetsuro; de la Torre, Ignacio

    2015-01-01

    Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record. PMID:25793642

  3. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  4. BIOCHEMISTRY OF MOBILE ZINC AND NITRIC OXIDE REVEALED BY FLUORESCENT SENSORS

    PubMed Central

    Pluth, Michael D.; Tomat, Elisa; Lippard, Stephen J.

    2010-01-01

    Biologically mobile zinc and nitric oxide (NO) are two prominent examples of inorganic compounds involved in numerous signaling pathways in living systems. In the past decade, a synergy of regulation, signaling, and translocation of these two species has emerged in several areas of human physiology, providing additional incentive for developing adequate detection systems for Zn(II) ions and NO in biological specimens. Fluorescent probes for both of these bioinorganic analytes provide excellent tools for their detection, with high spatial and temporal resolution. We review the most widely used fluorescent sensors for biological zinc and nitric oxide, together with promising new developments and unmet needs of contemporary Zn(II) and NO biological imaging. The interplay between zinc and nitric oxide in the nervous, cardiovascular, and immune systems is highlighted to illustrate the contributions of selective fluorescent probes to the study of these two important bioinorganic analytes. PMID:21675918

  5. Analytical treatment of the deformation behavior of EUVL masks during electrostatic chucking

    NASA Astrophysics Data System (ADS)

    Brandstetter, Gerd; Govindjee, Sanjay

    2012-03-01

    A new analytical approach is presented to predict mask deformation during electro-static chucking in next generation extreme-ultraviolet-lithography (EUVL). Given an arbitrary profile measurement of the mask and chuck non-flatness, this method has been developed as an alternative to time-consuming finite element simulations for overlay error correction algorithms. We consider the feature transfer of each harmonic component in the profile shapes via linear elasticity theory and demonstrate analytically how high spatial frequencies are filtered. The method is compared to presumably more accurate finite element simulations and has been tested successfully in an overlay error compensation experiment, where the residual error y-component could be reduced by a factor 2. As a side outcome, the formulation provides a tool to estimate the critical pin-size and -pitch such that the distortion on the mask front-side remains within given tolerances. We find for a numerical example that pin-pitches of less than 5 mm will result in a mask pattern-distortion of less than 1 nm if the chucking pressure is below 30 kPa.

  6. Analytical treatment of the deformation behavior of extreme-ultraviolet-lithography masks during electrostatic chucking

    NASA Astrophysics Data System (ADS)

    Brandstetter, Gerd; Govindjee, Sanjay

    2012-10-01

    A new analytical approach is presented to predict mask deformation during electrostatic chucking in next-generation extreme-ultraviolet-lithography. Given an arbitrary profile measurement of the mask and chuck nonflatness, this method has been developed as an alternative to time-consuming finite element simulations for overlay error correction algorithms. We consider the feature transfer of each harmonic component in the profile shapes via linear elasticity theory and demonstrate analytically how high spatial frequencies are filtered. The method is compared to presumably more accurate finite element simulations and has been tested successfully in an overlay error compensation experiment, where the residual error y-component could be reduced by a factor of 2. As a side outcome, the formulation provides a tool to estimate the critical pin-size and -pitch such that the distortion on the mask front-side remains within given tolerances. We find for a numerical example that pin-pitches of less than 5 mm will result in a mask pattern distortion of less than 1 nm if the chucking pressure is below 30 kPa.

  7. Stochastic population dynamics in spatially extended predator-prey systems

    NASA Astrophysics Data System (ADS)

    Dobramysl, Ulrich; Mobilia, Mauro; Pleimling, Michel; Täuber, Uwe C.

    2018-02-01

    Spatially extended population dynamics models that incorporate demographic noise serve as case studies for the crucial role of fluctuations and correlations in biological systems. Numerical and analytic tools from non-equilibrium statistical physics capture the stochastic kinetics of these complex interacting many-particle systems beyond rate equation approximations. Including spatial structure and stochastic noise in models for predator-prey competition invalidates the neutral Lotka-Volterra population cycles. Stochastic models yield long-lived erratic oscillations stemming from a resonant amplification mechanism. Spatially extended predator-prey systems display noise-stabilized activity fronts that generate persistent correlations. Fluctuation-induced renormalizations of the oscillation parameters can be analyzed perturbatively via a Doi-Peliti field theory mapping of the master equation; related tools allow detailed characterization of extinction pathways. The critical steady-state and non-equilibrium relaxation dynamics at the predator extinction threshold are governed by the directed percolation universality class. Spatial predation rate variability results in more localized clusters, enhancing both competing species’ population densities. Affixing variable interaction rates to individual particles and allowing for trait inheritance subject to mutations induces fast evolutionary dynamics for the rate distributions. Stochastic spatial variants of three-species competition with ‘rock-paper-scissors’ interactions metaphorically describe cyclic dominance. These models illustrate intimate connections between population dynamics and evolutionary game theory, underscore the role of fluctuations to drive populations toward extinction, and demonstrate how space can support species diversity. Two-dimensional cyclic three-species May-Leonard models are characterized by the emergence of spiraling patterns whose properties are elucidated by a mapping onto a complex Ginzburg-Landau equation. Multiple-species extensions to general ‘food networks’ can be classified on the mean-field level, providing both fundamental understanding of ensuing cooperativity and profound insight into the rich spatio-temporal features and coarsening kinetics in the corresponding spatially extended systems. Novel space-time patterns emerge as a result of the formation of competing alliances; e.g. coarsening domains that each incorporate rock-paper-scissors competition games.

  8. A spatial model to assess the effects of hydropower operations on Columbia River fall Chinook Salmon spawning habitat

    USGS Publications Warehouse

    Hatten, James R.; Tiffan, Kenneth F.; Anglin, Donald R.; Haeseker, Steven L.; Skalicky, Joseph J.; Schaller, Howard

    2009-01-01

    Priest Rapids Dam on the Columbia River produces large daily and hourly streamflow fluctuations throughout the Hanford Reach during the period when fall Chinook salmon Oncorhynchus tshawytscha are selecting spawning habitat, constructing redds, and actively engaged in spawning. Concern over the detrimental effects of these fluctuations prompted us to quantify the effects of variable flows on the amount and persistence of fall Chinook salmon spawning habitat in the Hanford Reach. Specifically, our goal was to develop a management tool capable of quantifying the effects of current and alternative hydrographs on predicted spawning habitat in a spatially explicit manner. Toward this goal, we modeled the water velocities and depths that fall Chinook salmon experienced during the 2004 spawning season, plus what they would probably have experienced under several alternative (i.e., synthetic) hydrographs, using both one- and two-dimensional hydrodynamic models. To estimate spawning habitat under existing or alternative hydrographs, we used cell-based modeling and logistic regression to construct and compare numerous spatial habitat models. We found that fall Chinook salmon were more likely to spawn at locations where velocities were persistently greater than 1 m/s and in areas where fluctuating water velocities were reduced. Simulations of alternative dam operations indicate that the quantity of spawning habitat is expected to increase as streamflow fluctuations are reduced during the spawning season. The spatial habitat models that we developed provide management agencies with a quantitative tool for predicting, in a spatially explicit manner, the effects of different flow regimes on fall Chinook salmon spawning habitat in the Hanford Reach. In addition to characterizing temporally varying habitat conditions, our research describes an analytical approach that could be applied in other highly variable aquatic systems.

  9. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  10. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  11. Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems

    NASA Astrophysics Data System (ADS)

    Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.

    2016-12-01

    Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.

  12. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  13. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  14. VISAGE Visualization for Integrated Satellite, Airborne and Ground-Based Data Exploration

    NASA Technical Reports Server (NTRS)

    Conover, Helen; Berendes, Todd; Naeger, Aaron; Maskey, Manil; Gatlin, Patrick; Wingo, Stephanie; Kulkarni, Ajinkya; Gupta, Shivangi; Nagaraj, Sriraksha; Wolff, David; hide

    2017-01-01

    The primary goal of the VISAGE project is to facilitate more efficient Earth Science investigations via a tool that can provide visualization and analytic capabilities for diverse coincident datasets. This proof-of-concept project will be centered around the GPM Ground Validation program, which provides a valuable source of intensive, coincident observations of atmospheric phenomena. The data are from a wide variety of ground-based, airborne and satellite instruments, with a wide diversity in spatial and temporal scales, variables, and formats, which makes these data difficult to use together. VISAGE will focus on "golden cases" where most ground instruments were in operation and multiple research aircraft sampled a significant weather event, ideally while the GPM Core Observatory passed overhead. The resulting tools will support physical process studies as well as satellite and model validation.

  15. Lateral conduction effects on heat-transfer data obtained with the phase-change paint technique

    NASA Technical Reports Server (NTRS)

    Maise, G.; Rossi, M. J.

    1974-01-01

    A computerized tool, CAPE, (Conduction Analysis Program using Eigenvalues) has been developed to account for lateral heat conduction in wind tunnel models in the data reduction of the phase-change paint technique. The tool also accounts for the effects of finite thickness (thin wings) and surface curvature. A special reduction procedure using just one time of melt is also possible on leading edges. A novel iterative numerical scheme was used, with discretized spatial coordinates but analytic integration in time, to solve the inverse conduction problem involved in the data reduction. A yes-no chart is provided which tells the test engineer when various corrections are large enough so that CAPE should be used. The accuracy of the phase-change paint technique in the presence of finite thickness and lateral conduction is also investigated.

  16. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  17. Selection and application of microbial source tracking tools for water-quality investigations

    USGS Publications Warehouse

    Stoeckel, Donald M.

    2005-01-01

    Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.

  18. Spectral Properties and Dynamics of Gold Nanorods Revealed by EMCCD Based Spectral-Phasor Method

    PubMed Central

    Chen, Hongtao; Digman, Michelle A.

    2015-01-01

    Gold nanorods (NRs) with tunable plasmon-resonant absorption in the near-infrared region have considerable advantages over organic fluorophores as imaging agents. However, the luminescence spectral properties of NRs have not been fully explored at the single particle level in bulk due to lack of proper analytic tools. Here we present a global spectral phasor analysis method which allows investigations of NRs' spectra at single particle level with their statistic behavior and spatial information during imaging. The wide phasor distribution obtained by the spectral phasor analysis indicates spectra of NRs are different from particle to particle. NRs with different spectra can be identified graphically in corresponding spatial images with high spectral resolution. Furthermore, spectral behaviors of NRs under different imaging conditions, e.g. different excitation powers and wavelengths, were carefully examined by our laser-scanning multiphoton microscope with spectral imaging capability. Our results prove that the spectral phasor method is an easy and efficient tool in hyper-spectral imaging analysis to unravel subtle changes of the emission spectrum. Moreover, we applied this method to study the spectral dynamics of NRs during direct optical trapping and by optothermal trapping. Interestingly, spectral shifts were observed in both trapping phenomena. PMID:25684346

  19. Generalized reproduction numbers and the prediction of patterns in waterborne disease.

    PubMed

    Gatto, Marino; Mari, Lorenzo; Bertuzzo, Enrico; Casagrandi, Renato; Righetto, Lorenzo; Rodriguez-Iturbe, Ignacio; Rinaldo, Andrea

    2012-11-27

    Understanding, predicting, and controlling outbreaks of waterborne diseases are crucial goals of public health policies, but pose challenging problems because infection patterns are influenced by spatial structure and temporal asynchrony. Although explicit spatial modeling is made possible by widespread data mapping of hydrology, transportation infrastructure, population distribution, and sanitation, the precise condition under which a waterborne disease epidemic can start in a spatially explicit setting is still lacking. Here we show that the requirement that all the local reproduction numbers R0 be larger than unity is neither necessary nor sufficient for outbreaks to occur when local settlements are connected by networks of primary and secondary infection mechanisms. To determine onset conditions, we derive general analytical expressions for a reproduction matrix G0, explicitly accounting for spatial distributions of human settlements and pathogen transmission via hydrological and human mobility networks. At disease onset, a generalized reproduction number Λ0 (the dominant eigenvalue of G0) must be larger than unity. We also show that geographical outbreak patterns in complex environments are linked to the dominant eigenvector and to spectral properties of G0. Tests against data and computations for the 2010 Haiti and 2000 KwaZulu-Natal cholera outbreaks, as well as against computations for metapopulation networks, demonstrate that eigenvectors of G0 provide a synthetic and effective tool for predicting the disease course in space and time. Networked connectivity models, describing the interplay between hydrology, epidemiology, and social behavior sustaining human mobility, thus prove to be key tools for emergency management of waterborne infections.

  20. A two-dimensional analytical model of vapor intrusion involving vertical heterogeneity.

    PubMed

    Yao, Yijun; Verginelli, Iason; Suuberg, Eric M

    2017-05-01

    In this work, we present an analytical chlorinated vapor intrusion (CVI) model that can estimate source-to-indoor air concentration attenuation by simulating two-dimensional (2-D) vapor concentration profile in vertically heterogeneous soils overlying a homogenous vapor source. The analytical solution describing the 2-D soil gas transport was obtained by applying a modified Schwarz-Christoffel mapping method. A partial field validation showed that the developed model provides results (especially in terms of indoor emission rates) in line with the measured data from a case involving a building overlying a layered soil. In further testing, it was found that the new analytical model can very closely replicate the results of three-dimensional (3-D) numerical models at steady state in scenarios involving layered soils overlying homogenous groundwater sources. By contrast, by adopting a two-layer approach (capillary fringe and vadose zone) as employed in the EPA implementation of the Johnson and Ettinger model, the spatially and temporally averaged indoor concentrations in the case of groundwater sources can be higher than the ones estimated by the numerical model up to two orders of magnitude. In short, the model proposed in this work can represent an easy-to-use tool that can simulate the subsurface soil gas concentration in layered soils overlying a homogenous vapor source while keeping the simplicity of an analytical approach that requires much less computational effort.

  1. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  2. Mirion--a software package for automatic processing of mass spectrometric images.

    PubMed

    Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B

    2013-08-01

    Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.

  3. Prediction of down-gradient impacts of DNAPL source depletion using tracer techniques: Laboratory and modeling validation

    NASA Astrophysics Data System (ADS)

    Jawitz, J. W.; Basu, N.; Chen, X.

    2007-05-01

    Interwell application of coupled nonreactive and reactive tracers through aquifer contaminant source zones enables quantitative characterization of aquifer heterogeneity and contaminant architecture. Parameters obtained from tracer tests are presented here in a Lagrangian framework that can be used to predict the dissolution of nonaqueous phase liquid (NAPL) contaminants. Nonreactive tracers are commonly used to provide information about travel time distributions in hydrologic systems. Reactive tracers have more recently been introduced as a tool to quantify the amount of NAPL contaminant present within the tracer swept volume. Our group has extended reactive tracer techniques to also characterize NAPL spatial distribution heterogeneity. By conceptualizing the flow field through an aquifer as a collection of streamtubes, the aquifer hydrodynamic heterogeneities may be characterized by a nonreactive tracer travel time distribution, and NAPL spatial distribution heterogeneity may be similarly described using reactive travel time distributions. The combined statistics of these distributions are used to derive a simple analytical solution for contaminant dissolution. This analytical solution, and the tracer techniques used for its parameterization, were validated both numerically and experimentally. Illustrative applications are presented from numerical simulations using the multiphase flow and transport simulator UTCHEM, and laboratory experiments of surfactant-enhanced NAPL remediation in two-dimensional flow chambers.

  4. Map LineUps: Effects of spatial structure on graphical inference.

    PubMed

    Beecham, Roger; Dykes, Jason; Meulemans, Wouter; Slingsby, Aidan; Turkay, Cagatay; Wood, Jo

    2017-01-01

    Fundamental to the effective use of visualization as an analytic and descriptive tool is the assurance that presenting data visually provides the capability of making inferences from what we see. This paper explores two related approaches to quantifying the confidence we may have in making visual inferences from mapped geospatial data. We adapt Wickham et al.'s 'Visual Line-up' method as a direct analogy with Null Hypothesis Significance Testing (NHST) and propose a new approach for generating more credible spatial null hypotheses. Rather than using as a spatial null hypothesis the unrealistic assumption of complete spatial randomness, we propose spatially autocorrelated simulations as alternative nulls. We conduct a set of crowdsourced experiments (n=361) to determine the just noticeable difference (JND) between pairs of choropleth maps of geographic units controlling for spatial autocorrelation (Moran's I statistic) and geometric configuration (variance in spatial unit area). Results indicate that people's abilities to perceive differences in spatial autocorrelation vary with baseline autocorrelation structure and the geometric configuration of geographic units. These results allow us, for the first time, to construct a visual equivalent of statistical power for geospatial data. Our JND results add to those provided in recent years by Klippel et al. (2011), Harrison et al. (2014) and Kay & Heer (2015) for correlation visualization. Importantly, they provide an empirical basis for an improved construction of visual line-ups for maps and the development of theory to inform geospatial tests of graphical inference.

  5. Raman Imaging in Cell Membranes, Lipid-Rich Organelles, and Lipid Bilayers.

    PubMed

    Syed, Aleem; Smith, Emily A

    2017-06-12

    Raman-based optical imaging is a promising analytical tool for noninvasive, label-free chemical imaging of lipid bilayers and cellular membranes. Imaging using spontaneous Raman scattering suffers from a low intensity that hinders its use in some cellular applications. However, developments in coherent Raman imaging, surface-enhanced Raman imaging, and tip-enhanced Raman imaging have enabled video-rate imaging, excellent detection limits, and nanometer spatial resolution, respectively. After a brief introduction to these commonly used Raman imaging techniques for cell membrane studies, this review discusses selected applications of these modalities for chemical imaging of membrane proteins and lipids. Finally, recent developments in chemical tags for Raman imaging and their applications in the analysis of selected cell membrane components are summarized. Ongoing developments toward improving the temporal and spatial resolution of Raman imaging and small-molecule tags with strong Raman scattering cross sections continue to expand the utility of Raman imaging for diverse cell membrane studies.

  6. Population dynamics in an intermittent refuge

    NASA Astrophysics Data System (ADS)

    Colombo, E. H.; Anteneodo, C.

    2016-10-01

    Population dynamics is constrained by the environment, which needs to obey certain conditions to support population growth. We consider a standard model for the evolution of a single species population density, which includes reproduction, competition for resources, and spatial spreading, while subject to an external harmful effect. The habitat is spatially heterogeneous, there existing a refuge where the population can be protected. Temporal variability is introduced by the intermittent character of the refuge. This scenario can apply to a wide range of situations, from a laboratory setting where bacteria can be protected by a blinking mask from ultraviolet radiation, to large-scale ecosystems, like a marine reserve where there can be seasonal fishing prohibitions. Using analytical and numerical tools, we investigate the asymptotic behavior of the total population as a function of the size and characteristic time scales of the refuge. We obtain expressions for the minimal size required for population survival, in the slow and fast time scale limits.

  7. Knowledge-based geographic information systems (KBGIS): New analytic and data management tools

    USGS Publications Warehouse

    Albert, T.M.

    1988-01-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.

  8. Analytical method of waste allocation in waste management systems: Concept, method and case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergeron, Francis C., E-mail: francis.b.c@videotron.ca

    Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less

  9. World Spatiotemporal Analytics and Mapping Project (wstamp): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World's Largest Open Soruce Data Sets

    NASA Astrophysics Data System (ADS)

    Stewart, R.; Piburn, J.; Sorokine, A.; Myers, A.; Moehl, J.; White, D.

    2015-07-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings.

  10. Nanoscale simultaneous chemical and mechanical imaging via peak force infrared microscopy

    PubMed Central

    Wang, Le; Wang, Haomin; Wagner, Martin; Yan, Yong; Jakob, Devon S.; Xu, Xiaoji G.

    2017-01-01

    Nondestructive chemical and mechanical measurements of materials with ~10-nm spatial resolution together with topography provide rich information on the compositions and organizations of heterogeneous materials and nanoscale objects. However, multimodal nanoscale correlations are difficult to achieve because of the limitation on spatial resolution of optical microscopy and constraints from instrumental complexities. We report a novel noninvasive spectroscopic scanning probe microscopy method—peak force infrared (PFIR) microscopy—that allows chemical imaging, collection of broadband infrared spectra, and mechanical mapping at a spatial resolution of 10 nm. In our technique, chemical absorption information is directly encoded in the withdraw curve of the peak force tapping cycle after illumination with synchronized infrared laser pulses in a simple apparatus. Nanoscale phase separation in block copolymers and inhomogeneity in CH3NH3PbBr3 perovskite crystals are studied with correlative infrared/mechanical nanoimaging. Furthermore, we show that the PFIR method is sensitive to the presence of surface phonon polaritons in boron nitride nanotubes. PFIR microscopy will provide a powerful analytical tool for explorations at the nanoscale across wide disciplines. PMID:28691096

  11. Detection of particle flow patterns in tumor by directional spatial frequency analysis

    NASA Astrophysics Data System (ADS)

    Russell, Stewart; Camara, Hawa; Shi, Lingyan; Hoopes, P. Jack; Kaufman, Peter; Pogue, Brian; Alfano, Robert

    2016-04-01

    Drug delivery to tumors is well known to be chaotic and limited, partly from dysfunctional vasculature, but also because of microscopic regional variations in composition. Modeling the of transport of nanoparticle therapeutics, therefore must include not only a description of vascular permeability, but also of the movement of the drug as suspended in tumor interstitial fluid (TIF) once it leaves the blood vessel. Understanding of this area is limited because we currently lack the tools and analytical methods to characterize it. We have previously shown that directional anisotropy of drug delivery can be detected using Directional Fourier Spatial Frequency (DFSF) Analysis. Here we extend this approach to generate flow line maps of nanoparticle transport in TIF relative to tumor ultrastructure, and show that features of tumor spatial heterogeneity can be identified that are directly related to local flow isometries. The identification of these regions of limited flow may be used as a metric for determining response to therapy, or for the optimization of adjuvant therapies such as radiation pre-treatment, or enzymatic degradation.

  12. Physics Mining of Multi-Source Data Sets

    NASA Technical Reports Server (NTRS)

    Helly, John; Karimabadi, Homa; Sipes, Tamara

    2012-01-01

    Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.

  13. Making Space for Place: Mapping Tools and Practices to Teach for Spatial Justice

    ERIC Educational Resources Information Center

    Rubel, Laurie H.; Hall-Wieckert, Maren; Lim, Vivian Y.

    2017-01-01

    This article presents a set of spatial tools for classroom learning about spatial justice. As part of a larger team, we designed a curriculum that engaged 10 learners with 3 spatial tools: (a) an oversized floor map, (b) interactive geographic information systems (GIS) maps, and (c) participatory mapping. We analyze how these tools supported…

  14. Evaluating the compatibility of multi-functional and intensive urban land uses

    NASA Astrophysics Data System (ADS)

    Taleai, M.; Sharifi, A.; Sliuzas, R.; Mesgari, M.

    2007-12-01

    This research is aimed at developing a model for assessing land use compatibility in densely built-up urban areas. In this process, a new model was developed through the combination of a suite of existing methods and tools: geographical information system, Delphi methods and spatial decision support tools: namely multi-criteria evaluation analysis, analytical hierarchy process and ordered weighted average method. The developed model has the potential to calculate land use compatibility in both horizontal and vertical directions. Furthermore, the compatibility between the use of each floor in a building and its neighboring land uses can be evaluated. The method was tested in a built-up urban area located in Tehran, the capital city of Iran. The results show that the model is robust in clarifying different levels of physical compatibility between neighboring land uses. This paper describes the various steps and processes of developing the proposed land use compatibility evaluation model (CEM).

  15. A spectral Poisson solver for kinetic plasma simulation

    NASA Astrophysics Data System (ADS)

    Szeremley, Daniel; Obberath, Jens; Brinkmann, Ralf

    2011-10-01

    Plasma resonance spectroscopy is a well established plasma diagnostic method, realized in several designs. One of these designs is the multipole resonance probe (MRP). In its idealized - geometrically simplified - version it consists of two dielectrically shielded, hemispherical electrodes to which an RF signal is applied. A numerical tool is under development which is capable of simulating the dynamics of the plasma surrounding the MRP in electrostatic approximation. In this contribution we concentrate on the specialized Poisson solver for that tool. The plasma is represented by an ensemble of point charges. By expanding both the charge density and the potential into spherical harmonics, a largely analytical solution of the Poisson problem can be employed. For a practical implementation, the expansion must be appropriately truncated. With this spectral solver we are able to efficiently solve the Poisson equation in a kinetic plasma simulation without the need of introducing a spatial discretization.

  16. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    NASA Astrophysics Data System (ADS)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  17. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  18. Remote Sensing in Geography in the New Millennium: Prospects, Challenges, and Opportunities

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Jensen, John R.; Morain, Stanley A.; Walsh, Stephen J.; Ridd, Merrill K.

    1999-01-01

    Remote sensing science contributes greatly to our understanding of the Earth's ecosystems and cultural landscapes. Almost all the natural and social sciences, including geography, rely heavily on remote sensing to provide quantitative, and indispensable spatial information. Many geographers have made significant contributions to remote sensing science since the 1970s, including the specification of advanced remote sensing systems, improvements in analog and digital image analysis, biophysical modeling, and terrain analysis. In fact, the Remote Sensing Specialty Group (RSSG) is one of the largest specialty groups within the AAG with over 500 members. Remote sensing in concert with a geographic information systems, offers much value to geography as both an incisive spatial-analytical tool and as a scholarly pursuit that adds to the body of geographic knowledge on the whole. The "power" of remote sensing as a research endeavor in geography lies in its capabilities for obtaining synoptic, near-real time data at many spatial and temporal scales, and in many regions of the electromagnetic spectrum - from microwave, to RADAR, to visible, and reflective and thermal infrared. In turn, these data present a vast compendium of information for assessing Earth attributes and characte6stics that are at the very core of geography. Here we revisit how remote sensing has become a fundamental and important tool for geographical research, and how with the advent of new and improved sensing systems to be launched in the near future, remote sensing will further advance geographical analysis in the approaching New Millennium.

  19. The Benefits and Complexities of Operating Geographic Information Systems (GIS) in a High Performance Computing (HPC) Environment

    NASA Astrophysics Data System (ADS)

    Shute, J.; Carriere, L.; Duffy, D.; Hoy, E.; Peters, J.; Shen, Y.; Kirschbaum, D.

    2017-12-01

    The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center is building and maintaining an Enterprise GIS capability for its stakeholders, to include NASA scientists, industry partners, and the public. This platform is powered by three GIS subsystems operating in a highly-available, virtualized environment: 1) the Spatial Analytics Platform is the primary NCCS GIS and provides users discoverability of the vast DigitalGlobe/NGA raster assets within the NCCS environment; 2) the Disaster Mapping Platform provides mapping and analytics services to NASA's Disaster Response Group; and 3) the internal (Advanced Data Analytics Platform/ADAPT) enterprise GIS provides users with the full suite of Esri and open source GIS software applications and services. All systems benefit from NCCS's cutting edge infrastructure, to include an InfiniBand network for high speed data transfers; a mixed/heterogeneous environment featuring seamless sharing of information between Linux and Windows subsystems; and in-depth system monitoring and warning systems. Due to its co-location with the NCCS Discover High Performance Computing (HPC) environment and the Advanced Data Analytics Platform (ADAPT), the GIS platform has direct access to several large NCCS datasets including DigitalGlobe/NGA, Landsat, MERRA, and MERRA2. Additionally, the NCCS ArcGIS Desktop Windows virtual machines utilize existing NetCDF and OPeNDAP assets for visualization, modelling, and analysis - thus eliminating the need for data duplication. With the advent of this platform, Earth scientists have full access to vast data repositories and the industry-leading tools required for successful management and analysis of these multi-petabyte, global datasets. The full system architecture and integration with scientific datasets will be presented. Additionally, key applications and scientific analyses will be explained, to include the NASA Global Landslide Catalog (GLC) Reporter crowdsourcing application, the NASA GLC Viewer discovery and analysis tool, the DigitalGlobe/NGA Data Discovery Tool, the NASA Disaster Response Group Mapping Platform (https://maps.disasters.nasa.gov), and support for NASA's Arctic - Boreal Vulnerability Experiment (ABoVE).

  20. Modern Material Analysis Instruments Add a New Dimension to Materials Characterization and Failure Analysis

    NASA Technical Reports Server (NTRS)

    Panda, Binayak

    2009-01-01

    Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.

  1. A three-dimensional analytical model to simulate groundwater flow during operation of recirculating wells

    NASA Astrophysics Data System (ADS)

    Huang, Junqi; Goltz, Mark N.

    2005-11-01

    The potential for using pairs of so-called horizontal flow treatment wells (HFTWs) to effect in situ capture and treatment of contaminated groundwater has recently been demonstrated. To apply this new technology, design engineers need to be able to simulate the relatively complex groundwater flow patterns that result from HFTW operation. In this work, a three-dimensional analytical solution for steady flow in a homogeneous, anisotropic, contaminated aquifer is developed to efficiently calculate the interflow of water circulating between a pair of HFTWs and map the spatial extent of contaminated groundwater flowing from upgradient that is captured. The solution is constructed by superposing the solutions for the flow fields resulting from operation of partially penetrating wells. The solution is used to investigate the flow resulting from operation of an HFTW well pair and to quantify how aquifer anisotropy, well placement, and pumping rate impact capture zone width and interflow. The analytical modeling method presented here provides a fast and accurate technique for representing the flow field resulting from operation of HFTW systems, and represents a tool that can be useful in designing in situ groundwater contamination treatment systems.

  2. Advances in spatial epidemiology and geographic information systems.

    PubMed

    Kirby, Russell S; Delmelle, Eric; Eberth, Jan M

    2017-01-01

    The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Supratransmission in a metastable modular metastructure for tunable non-reciprocal wave transmission

    NASA Astrophysics Data System (ADS)

    Wu, Zhen; Wang, K. W.

    2018-03-01

    In this research, we numerically and analytically investigate the nonlinear energy transmission phenomenon in a metastable modular metastructure. Numerical studies on a 1D metastable chain provide clear evidence that when driving frequency is within the stopband of the periodic structure, there exists a threshold for the driving amplitude, above which sudden increase in the energy transmission can be observed. This onset of transmission is due to nonlinear instability and is known as supratransmission. We discover that due to spatial asymmetry of strategically configured constituents, such transmission thresholds are considerably different when structure is excited from different ends and this discrepancy creates a region of non-reciprocal energy transmission. We demonstrate that when the loss of stability is due to saddlenode bifurcation, the transmission threshold can be predicted analytically using a localized nonlinear-linear system model, and analyzed via combining harmonic balancing and transfer matrix methods. These investigations elucidate the rich and complex dynamics achievable by nonlinearity and metastabilities, and provide synthesize tools for tunable bandgaps and non-reciprocal wave transmissions.

  4. Augmenting Austrian flood management practices through geospatial predictive analytics: a study in Carinthia

    NASA Astrophysics Data System (ADS)

    Ward, S. M.; Paulus, G.

    2013-06-01

    The Danube River basin has long been the location of significant flooding problems across central Europe. The last decade has seen a sharp increase in the frequency, duration and intensity of these flood events, unveiling a dire need for enhanced flood management policy and tools in the region. Located in the southern portion of Austria, the state of Carinthia has experienced a significant volume of intense flood impacts over the last decade. Although the Austrian government has acknowledged these issues, their remedial actions have been primarily structural to date. Continued focus on controlling the natural environment through infrastructure while disregarding the need to consider alternative forms of assessing flood exposure will only act as a provisional solution to this inescapable risk. In an attempt to remedy this flaw, this paper highlights the application of geospatial predictive analytics and spatial recovery index as a proxy for community resilience, as well as the cultural challenges associated with the application of foreign models within an Austrian environment.

  5. Atmospheric pressure MALDI for the noninvasive characterization of carbonaceous ink from Renaissance documents.

    PubMed

    Grasso, Giuseppe; Calcagno, Marzia; Rapisarda, Alessandro; D'Agata, Roberta; Spoto, Giuseppe

    2017-06-01

    The analytical methods that are usually applied to determine the compositions of inks from ancient manuscripts usually focus on inorganic components, as in the case of iron gall ink. In this work, we describe the use of atmospheric pressure/matrix-assisted laser desorption ionization-mass spectrometry (AP/MALDI-MS) as a spatially resolved analytical technique for the study of the organic carbonaceous components of inks used in handwritten parts of ancient books for the first time. Large polycyclic aromatic hydrocarbons (L-PAH) were identified in situ in the ink of XVII century handwritten documents. We prove that it is possible to apply MALDI-MS as a suitable microdestructive diagnostic tool for analyzing samples in air at atmospheric pressure, thus simplifying investigations of the organic components of artistic and archaeological objects. The interpretation of the experimental MS results was supported by independent Raman spectroscopic investigations. Graphical abstract Atmospheric pressure/MALDI mass spectrometry detects in situ polycyclic aromatic hydrocarbons in the carbonaceous ink of XVII century manuscripts.

  6. Fluorescence Lifetime Imaging and Spectroscopy as Tools for Nondestructive Analysis of Works of Art

    NASA Astrophysics Data System (ADS)

    Comelli, Daniela; D'Andrea, Cosimo; Valentini, Gianluca; Cubeddu, Rinaldo; Colombo, Chiara; Toniolo, Lucia

    2004-04-01

    A system for advanced fluorescence investigation of works of art has been assembled and integrated in a characterization procedure that allows one to localize and identify organic compounds that are present in artworks. At the beginning of the investigation, fluorescence lifetime imaging and spectroscopy address a selective microsampling of the artwork. Then analytical measurements of microsamples identify the chemical composition of the materials under investigation. Finally, on the basis of fluorescence lifetime and amplitude maps, analytical data are extended to the whole artwork. In such a way, information on the spatial distribution of organic materials can be inferred. These concepts have been successfully applied in an extensive campaign for analysis of Renaissance fresco paintings in Castiglione Olona, Italy. Residue of various types of glue and stucco left from a restoration carried out in the early 1970s was localized and classified. Insight into the technique used by the painter to make gilded reliefs was also obtained.

  7. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  8. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  9. Retooling CalEnviroScreen: Cumulative Pollution Burden and Race-Based Environmental Health Vulnerabilities in California.

    PubMed

    Liévanos, Raoul S

    2018-04-16

    The California Community Environmental Health Screening Tool (CalEnviroScreen) advances research and policy pertaining to environmental health vulnerability. However, CalEnviroScreen departs from its historical foundations and comparable screening tools by no longer considering racial status as an indicator of environmental health vulnerability and predictor of cumulative pollution burden. This study used conceptual frameworks and analytical techniques from environmental health and inequality literature to address the limitations of CalEnviroScreen, especially its inattention to race-based environmental health vulnerabilities. It developed an adjusted measure of cumulative pollution burden from the CalEnviroScreen 2.0 data that facilitates multivariate analyses of the effect of neighborhood racial composition on cumulative pollution burden, net of other indicators of population vulnerability, traffic density, industrial zoning, and local and regional clustering of pollution burden. Principal component analyses produced three new measures of population vulnerability, including Latina/o cumulative disadvantage that represents the spatial concentration of Latinas/os, economic disadvantage, limited English-speaking ability, and health vulnerability. Spatial error regression analyses demonstrated that concentrations of Latinas/os, followed by Latina/o cumulative disadvantage, are the strongest demographic determinants of adjusted cumulative pollution burden. Findings have implications for research and policy pertaining to cumulative impacts and race-based environmental health vulnerabilities within and beyond California.

  10. Retooling CalEnviroScreen: Cumulative Pollution Burden and Race-Based Environmental Health Vulnerabilities in California

    PubMed Central

    2018-01-01

    The California Community Environmental Health Screening Tool (CalEnviroScreen) advances research and policy pertaining to environmental health vulnerability. However, CalEnviroScreen departs from its historical foundations and comparable screening tools by no longer considering racial status as an indicator of environmental health vulnerability and predictor of cumulative pollution burden. This study used conceptual frameworks and analytical techniques from environmental health and inequality literature to address the limitations of CalEnviroScreen, especially its inattention to race-based environmental health vulnerabilities. It developed an adjusted measure of cumulative pollution burden from the CalEnviroScreen 2.0 data that facilitates multivariate analyses of the effect of neighborhood racial composition on cumulative pollution burden, net of other indicators of population vulnerability, traffic density, industrial zoning, and local and regional clustering of pollution burden. Principal component analyses produced three new measures of population vulnerability, including Latina/o cumulative disadvantage that represents the spatial concentration of Latinas/os, economic disadvantage, limited English-speaking ability, and health vulnerability. Spatial error regression analyses demonstrated that concentrations of Latinas/os, followed by Latina/o cumulative disadvantage, are the strongest demographic determinants of adjusted cumulative pollution burden. Findings have implications for research and policy pertaining to cumulative impacts and race-based environmental health vulnerabilities within and beyond California. PMID:29659481

  11. Interactive Management and Updating of Spatial Data Bases

    NASA Technical Reports Server (NTRS)

    French, P.; Taylor, M.

    1982-01-01

    The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.

  12. Brownian systems with spatially inhomogeneous activity

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Brader, J. M.

    2017-09-01

    We generalize the Green-Kubo approach, previously applied to bulk systems of spherically symmetric active particles [J. Chem. Phys. 145, 161101 (2016), 10.1063/1.4966153], to include spatially inhomogeneous activity. The method is applied to predict the spatial dependence of the average orientation per particle and the density. The average orientation is given by an integral over the self part of the Van Hove function and a simple Gaussian approximation to this quantity yields an accurate analytical expression. Taking this analytical result as input to a dynamic density functional theory approximates the spatial dependence of the density in good agreement with simulation data. All theoretical predictions are validated using Brownian dynamics simulations.

  13. Refraction-enhanced backlit imaging of axially symmetric inertial confinement fusion plasmas.

    PubMed

    Koch, Jeffrey A; Landen, Otto L; Suter, Laurence J; Masse, Laurent P; Clark, Daniel S; Ross, James S; Mackinnon, Andrew J; Meezan, Nathan B; Thomas, Cliff A; Ping, Yuan

    2013-05-20

    X-ray backlit radiographs of dense plasma shells can be significantly altered by refraction of x rays that would otherwise travel straight-ray paths, and this effect can be a powerful tool for diagnosing the spatial structure of the plasma being radiographed. We explore the conditions under which refraction effects may be observed, and we use analytical and numerical approaches to quantify these effects for one-dimensional radial opacity and density profiles characteristic of inertial-confinement fusion (ICF) implosions. We also show how analytical and numerical approaches allow approximate radial plasma opacity and density profiles to be inferred from point-projection refraction-enhanced radiography data. This imaging technique can provide unique data on electron density profiles in ICF plasmas that cannot be obtained using other techniques, and the uniform illumination provided by point-like x-ray backlighters eliminates a significant source of uncertainty in inferences of plasma opacity profiles from area-backlit pinhole imaging data when the backlight spatial profile cannot be independently characterized. The technique is particularly suited to in-flight radiography of imploding low-opacity shells surrounding hydrogen ice, because refraction is sensitive to the electron density of the hydrogen plasma even when it is invisible to absorption radiography. It may also provide an alternative approach to timing shockwaves created by the implosion drive, that are currently invisible to absorption radiography.

  14. Location of Road Emergency Stations in Fars Province, Using Spatial Multi-Criteria Decision Making.

    PubMed

    Goli, Ali; Ansarizade, Najmeh; Barati, Omid; Kavosi, Zahra

    2015-01-01

    To locate the road emergency stations in Fars province based on using spatial multi-criteria decision making (Delphi method). In this study, the criteria affecting the location of road emergency stations have been identified through Delphi method and their importance was determined using Analytical Hierarchical Process (AHP). With regard to the importance of the criteria and by using Geographical Information System (GIS), the appropriateness of the existing stations with the criteria and the way of their distribution has been explored, and the appropriate arenas for creating new emergency stations were determined. In order to investigate the spatial distribution pattern of the stations, Moran's Index was used. The accidents (0.318), placement position (0.235), time (0.198), roads (0.160), and population (0.079) were introduced as the main criteria in location road emergency stations. The findings showed that the distribution of the existing stations was clustering (Moran's I=0.3). Three priorities were introduced for establishing new stations. Some arenas including Abade, north of Eghlid and Khoram bid, and small parts of Shiraz, Farashband, Bavanat, and Kazeroon were suggested as the first priority. GIS is a useful and applicable tool in investigating spatial distribution and geographical accessibility to the setting that provide health care, including emergency stations.

  15. Spatial genetic analyses reveal cryptic population structure and migration patterns in a continuously harvested grey wolf (Canis lupus) population in north-eastern Europe.

    PubMed

    Hindrikson, Maris; Remm, Jaanus; Männil, Peep; Ozolins, Janis; Tammeleht, Egle; Saarma, Urmas

    2013-01-01

    Spatial genetics is a relatively new field in wildlife and conservation biology that is becoming an essential tool for unravelling the complexities of animal population processes, and for designing effective strategies for conservation and management. Conceptual and methodological developments in this field are therefore critical. Here we present two novel methodological approaches that further the analytical possibilities of STRUCTURE and DResD. Using these approaches we analyse structure and migrations in a grey wolf (Canislupus) population in north-eastern Europe. We genotyped 16 microsatellite loci in 166 individuals sampled from the wolf population in Estonia and Latvia that has been under strong and continuous hunting pressure for decades. Our analysis demonstrated that this relatively small wolf population is represented by four genetic groups. We also used a novel methodological approach that uses linear interpolation to statistically test the spatial separation of genetic groups. The new method, which is capable of using program STRUCTURE output, can be applied widely in population genetics to reveal both core areas and areas of low significance for genetic groups. We also used a recently developed spatially explicit individual-based method DResD, and applied it for the first time to microsatellite data, revealing a migration corridor and barriers, and several contact zones.

  16. Visual and Analytic Strategies in Geometry

    ERIC Educational Resources Information Center

    Kospentaris, George; Vosniadou, Stella; Kazic, Smaragda; Thanou, Emilian

    2016-01-01

    We argue that there is an increasing reliance on analytic strategies compared to visuospatial strategies, which is related to geometry expertise and not on individual differences in cognitive style. A Visual/Analytic Strategy Test (VAST) was developed to investigate the use of visuo-spatial and analytic strategies in geometry in 30 mathematics…

  17. Technical Note: A direct ray-tracing method to compute integral depth dose in pencil beam proton radiography with a multilayer ionization chamber.

    PubMed

    Farace, Paolo; Righetto, Roberto; Deffet, Sylvain; Meijers, Arturs; Vander Stappen, Francois

    2016-12-01

    To introduce a fast ray-tracing algorithm in pencil proton radiography (PR) with a multilayer ionization chamber (MLIC) for in vivo range error mapping. Pencil beam PR was obtained by delivering spots uniformly positioned in a square (45 × 45 mm 2 field-of-view) of 9 × 9 spots capable of crossing the phantoms (210 MeV). The exit beam was collected by a MLIC to sample the integral depth dose (IDD MLIC ). PRs of an electron-density and of a head phantom were acquired by moving the couch to obtain multiple 45 × 45 mm 2 frames. To map the corresponding range errors, the two-dimensional set of IDD MLIC was compared with (i) the integral depth dose computed by the treatment planning system (TPS) by both analytic (IDD TPS ) and Monte Carlo (IDD MC ) algorithms in a volume of water simulating the MLIC at the CT, and (ii) the integral depth dose directly computed by a simple ray-tracing algorithm (IDD direct ) through the same CT data. The exact spatial position of the spot pattern was numerically adjusted testing different in-plane positions and selecting the one that minimized the range differences between IDD direct and IDD MLIC . Range error mapping was feasible by both the TPS and the ray-tracing methods, but very sensitive to even small misalignments. In homogeneous regions, the range errors computed by the direct ray-tracing algorithm matched the results obtained by both the analytic and the Monte Carlo algorithms. In both phantoms, lateral heterogeneities were better modeled by the ray-tracing and the Monte Carlo algorithms than by the analytic TPS computation. Accordingly, when the pencil beam crossed lateral heterogeneities, the range errors mapped by the direct algorithm matched better the Monte Carlo maps than those obtained by the analytic algorithm. Finally, the simplicity of the ray-tracing algorithm allowed to implement a prototype procedure for automated spatial alignment. The ray-tracing algorithm can reliably replace the TPS method in MLIC PR for in vivo range verification and it can be a key component to develop software tools for spatial alignment and correction of CT calibration.

  18. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  19. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Propagation of flat-topped multi-Gaussian beams through a double-lens system with apertures.

    PubMed

    Gao, Yanqi; Zhu, Baoqiang; Liu, Daizhong; Lin, Zunqi

    2009-07-20

    A general model for different apertures and flat-topped laser beams based on the multi-Gaussian function is developed. The general analytical expression for the propagation of a flat-topped beam through a general double-lens system with apertures is derived using the above model. Then, the propagation characteristics of the flat-topped beam through a spatial filter are investigated by using a simplified analytical expression. Based on the Fluence beam contrast and the Fill factor, the influences of a pinhole size on the propagation of the flat-topped multi-Gaussian beam (FMGB) through the spatial filter are illustrated. An analytical expression for the propagation of the FMGB through the spatial filter with a misaligned pinhole is presented, and the influences of the pinhole offset are evaluated.

  1. Spatial and temporal epidemiological analysis in the Big Data era.

    PubMed

    Pfeiffer, Dirk U; Stevens, Kim B

    2015-11-01

    Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle large datasets faster than classical regression approaches, are now also used to analyse spatial and spatio-temporal data. Multi-criteria decision analysis methods have gained greater acceptance, due in part, to the need to increasingly combine data from diverse sources including published scientific information and expert opinion in an attempt to fill important knowledge gaps. The opportunities for more effective prevention, detection and control of animal health threats arising from these developments are immense, but not without risks given the different types, and much higher frequency, of biases associated with these data. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Scalable Earth-observation Analytics for Geoscientists: Spacetime Extensions to the Array Database SciDB

    NASA Astrophysics Data System (ADS)

    Appel, Marius; Lahn, Florian; Pebesma, Edzer; Buytaert, Wouter; Moulds, Simon

    2016-04-01

    Today's amount of freely available data requires scientists to spend large parts of their work on data management. This is especially true in environmental sciences when working with large remote sensing datasets, such as obtained from earth-observation satellites like the Sentinel fleet. Many frameworks like SpatialHadoop or Apache Spark address the scalability but target programmers rather than data analysts, and are not dedicated to imagery or array data. In this work, we use the open-source data management and analytics system SciDB to bring large earth-observation datasets closer to analysts. Its underlying data representation as multidimensional arrays fits naturally to earth-observation datasets, distributes storage and computational load over multiple instances by multidimensional chunking, and also enables efficient time-series based analyses, which is usually difficult using file- or tile-based approaches. Existing interfaces to R and Python furthermore allow for scalable analytics with relatively little learning effort. However, interfacing SciDB and file-based earth-observation datasets that come as tiled temporal snapshots requires a lot of manual bookkeeping during ingestion, and SciDB natively only supports loading data from CSV-like and custom binary formatted files, which currently limits its practical use in earth-observation analytics. To make it easier to work with large multi-temporal datasets in SciDB, we developed software tools that enrich SciDB with earth observation metadata and allow working with commonly used file formats: (i) the SciDB extension library scidb4geo simplifies working with spatiotemporal arrays by adding relevant metadata to the database and (ii) the Geospatial Data Abstraction Library (GDAL) driver implementation scidb4gdal allows to ingest and export remote sensing imagery from and to a large number of file formats. Using added metadata on temporal resolution and coverage, the GDAL driver supports time-based ingestion of imagery to existing multi-temporal SciDB arrays. While our SciDB plugin works directly in the database, the GDAL driver has been specifically developed using a minimum amount of external dependencies (i.e. CURL). Source code for both tools is available from github [1]. We present these tools in a case-study that demonstrates the ingestion of multi-temporal tiled earth-observation data to SciDB, followed by a time-series analysis using R and SciDBR. Through the exclusive use of open-source software, our approach supports reproducibility in scalable large-scale earth-observation analytics. In the future, these tools can be used in an automated way to let scientists only work on ready-to-use SciDB arrays to significantly reduce the data management workload for domain scientists. [1] https://github.com/mappl/scidb4geo} and \\url{https://github.com/mappl/scidb4gdal

  3. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  4. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  5. Hydrologic controls on basin-scale distribution of benthic macroinvertebrates

    NASA Astrophysics Data System (ADS)

    Bertuzzo, E.; Ceola, S.; Singer, G. A.; Battin, T. J.; Montanari, A.; Rinaldo, A.

    2013-12-01

    The presentation deals with the role of streamflow variability on basin-scale distributions of benthic macroinvertebrates. Specifically, we present a probabilistic analysis of the impacts of the variability along the river network of relevant hydraulic variables on the density of benthic macroinvertebrate species. The relevance of this work is based on the implications of the predictability of macroinvertebrate patterns within a catchment on fluvial ecosystem health, being macroinvertebrates commonly used as sensitive indicators, and on the effects of anthropogenic activity. The analytical tools presented here outline a novel procedure of general nature aiming at a spatially-explicit quantitative assessment of how near-bed flow variability affects benthic macroinvertebrate abundance. Moving from the analytical characterization of the at-a-site probability distribution functions (pdfs) of streamflow and bottom shear stress, a spatial extension to a whole river network is performed aiming at the definition of spatial maps of streamflow and bottom shear stress. Then, bottom shear stress pdf, coupled with habitat suitability curves (e.g., empirical relations between species density and bottom shear stress) derived from field studies are used to produce maps of macroinvertebrate suitability to shear stress conditions. Thus, moving from measured hydrologic conditions, possible effects of river streamflow alterations on macroinvertebrate densities may be fairly assessed. We apply this framework to an Austrian river network, used as benchmark for the analysis, for which rainfall and streamflow time-series and river network hydraulic properties and macroinvertebrate density data are available. A comparison between observed vs "modeled" species' density in three locations along the examined river network is also presented. Although the proposed approach focuses on a single controlling factor, it shows important implications with water resources management and fluvial ecosystem protection.

  6. Merging spatially variant physical process models under an optimized systems dynamics framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cain, William O.; Lowry, Thomas Stephen; Pierce, Suzanne A.

    The complexity of water resource issues, its interconnectedness to other systems, and the involvement of competing stakeholders often overwhelm decision-makers and inhibit the creation of clear management strategies. While a range of modeling tools and procedures exist to address these problems, they tend to be case specific and generally emphasize either a quantitative and overly analytic approach or present a qualitative dialogue-based approach lacking the ability to fully explore consequences of different policy decisions. The integration of these two approaches is needed to drive toward final decisions and engender effective outcomes. Given these limitations, the Computer Assisted Dispute Resolution systemmore » (CADRe) was developed to aid in stakeholder inclusive resource planning. This modeling and negotiation system uniquely addresses resource concerns by developing a spatially varying system dynamics model as well as innovative global optimization search techniques to maximize outcomes from participatory dialogues. Ultimately, the core system architecture of CADRe also serves as the cornerstone upon which key scientific innovation and challenges can be addressed.« less

  7. Typology and indicators of ecosystem services for marine spatial planning and management.

    PubMed

    Böhnke-Henrichs, Anne; Baulcomb, Corinne; Koss, Rebecca; Hussain, S Salman; de Groot, Rudolf S

    2013-11-30

    The ecosystem services concept provides both an analytical and communicative tool to identify and quantify the link between human welfare and the environment, and thus to evaluate the ramifications of management interventions. Marine spatial planning (MSP) and Ecosystem-based Management (EBM) are a form of management intervention that has become increasingly popular and important globally. The ecosystem service concept is rarely applied in marine planning and management to date which we argue is due to the lack of a well-structured, systematic classification and assessment of marine ecosystem services. In this paper we not only develop such a typology but also provide guidance to select appropriate indicators for all relevant ecosystem services. We apply this marine-specific ecosystem service typology to MSP and EBM. We thus provide not only a novel theoretical construct but also show how the ecosystem services concept can be used in marine planning and management. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Development of a 3D GIS and its application to karst areas

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Xu, Hua; Zhou, Wanfang

    2008-05-01

    There is a growing interest in modeling and analyzing karst phenomena in three dimensions. This paper integrates geology, groundwater hydrology, geographic information system (GIS), database management system (DBMS), visualization and data mining to study karst features in Huaibei, China. The 3D geo-objects retrieved from the karst area are analyzed and mapped into different abstract levels. The spatial relationships among the objects are constructed by a dual-linker. The shapes of the 3D objects and the topological models with attributes are stored and maintained in the DBMS. Spatial analysis was then used to integrate the data in the DBMS and the 3D model to form a virtual reality (VR) to provide analytical functions such as distribution analysis, correlation query, and probability assessment. The research successfully implements 3D modeling and analyses in the karst area, and meanwhile provides an efficient tool for government policy-makers to set out restrictions on water resource development in the area.

  9. LWIR hyperspectral micro-imager for detection of trace explosive particles

    NASA Astrophysics Data System (ADS)

    Bingham, Adam L.; Lucey, Paul G.; Akagi, Jason T.; Hinrichs, John L.; Knobbe, Edward T.

    2014-05-01

    Chemical micro-imaging is a powerful tool for the detection and identification of analytes of interest against a cluttered background (i.e. trace explosive particles left behind in a fingerprint). While a variety of groups have demonstrated the efficacy of Raman instruments for these applications, point by point or line by line acquisition of a targeted field of view (FOV) is a time consuming process if it is to be accomplished with useful spatial resolutions. Spectrum Photonics has developed and demonstrated a prototype system utilizing long wave infrared hyperspectral microscopy, which enables the simultaneous collection of LWIR reflectance spectra from 8-14 μm in a 30 x 7 mm FOV with 30 μm spatial resolution in 30 s. An overview of the uncooled Sagnac-based LWIR HSM system will be given, emphasizing the benefits of this approach. Laboratory Hyperspectral data collected from custom mixtures and fingerprint residues is shown, focusing on the ability of the LWIR chemical micro-imager to detect chemicals of interest out of a cluttered background.

  10. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  11. Visualisation and Analytic Strategies for Anticipating the Folding of Nets

    ERIC Educational Resources Information Center

    Wright, Vince

    2016-01-01

    Visual and analytic strategies are features of students' schemes for spatial tasks. The strategies used by six students to anticipate the folding of nets were investigated. Evidence suggested that visual and analytic strategies were strongly connected in competent performance.

  12. EFFECTS OF LASER RADIATION ON MATTER. LASER PLASMA: Spatial-temporal distribution of a mechanical load resulting from interaction of laser radiation with a barrier (analytic model)

    NASA Astrophysics Data System (ADS)

    Fedyushin, B. T.

    1992-01-01

    The concepts developed earlier are used to propose a simple analytic model describing the spatial-temporal distribution of a mechanical load (pressure, impulse) resulting from interaction of laser radiation with a planar barrier surrounded by air. The correctness of the model is supported by a comparison with experimental results.

  13. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Feng; Liu, Yijin; Yu, Xiqian

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  14. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE PAGES

    Lin, Feng; Liu, Yijin; Yu, Xiqian; ...

    2017-08-30

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  15. The analysis of a cardiological network in a regulated setting: a spatial interaction approach.

    PubMed

    Lippi Bruni, Matteo; Nobilio, Lucia; Ugolini, Cristina

    2008-02-01

    We analyse referral patterns for patients undergoing percutaneous transluminal coronary angioplasty (PTCA) in the Emilia Romagna region of Italy, a procedure for which the assumption of a negative association between volume and adverse outcomes is used to justify its territorial concentration. Nevertheless, recent clinical evidence shows PTCA superiority for immediate treatment of acute myocardial infarction, which advises an increase in the number of points of delivery. Our paper aims to develop analytical tools designed to provide support to policy makers when they are asked to evaluate the spatial distribution of catheterisation laboratories that perform PTCA. Information is drawn from the regional administrative hospital discharge data (SDO) for the year 2002. We first use entropy indexes to investigate the spatial accessibility of the cardiological network. Secondly, by means of a gravity model estimated using Bayesian techniques we identify the determinants of patient flows in terms of demand and supply factors. Our results suggest that information on destinations is processed hierarchically and that agglomeration-like forces are dominant. Furthermore, although self-sufficiency of provision at the provincial level has been achieved to a large extent, there is still scope to improve the organisational efficiency of the network.

  16. Mining Mathematics in Textbook Lessons

    ERIC Educational Resources Information Center

    Ronda, Erlina; Adler, Jill

    2017-01-01

    In this paper, we propose an analytic tool for describing the mathematics made available to learn in a "textbook lesson". The tool is an adaptation of the Mathematics Discourse in Instruction (MDI) analytic tool that we developed to analyze what is made available to learn in teachers' lessons. Our motivation to adapt the use of the MDI…

  17. Fire behavior modeling-a decision tool

    Treesearch

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  18. Guidance for the Design and Adoption of Analytic Tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  19. A powerful tool for assessing distribution and fate of potentially toxic metals (PTMs) in soils: integration of laser ablation spectrometry (LA-ICP-MS) on thin sections with soil micromorphology and geochemistry.

    PubMed

    Scarciglia, Fabio; Barca, Donatella

    2017-04-01

    The dynamic behavior and inherent spatial heterogeneity, at different hierarchic levels, of the soil system often make the spatial distribution of potentially toxic metals (PTMs) quite complex and difficult to assess correctly. This work demonstrates that the application of laser ablation spectrometry (LA-ICP-MS) to soil thin sections constitutes an ancillary powerful tool to well-established analytical methods for tracing the behavior and fate of potential soil contaminants at the microsite level. It allowed to discriminate the contribution of PTMs in distinct soil sub-components, such as parent rock fragments, neoformed, clay-enriched or humified matrix, and specific pedogenetic features of illuvial origin (unstained or iron-stained clay coatings) even at very low contents. PTMs were analyzed in three soil profiles located in the Muravera area (Sardinia, Italy), where several, now abandoned mines were exploited. Recurrent trends of increase of many PTMs from rock to pedogenic matrix and to illuvial clay coatings, traced by LA-ICP-MS compositional data, revealed a pedogenetic control on metal fractionation and distribution, based on adsorption properties of clay minerals, iron oxyhydroxides or organic matter, and downprofile illuviation processes. The main PTMs patterns coupled with SEM-EDS analyses suggest that heavy metal-bearing mineral grains were sourced from the mine plants, in addition to the natural sedimentary input. The interplay between soil-forming processes and geomorphic dynamics significantly contributed to the PTMs spatial distribution detected in the different pedogenetic horizons and soil features.

  20. The development of the rhizosphere: simulation of root exudation for two contrasting exudates: citrate and mucilage

    NASA Astrophysics Data System (ADS)

    Sheng, Cheng; Bol, Roland; Vetterlein, Doris; Vanderborght, Jan; Schnepf, Andrea

    2017-04-01

    Different types of root exudates and their effect on soil/rhizosphere properties have received a lot of attention. Since their influence of rhizosphere properties and processes depends on their concentration in the soil, the assessment of the spatial-temporal exudate concentration distribution around roots is of key importance for understanding the functioning of the rhizosphere. Different root systems have different root architectures. Different types of root exudates diffuse in the rhizosphere with different diffusion coefficient. Both of them are responsible for the dynamics of exudate concentration distribution in the rhizosphere. Hence, simulations of root exudation involving four kinds of plant root systems (Vicia faba, Lupinus albus, Triticum aestivum and Zea mays) and two kinds of root exudates (citrate and mucilage) were conducted. We consider a simplified root architecture where each root is represented by a straight line. Assuming that root tips move at a constant velocity and that mucilage transport is linear, concentration distributions can be obtained from a convolution of the analytical solution of the transport equation in a stationary flow field for an instantaneous point source injection with the spatial-temporal distribution of the source strength. By coupling the analytical equation with a root growth model that delivers the spatial-temporal source term, we simulated exudate concentration distributions for citrate and mucilage with MATLAB. From the simulation results, we inferred the following information about the rhizosphere: (a) the dynamics of the root architecture development is the main effect of exudate distribution in the root zone; (b) a steady rhizosphere with constant width is more likely to develop for individual roots when the diffusion coefficient is small. The simulations suggest that rhizosphere development depends in the following way on the root and exudate properties: the dynamics of the root architecture result in various development patterns of the rhizosphere. Meanwhile, Results improve our understanding of the impact of the spatial and temporal heterogeneity of exudate input on rhizosphere development for different root system types and substances. In future work, we will use the simulation tool to infer critical parameters that determine the spatial-temporal extent of the rhizosphere from experimental data.

  1. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  2. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to eliminate the surface measurement noise or measurement errors can also suffer from aliasing effects. During re-sampling of a surface map, this software preserves the low spatial-frequency characteristic of a given surface map through the use of Zernike-polynomial fit coefficients, and maintains mid- and high-spatial-frequency characteristics of the given surface map by the use of a PSD model derived from the two-dimensional PSD data of the mid- and high-spatial-frequency components of the original surface map. Because this new method creates the new surface map in the desired sampling format from analytical expressions only, it does not encounter any aliasing effects and does not cause any discontinuity in the resultant surface map.

  3. New analytical solutions to the two-phase water faucet problem

    DOE PAGES

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    2016-06-17

    Here, the one-dimensional water faucet problem is one of the classical benchmark problems originally proposed by Ransom to study the two-fluid two-phase flow model. With certain simplifications, such as massless gas phase and no wall and interfacial frictions, analytical solutions had been previously obtained for the transient liquid velocity and void fraction distribution. The water faucet problem and its analytical solutions have been widely used for the purposes of code assessment, benchmark and numerical verifications. In our previous study, the Ransom’s solutions were used for the mesh convergence study of a high-resolution spatial discretization scheme. It was found that, atmore » the steady state, an anticipated second-order spatial accuracy could not be achieved, when compared to the existing Ransom’s analytical solutions. A further investigation showed that the existing analytical solutions do not actually satisfy the commonly used two-fluid single-pressure two-phase flow equations. In this work, we present a new set of analytical solutions of the water faucet problem at the steady state, considering the gas phase density’s effect on pressure distribution. This new set of analytical solutions are used for mesh convergence studies, from which anticipated second-order of accuracy is achieved for the 2nd order spatial discretization scheme. In addition, extended Ransom’s transient solutions for the gas phase velocity and pressure are derived, with the assumption of decoupled liquid and gas pressures. Numerical verifications on the extended Ransom’s solutions are also presented.« less

  4. The NASA Reanalysis Ensemble Service - Advanced Capabilities for Integrated Reanalysis Access and Intercomparison

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2017-12-01

    NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing

  5. Itzï (version 17.1): an open-source, distributed GIS model for dynamic flood simulation

    NASA Astrophysics Data System (ADS)

    Guillaume Courty, Laurent; Pedrozo-Acuña, Adrián; Bates, Paul David

    2017-05-01

    Worldwide, floods are acknowledged as one of the most destructive hazards. In human-dominated environments, their negative impacts are ascribed not only to the increase in frequency and intensity of floods but also to a strong feedback between the hydrological cycle and anthropogenic development. In order to advance a more comprehensive understanding of this complex interaction, this paper presents the development of a new open-source tool named Itzï that enables the 2-D numerical modelling of rainfall-runoff processes and surface flows integrated with the open-source geographic information system (GIS) software known as GRASS. Therefore, it takes advantage of the ability given by GIS environments to handle datasets with variations in both temporal and spatial resolutions. Furthermore, the presented numerical tool can handle datasets from different sources with varied spatial resolutions, facilitating the preparation and management of input and forcing data. This ability reduces the preprocessing time usually required by other models. Itzï uses a simplified form of the shallow water equations, the damped partial inertia equation, for the resolution of surface flows, and the Green-Ampt model for the infiltration. The source code is now publicly available online, along with complete documentation. The numerical model is verified against three different tests cases: firstly, a comparison with an analytic solution of the shallow water equations is introduced; secondly, a hypothetical flooding event in an urban area is implemented, where results are compared to those from an established model using a similar approach; and lastly, the reproduction of a real inundation event that occurred in the city of Kingston upon Hull, UK, in June 2007, is presented. The numerical approach proved its ability at reproducing the analytic and synthetic test cases. Moreover, simulation results of the real flood event showed its suitability at identifying areas affected by flooding, which were verified against those recorded after the event by local authorities.

  6. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    PubMed

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  7. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  8. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  9. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  10. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  11. Chemical Functionalization of Plasmonic Surface Biosensors: A Tutorial Review on Issues, Strategies, and Costs

    PubMed Central

    2017-01-01

    In an ideal plasmonic surface sensor, the bioactive area, where analytes are recognized by specific biomolecules, is surrounded by an area that is generally composed of a different material. The latter, often the surface of the supporting chip, is generally hard to be selectively functionalized, with respect to the active area. As a result, cross talks between the active area and the surrounding one may occur. In designing a plasmonic sensor, various issues must be addressed: the specificity of analyte recognition, the orientation of the immobilized biomolecule that acts as the analyte receptor, and the selectivity of surface coverage. The objective of this tutorial review is to introduce the main rational tools required for a correct and complete approach to chemically functionalize plasmonic surface biosensors. After a short introduction, the review discusses, in detail, the most common strategies for achieving effective surface functionalization. The most important issues, such as the orientation of active molecules and spatial and chemical selectivity, are considered. A list of well-defined protocols is suggested for the most common practical situations. Importantly, for the reported protocols, we also present direct comparisons in term of costs, labor demand, and risk vs benefit balance. In addition, a survey of the most used characterization techniques necessary to validate the chemical protocols is reported. PMID:28796479

  12. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  13. EPA Tools and Resources Webinar: EPA’s Environmental Sampling and Analytical Methods for Environmental Remediation and Recovery

    EPA Pesticide Factsheets

    EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.

  14. Biogeochemistry and Spatial Distribution of the Microbial-Mineral Interface Using I2LD-FTMS

    NASA Astrophysics Data System (ADS)

    Scott, J. R.; Kauffman, M. E.; Kauffman, M. E.; Tremblay, P. L.

    2001-12-01

    Previous studies indicate that biogeochemistry can vary within individual mineral specimens in contact with microorganisms. These same studies have shown that microcosms containing a mixture of minerals simulating a heterogeneous geologic matrix do not yield the same results as the naturally occurring rock. Therefore, it is of utmost importance to develop analytical tools that can provide spatially correlative biogeochemical data of the microbial-mineral interface within naturally occurring geologic matrices. Imaging internal laser desorption Fourier transform mass spectrometry (I2LD-FTMS) can provide elemental and molecular information of the microbial-mineral interface at a spatial resolution limited only by the optical diffraction limit of the final focusing lens (down to 2 μ m). Additionally, the I2LD-FTMS used in this study has exceptional reproducibility, which can provide successive mapping sequences for depth-profiling studies. Basalt core samples, taken from the Snake River Plain Aquifer in southeastern Idaho, were mapped prior to, and after, exposure to a bacterial culture. The bacteria-basalt interface spectra were collected using the I2LD-FTMS at the INEEL. Mass spectra were recorded over a mass-to-charge range of 30-2500 Da with an average peak resolution of 15,000 using 10 μ m spots. Two-dimensional maps were constructed depicting the spatial distribution of the minerals within the basalt as well as the spatial distribution of the bacteria on the basalt surface. This represents the first reported application of I2LD-FTMS in the field of biogeochemistry.

  15. Understanding spatio-temporal mobility patterns for seniors, child/student and adult using smart card data

    NASA Astrophysics Data System (ADS)

    Huang, X.; Tan, J.

    2014-11-01

    Commutes in urban areas create interesting travel patterns that are often stored in regional transportation databases. These patterns can vary based on the day of the week, the time of the day, and commuter type. This study proposes methods to detect underlying spatio-temporal variability among three groups of commuters (senior citizens, child/students, and adults) using data mining and spatial analytics. Data from over 36 million individual trip records collected over one week (March 2012) on the Singapore bus and Mass Rapid Transit (MRT) system by the fare collection system were used. Analyses of such data are important for transportation and landuse designers and contribute to a better understanding of urban dynamics. Specifically, descriptive statistics, network analysis, and spatial analysis methods are presented. Descriptive variables were proposed such as density and duration to detect temporal features of people. A directed weighted graph G ≡ (N , L, W) was defined to analyze the global network properties of every pair of the transportation link in the city during an average workday for all three categories. Besides, spatial interpolation and spatial statistic tools were used to transform the discrete network nodes into structured human movement landscape to understand the role of transportation systems in urban areas. The travel behaviour of the three categories follows a certain degree of temporal and spatial universality but also displays unique patterns within their own specialties. Each category is characterized by their different peak hours, commute distances, and specific locations for travel on weekdays.

  16. TXM-Wizard: a program for advanced data collection and evaluation in full-field transmission X-ray microscopy

    PubMed Central

    Liu, Yijin; Meirer, Florian; Williams, Phillip A.; Wang, Junyue; Andrews, Joy C.; Pianetta, Piero

    2012-01-01

    Transmission X-ray microscopy (TXM) has been well recognized as a powerful tool for non-destructive investigation of the three-dimensional inner structure of a sample with spatial resolution down to a few tens of nanometers, especially when combined with synchrotron radiation sources. Recent developments of this technique have presented a need for new tools for both system control and data analysis. Here a software package developed in MATLAB for script command generation and analysis of TXM data is presented. The first toolkit, the script generator, allows automating complex experimental tasks which involve up to several thousand motor movements. The second package was designed to accomplish computationally intense tasks such as data processing of mosaic and mosaic tomography datasets; dual-energy contrast imaging, where data are recorded above and below a specific X-ray absorption edge; and TXM X-ray absorption near-edge structure imaging datasets. Furthermore, analytical and iterative tomography reconstruction algorithms were implemented. The compiled software package is freely available. PMID:22338691

  17. Evaluating the utility of companion animal tick surveillance practices for monitoring spread and occurrence of human Lyme disease in West Virginia, 2014-2016.

    PubMed

    Hendricks, Brian; Mark-Carew, Miguella; Conley, Jamison

    2017-11-13

    Domestic dogs and cats are potentially effective sentinel populations for monitoring occurrence and spread of Lyme disease. Few studies have evaluated the public health utility of sentinel programmes using geo-analytic approaches. Confirmed Lyme disease cases diagnosed by physicians and ticks submitted by veterinarians to the West Virginia State Health Department were obtained for 2014-2016. Ticks were identified to species, and only Ixodes scapularis were incorporated in the analysis. Separate ordinary least squares (OLS) and spatial lag regression models were conducted to estimate the association between average numbers of Ix. scapularis collected on pets and human Lyme disease incidence. Regression residuals were visualised using Local Moran's I as a diagnostic tool to identify spatial dependence. Statistically significant associations were identified between average numbers of Ix. scapularis collected from dogs and human Lyme disease in the OLS (β=20.7, P<0.001) and spatial lag (β=12.0, P=0.002) regression. No significant associations were identified for cats in either regression model. Statistically significant (P≤0.05) spatial dependence was identified in all regression models. Local Moran's I maps produced for spatial lag regression residuals indicated a decrease in model over- and under-estimation, but identified a higher number of statistically significant outliers than OLS regression. Results support previous conclusions that dogs are effective sentinel populations for monitoring risk of human exposure to Lyme disease. Findings reinforce the utility of spatial analysis of surveillance data, and highlight West Virginia's unique position within the eastern United States in regards to Lyme disease occurrence.

  18. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  19. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  20. Fluorescence correlation spectroscopy of diffusion probed with a Gaussian Lorentzian spatial distribution

    NASA Astrophysics Data System (ADS)

    Marrocco, Michele

    2007-11-01

    Fluorescence correlation spectroscopy is fundamental in many physical, chemical and biological studies of molecular diffusion. However, the concept of fluorescence correlation is founded on the assumption that the analytical description of the correlation decay of diffusion can be achieved if the spatial profile of the detected volume obeys a three-dimensional Gaussian distribution. In the present Letter, the analytical result is instead proven for the fundamental Gaussian-Lorentzian profile.

  1. Next-generation technologies for spatial proteomics: Integrating ultra-high speed MALDI-TOF and high mass resolution MALDI FTICR imaging mass spectrometry for protein analysis.

    PubMed

    Spraggins, Jeffrey M; Rizzo, David G; Moore, Jessica L; Noto, Michael J; Skaar, Eric P; Caprioli, Richard M

    2016-06-01

    MALDI imaging mass spectrometry is a powerful analytical tool enabling the visualization of biomolecules in tissue. However, there are unique challenges associated with protein imaging experiments including the need for higher spatial resolution capabilities, improved image acquisition rates, and better molecular specificity. Here we demonstrate the capabilities of ultra-high speed MALDI-TOF and high mass resolution MALDI FTICR IMS platforms as they relate to these challenges. High spatial resolution MALDI-TOF protein images of rat brain tissue and cystic fibrosis lung tissue were acquired at image acquisition rates >25 pixels/s. Structures as small as 50 μm were spatially resolved and proteins associated with host immune response were observed in cystic fibrosis lung tissue. Ultra-high speed MALDI-TOF enables unique applications including megapixel molecular imaging as demonstrated for lipid analysis of cystic fibrosis lung tissue. Additionally, imaging experiments using MALDI FTICR IMS were shown to produce data with high mass accuracy (<5 ppm) and resolving power (∼75 000 at m/z 5000) for proteins up to ∼20 kDa. Analysis of clear cell renal cell carcinoma using MALDI FTICR IMS identified specific proteins localized to healthy tissue regions, within the tumor, and also in areas of increased vascularization around the tumor. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Diverting the tourists: a spatial decision-support system for tourism planning on a developing island

    NASA Astrophysics Data System (ADS)

    Beedasy, Jaishree; Whyatt, Duncan

    Mauritius is a small island (1865 km 2) in the Indian Ocean. Tourism is the third largest economic sector of the country, after manufacturing and agriculture. A limitation of space and the island's vulnerable ecosystem warrants a rational approach to tourism development. The main problems so far have been to manipulate and integrate all the factors affecting tourism planning and to match spatial data with their relevant attributes. A Spatial Decision Support System (SDSS) for sustainable tourism planning is therefore proposed. The proposed SDSS design would include a GIS as its core component. A first GIS model has already been constructed with available data. Supporting decision-making in a spatial context is implicit in the use of GIS. However the analytical capability of the GIS has to be enhanced to solve semi-structured problems, where subjective judgements come into play. The second part of the paper deals with the choice, implementation and customisation of a relevant model to develop a specialised SDSS. Different types of models and techniques are discussed, in particular a comparison of compensatory and non-compensatory approaches to multicriteria evaluation (MCE). It is concluded that compensatory multicriteria evaluation techniques increase the scope of the present GIS model as a decision-support tool. This approach gives the user or decision-maker the flexibility to change the importance of each criterion depending on relevant objectives.

  3. Algorithms and software for U-Pb geochronology by LA-ICPMS

    NASA Astrophysics Data System (ADS)

    McLean, Noah M.; Bowring, James F.; Gehrels, George

    2016-07-01

    The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.

  4. Magnetic Measurements as a Useful Tool for the Evaluation of Spatial Variability of the Arable Horizon Thickness

    NASA Astrophysics Data System (ADS)

    Fattakhova, Leysan; Shinkarev, Alexandr; Ryzhikh, Lyudmila; Kosareva, Lina

    2017-04-01

    In normal practice, the thickness of the arable horizon is determined on the basis of field morphological descriptions, allowing the subjectivity of perception and judgment at the crucial role of experience of the researcher. The subject of special interest are independent analytical and technically relatively simple in design approaches to the diagnosis of the lower boundary of the blended plowing the profiles part. Theoretical premises to use spectrophotometry and magnetometry to arable horizon depth diagnose is based on the concept of regular color and magnetic properties vertical differentiation in a profile of virgin soils. This work is devoted to the comparative assessment of the possibility to objectively and reliably diagnose the lower boundary of the arable horizon in gray forest soils by determining the color characteristics and the magnetic susceptibility of their layer-wise samples. It was shown with arable gray forest soil (Cutanic Luvisols (Anthric)) as example that the magnetic susceptibility profile distribution curves can provide more reliable and objective assessment of the arable horizon thickness spatial variability than the profile curves of the color characteristics in the CIELAB coordinates. Therefore, magnetic measurements can be a useful tool for the tillage erosion estimation in the monitoring of soil characteristics in connection with the development of precision agriculture technologies and the organizing of agricultural field plot experiments.

  5. The effectiveness of physical models in teaching anatomy: a meta-analysis of comparative studies.

    PubMed

    Yammine, Kaissar; Violato, Claudio

    2016-10-01

    There are various educational methods used in anatomy teaching. While three dimensional (3D) visualization technologies are gaining ground due to their ever-increasing realism, reports investigating physical models as a low-cost 3D traditional method are still the subject of considerable interest. The aim of this meta-analysis is to quantitatively assess the effectiveness of such models based on comparative studies. Eight studies (7 randomized trials; 1 quasi-experimental) including 16 comparison arms and 820 learners met the inclusion criteria. Primary outcomes were defined as factual, spatial and overall percentage scores. The meta-analytical results are: educational methods using physical models yielded significantly better results when compared to all other educational methods for the overall knowledge outcome (p < 0.001) and for spatial knowledge acquisition (p < 0.001). Significantly better results were also found with regard to the long-retention knowledge outcome (p < 0.01). No significance was found for the factual knowledge acquisition outcome. The evidence in the present systematic review was found to have high internal validity and at least an acceptable strength. In conclusion, physical anatomical models offer a promising tool for teaching gross anatomy in 3D representation due to their easy accessibility and educational effectiveness. Such models could be a practical tool to bring up the learners' level of gross anatomy knowledge at low cost.

  6. Trace element study in scallop shells by laser ablation ICP-MS: the example of Ba/Ca ratios

    NASA Astrophysics Data System (ADS)

    Lorrain, A.; Pécheyran, C.; Paulet, Y.-M.; Chauvaud, L.; Amouroux, D.; Krupp, E.; Donard, O.

    2003-04-01

    As scallop shells grow incrementally at a rate of one line per day, environmental changes could then be evidenced on a daily basis. As an example for trace element incorporation studies, barium is a geochemical tracer that can be directly related to oceanic primary productivity. Hence, monitoring Ba/Ca variations in a scallop shell should give information about phytoplanktonic events encountered day by day during its life. The very high spatial resolution (typically 40 - 200 µm) and the high elemental sensitivity required can only be achieved by the combination of laser ablation coupled to inductively coupled plasma mass spectrometry. This study demonstrates that Laser ablation coupled to ICP-MS determination is a relevant tool for high resolution distribution measurement of trace elements in calcite matrix. The ablation strategy related to single line rastering and calcium normalisation were found to be the best analytical conditions in terms of reproducibility and sensitivity. The knowledge of P. maximus growth rings periodicity (daily), combined with LA-ICP-MS micro analysis allows the acquisition of time dated profiles with high spatial and thus temporal resolution. This resolution makes P. maximus a potential tool for environmental reconstruction and especially for accurate calibration of proxies. However, the relations among Ba/Ca peaks and phytoplanktonic events differed according to the animals and some inter-annual discrepancies complexify the interpretation.

  7. Mathematical Design Optimization of Wide-Field X-ray Telescopes: Mirror Nodal Positions and Detector Tilts

    NASA Technical Reports Server (NTRS)

    Elsner, R. F.; O'Dell, S. L.; Ramsey, B. D.; Weisskopf, M. C.

    2011-01-01

    We describe a mathematical formalism for determining the mirror shell nodal positions and detector tilts that optimize the spatial resolution averaged over a field-of-view for a nested x-ray telescope, assuming known mirror segment surface prescriptions and known detector focal surface. The results are expressed in terms of ensemble averages over variable combinations of the ray positions and wave vectors in the flat focal plane intersecting the optical axis at the nominal on-axis focus, which can be determined by Monte-Carlo ray traces of the individual mirror shells. This work is part of our continuing efforts to provide analytical tools to aid in the design process for wide-field survey x-ray astronomy missions.

  8. Mathematical Design Optimization of Wide-Field X-ray Telescopes: Mirror Nodal Positions and Detector Tilts

    NASA Technical Reports Server (NTRS)

    Elsner, Ronald; O'Dell, Stephen; Ramsey, Brian; Weisskopf, Martin

    2011-01-01

    We describe a mathematical formalism for determining the mirror shell nodal positions and detector tilts that optimize the spatial resolution averaged over a field-of-view for a nested x-ray telescope, assuming known mirror segment surface prescriptions and known detector focal surface. The results are expressed in terms of ensemble averages over variable combinations of the ray positions and wavevectors in the flat focal plane intersecting the optical axis at the nominal on-axis focus, which can be determined by Monte-Carlo ray traces of the individual mirror shells. This work is part of our continuing efforts to provide analytical tools to aid in the design process for wide-field survey x-ray astronomy missions.

  9. Opinion Formation Models on a Gradient

    PubMed Central

    Gastner, Michael T.; Markou, Nikolitsa; Pruessner, Gunnar; Draief, Moez

    2014-01-01

    Statistical physicists have become interested in models of collective social behavior such as opinion formation, where individuals change their inherently preferred opinion if their friends disagree. Real preferences often depend on regional cultural differences, which we model here as a spatial gradient g in the initial opinion. The gradient does not only add reality to the model. It can also reveal that opinion clusters in two dimensions are typically in the standard (i.e., independent) percolation universality class, thus settling a recent controversy about a non-consensus model. However, using analytical and numerical tools, we also present a model where the width of the transition between opinions scales , not as in independent percolation, and the cluster size distribution is consistent with first-order percolation. PMID:25474528

  10. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  11. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  12. Semianalytical solutions for transport in aquifer and fractured clay matrix system

    NASA Astrophysics Data System (ADS)

    Huang, Junqi; Goltz, Mark N.

    2015-09-01

    A three-dimensional mathematical model that describes transport of contaminant in a horizontal aquifer with simultaneous diffusion into a fractured clay formation is proposed. A group of semianalytical solutions is derived based on specific initial and boundary conditions as well as various source functions. The analytical model solutions are evaluated by numerical Laplace inverse transformation and analytical Fourier inverse transformation. The model solutions can be used to study the fate and transport in a three-dimensional spatial domain in which a nonaqueous phase liquid exists as a pool atop a fractured low-permeability clay layer. The nonaqueous phase liquid gradually dissolves into the groundwater flowing past the pool, while simultaneously diffusing into the fractured clay formation below the aquifer. Mass transfer of the contaminant into the clay formation is demonstrated to be significantly enhanced by the existence of the fractures, even though the volume of fractures is relatively small compared to the volume of the clay matrix. The model solution is a useful tool in assessing contaminant attenuation processes in a confined aquifer underlain by a fractured clay formation.

  13. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti

    PubMed Central

    2013-01-01

    Background Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Methods Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Results Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these “hotspots”. Conclusions Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study sites through time to track spatio-temporal dynamics of the communities. Its simplicity should also be used to encourage local participatory collaborations. PMID:23587358

  14. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti.

    PubMed

    Curtis, Andrew; Blackburn, Jason K; Widmer, Jocelyn M; Morris, J Glenn

    2013-04-15

    Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these "hotspots". Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study sites through time to track spatio-temporal dynamics of the communities. Its simplicity should also be used to encourage local participatory collaborations.

  15. Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.

    DOT National Transportation Integrated Search

    2015-01-01

    The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...

  16. Performance of Orbital Neutron Instruments for Spatially Resolved Hydrogen Measurements of Airless Planetary Bodies

    PubMed Central

    Elphic, Richard C.; Feldman, William C.; Funsten, Herbert O.; Prettyman, Thomas H.

    2010-01-01

    Abstract Orbital neutron spectroscopy has become a standard technique for measuring planetary surface compositions from orbit. While this technique has led to important discoveries, such as the deposits of hydrogen at the Moon and Mars, a limitation is its poor spatial resolution. For omni-directional neutron sensors, spatial resolutions are 1–1.5 times the spacecraft's altitude above the planetary surface (or 40–600 km for typical orbital altitudes). Neutron sensors with enhanced spatial resolution have been proposed, and one with a collimated field of view is scheduled to fly on a mission to measure lunar polar hydrogen. No quantitative studies or analyses have been published that evaluate in detail the detection and sensitivity limits of spatially resolved neutron measurements. Here, we describe two complementary techniques for evaluating the hydrogen sensitivity of spatially resolved neutron sensors: an analytic, closed-form expression that has been validated with Lunar Prospector neutron data, and a three-dimensional modeling technique. The analytic technique, called the Spatially resolved Neutron Analytic Sensitivity Approximation (SNASA), provides a straightforward method to evaluate spatially resolved neutron data from existing instruments as well as to plan for future mission scenarios. We conclude that the existing detector—the Lunar Exploration Neutron Detector (LEND)—scheduled to launch on the Lunar Reconnaissance Orbiter will have hydrogen sensitivities that are over an order of magnitude poorer than previously estimated. We further conclude that a sensor with a geometric factor of ∼ 100 cm2 Sr (compared to the LEND geometric factor of ∼ 10.9 cm2 Sr) could make substantially improved measurements of the lunar polar hydrogen spatial distribution. Key Words: Planetary instrumentation—Planetary science—Moon—Spacecraft experiments—Hydrogen. Astrobiology 10, 183–200. PMID:20298147

  17. Delivery of Forecasted Atmospheric Ozone and Dust for the New Mexico Environmental Public Health Tracking System - An Open Source Geospatial Solution

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Sanchez-Silva, R.; Cavner, J. A.

    2010-12-01

    New Mexico's Environmental Public Health Tracking System (EPHTS), funded by the Centers for Disease Control (CDC) Environmental Public Health Tracking Network (EPHTN), aims to improve health awareness and services by linking health effects data with levels and frequency of environmental exposure. As a public health decision-support system, EPHTS systems include: state-of-the-art statistical analysis tools; geospatial visualization tools; data discovery, extraction, and delivery tools; and environmental/public health linkage information. As part of its mandate, EPHTS issues public health advisories and forecasts of environmental conditions that have consequences for human health. Through a NASA-funded partnership between the University of New Mexico and the University of Arizona, NASA Earth Science results are fused into two existing models (the Dust Regional Atmospheric Model (DREAM) and the Community Multiscale Air Quality (CMAQ) model) in order to improve forecasts of atmospheric dust, ozone, and aerosols. The results and products derived from the outputs of these models are made available to an Open Source mapping component of the New Mexico EPHTS. In particular, these products are integrated into a Django content management system using GeoDjango, GeoAlchemy, and other OGC-compliant geospatial libraries written in the Python and C++ programming languages. Capabilities of the resultant mapping system include indicator-based thematic mapping, data delivery, and analytical capabilities. DREAM and CMAQ outputs can be inspected, via REST calls, through temporal and spatial subsetting of the atmospheric concentration data across analytical units employed by the public health community. This paper describes details of the architecture and integration of NASA Earth Science into the EPHTS decision-support system.

  18. 2D-Visualization of metabolic activity with planar optical chemical sensors (optodes)

    NASA Astrophysics Data System (ADS)

    Meier, R. J.; Liebsch, G.

    2015-12-01

    Microbia plays an outstandingly important role in many hydrologic compartments, such as e.g. the benthic community in sediments, or biologically active microorganisms in the capillary fringe, in ground water, or soil. Oxygen, pH, and CO2 are key factors and indicators for microbial activity. They can be measured using optical chemical sensors. These sensors record changing fluorescence properties of specific indicator dyes. The signals can be measured in a non-contact mode, even through transparent walls, which is important for many lab-experiments. They can measure in closed (transparent) systems, without sampling or intruding into the sample. They do not consume the analytes while measuring, are fully reversible and able to measure in non-stirred solutions. These sensors can be applied as high precision fiberoptic sensors (for profiling), robust sensor spots, or as planar sensors for 2D visualization (imaging). Imaging enables to detect thousands of measurement spots at the same time and generate 2D analyte maps over a region of interest. It allows for comparing different regions within one recorded image, visualizing spatial analyte gradients, or more important to identify hot spots of metabolic activity. We present ready-to-use portable imaging systems for the analytes oxygen, pH, and CO2. They consist of a detector unit, planar sensor foils and a software for easy data recording and evaluation. Sensors foils for various analytes and measurement ranges enable visualizing metabolic activity or analyte changes in the desired range. Dynamics of metabolic activity can be detected in one shot or over long time periods. We demonstrate the potential of this analytical technique by presenting experiments on benthic disturbance-recovery dynamics in sediments and microbial degradation of organic material in the capillary fringe. We think this technique is a new tool to further understand how microbial and geochemical processes are linked in (not solely) hydrologic systems.

  19. Quantitative imaging with fluorescent biosensors.

    PubMed

    Okumoto, Sakiko; Jones, Alexander; Frommer, Wolf B

    2012-01-01

    Molecular activities are highly dynamic and can occur locally in subcellular domains or compartments. Neighboring cells in the same tissue can exist in different states. Therefore, quantitative information on the cellular and subcellular dynamics of ions, signaling molecules, and metabolites is critical for functional understanding of organisms. Mass spectrometry is generally used for monitoring ions and metabolites; however, its temporal and spatial resolution are limited. Fluorescent proteins have revolutionized many areas of biology-e.g., fluorescent proteins can report on gene expression or protein localization in real time-yet promoter-based reporters are often slow to report physiologically relevant changes such as calcium oscillations. Therefore, novel tools are required that can be deployed in specific cells and targeted to subcellular compartments in order to quantify target molecule dynamics directly. We require tools that can measure enzyme activities, protein dynamics, and biophysical processes (e.g., membrane potential or molecular tension) with subcellular resolution. Today, we have an extensive suite of tools at our disposal to address these challenges, including translocation sensors, fluorescence-intensity sensors, and Förster resonance energy transfer sensors. This review summarizes sensor design principles, provides a database of sensors for more than 70 different analytes/processes, and gives examples of applications in quantitative live cell imaging.

  20. An analytical SMASH procedure (ASP) for sensitivity-encoded MRI.

    PubMed

    Lee, R F; Westgate, C R; Weiss, R G; Bottomley, P A

    2000-05-01

    The simultaneous acquisition of spatial harmonics (SMASH) method of imaging with detector arrays can reduce the number of phase-encoding steps, and MRI scan time several-fold. The original approach utilized numerical gradient-descent fitting with the coil sensitivity profiles to create a set of composite spatial harmonics to replace the phase-encoding steps. Here, an analytical approach for generating the harmonics is presented. A transform is derived to project the harmonics onto a set of sensitivity profiles. A sequence of Fourier, Hilbert, and inverse Fourier transform is then applied to analytically eliminate spatially dependent phase errors from the different coils while fully preserving the spatial-encoding. By combining the transform and phase correction, the original numerical image reconstruction method can be replaced by an analytical SMASH procedure (ASP). The approach also allows simulation of SMASH imaging, revealing a criterion for the ratio of the detector sensitivity profile width to the detector spacing that produces optimal harmonic generation. When detector geometry is suboptimal, a group of quasi-harmonics arises, which can be corrected and restored to pure harmonics. The simulation also reveals high-order harmonic modulation effects, and a demodulation procedure is presented that enables application of ASP to a large numbers of detectors. The method is demonstrated on a phantom and humans using a standard 4-channel phased-array MRI system. Copyright 2000 Wiley-Liss, Inc.

  1. SAM Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation

  2. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  3. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  4. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    NASA Astrophysics Data System (ADS)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via ODBC or JDBC respectively, 2) an instance to represent the spatial data stored in the database as a dataframe in Python, called the IdaGeoDataFrame, with a specific geometry attribute which recognises a planar geometry column in dashDB and 3) Python wrappers for spatial functions like within, distance, area, buffer} and more which dashDB currently supports to make the querying process from Python much simpler for the users. The spatial functions translate well-known geopandas-like syntax into SQL queries utilising the database connection to perform spatial operations in-database and can operate on single geometries as well two different geometries from different IdaGeoDataFrames. The in-database queries strictly follow the standards of OpenGIS Implementation Specification for Geographic information - Simple feature access for SQL. The results of the operations obtained can thereby be accessed dynamically via interactive Jupyter notebooks from any system which supports Python, without any additional dependencies and can also be combined with other open source libraries such as matplotlib and folium in-built within Jupyter notebooks for visualization purposes. We built a use case to analyse crime hotspots in New York city to validate our implementation and visualized the results as a choropleth map for each borough.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    Here, the one-dimensional water faucet problem is one of the classical benchmark problems originally proposed by Ransom to study the two-fluid two-phase flow model. With certain simplifications, such as massless gas phase and no wall and interfacial frictions, analytical solutions had been previously obtained for the transient liquid velocity and void fraction distribution. The water faucet problem and its analytical solutions have been widely used for the purposes of code assessment, benchmark and numerical verifications. In our previous study, the Ransom’s solutions were used for the mesh convergence study of a high-resolution spatial discretization scheme. It was found that, atmore » the steady state, an anticipated second-order spatial accuracy could not be achieved, when compared to the existing Ransom’s analytical solutions. A further investigation showed that the existing analytical solutions do not actually satisfy the commonly used two-fluid single-pressure two-phase flow equations. In this work, we present a new set of analytical solutions of the water faucet problem at the steady state, considering the gas phase density’s effect on pressure distribution. This new set of analytical solutions are used for mesh convergence studies, from which anticipated second-order of accuracy is achieved for the 2nd order spatial discretization scheme. In addition, extended Ransom’s transient solutions for the gas phase velocity and pressure are derived, with the assumption of decoupled liquid and gas pressures. Numerical verifications on the extended Ransom’s solutions are also presented.« less

  6. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  7. LC-MS/MS imaging with thermal film-based laser microdissection.

    PubMed

    Oya, Michiko; Suzuki, Hiromi; Anas, Andrea Roxanne J; Oishi, Koichi; Ono, Kenji; Yamaguchi, Shun; Eguchi, Megumi; Sawada, Makoto

    2018-01-01

    Mass spectrometry (MS) imaging is a useful tool for direct and simultaneous visualization of specific molecules. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) is used to evaluate the abundance of molecules in tissues using sample homogenates. To date, however, LC-MS/MS has not been utilized as an imaging tool because spatial information is lost during sample preparation. Here we report a new approach for LC-MS/MS imaging using a thermal film-based laser microdissection (LMD) technique. To isolate tissue spots, our LMD system uses a 808-nm near infrared laser, the diameter of which can be freely changed from 2.7 to 500 μm; for imaging purposes in this study, the diameter was fixed at 40 μm, allowing acquisition of LC-MS/MS images at a 40-μm resolution. The isolated spots are arranged on a thermal film at 4.5-mm intervals, corresponding to the well spacing on a 384-well plate. Each tissue spot is handled on the film in such a manner as to maintain its spatial information, allowing it to be extracted separately in its individual well. Using analytical LC-MS/MS in combination with the spatial information of each sample, we can reconstruct LC-MS/MS images. With this imaging technique, we successfully obtained the distributions of pilocarpine, glutamate, γ-aminobutyric acid, acetylcholine, and choline in a cross-section of mouse hippocampus. The protocol we established in this study is applicable to revealing the neurochemistry of pilocarpine model of epilepsy. Our system has a wide range of uses in fields such as biology, pharmacology, pathology, and neuroscience. Graphical abstract Schematic Indication of LMD-LC-MS/MS imaging.

  8. SAM Pathogen Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.

  9. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Development of computer-based analytical tool for assessing physical protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less

  11. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  12. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  13. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  14. SAM Biotoxin Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.

  15. SAM Chemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery

  16. Using Learning Analytics to Support Engagement in Collaborative Writing

    ERIC Educational Resources Information Center

    Liu, Ming; Pardo, Abelardo; Liu, Li

    2017-01-01

    Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…

  17. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  18. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  19. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    ERIC Educational Resources Information Center

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  20. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  1. Design of an omnidirectional single-point photodetector for large-scale spatial coordinate measurement

    NASA Astrophysics Data System (ADS)

    Xie, Hongbo; Mao, Chensheng; Ren, Yongjie; Zhu, Jigui; Wang, Chao; Yang, Lei

    2017-10-01

    In high precision and large-scale coordinate measurement, one commonly used approach to determine the coordinate of a target point is utilizing the spatial trigonometric relationships between multiple laser transmitter stations and the target point. A light receiving device at the target point is the key element in large-scale coordinate measurement systems. To ensure high-resolution and highly sensitive spatial coordinate measurement, a high-performance and miniaturized omnidirectional single-point photodetector (OSPD) is greatly desired. We report one design of OSPD using an aspheric lens, which achieves an enhanced reception angle of -5 deg to 45 deg in vertical and 360 deg in horizontal. As the heart of our OSPD, the aspheric lens is designed in a geometric model and optimized by LightTools Software, which enables the reflection of a wide-angle incident light beam into the single-point photodiode. The performance of home-made OSPD is characterized with working distances from 1 to 13 m and further analyzed utilizing developed a geometric model. The experimental and analytic results verify that our device is highly suitable for large-scale coordinate metrology. The developed device also holds great potential in various applications such as omnidirectional vision sensor, indoor global positioning system, and optical wireless communication systems.

  2. Investigation of plasma-sheath resonances in low pressure discharges

    NASA Astrophysics Data System (ADS)

    Naggary, Schabnam; Kemaneci, Efe; Brinkmann, Ralf Peter; Megahed, Mustafa

    2016-09-01

    Plasma sheath resonances (PSR) arise from a periodic exchange between the kinetic electron energy in the plasma bulk and the electric field energy in the sheath and can easily be excited by the sheath-generated harmonics of the applied RF. In this contribution, we employ a series of models to obtain a well-defined description of these phenomena. In the first part, we use a global model to study the influence of the nonlinear charge-voltage characteristics on the electron dynamics. However, the global model is restricted to the assumption of spatially constant potential at each driven and grounded electrode and thus delivers only the fundamental mode of the current. In order to remedy the deficiency, we introduce a spatially resolved model for arbitrary reactor geometries with no assumptions on the homogeneity of the plasma. An exact evaluation of the analytical solution is realized on the assumption of a cylinderical plasma reactor geometry with uniform conductance. Furthermore, the spatially resolved model is capable of being utilized for a more realistic CCP reactor geometry and non homogeneous plasma provided the conductance distribution is known. For this purpose, we use the CFD-ACE+ tool. The results show that the proposed multi-mode model provides a significant improvement. The authors gratefully acknowledge the financial support by the ESI Group and the SFB- TR 87.

  3. Hyperspectral imaging using near infrared spectroscopy to monitor coat thickness uniformity in the manufacture of a transdermal drug delivery system.

    PubMed

    Pavurala, Naresh; Xu, Xiaoming; Krishnaiah, Yellela S R

    2017-05-15

    Hyperspectral imaging using near infrared spectroscopy (NIRS) integrates spectroscopy and conventional imaging to obtain both spectral and spatial information of materials. The non-invasive and rapid nature of hyperspectral imaging using NIRS makes it a valuable process analytical technology (PAT) tool for in-process monitoring and control of the manufacturing process for transdermal drug delivery systems (TDS). The focus of this investigation was to develop and validate the use of Near Infra-red (NIR) hyperspectral imaging to monitor coat thickness uniformity, a critical quality attribute (CQA) for TDS. Chemometric analysis was used to process the hyperspectral image and a partial least square (PLS) model was developed to predict the coat thickness of the TDS. The goodness of model fit and prediction were 0.9933 and 0.9933, respectively, indicating an excellent fit to the training data and also good predictability. The % Prediction Error (%PE) for internal and external validation samples was less than 5% confirming the accuracy of the PLS model developed in the present study. The feasibility of the hyperspectral imaging as a real-time process analytical tool for continuous processing was also investigated. When the PLS model was applied to detect deliberate variation in coating thickness, it was able to predict both the small and large variations as well as identify coating defects such as non-uniform regions and presence of air bubbles. Published by Elsevier B.V.

  4. Developing semi-analytical solution for multiple-zone transient storage model with spatially non-uniform storage

    NASA Astrophysics Data System (ADS)

    Deng, Baoqing; Si, Yinbing; Wang, Jia

    2017-12-01

    Transient storages may vary along the stream due to stream hydraulic conditions and the characteristics of storage. Analytical solutions of transient storage models in literature didn't cover the spatially non-uniform storage. A novel integral transform strategy is presented that simultaneously performs integral transforms to the concentrations in the stream and in storage zones by using the single set of eigenfunctions derived from the advection-diffusion equation of the stream. The semi-analytical solution of the multiple-zone transient storage model with the spatially non-uniform storage is obtained by applying the generalized integral transform technique to all partial differential equations in the multiple-zone transient storage model. The derived semi-analytical solution is validated against the field data in literature. Good agreement between the computed data and the field data is obtained. Some illustrative examples are formulated to demonstrate the applications of the present solution. It is shown that solute transport can be greatly affected by the variation of mass exchange coefficient and the ratio of cross-sectional areas. When the ratio of cross-sectional areas is big or the mass exchange coefficient is small, more reaches are recommended to calibrate the parameter.

  5. Electron energy loss spectroscopy on semiconductor heterostructures for optoelectronics and photonics applications.

    PubMed

    Eljarrat, A; López-Conesa, L; Estradé, S; Peiró, F

    2016-05-01

    In this work, we present characterization methods for the analysis of nanometer-sized devices, based on silicon and III-V nitride semiconductor materials. These methods are devised in order to take advantage of the aberration corrected scanning transmission electron microscope, equipped with a monochromator. This set-up ensures the necessary high spatial and energy resolution for the characterization of the smallest structures. As with these experiments, we aim to obtain chemical and structural information, we use electron energy loss spectroscopy (EELS). The low-loss region of EELS is exploited, which features fundamental electronic properties of semiconductor materials and facilitates a high data throughput. We show how the detailed analysis of these spectra, using theoretical models and computational tools, can enhance the analytical power of EELS. In this sense, initially, results from the model-based fit of the plasmon peak are presented. Moreover, the application of multivariate analysis algorithms to low-loss EELS is explored. Finally, some physical limitations of the technique, such as spatial delocalization, are mentioned. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  6. Cell biochemistry studied by single-molecule imaging.

    PubMed

    Mashanov, G I; Nenasheva, T A; Peckham, M; Molloy, J E

    2006-11-01

    Over the last decade, there have been remarkable developments in live-cell imaging. We can now readily observe individual protein molecules within living cells and this should contribute to a systems level understanding of biological pathways. Direct observation of single fluorophores enables several types of molecular information to be gathered. Temporal and spatial trajectories enable diffusion constants and binding kinetics to be deduced, while analyses of fluorescence lifetime, intensity, polarization or spectra give chemical and conformational information about molecules in their cellular context. By recording the spatial trajectories of pairs of interacting molecules, formation of larger molecular complexes can be studied. In the future, multicolour and multiparameter imaging of single molecules in live cells will be a powerful analytical tool for systems biology. Here, we discuss measurements of single-molecule mobility and residency at the plasma membrane of live cells. Analysis of diffusional paths at the plasma membrane gives information about its physical properties and measurement of temporal trajectories enables rates of binding and dissociation to be derived. Meanwhile, close scrutiny of individual fluorophore trajectories enables ideas about molecular dimerization and oligomerization related to function to be tested directly.

  7. Challenges to Applying a Metamodel for Groundwater Flow Beyond Underlying Numerical Model Boundaries

    NASA Astrophysics Data System (ADS)

    Reeves, H. W.; Fienen, M. N.; Feinstein, D.

    2015-12-01

    Metamodels of environmental behavior offer opportunities for decision support, adaptive management, and increased stakeholder engagement through participatory modeling and model exploration. Metamodels are derived from calibrated, computationally demanding, numerical models. They may potentially be applied to non-modeled areas to provide screening or preliminary analysis tools for areas that do not yet have the benefit of more comprehensive study. In this decision-support mode, they may be fulfilling a role often accomplished by application of analytical solutions. The major challenge to transferring a metamodel to a non-modeled area is how to quantify the spatial data in the new area of interest in such a way that it is consistent with the data used to derive the metamodel. Tests based on transferring a metamodel derived from a numerical groundwater-flow model of the Lake Michigan Basin to other glacial settings across the northern U.S. show that the spatial scale of the numerical model must be appropriately scaled to adequately represent different settings. Careful GIS analysis of the numerical model, metamodel, and new area of interest is required for successful transfer of results.

  8. Boundary element method for 2D materials and thin films.

    PubMed

    Hrtoň, M; Křápek, V; Šikola, T

    2017-10-02

    2D materials emerge as a viable platform for the control of light at the nanoscale. In this context the need has arisen for a fast and reliable tool capable of capturing their strictly 2D nature in 3D light scattering simulations. So far, 2D materials and their patterned structures (ribbons, discs, etc.) have been mostly treated as very thin films of subnanometer thickness with an effective dielectric function derived from their 2D optical conductivity. In this study an extension to the existing framework of the boundary element method (BEM) with 2D materials treated as a conductive interface between two media is presented. The testing of our enhanced method on problems with known analytical solutions reveals that for certain types of tasks the new modification is faster than the original BEM algorithm. Furthermore, the representation of 2D materials as an interface allows us to simulate problems in which their optical properties depend on spatial coordinates. Such spatial dependence can occur naturally or can be tailored artificially to attain new functional properties.

  9. Developing AN Emergency Response Model for Offshore Oil Spill Disaster Management Using Spatial Decision Support System (sdss)

    NASA Astrophysics Data System (ADS)

    Balogun, Abdul-Lateef; Matori, Abdul-Nasir; Wong Toh Kiak, Kelvin

    2018-04-01

    Environmental resources face severe risks during offshore oil spill disasters and Geographic Information System (GIS) Environmental Sensitivity Index (ESI) maps are increasingly being used as response tools to minimize the huge impacts of these spills. However, ESI maps are generally unable to independently harmonize the diverse preferences of the multiple stakeholders' involved in the response process, causing rancour and delay in response time. This paper's Spatial Decision Support System (SDSS) utilizes the Analytic Hierarchy Process (AHP) model to perform tradeoffs in determining the most significant resources to be secured considering the limited resources and time available to perform the response operation. The AHP approach is used to aggregate the diverse preferences of the stakeholders and reach a consensus. These preferences, represented as priority weights, are incorporated in a GIS platform to generate Environmental sensitivity risk (ESR) maps. The ESR maps provide a common operational platform and consistent situational awareness for the multiple parties involved in the emergency response operation thereby minimizing discord among the response teams and saving the most valuable resources.

  10. A Compressive Sensing Approach for Glioma Margin Delineation Using Mass Spectrometry

    PubMed Central

    Gholami, Behnood; Agar, Nathalie Y. R.; Jolesz, Ferenc A.; Haddad, Wassim M.; Tannenbaum, Allen R.

    2013-01-01

    Surgery, and specifically, tumor resection, is the primary treatment for most patients suffering from brain tumors. Medical imaging techniques, and in particular, magnetic resonance imaging are currently used in diagnosis as well as image-guided surgery procedures. However, studies show that computed tomography and magnetic resonance imaging fail to accurately identify the full extent of malignant brain tumors and their microscopic infiltration. Mass spectrometry is a well-known analytical technique used to identify molecules in a given sample based on their mass. In a recent study, it is proposed to use mass spectrometry as an intraoperative tool for discriminating tumor and non-tumor tissue. Integration of mass spectrometry with the resection module allows for tumor resection and immediate molecular analysis. In this paper, we propose a framework for tumor margin delineation using compressive sensing. Specifically, we show that the spatial distribution of tumor cell concentration can be efficiently reconstructed and updated using mass spectrometry information from the resected tissue. In addition, our proposed framework is model-free, and hence, requires no prior information of spatial distribution of the tumor cell concentration. PMID:22255629

  11. GIS residency footprinting: analyzing the impact of family medicine graduate medical education in Hawai'i.

    PubMed

    Hixon, Allen L; Buenconsejo-Lum, Lee E; Racsa, C Philip

    2012-04-01

    Access to care for patients in Hawai'i is compromised by a significant primary care workforce shortage. Not only are there not enough primary care providers, they are often not practicing in locations of high need such as rural areas on the neighbor islands or in the Pacific. This study used geographic information systems (GIS) spatial analysis to look at practice locations for 86 University of Hawai'i Family Medicine and Community Health graduates from 1993 to the 2010. Careful alumni records were verified and entered into the data set using the street address of major employment. Questions to be answered were (1) what percentage of program graduates remain in the state of Hawai'i and (2) what percentage of graduates practice in health professional shortage areas (HPSAs) throughout the United States. This study found that 73 percent of graduates remain and practice in Hawai'i with over 36 percent working in Health Professional Shortage Areas. Spatial analysis using GIS residency footprinting may be an important analytic tool to ensure that graduate medical education programs are meeting Hawai'i's health workforce needs.

  12. Confined wormlike chains in external fields

    NASA Astrophysics Data System (ADS)

    Morrison, Greg

    The confinement of biomolecules is ubiquitous in nature, such as the spatial constraints of viral encapsulation, histone binding, and chromosomal packing. Advances in microfluidics and nanopore fabrication have permitted powerful new tools in single molecule manipulation and gene sequencing through molecular confinement as well. In order to fully understand and exploit these systems, the ability to predict the structure of spatially confined molecules is essential. In this talk, I describe a mean field approach to determine the properties of stiff polymers confined to cylinders and slits, which is relevant for a variety of biological and experimental conditions. I show that this approach is able to not only reproduce known scaling laws for confined wormlike chains, but also provides an improvement over existing weakly bending rod approximations in determining the detailed chain properties (such as correlation functions). Using this approach, we also show that it is possible to study the effect of an externally applied tension or static electric field in a natural and analytically tractable way. These external perturbations can alter the scaling laws and introduce important new length scales into the system, relevant for histone unbinding and single-molecule analysis of DNA.

  13. Women Match Men when Learning a Spatial Skill

    ERIC Educational Resources Information Center

    Spence, Ian; Yu, Jingjie Jessica; Feng, Jing; Marshman, Jeff

    2009-01-01

    Meta-analytic studies have concluded that although training improves spatial cognition in both sexes, the male advantage generally persists. However, because some studies run counter to this pattern, a closer examination of the anomaly is warranted. The authors investigated the acquisition of a basic skill (spatial selective attention) using a…

  14. Spatial autocorrelation analysis of health care hotspots in Taiwan in 2006

    PubMed Central

    2009-01-01

    Background Spatial analytical techniques and models are often used in epidemiology to identify spatial anomalies (hotspots) in disease regions. These analytical approaches can be used to not only identify the location of such hotspots, but also their spatial patterns. Methods In this study, we utilize spatial autocorrelation methodologies, including Global Moran's I and Local Getis-Ord statistics, to describe and map spatial clusters, and areas in which these are situated, for the 20 leading causes of death in Taiwan. In addition, we use the fit to a logistic regression model to test the characteristics of similarity and dissimilarity by gender. Results Gender is compared in efforts to formulate the common spatial risk. The mean found by local spatial autocorrelation analysis is utilized to identify spatial cluster patterns. There is naturally great interest in discovering the relationship between the leading causes of death and well-documented spatial risk factors. For example, in Taiwan, we found the geographical distribution of clusters where there is a prevalence of tuberculosis to closely correspond to the location of aboriginal townships. Conclusions Cluster mapping helps to clarify issues such as the spatial aspects of both internal and external correlations for leading health care events. This is of great aid in assessing spatial risk factors, which in turn facilitates the planning of the most advantageous types of health care policies and implementation of effective health care services. PMID:20003460

  15. A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics

    NASA Technical Reports Server (NTRS)

    Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan

    2013-01-01

    In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.

  16. Regulating outdoor advertisement boards; employing spatial decision support system to control urban visual pollution

    NASA Astrophysics Data System (ADS)

    Wakil, K.; Hussnain, MQ; Tahir, A.; Naeem, M. A.

    2016-06-01

    Unmanaged placement, size, location, structure and contents of outdoor advertisement boards have resulted in severe urban visual pollution and deterioration of the socio-physical living environment in urban centres of Pakistan. As per the regulatory instruments, the approval decision for a new advertisement installation is supposed to be based on the locational density of existing boards and their proximity or remoteness to certain land- uses. In cities, where regulatory tools for the control of advertisement boards exist, responsible authorities are handicapped in effective implementation due to the absence of geospatial analysis capacity. This study presents the development of a spatial decision support system (SDSS) for regularization of advertisement boards in terms of their location and placement. The knowledge module of the proposed SDSS is based on provisions and restrictions prescribed in regulatory documents. While the user interface allows visualization and scenario evaluation to understand if the new board will affect existing linear density on a particular road and if it violates any buffer restrictions around a particular land use. Technically the structure of the proposed SDSS is a web-based solution which includes open geospatial tools such as OpenGeo Suite, GeoExt, PostgreSQL, and PHP. It uses three key data sets including road network, locations of existing billboards and building parcels with land use information to perform the analysis. Locational suitability has been calculated using pairwise comparison through analytical hierarchy process (AHP) and weighted linear combination (WLC). Our results indicate that open geospatial tools can be helpful in developing an SDSS which can assist solving space related iterative decision challenges on outdoor advertisements. Employing such a system will result in effective implementation of regulations resulting in visual harmony and aesthetic improvement in urban communities.

  17. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  18. Rapid Benefit Indicators (RBI) Spatial Analysis Tools

    EPA Science Inventory

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  19. Analytical 3D views and virtual globes — scientific results in a familiar spatial context

    NASA Astrophysics Data System (ADS)

    Tiede, Dirk; Lang, Stefan

    In this paper we introduce analytical three-dimensional (3D) views as a means for effective and comprehensible information delivery, using virtual globes and the third dimension as an additional information carrier. Four case studies are presented, in which information extraction results from very high spatial resolution (VHSR) satellite images were conditioned and aggregated or disaggregated to regular spatial units. The case studies were embedded in the context of: (1) urban life quality assessment (Salzburg/Austria); (2) post-disaster assessment (Harare/Zimbabwe); (3) emergency response (Lukole/Tanzania); and (4) contingency planning (faked crisis scenario/Germany). The results are made available in different virtual globe environments, using the implemented contextual data (such as satellite imagery, aerial photographs, and auxiliary geodata) as valuable additional context information. Both day-to-day users and high-level decision makers are addressees of this tailored information product. The degree of abstraction required for understanding a complex analytical content is balanced with the ease and appeal by which the context is conveyed.

  20. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  1. Visual analytics of geo-social interaction patterns for epidemic control.

    PubMed

    Luo, Wei

    2016-08-10

    Human interaction and population mobility determine the spatio-temporal course of the spread of an airborne disease. This research views such spreads as geo-social interaction problems, because population mobility connects different groups of people over geographical locations via which the viruses transmit. Previous research argued that geo-social interaction patterns identified from population movement data can provide great potential in designing effective pandemic mitigation. However, little work has been done to examine the effectiveness of designing control strategies taking into account geo-social interaction patterns. To address this gap, this research proposes a new framework for effective disease control; specifically this framework proposes that disease control strategies should start from identifying geo-social interaction patterns, designing effective control measures accordingly, and evaluating the efficacy of different control measures. This framework is used to structure design of a new visual analytic tool that consists of three components: a reorderable matrix for geo-social mixing patterns, agent-based epidemic models, and combined visualization methods. With real world human interaction data in a French primary school as a proof of concept, this research compares the efficacy of vaccination strategies between the spatial-social interaction patterns and the whole areas. The simulation results show that locally targeted vaccination has the potential to keep infection to a small number and prevent spread to other regions. At some small probability, the local control strategies will fail; in these cases other control strategies will be needed. This research further explores the impact of varying spatial-social scales on the success of local vaccination strategies. The results show that a proper spatial-social scale can help achieve the best control efficacy with a limited number of vaccines. The case study shows how GS-EpiViz does support the design and testing of advanced control scenarios in airborne disease (e.g., influenza). The geo-social patterns identified through exploring human interaction data can help target critical individuals, locations, and clusters of locations for disease control purposes. The varying spatial-social scales can help geographically and socially prioritize limited resources (e.g., vaccines).

  2. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. LASER APPLICATIONS AND OTHER TOPICS IN QUANTUM ELECTRONICS: Higher spatial harmonics of photorefractive gratings written by phase-locked detection

    NASA Astrophysics Data System (ADS)

    Dugin, A. V.; Zel'dovich, Boris Ya; Il'inykh, P. N.; Liberman, V. S.; Nesterkin, O. P.

    1992-11-01

    The higher spatial harmonics of the photorefractive response have been studied theoretically and experimentally for gratings written by phase-locked detection in an alternating external field. The conditions for writing higher spatial harmonics are derived analytically. The amplitude of the second spatial harmonic has been found experimentally as a function of the spatial frequency in two Bi12TiO20 crystals.

  4. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  5. Mathematical and Computational Challenges in Population Biology and Ecosystems Science

    NASA Technical Reports Server (NTRS)

    Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.

    1997-01-01

    Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.

  6. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  7. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  8. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  9. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  10. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  11. Feasibility model of a high reliability five-year tape transport. Volume 3: Appendices. [detailed drawing and analytical tools used in analyses

    NASA Technical Reports Server (NTRS)

    Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.

    1973-01-01

    Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.

  12. Challenges and Opportunities in Analysing Students Modelling

    ERIC Educational Resources Information Center

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-01-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…

  13. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  14. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  15. Surface Desorption Dielectric-Barrier Discharge Ionization Mass Spectrometry.

    PubMed

    Zhang, Hong; Jiang, Jie; Li, Na; Li, Ming; Wang, Yingying; He, Jing; You, Hong

    2017-07-18

    A variant of dielectric-barrier discharge named surface desorption dielectric-barrier discharge ionization (SDDBDI) mass spectrometry was developed for high-efficiency ion transmission and high spatial resolution imaging. In SDDBDI, a tungsten nanotip and the inlet of the mass spectrometer are used as electrodes, and a piece of coverslip is used as a sample plate as well as an insulating dielectric barrier, which simplifies the configuration of instrument and thus the operation. Different from volume dielectric-barrier discharge (VDBD), the microdischarges are generated on the surface at SDDBDI, and therefore the plasma density is extremely high. Analyte ions are guided directly into the MS inlet without any deflection. This configuration significantly improves the ion transmission efficiency and thus the sensitivity. The dependence of sensitivity and spatial resolution of the SDDBDI on the operation parameters were systematically investigated. The application of SDDBDI was successfully demonstrated by analysis of multiple species including amino acids, pharmaceuticals, putative cancer biomarkers, and mixtures of both fatty acids and hormones. Limits of detection (S/N = 3) were determined to be 0.84 and 0.18 pmol, respectively, for the analysis of l-alanine and metronidazole. A spatial resolution of 22 μm was obtained for the analysis of an imprinted cyclophosphamide pattern, and imaging of a "T" character was successfully demonstrated under ambient conditions. These results indicate that SDDBDI has high-efficiency ion transmission, high sensitivity, and high spatial resolution, which render it a potential tool for mass spectrometry imaging.

  16. Step selection techniques uncover the environmental predictors of space use patterns in flocks of Amazonian birds.

    PubMed

    Potts, Jonathan R; Mokross, Karl; Stouffer, Philip C; Lewis, Mark A

    2014-12-01

    Understanding the behavioral decisions behind animal movement and space use patterns is a key challenge for behavioral ecology. Tools to quantify these patterns from movement and animal-habitat interactions are vital for transforming ecology into a predictive science. This is particularly important in environments undergoing rapid anthropogenic changes, such as the Amazon rainforest, where animals face novel landscapes. Insectivorous bird flocks are key elements of avian biodiversity in the Amazonian ecosystem. Therefore, disentangling and quantifying the drivers behind their movement and space use patterns is of great importance for Amazonian conservation. We use a step selection function (SSF) approach to uncover environmental drivers behind movement choices. This is used to construct a mechanistic model, from which we derive predicted utilization distributions (home ranges) of flocks. We show that movement decisions are significantly influenced by canopy height and topography, but depletion and renewal of resources do not appear to affect movement significantly. We quantify the magnitude of these effects and demonstrate that they are helpful for understanding various heterogeneous aspects of space use. We compare our results to recent analytic derivations of space use, demonstrating that the analytic approximation is only accurate when assuming that there is no persistence in the animals' movement. Our model can be translated into other environments or hypothetical scenarios, such as those given by proposed future anthropogenic actions, to make predictions of spatial patterns in bird flocks. Furthermore, our approach is quite general, so could potentially be used to understand the drivers of movement and spatial patterns for a wide variety of animal communities.

  17. Step selection techniques uncover the environmental predictors of space use patterns in flocks of Amazonian birds

    PubMed Central

    Potts, Jonathan R; Mokross, Karl; Stouffer, Philip C; Lewis, Mark A

    2014-01-01

    Understanding the behavioral decisions behind animal movement and space use patterns is a key challenge for behavioral ecology. Tools to quantify these patterns from movement and animal–habitat interactions are vital for transforming ecology into a predictive science. This is particularly important in environments undergoing rapid anthropogenic changes, such as the Amazon rainforest, where animals face novel landscapes. Insectivorous bird flocks are key elements of avian biodiversity in the Amazonian ecosystem. Therefore, disentangling and quantifying the drivers behind their movement and space use patterns is of great importance for Amazonian conservation. We use a step selection function (SSF) approach to uncover environmental drivers behind movement choices. This is used to construct a mechanistic model, from which we derive predicted utilization distributions (home ranges) of flocks. We show that movement decisions are significantly influenced by canopy height and topography, but depletion and renewal of resources do not appear to affect movement significantly. We quantify the magnitude of these effects and demonstrate that they are helpful for understanding various heterogeneous aspects of space use. We compare our results to recent analytic derivations of space use, demonstrating that the analytic approximation is only accurate when assuming that there is no persistence in the animals' movement. Our model can be translated into other environments or hypothetical scenarios, such as those given by proposed future anthropogenic actions, to make predictions of spatial patterns in bird flocks. Furthermore, our approach is quite general, so could potentially be used to understand the drivers of movement and spatial patterns for a wide variety of animal communities. PMID:25558353

  18. Analytical Hierarchy Process modeling for malaria risk zones in Vadodara district, Gujarat

    NASA Astrophysics Data System (ADS)

    Bhatt, B.; Joshi, J. P.

    2014-11-01

    Malaria epidemic is one of the complex spatial problems around the world. According to WHO, an estimated 6, 27, 000 deaths occurred due to malaria in 2012. In many developing nations with diverse ecological regions, it is still a large cause of human mortality. Owing to the incompleteness of epidemiological data and their spatial origin, the quantification of disease incidence burdening basic public health planning is a major constrain especially in developing countries. The present study focuses on the integrated Geospatial and Multi-Criteria Evaluation (AHP) technique to determine malaria risk zones. The study is conducted in Vadodara district, including 12 Taluka among which 4 Taluka are predominantly tribal. The influence of climatic and physical environmental factors viz., rainfall, hydro geomorphology; drainage, elevation, and land cover are used to score their share in the evaluation of malariogenic condition. This was synthesized on the basis of preference over each factor and the total weights of each data and data layer were computed and visualized. The district was divided into three viz., high, moderate and low risk zones .It was observed that a geographical area of 1885.2sq.km comprising 30.3% fall in high risk zone. The risk zones identified on the basis of these parameters and assigned weights shows a close resemblance with ground condition. As the API distribution for 2011overlaid corresponds to the risk zones identified. The study demonstrates the significance and prospect of integrating Geospatial tools and Analytical Hierarchy Process for malaria risk zones and dynamics of malaria transmission.

  19. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  20. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  1. ToF-SIMS imaging of molecular-level alteration mechanisms in Le Bonheur de vivre by Henri Matisse.

    PubMed

    Voras, Zachary E; deGhetaldi, Kristin; Wiggins, Marcie B; Buckley, Barbara; Baade, Brian; Mass, Jennifer L; Beebe, Thomas P

    2015-11-01

    Time-of-flight secondary ion mass spectrometry (ToF-SIMS) has recently been shown to be a valuable tool for cultural heritage studies, especially when used in conjunction with established analytical techniques in the field. The ability of ToF-SIMS to simultaneously image inorganic and organic species within a paint cross section at micrometer-level spatial resolution makes it a uniquely qualified analytical technique to aid in further understanding the processes of pigment and binder alteration, as well as pigment-binder interactions. In this study, ToF-SIMS was used to detect and image both molecular and elemental species related to CdS pigment and binding medium alteration on the painting Le Bonheur de vivre (1905-1906, The Barnes Foundation) by Henri Matisse. Three categories of inorganic and organic components were found throughout Le Bonheur de vivre and co-localized in cross-sectional samples using high spatial resolution ToF-SIMS analysis: (1) species relating to the preparation and photo-induced oxidation of CdS yellow pigments (2) varying amounts of long-chain fatty acids present in both the paint and primary ground layer and (3) specific amino acid fragments, possibly relating to the painting's complex restoration history. ToF-SIMS's ability to discern both organic and inorganic species via cross-sectional imaging was used to compare samples collected from Le Bonheur de vivre to artificially aged reference paints in an effort to gather mechanistic information relating to alteration processes that have been previously explored using μXANES, SR-μXRF, SEM-EDX, and SR-FTIR. The relatively high sensitivity offered by ToF-SIMS imaging coupled to the high spatial resolution allowed for the positive identification of degradation products (such as cadmium oxalate) in specific paint regions that have before been unobserved. The imaging of organic materials has provided an insight into the extent of destruction of the original binding medium, as well as identifying unexpected organic materials in specific paint layers.

  2. ToF-SIMS imaging of molecular-level alteration mechanisms in Le Bonheur de vivre by Henri Matisse

    NASA Astrophysics Data System (ADS)

    Voras, Zachary E.; deGhetaldi, Kristin; Wiggins, Marcie B.; Buckley, Barbara; Baade, Brian; Mass, Jennifer L.; Beebe, Thomas P.

    2015-11-01

    Time-of-flight secondary ion mass spectrometry (ToF-SIMS) has recently been shown to be a valuable tool for cultural heritage studies, especially when used in conjunction with established analytical techniques in the field. The ability of ToF-SIMS to simultaneously image inorganic and organic species within a paint cross section at micrometer-level spatial resolution makes it a uniquely qualified analytical technique to aid in further understanding the processes of pigment and binder alteration, as well as pigment-binder interactions. In this study, ToF-SIMS was used to detect and image both molecular and elemental species related to CdS pigment and binding medium alteration on the painting Le Bonheur de vivre (1905-1906, The Barnes Foundation) by Henri Matisse. Three categories of inorganic and organic components were found throughout Le Bonheur de vivre and co-localized in cross-sectional samples using high spatial resolution ToF-SIMS analysis: (1) species relating to the preparation and photo-induced oxidation of CdS yellow pigments (2) varying amounts of long-chain fatty acids present in both the paint and primary ground layer and (3) specific amino acid fragments, possibly relating to the painting's complex restoration history. ToF-SIMS's ability to discern both organic and inorganic species via cross-sectional imaging was used to compare samples collected from Le Bonheur de vivre to artificially aged reference paints in an effort to gather mechanistic information relating to alteration processes that have been previously explored using μXANES, SR-μXRF, SEM-EDX, and SR-FTIR. The relatively high sensitivity offered by ToF-SIMS imaging coupled to the high spatial resolution allowed for the positive identification of degradation products (such as cadmium oxalate) in specific paint regions that have before been unobserved. The imaging of organic materials has provided an insight into the extent of destruction of the original binding medium, as well as identifying unexpected organic materials in specific paint layers.

  3. A GIS-based hedonic price model for agricultural land

    NASA Astrophysics Data System (ADS)

    Demetriou, Demetris

    2015-06-01

    Land consolidation is a very effective land management planning approach that aims towards rural/agricultural sustainable development. Land reallocation which involves land tenure restructuring is the most important, complex and time consuming component of land consolidation. Land reallocation relies on land valuation since its fundamental principle provides that after consolidation, each landowner shall be granted a property of an aggregate value that is approximately the same as the value of the property owned prior to consolidation. Therefore, land value is the crucial factor for the land reallocation process and hence for the success and acceptance of the final land consolidation plan. Land valuation is a process of assigning values to all parcels (and its contents) and it is usually carried out by an ad-hoc committee. However, the process faces some problems such as it is time consuming hence costly, outcomes may present inconsistency since it is carried out manually and empirically without employing systematic analytical tools and in particular spatial analysis tools and techniques such as statistical/mathematical. A solution to these problems can be the employment of mass appraisal land valuation methods using automated valuation models (AVM) based on international standards. In this context, this paper presents a spatial based linear hedonic price model which has been developed and tested in a case study land consolidation area in Cyprus. Results showed that the AVM is capable to produce acceptable in terms of accuracy and reliability land values and to reduce time hence cost required by around 80%.

  4. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  5. General properties and analytical approximations of photorefractive solitons

    NASA Astrophysics Data System (ADS)

    Geisler, A.; Homann, F.; Schmidt, H.-J.

    2004-08-01

    We investigate general properties of spatial 1-dimensional bright photorefractive solitons and discuss various analytical approximations for the soliton profile and the half width, both depending on an intensity parameter r. The case of dark solitons is also briefly addressed.

  6. Benchmark solutions for the galactic heavy-ion transport equations with energy and spatial coupling

    NASA Technical Reports Server (NTRS)

    Ganapol, Barry D.; Townsend, Lawrence W.; Lamkin, Stanley L.; Wilson, John W.

    1991-01-01

    Nontrivial benchmark solutions are developed for the galactic heavy ion transport equations in the straightahead approximation with energy and spatial coupling. Analytical representations of the ion fluxes are obtained for a variety of sources with the assumption that the nuclear interaction parameters are energy independent. The method utilizes an analytical LaPlace transform inversion to yield a closed form representation that is computationally efficient. The flux profiles are then used to predict ion dose profiles, which are important for shield design studies.

  7. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  8. Coloc-stats: a unified web interface to perform colocalization analysis of genomic features.

    PubMed

    Simovski, Boris; Kanduri, Chakravarthi; Gundersen, Sveinung; Titov, Dmytro; Domanska, Diana; Bock, Christoph; Bossini-Castillo, Lara; Chikina, Maria; Favorov, Alexander; Layer, Ryan M; Mironov, Andrey A; Quinlan, Aaron R; Sheffield, Nathan C; Trynka, Gosia; Sandve, Geir K

    2018-06-05

    Functional genomics assays produce sets of genomic regions as one of their main outputs. To biologically interpret such region-sets, researchers often use colocalization analysis, where the statistical significance of colocalization (overlap, spatial proximity) between two or more region-sets is tested. Existing colocalization analysis tools vary in the statistical methodology and analysis approaches, thus potentially providing different conclusions for the same research question. As the findings of colocalization analysis are often the basis for follow-up experiments, it is helpful to use several tools in parallel and to compare the results. We developed the Coloc-stats web service to facilitate such analyses. Coloc-stats provides a unified interface to perform colocalization analysis across various analytical methods and method-specific options (e.g. colocalization measures, resolution, null models). Coloc-stats helps the user to find a method that supports their experimental requirements and allows for a straightforward comparison across methods. Coloc-stats is implemented as a web server with a graphical user interface that assists users with configuring their colocalization analyses. Coloc-stats is freely available at https://hyperbrowser.uio.no/coloc-stats/.

  9. Monitoring the Productivity of Coastal Systems Using PH ...

    EPA Pesticide Factsheets

    The impact of nutrient inputs to the eutrophication of coastal ecosystems has been one of the great themes of coastal ecology. There have been countless studies devoted to quantifying how human sources of nutrients, in particular nitrogen (N), effect coastal water bodies. These studies, which often measure in situ concentrations of nutrients, chlorophyll, and dissolved oxygen, are often spatially and/or temporally intensive and expensive. We provide evidence from experimental mesocosms, coupled with data from the water column of a well-mixed estuary, that pH can be a quick, inexpensive, and integrative measure of net ecosystem metabolism. In some cases, this approach is a more sensitive tracer of production than direct measurements of chlorophyll and carbon-14. Taken together, our data suggest that pH is a sensitive, but often overlooked, tool for monitoring estuarine production. This presentation will explore the potential utility of pH as an indicator of ecosystem productivity. Our data suggest that pH is a sensitive and potentially integrator of net ecosystem production. It should not be overlooked, that measuring pH is quick, easy, and inexpensive, further increasing its value as an analytical tool.

  10. Student laboratory presentations as a learning tool in anatomy education.

    PubMed

    Chollet, Madeleine B; Teaford, Mark F; Garofalo, Evan M; DeLeon, Valerie B

    2009-01-01

    Previous studies have shown that anatomy students who complete oral laboratory presentations believe they understand the material better and retain it longer than they otherwise would if they only took examinations on the material; however, we have found no studies that empirically test such outcomes. The purpose of this study was to assess the effectiveness of oral presentations through comparisons with other methods of assessment, most notably, examination performance. Specifically, we tested whether students (n = 256) performed better on examination questions on topics covered by their oral presentations than on other topics. Each student completed two graded, 12-minute laboratory presentations on two different assigned topics during the course and took three examinations, each of which covered a third of the course material. Examination questions were characterized by type (memorization, pathway, analytical, spatial). A two-way repeated measures analysis of variance revealed that students performed better on topics covered by their presentations than on topics not covered by their presentations (P < 0.005), regardless of presentation grade (P > 0.05) and question type (P > 0.05). These results demonstrate empirically that oral presentations are an effective learning tool.

  11. On-Line Monitoring the Growth of E. coli or HeLa Cells Using an Annular Microelectrode Piezoelectric Biosensor.

    PubMed

    Tong, Feifei; Lian, Yan; Han, Junliang

    2016-12-18

    Biological information is obtained from the interaction between the series detection electrode and the organism or the physical field of biological cultures in the non-mass responsive piezoelectric biosensor. Therefore, electric parameter of the electrode will affect the biosensor signal. The electric field distribution of the microelectrode used in this study was simulated using the COMSOL Multiphysics analytical tool. This process showed that the electric field spatial distribution is affected by the width of the electrode finger or the space between the electrodes. In addition, the characteristic response of the piezoelectric sensor constructed serially with an annular microelectrode was tested and applied for the continuous detection of Escherichia coli culture or HeLa cell culture. Results indicated that the piezoelectric biosensor with an annular microelectrode meets the requirements for the real-time detection of E. coli or HeLa cells in culture. Moreover, this kind of piezoelectric biosensor is more sensitive than the sensor with an interdigital microelectrode. Thus, the piezoelectric biosensor acts as an effective analysis tool for acquiring online cell or microbial culture information.

  12. Report on an Informal Survey of Groundwater Modeling Practitioners About How They Quantify Uncertainty: Which Tools They Use, Why, and Why Not.

    NASA Astrophysics Data System (ADS)

    Ginn, T. R.; Scheibe, T. D.

    2006-12-01

    Hydrogeology is among the most data-limited of the earth sciences, so that uncertainty arises in every aspect of subsurface flow and transport modeling, from conceptual model to spatial discretization to parameter values. Thus treatment of uncertainty is unavoidable, and the literature and conference proceedings are replete with approaches, templates, paradigms and such for doing so. However, such tools remain not well used, especially those of the stochastic analytic sort, leading recently to explicit inquiries about why this is the case, in response to which entire journal issues have been dedicated. In an effort to continue this discussion in a constructive way we report on an informal yet extensive survey of hydrogeology practitioners, as the "marketplace" for techniques to deal with uncertainty. We include scientists, engineers, regulators, and others in the survey, that reports on quantitative (or not) methods for uncertainty characterization and analysis, frequency and level of usage, and reasons behind the selection or avoidance of available methods. Results shed light on fruitful directions for future research in uncertainty quantification in hydrogeology.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  14. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  15. A geovisual analytic approach to understanding geo-social relationships in the international trade network.

    PubMed

    Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M

    2014-01-01

    The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly 'balkanized' (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above.

  16. A Geovisual Analytic Approach to Understanding Geo-Social Relationships in the International Trade Network

    PubMed Central

    Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M.

    2014-01-01

    The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly ‘balkanized’ (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above. PMID:24558409

  17. Spatial allocation of forest recreation value

    Treesearch

    Kenneth A. Baerenklau; Armando Gonzalez-Caban; Catrina Paez; Edgard Chavez

    2009-01-01

    Non-market valuation methods and geographic information systems are useful planning and management tools for public land managers. Recent attention has been given to investigation and demonstration of methods for combining these tools to provide spatially-explicit representations of non-market value. Most of these efforts have focused on spatial allocation of...

  18. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  19. Spatial-Operator Algebra For Robotic Manipulators

    NASA Technical Reports Server (NTRS)

    Rodriguez, Guillermo; Kreutz, Kenneth K.; Milman, Mark H.

    1991-01-01

    Report discusses spatial-operator algebra developed in recent studies of mathematical modeling, control, and design of trajectories of robotic manipulators. Provides succinct representation of mathematically complicated interactions among multiple joints and links of manipulator, thereby relieving analyst of most of tedium of detailed algebraic manipulations. Presents analytical formulation of spatial-operator algebra, describes some specific applications, summarizes current research, and discusses implementation of spatial-operator algebra in the Ada programming language.

  20. FEMA's Earthquake Incident Journal: A Web-Based Data Integration and Decision Support Tool for Emergency Management

    NASA Astrophysics Data System (ADS)

    Jones, M.; Pitts, R.

    2017-12-01

    For emergency managers, government officials, and others who must respond to rapidly changing natural disasters, timely access to detailed information related to affected terrain, population and infrastructure is critical for planning, response and recovery operations. Accessing, analyzing and disseminating such disparate information in near real-time are critical decision support components. However, finding a way to handle a variety of informative yet complex datasets poses a challenge when preparing for and responding to disasters. Here, we discuss the implementation of a web-based data integration and decision support tool for earthquakes developed by the Federal Emergency Management Agency (FEMA) as a solution to some of these challenges. While earthquakes are among the most well- monitored and measured of natural hazards, the spatially broad impacts of shaking, ground deformation, landslides, liquefaction, and even tsunamis, are extremely difficult to quantify without accelerated access to data, modeling, and analytics. This web-based application, deemed the "Earthquake Incident Journal", provides real-time access to authoritative and event-specific data from external (e.g. US Geological Survey, NASA, state and local governments, etc.) and internal (FEMA) data sources. The journal includes a GIS-based model for exposure analytics, allowing FEMA to assess the severity of an event, estimate impacts to structures and population in near real-time, and then apply planning factors to exposure estimates to answer questions such as: What geographic areas are impacted? Will federal support be needed? What resources are needed to support survivors? And which infrastructure elements or essential facilities are threatened? This presentation reviews the development of the Earthquake Incident Journal, detailing the data integration solutions, the methodology behind the GIS-based automated exposure model, and the planning factors as well as other analytical advances that provide near real-time decision support to the federal government.

  1. Online platform for applying space-time scan statistics for prospectively detecting emerging hot spots of dengue fever.

    PubMed

    Chen, Chien-Chou; Teng, Yung-Chu; Lin, Bo-Cheng; Fan, I-Chun; Chan, Ta-Chien

    2016-11-25

    Cases of dengue fever have increased in areas of Southeast Asia in recent years. Taiwan hit a record-high 42,856 cases in 2015, with the majority in southern Tainan and Kaohsiung Cities. Leveraging spatial statistics and geo-visualization techniques, we aim to design an online analytical tool for local public health workers to prospectively identify ongoing hot spots of dengue fever weekly at the village level. A total of 57,516 confirmed cases of dengue fever in 2014 and 2015 were obtained from the Taiwan Centers for Disease Control (TCDC). Incorporating demographic information as covariates with cumulative cases (365 days) in a discrete Poisson model, we iteratively applied space-time scan statistics by SaTScan software to detect the currently active cluster of dengue fever (reported as relative risk) in each village of Tainan and Kaohsiung every week. A village with a relative risk >1 and p value <0.05 was identified as a dengue-epidemic area. Assuming an ongoing transmission might continuously spread for two consecutive weeks, we estimated the sensitivity and specificity for detecting outbreaks by comparing the scan-based classification (dengue-epidemic vs. dengue-free village) with the true cumulative case numbers from the TCDC's surveillance statistics. Among the 1648 villages in Tainan and Kaohsiung, the overall sensitivity for detecting outbreaks increases as case numbers grow in a total of 92 weekly simulations. The specificity for detecting outbreaks behaves inversely, compared to the sensitivity. On average, the mean sensitivity and specificity of 2-week hot spot detection were 0.615 and 0.891 respectively (p value <0.001) for the covariate adjustment model, as the maximum spatial and temporal windows were specified as 50% of the total population at risk and 28 days. Dengue-epidemic villages were visualized and explored in an interactive map. We designed an online analytical tool for front-line public health workers to prospectively detect ongoing dengue fever transmission on a weekly basis at the village level by using the routine surveillance data.

  2. Using Exploratory Spatial Data Analysis to Leverage Social Indicator Databases: The Discovery of Interesting Patterns

    ERIC Educational Resources Information Center

    Anselin, Luc; Sridharan, Sanjeev; Gholston, Susan

    2007-01-01

    With the proliferation of social indicator databases, the need for powerful techniques to study patterns of change has grown. In this paper, the utility of spatial data analytical methods such as exploratory spatial data analysis (ESDA) is suggested as a means to leverage the information contained in social indicator databases. The principles…

  3. Spatial versus Object Visualisation: The Case of Mathematical Understanding in Three-Dimensional Arrays of Cubes and Nets

    ERIC Educational Resources Information Center

    Pitta-Pantazi, Demetra; Christou, Constantinos

    2010-01-01

    This paper investigates the relations of students' spatial and object visualisation with their analytic, creative and practical abilities in three-dimensional geometry. Fifty-three 11-year-olds were tested using a Greek modified version of the Object-Spatial Imagery Questionnaire (OSIQ) (Blajenkova, Kozhevnikov, & Motes, 2006) and two…

  4. Mapping the magnonic landscape in patterned magnetic structures

    NASA Astrophysics Data System (ADS)

    Davies, C. S.; Poimanov, V. D.; Kruglyak, V. V.

    2017-09-01

    We report the development of a hybrid numerical/analytical model capable of mapping the spatially varying distributions of the local ferromagnetic resonance (FMR) frequency and dynamic magnetic susceptibility in a wide class of patterned and compositionally modulated magnetic structures. Starting from the numerically simulated static micromagnetic state, the magnetization is deliberately deflected orthogonally to its equilibrium orientation, and the magnetic fields generated in response to this deflection are evaluated using micromagnetic software. This allows us to calculate the elements of the effective demagnetizing tensor, which are then used within a linear analytical formalism to map the local FMR frequency and dynamic magnetic susceptibility. To illustrate the typical results that one can obtain using this model, we analyze three micromagnetic systems boasting nonuniformity in either one or two dimensions, and successfully explain the spin-wave emission observed in each case, demonstrating the ubiquitous nature of the Schlömann excitation mechanism underpinning the observations. Finally, the developed model of local FMR frequency can be used to explain how spin waves could be confined and steered using magnetic nonuniformities of various origins, rendering it a powerful tool for the mapping of the graded magnonic index in magnonics.

  5. Simulation of laser generated ultrasound with application to defect detection

    NASA Astrophysics Data System (ADS)

    Pantano, A.; Cerniglia, D.

    2008-06-01

    Laser generated ultrasound holds substantial promise for use as a tool for defect detection in remote inspection thanks to its ability to produce frequencies in the MHz range, enabling fine spatial resolution of defects. Despite the potential impact of laser generated ultrasound in many areas of science and industry, robust tools for studying the phenomenon are lacking and thus limit the design and optimization of non-destructive testing and evaluation techniques. The laser generated ultrasound propagation in complex structures is an intricate phenomenon and is extremely hard to analyze. Only simple geometries can be studied analytically. Numerical techniques found in the literature have proved to be limited in their applicability, by the frequencies in the MHz range and very short wavelengths. The objective of this research is to prove that by using an explicit integration rule together with diagonal element mass matrices, instead of the almost universally adopted implicit integration rule to integrate the equations of motion in a dynamic analysis, it is possible to efficiently and accurately solve ultrasound wave propagation problems with frequencies in the MHz range travelling in relatively large bodies. Presented results on NDE testing of rails demonstrate that the proposed FE technique can provide a valuable tool for studying the laser generated ultrasound propagation.

  6. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less

  7. Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making - Proceedings of a Workshop

    USGS Publications Warehouse

    Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl

    2010-01-01

    The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and the use of ecosystem services in decision making.

  8. IBM's Health Analytics and Clinical Decision Support.

    PubMed

    Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W

    2014-08-15

    This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.

  9. Virtual Technologies to Develop Visual-Spatial Ability in Engineering Students

    ERIC Educational Resources Information Center

    Roca-González, Cristina; Martin-Gutierrez, Jorge; García-Dominguez, Melchor; Carrodeguas, Mª del Carmen Mato

    2017-01-01

    The present study assessed a short training experiment to improve spatial abilities using two tools based on virtual technologies: one focused on manipulation of specific geometric virtual pieces, and the other consisting of virtual orienteering game. The two tools can help improve spatial abilities required for many engineering problem-solving…

  10. Vector solitons in coupled nonlinear Schrödinger equations with spatial stimulated scattering and inhomogeneous dispersion

    NASA Astrophysics Data System (ADS)

    Gromov, E. M.; Malomed, B. A.; Tyutin, V. V.

    2018-01-01

    The dynamics of two-component solitons is studied, analytically and numerically, in the framework of a system of coupled extended nonlinear Schrödinger equations, which incorporate the cross-phase modulation, pseudo-stimulated-Raman-scattering (pseudo-SRS), cross-pseudo-SRS, and spatially inhomogeneous second-order dispersion (SOD). The system models co-propagation of electromagnetic waves with orthogonal polarizations in plasmas. It is shown that the soliton's wavenumber downshift, caused by pseudo-SRS, may be compensated by an upshift, induced by the inhomogeneous SOD, to produce stable stationary two-component solitons. The corresponding approximate analytical solutions for stable solitons are found. Analytical results are well confirmed by their numerical counterparts. Further, the evolution of inputs composed of spatially even and odd components is investigated by means of systematic simulations, which reveal three different outcomes: formation of a breather which keeps opposite parities of the components; splitting into a pair of separating vector solitons; and spreading of the weak odd component into a small-amplitude pedestal with an embedded dark soliton.

  11. Harnessing Scientific Literature Reports for Pharmacovigilance

    PubMed Central

    Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-01-01

    Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432

  12. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  13. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  14. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  15. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  16. Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft

    DTIC Science & Technology

    2013-03-01

    imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89  B.  FUTURE WORK................................................................................. 90  APPENDIX A. STK DATA AND BENEFIT

  17. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  18. The Metaphorical Department Head: Using Metaphors as Analytic Tools to Investigate the Role of Department Head

    ERIC Educational Resources Information Center

    Paranosic, Nikola; Riveros, Augusto

    2017-01-01

    This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…

  19. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  20. Multi-Scale Approach for Predicting Fish Species Distributions across Coral Reef Seascapes

    PubMed Central

    Pittman, Simon J.; Brown, Kerry A.

    2011-01-01

    Two of the major limitations to effective management of coral reef ecosystems are a lack of information on the spatial distribution of marine species and a paucity of data on the interacting environmental variables that drive distributional patterns. Advances in marine remote sensing, together with the novel integration of landscape ecology and advanced niche modelling techniques provide an unprecedented opportunity to reliably model and map marine species distributions across many kilometres of coral reef ecosystems. We developed a multi-scale approach using three-dimensional seafloor morphology and across-shelf location to predict spatial distributions for five common Caribbean fish species. Seascape topography was quantified from high resolution bathymetry at five spatial scales (5–300 m radii) surrounding fish survey sites. Model performance and map accuracy was assessed for two high performing machine-learning algorithms: Boosted Regression Trees (BRT) and Maximum Entropy Species Distribution Modelling (MaxEnt). The three most important predictors were geographical location across the shelf, followed by a measure of topographic complexity. Predictor contribution differed among species, yet rarely changed across spatial scales. BRT provided ‘outstanding’ model predictions (AUC = >0.9) for three of five fish species. MaxEnt provided ‘outstanding’ model predictions for two of five species, with the remaining three models considered ‘excellent’ (AUC = 0.8–0.9). In contrast, MaxEnt spatial predictions were markedly more accurate (92% map accuracy) than BRT (68% map accuracy). We demonstrate that reliable spatial predictions for a range of key fish species can be achieved by modelling the interaction between the geographical location across the shelf and the topographic heterogeneity of seafloor structure. This multi-scale, analytic approach is an important new cost-effective tool to accurately delineate essential fish habitat and support conservation prioritization in marine protected area design, zoning in marine spatial planning, and ecosystem-based fisheries management. PMID:21637787

  1. Multi-scale approach for predicting fish species distributions across coral reef seascapes.

    PubMed

    Pittman, Simon J; Brown, Kerry A

    2011-01-01

    Two of the major limitations to effective management of coral reef ecosystems are a lack of information on the spatial distribution of marine species and a paucity of data on the interacting environmental variables that drive distributional patterns. Advances in marine remote sensing, together with the novel integration of landscape ecology and advanced niche modelling techniques provide an unprecedented opportunity to reliably model and map marine species distributions across many kilometres of coral reef ecosystems. We developed a multi-scale approach using three-dimensional seafloor morphology and across-shelf location to predict spatial distributions for five common Caribbean fish species. Seascape topography was quantified from high resolution bathymetry at five spatial scales (5-300 m radii) surrounding fish survey sites. Model performance and map accuracy was assessed for two high performing machine-learning algorithms: Boosted Regression Trees (BRT) and Maximum Entropy Species Distribution Modelling (MaxEnt). The three most important predictors were geographical location across the shelf, followed by a measure of topographic complexity. Predictor contribution differed among species, yet rarely changed across spatial scales. BRT provided 'outstanding' model predictions (AUC = >0.9) for three of five fish species. MaxEnt provided 'outstanding' model predictions for two of five species, with the remaining three models considered 'excellent' (AUC = 0.8-0.9). In contrast, MaxEnt spatial predictions were markedly more accurate (92% map accuracy) than BRT (68% map accuracy). We demonstrate that reliable spatial predictions for a range of key fish species can be achieved by modelling the interaction between the geographical location across the shelf and the topographic heterogeneity of seafloor structure. This multi-scale, analytic approach is an important new cost-effective tool to accurately delineate essential fish habitat and support conservation prioritization in marine protected area design, zoning in marine spatial planning, and ecosystem-based fisheries management.

  2. Genome wide approaches to identify protein-DNA interactions.

    PubMed

    Ma, Tao; Ye, Zhenqing; Wang, Liguo

    2018-05-29

    Transcription factors are DNA-binding proteins that play key roles in many fundamental biological processes. Unraveling their interactions with DNA is essential to identify their target genes and understand the regulatory network. Genome-wide identification of their binding sites became feasible thanks to recent progress in experimental and computational approaches. ChIP-chip, ChIP-seq, and ChIP-exo are three widely used techniques to demarcate genome-wide transcription factor binding sites. This review aims to provide an overview of these three techniques including their experiment procedures, computational approaches, and popular analytic tools. ChIP-chip, ChIP-seq, and ChIP-exo have been the major techniques to study genome-wide in vivo protein-DNA interaction. Due to the rapid development of next-generation sequencing technology, array-based ChIP-chip is deprecated and ChIP-seq has become the most widely used technique to identify transcription factor binding sites in genome-wide. The newly developed ChIP-exo further improves the spatial resolution to single nucleotide. Numerous tools have been developed to analyze ChIP-chip, ChIP-seq and ChIP-exo data. However, different programs may employ different mechanisms or underlying algorithms thus each will inherently include its own set of statistical assumption and bias. So choosing the most appropriate analytic program for a given experiment needs careful considerations. Moreover, most programs only have command line interface so their installation and usage will require basic computation expertise in Unix/Linux. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Confocal Raman microscope mapping as a tool to describe different mineral and organic phases at high spatial resolution within marine biogenic carbonates: case study on Nerita undata (Gastropoda, Neritopsina)

    NASA Astrophysics Data System (ADS)

    Nehrke, G.; Nouet, J.

    2011-12-01

    Marine biogenic carbonates formed by invertebrates (e.g. corals and mollusks) represent complex composites of one or more mineral phases and organic molecules. This complexity ranges from the macroscopic structures observed with the naked eye down to sub micrometric structures only revealed by micro analytical techniques. Understanding to what extent and how organisms can control the formation of these structures requires that the mineral and organic phases can be identified and their spatial distribution related. Here we demonstrate the capability of confocal Raman microscopy applied to cross sections of a shell of Nerita undata to describe the distribution of calcite and aragonite including their crystallographic orientation with high lateral resolution (~300 nm). Moreover, spatial distribution of functional groups of organic compounds can be simultaneously acquired, allowing to specifically relate them to the observed microstructures. The data presented in this case study highlights the possible new contributions of this method to the description of modalities of Nerita undata shell formation, and what could be expected of its application to other marine biogenic carbonates. Localization of areas of interest would also allow further investigations using more localized methods, such as TEM that would provide complementary information on the relation between organic molecules and crystal lattice.

  4. Confocal Raman microscopy as a tool to describe different mineral and organic phases at high spatial resolution within marine biogenic carbonates: case study on Nerita undata (Gastropoda, Neritopsina)

    NASA Astrophysics Data System (ADS)

    Nehrke, G.; Nouet, J.

    2011-06-01

    Marine biogenic carbonates formed by invertebrates (e.g. corals and mollusk shells) represent complex composites of one or more mineral phases and organic molecules. This complexity ranges from the macroscopic structures observed with the naked eye down to sub micrometric structures only revealed by micro analytical techniques. Understanding to what extent and how organisms can control the formation of these structures requires that the mineral and organic phases can be identified and their spatial distribution related. Here we demonstrate the capability of confocal Raman microscopy applied to cross sections of a shell of Nerita undata to describe the distribution of calcite and aragonite including their crystallographic orientation with high lateral resolution (∼300 nm). Moreover, spatial distribution of functional groups of organic compounds can be simultaneously acquired, allowing to specifically relate them to the observed microstructures. The data presented in this case study highlights the possible new contributions of this method to the description of modalities of Nerita undata shell formation, and what could be expected of its application to other marine biogenic carbonates. Localization of areas of interest would also allow further investigations using more localized methods, such as TEM that would provide complementary information on the relation between organic molecules and crystallographic lattice.

  5. A spatially distributed model for the assessment of land use impacts on stream temperature in small urban watersheds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Ning; Yearsley, John; Voisin, Nathalie

    2015-05-15

    Stream temperatures in urban watersheds are influenced to a high degree by anthropogenic impacts related to changes in landscape, stream channel morphology, and climate. These impacts can occur at small time and length scales, hence require analytical tools that consider the influence of the hydrologic regime, energy fluxes, topography, channel morphology, and near-stream vegetation distribution. Here we describe a modeling system that integrates the Distributed Hydrologic Soil Vegetation Model, DHSVM, with the semi-Lagrangian stream temperature model RBM, which has the capability to simulate the hydrology and water temperature of urban streams at high time and space resolutions, as well asmore » a representation of the effects of riparian shading on stream energetics. We demonstrate the modeling system through application to the Mercer Creek watershed, a small urban catchment near Bellevue, Washington. The results suggest that the model is able both to produce realistic streamflow predictions at fine temporal and spatial scales, and to provide spatially distributed water temperature predictions that are consistent with observations throughout a complex stream network. We use the modeling construct to characterize impacts of land use change and near-stream vegetation change on stream temperature throughout the Mercer Creek system. We then explore the sensitivity of stream temperature to land use changes and modifications in vegetation along the riparian corridor.« less

  6. [Spatial variation of soil properties and quality evaluation for arable Ustic Cambosols in central Henan Province].

    PubMed

    Zhang, Xue-Lei; Feng, Wan-Wan; Zhong, Guo-Min

    2011-01-01

    A GIS-based 500 m x 500 m soil sampling point arrangement was set on 248 points at Wenshu Town of Yuzhou County in central Henan Province, where the typical Ustic Cambosols locates. By using soil digital data, the spatial database was established, from which, all the needed latitude and longitude data of the sampling points were produced for the field GPS guide. Soil samples (0-20 cm) were collected from 202 points, of which, bulk density measurement were conducted for randomly selected 34 points, and the ten soil property items used as the factors for soil quality assessment, including organic matter, available K, available P, pH, total N, total P, soil texture, cation exchange capacity (CEC), slowly available K, and bulk density, were analyzed for the other points. The soil property items were checked by statistic tools, and then, classified with standard criteria at home and abroad. The factor weight was given by analytic hierarchy process (AHP) method, and the spatial variation of the major 10 soil properties as well as the soil quality classes and their occupied areas were worked out by Kriging interpolation maps. The results showed that the arable Ustic Cambosols in study area was of good quality soil, over 95% of which ranked in good and medium classes and only less than 5% were in poor class.

  7. Identifying environmental drivers of insect phenology across space and time: Culicoides in Scotland as a case study.

    PubMed

    Searle, K R; Blackwell, A; Falconer, D; Sullivan, M; Butler, A; Purse, B V

    2013-04-01

    Interpreting spatial patterns in the abundance of species over time is a fundamental cornerstone of ecological research. For many species, this type of analysis is hampered by datasets that contain a large proportion of zeros, and data that are overdispersed and spatially autocorrelated. This is particularly true for insects, for which abundance data can fluctuate from zero to many thousands in the space of weeks. Increasingly, an understanding of the ways in which environmental variation drives spatial and temporal patterns in the distribution, abundance and phenology of insects is required for management of pests and vector-borne diseases. In this study, we combine the use of smoothing techniques and generalised linear mixed models to relate environmental drivers to key phenological patterns of two species of biting midges, Culicoides pulicaris and C. impunctatus, of which C. pulicaris has been implicated in transmission of bluetongue in Europe. In so doing, we demonstrate analytical tools for linking the phenology of species with key environmental drivers, despite using a relatively small dataset containing overdispersed and zero-inflated data. We demonstrate the importance of landcover and climatic variables in determining the seasonal abundance of these two vector species, and highlight the need for more empirical data on the effects of temperature and precipitation on the life history traits of palearctic Culicoides spp. in Europe.

  8. Collagen Organization in Facet Capsular Ligaments Varies With Spinal Region and With Ligament Deformation.

    PubMed

    Ban, Ehsan; Zhang, Sijia; Zarei, Vahhab; Barocas, Victor H; Winkelstein, Beth A; Picu, Catalin R

    2017-07-01

    The spinal facet capsular ligament (FCL) is primarily comprised of heterogeneous arrangements of collagen fibers. This complex fibrous structure and its evolution under loading play a critical role in determining the mechanical behavior of the FCL. A lack of analytical tools to characterize the spatial anisotropy and heterogeneity of the FCL's microstructure has limited the current understanding of its structure-function relationships. Here, the collagen organization was characterized using spatial correlation analysis of the FCL's optically obtained fiber orientation field. FCLs from the cervical and lumbar spinal regions were characterized in terms of their structure, as was the reorganization of collagen in stretched cervical FCLs. Higher degrees of intra- and intersample heterogeneity were found in cervical FCLs than in lumbar specimens. In the cervical FCLs, heterogeneity was manifested in the form of curvy patterns formed by collections of collagen fibers or fiber bundles. Tensile stretch, a common injury mechanism for the cervical FCL, significantly increased the spatial correlation length in the stretch direction, indicating an elongation of the observed structural features. Finally, an affine estimation for the change of correlation length under loading was performed which gave predictions very similar to the actual values. These findings provide structural insights for multiscale mechanical analyses of the FCLs from various spinal regions and also suggest methods for quantitative characterization of complex tissue patterns.

  9. Spatial Representations in Older Adults are Not Modified by Action: Evidence from Tool Use

    PubMed Central

    Costello, Matthew C.; Bloesch, Emily K.; Davoli, Christopher C.; Panting, Nicholas D.; Abrams, Richard A.; Brockmole, James R.

    2015-01-01

    Theories of embodied perception hold that the visual system is calibrated by both the body schema and the action system, allowing for adaptive action-perception responses. One example of embodied perception involves the effects of tool-use on distance perception, in which wielding a tool with the intention to act upon a target appears to bring that object closer. This tool-based spatial compression (i.e., tool-use effect) has been studied exclusively with younger adults, but it is unknown whether the phenomenon exists with older adults. In this study, we examined the effects of tool use on distance perception in younger and older adults in two experiments. In Experiment 1, younger and older adults estimated the distances of targets just beyond peripersonal space while either wielding a tool or pointing with the hand. Younger adults, but not older adults, estimated targets to be closer after reaching with a tool. In Experiment 2, younger and older adults estimated the distance to remote targets while using either a baton or laser pointer. Younger adults displayed spatial compression with the laser pointer compared to the baton, although older adults did not. Taken together, these findings indicate a generalized absence of the tool-use effect in older adults during distance estimation suggesting that the visuomotor system of older adults does not remap from peripersonal to extrapersonal spatial representations during tool use. PMID:26052886

  10. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  11. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    ERIC Educational Resources Information Center

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  12. Network-scale spatial and temporal variation in Chinook salmon (Oncorhynchus tshawytscha) redd distributions: patterns inferred from spatially continuous replicate surveys

    Treesearch

    Daniel J. Isaak; Russell F. Thurow

    2006-01-01

    Spatially continuous sampling designs, when temporally replicated, provide analytical flexibility and are unmatched in their ability to provide a dynamic system view. We have compiled such a data set by georeferencing the network-scale distribution of Chinook salmon (Oncorhynchus tshawytscha) redds across a large wilderness basin (7330 km2) in...

  13. Application of Characterization, Modeling, and Analytics Towards Understanding Process Structure Linkages in Metallic 3D Printing (Postprint)

    DTIC Science & Technology

    2017-08-01

    of metallic additive manufacturing processes and show that combining experimental data with modelling and advanced data processing and analytics...manufacturing processes and show that combining experimental data with modelling and advanced data processing and analytics methods will accelerate that...geometries, we develop a methodology that couples experimental data and modelling to convert the scan paths into spatially resolved local thermal histories

  14. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  15. Entanglement spectrum degeneracy and the Cardy formula in 1+1 dimensional conformal field theories

    NASA Astrophysics Data System (ADS)

    Alba, Vincenzo; Calabrese, Pasquale; Tonni, Erik

    2018-01-01

    We investigate the effect of a global degeneracy in the distribution of the entanglement spectrum in conformal field theories in one spatial dimension. We relate the recently found universal expression for the entanglement Hamiltonian to the distribution of the entanglement spectrum. The main tool to establish this connection is the Cardy formula. It turns out that the Affleck-Ludwig non-integer degeneracy, appearing because of the boundary conditions induced at the entangling surface, can be directly read from the entanglement spectrum distribution. We also clarify the effect of the non-integer degeneracy on the spectrum of the partial transpose, which is the central object for quantifying the entanglement in mixed states. We show that the exact knowledge of the entanglement spectrum in some integrable spin-chains provides strong analytical evidences corroborating our results.

  16. Automated Interpretation and Extraction of Topographic Information from Time of Flight Secondary Ion Mass Spectrometry Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ievlev, Anton V.; Belianinov, Alexei; Jesse, Stephen

    Time of flight secondary ion mass spectrometry (ToF SIMS) is one of the most powerful characterization tools allowing imaging of the chemical properties of various systems and materials. It allows precise studies of the chemical composition with sub-100-nm lateral and nanometer depth spatial resolution. However, comprehensive interpretation of ToF SIMS results is challengeable, because of the data volume and its multidimensionality. Furthermore, investigation of the samples with pronounced topographical features are complicated by the spectral shift. In this work we developed approach for the comprehensive ToF SIMS data interpretation based on the data analytics and automated extraction of the samplemore » topography based on time of flight shift. We further applied this approach to investigate correlation between biological function and chemical composition in Arabidopsis roots.« less

  17. Automated Interpretation and Extraction of Topographic Information from Time of Flight Secondary Ion Mass Spectrometry Data

    DOE PAGES

    Ievlev, Anton V.; Belianinov, Alexei; Jesse, Stephen; ...

    2017-12-06

    Time of flight secondary ion mass spectrometry (ToF SIMS) is one of the most powerful characterization tools allowing imaging of the chemical properties of various systems and materials. It allows precise studies of the chemical composition with sub-100-nm lateral and nanometer depth spatial resolution. However, comprehensive interpretation of ToF SIMS results is challengeable, because of the data volume and its multidimensionality. Furthermore, investigation of the samples with pronounced topographical features are complicated by the spectral shift. In this work we developed approach for the comprehensive ToF SIMS data interpretation based on the data analytics and automated extraction of the samplemore » topography based on time of flight shift. We further applied this approach to investigate correlation between biological function and chemical composition in Arabidopsis roots.« less

  18. Mass Spectrometry Imaging and GC-MS Profiling of the Mammalian Peripheral Sensory-Motor Circuit

    NASA Astrophysics Data System (ADS)

    Rubakhin, Stanislav S.; Ulanov, Alexander; Sweedler, Jonathan V.

    2015-06-01

    Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry imaging (MSI) has evolved to become an effective discovery tool in science and clinical diagnostics. Here, chemical imaging approaches are applied to well-defined regions of the mammalian peripheral sensory-motor system, including the dorsal root ganglia (DRG) and adjacent nerves. By combining several MSI approaches, analyte coverage is increased and 195 distinct molecular features are observed. Principal component analysis suggests three chemically different regions within the sensory-motor system, with the DRG and adjacent nerve regions being the most distinct. Investigation of these regions using gas chromatography-mass spectrometry corroborate these findings and reveal important metabolic markers related to the observed differences. The heterogeneity of the structurally, physiologically, and functionally connected regions demonstrates the intricate chemical and spatial regulation of their chemical composition.

  19. Frozen Stiff: Cartographic Design and Permafrost Mapping

    NASA Astrophysics Data System (ADS)

    Nelson, F. E.; Li, J.; Nyland, K. E.

    2016-12-01

    Maps are the primary vehicle used to communicate geographical relationships. Ironically, interest in the formal practice of cartography, the art and science of geographic visualization, has fallen significantly during a period when the sophistication and availability of GIS software has increased dramatically. Although the number of geographically oriented permafrost studies has increased significantly in recent years, little discussion about competing visualization strategies, map accuracy, and the psychophysical impact of cartographic design is evident in geocryological literature. Failure to use the full potential of the tools and techniques that contemporary cartographic and spatial-analytic theory makes possible affects our ability to effectively and accurately communicate the impacts and hazards associated with thawing permafrost, particularly in the context of global climate change. This presentation examines recent permafrost studies involving primarily small-scale (large area) mapping, and suggests cartographic strategies for rectifying existing problems.

  20. High-Speed, Three Dimensional Object Composition Mapping Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishikawa, M Y

    2001-02-14

    This document overviews an entirely new approach to determining the composition--the chemical-elemental, isotopic and molecular make-up--of complex, highly structured objects, moreover with microscopic spatial resolution in all 3 dimensions. The front cover depicts the new type of pulsed laser system at the heart of this novel technology under adjustment by Alexis Wynne, and schematically indicates two of its early uses: swiftly analyzing the 3-D composition governed structure of a transistor circuit with both optical and mass-spectrometric detectors, and of fossilized dinosaur and turtle bones high-speed probed by optical detection means. Studying the composition-cued 3-D micro-structures of advanced composite materials andmore » the microscopic scale composition-texture of biological tissues are two near-term examples of the rich spectrum of novel applications enabled by this field-opening analytic tool-set.« less

  1. Bath-induced correlations in an infinite-dimensional Hilbert space

    NASA Astrophysics Data System (ADS)

    Nizama, Marco; Cáceres, Manuel O.

    2017-09-01

    Quantum correlations between two free spinless dissipative distinguishable particles (interacting with a thermal bath) are studied analytically using the quantum master equation and tools of quantum information. Bath-induced coherence and correlations in an infinite-dimensional Hilbert space are shown. We show that for temperature T> 0 the time-evolution of the reduced density matrix cannot be written as the direct product of two independent particles. We have found a time-scale that characterizes the time when the bath-induced coherence is maximum before being wiped out by dissipation (purity, relative entropy, spatial dispersion, and mirror correlations are studied). The Wigner function associated to the Wannier lattice (where the dissipative quantum walks move) is studied as an indirect measure of the induced correlations among particles. We have supported the quantum character of the correlations by analyzing the geometric quantum discord.

  2. Spatial Moment Equations for a Groundwater Plume with Degradation and Rate-Limited Sorption

    EPA Science Inventory

    In this note, we analytically derive the solution for the spatial moments of groundwater solute concentration distributions simulated by a one-dimensional model that assumes advective-dispersive transport with first-order degradation and rate-limited sorption. Sorption kinetics...

  3. Urban Health Indicator Tools of the Physical Environment: a Systematic Review.

    PubMed

    Pineo, Helen; Glonti, Ketevan; Rutter, Harry; Zimmermann, Nici; Wilkinson, Paul; Davies, Michael

    2018-04-16

    Urban health indicator (UHI) tools provide evidence about the health impacts of the physical urban environment which can be used in built environment policy and decision-making. Where UHI tools provide data at the neighborhood (and lower) scale they can provide valuable information about health inequalities and environmental deprivation. This review performs a census of UHI tools and explores their nature and characteristics (including how they represent, simplify or address complex systems) to increase understanding of their potential use by municipal built environment policy and decision-makers. We searched seven bibliographic databases, four key journals and six practitioner websites and conducted Google searches between January 27, 2016 and February 24, 2016 for UHI tools. We extracted data from primary studies and online indicator systems. We included 198 documents which identified 145 UHI tools comprising 8006 indicators, from which we developed a taxonomy. Our taxonomy classifies the significant diversity of UHI tools with respect to topic, spatial scale, format, scope and purpose. The proportions of UHI tools which measure data at the neighborhood and lower scale, and present data via interactive maps, have both increased over time. This is particularly relevant to built environment policy and decision-makers, reflects growing analytical capability and offers the potential for improved understanding of the complexity of influences on urban health (an aspect noted as a particular challenge by some indicator producers). The relation between urban health indicators and health impacts attributable to modifiable environmental characteristics is often indirect. Furthermore, the use of UHI tools in policy and decision-making appears to be limited, thus raising questions about the continued development of such tools by multiple organisations duplicating scarce resources. Further research is needed to understand the requirements of built environment policy and decision-makers, public health professionals and local communities regarding the form and presentation of indicators which support their varied objectives.

  4. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  5. The "Forgotten" Pseudomomenta and Gauge Changes in Generalized Landau Level Problems: Spatially Nonuniform Magnetic and Temporally Varying Electric Fields

    NASA Astrophysics Data System (ADS)

    Konstantinou, Georgios; Moulopoulos, Konstantinos

    2017-05-01

    By perceiving gauge invariance as an analytical tool in order to get insight into the states of the "generalized Landau problem" (a charged quantum particle moving inside a magnetic, and possibly electric field), and motivated by an early article that correctly warns against a naive use of gauge transformation procedures in the usual Landau problem (i.e. with the magnetic field being static and uniform), we first show how to bypass the complications pointed out in that article by solving the problem in full generality through gauge transformation techniques in a more appropriate manner. Our solution provides in simple and closed analytical forms all Landau Level-wavefunctions without the need to specify a particular vector potential. This we do by proper handling of the so-called pseudomomentum ěc {{K}} (or of a quantity that we term pseudo-angular momentum L z ), a method that is crucially different from the old warning argument, but also from standard treatments in textbooks and in research literature (where the usual Landau-wavefunctions are employed - labeled with canonical momenta quantum numbers). Most importantly, we go further by showing that a similar procedure can be followed in the more difficult case of spatially-nonuniform magnetic fields: in such case we define ěc {{K}} and L z as plausible generalizations of the previous ordinary case, namely as appropriate line integrals of the inhomogeneous magnetic field - our method providing closed analytical expressions for all stationary state wavefunctions in an easy manner and in a broad set of geometries and gauges. It can thus be viewed as complementary to the few existing works on inhomogeneous magnetic fields, that have so far mostly focused on determining the energy eigenvalues rather than the corresponding eigenkets (on which they have claimed that, even in the simplest cases, it is not possible to obtain in closed form the associated wavefunctions). The analytical forms derived here for these wavefunctions enable us to also provide explicit Berry's phase calculations and a quick study of their connection to probability currents and to some recent interesting issues in elementary Quantum Mechanics and Condensed Matter Physics. As an added feature, we also show how the possible presence of an additional electric field can be treated through a further generalization of pseudomomenta and their proper handling.

  6. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  7. Web-based GIS: the vector-borne disease airline importation risk (VBD-AIR) tool

    PubMed Central

    2012-01-01

    Background Over the past century, the size and complexity of the air travel network has increased dramatically. Nowadays, there are 29.6 million scheduled flights per year and around 2.7 billion passengers are transported annually. The rapid expansion of the network increasingly connects regions of endemic vector-borne disease with the rest of the world, resulting in challenges to health systems worldwide in terms of vector-borne pathogen importation and disease vector invasion events. Here we describe the development of a user-friendly Web-based GIS tool: the Vector-Borne Disease Airline Importation Risk Tool (VBD-AIR), to help better define the roles of airports and airlines in the transmission and spread of vector-borne diseases. Methods Spatial datasets on modeled global disease and vector distributions, as well as climatic and air network traffic data were assembled. These were combined to derive relative risk metrics via air travel for imported infections, imported vectors and onward transmission, and incorporated into a three-tier server architecture in a Model-View-Controller framework with distributed GIS components. A user-friendly web-portal was built that enables dynamic querying of the spatial databases to provide relevant information. Results The VBD-AIR tool constructed enables the user to explore the interrelationships among modeled global distributions of vector-borne infectious diseases (malaria. dengue, yellow fever and chikungunya) and international air service routes to quantify seasonally changing risks of vector and vector-borne disease importation and spread by air travel, forming an evidence base to help plan mitigation strategies. The VBD-AIR tool is available at http://www.vbd-air.com. Conclusions VBD-AIR supports a data flow that generates analytical results from disparate but complementary datasets into an organized cartographical presentation on a web map for the assessment of vector-borne disease movements on the air travel network. The framework built provides a flexible and robust informatics infrastructure by separating the modules of functionality through an ontological model for vector-borne disease. The VBD‒AIR tool is designed as an evidence base for visualizing the risks of vector-borne disease by air travel for a wide range of users, including planners and decisions makers based in state and local government, and in particular, those at international and domestic airports tasked with planning for health risks and allocating limited resources. PMID:22892045

  8. Web-based GIS: the vector-borne disease airline importation risk (VBD-AIR) tool.

    PubMed

    Huang, Zhuojie; Das, Anirrudha; Qiu, Youliang; Tatem, Andrew J

    2012-08-14

    Over the past century, the size and complexity of the air travel network has increased dramatically. Nowadays, there are 29.6 million scheduled flights per year and around 2.7 billion passengers are transported annually. The rapid expansion of the network increasingly connects regions of endemic vector-borne disease with the rest of the world, resulting in challenges to health systems worldwide in terms of vector-borne pathogen importation and disease vector invasion events. Here we describe the development of a user-friendly Web-based GIS tool: the Vector-Borne Disease Airline Importation Risk Tool (VBD-AIR), to help better define the roles of airports and airlines in the transmission and spread of vector-borne diseases. Spatial datasets on modeled global disease and vector distributions, as well as climatic and air network traffic data were assembled. These were combined to derive relative risk metrics via air travel for imported infections, imported vectors and onward transmission, and incorporated into a three-tier server architecture in a Model-View-Controller framework with distributed GIS components. A user-friendly web-portal was built that enables dynamic querying of the spatial databases to provide relevant information. The VBD-AIR tool constructed enables the user to explore the interrelationships among modeled global distributions of vector-borne infectious diseases (malaria. dengue, yellow fever and chikungunya) and international air service routes to quantify seasonally changing risks of vector and vector-borne disease importation and spread by air travel, forming an evidence base to help plan mitigation strategies. The VBD-AIR tool is available at http://www.vbd-air.com. VBD-AIR supports a data flow that generates analytical results from disparate but complementary datasets into an organized cartographical presentation on a web map for the assessment of vector-borne disease movements on the air travel network. The framework built provides a flexible and robust informatics infrastructure by separating the modules of functionality through an ontological model for vector-borne disease. The VBD‒AIR tool is designed as an evidence base for visualizing the risks of vector-borne disease by air travel for a wide range of users, including planners and decisions makers based in state and local government, and in particular, those at international and domestic airports tasked with planning for health risks and allocating limited resources.

  9. Analytical solutions for a soil vapor extraction model that incorporates gas phase dispersion and molecular diffusion

    NASA Astrophysics Data System (ADS)

    Huang, Junqi; Goltz, Mark N.

    2017-06-01

    To greatly simplify their solution, the equations describing radial advective/dispersive transport to an extraction well in a porous medium typically neglect molecular diffusion. While this simplification is appropriate to simulate transport in the saturated zone, it can result in significant errors when modeling gas phase transport in the vadose zone, as might be applied when simulating a soil vapor extraction (SVE) system to remediate vadose zone contamination. A new analytical solution for the equations describing radial gas phase transport of a sorbing contaminant to an extraction well is presented. The equations model advection, dispersion (including both mechanical dispersion and molecular diffusion), and rate-limited mass transfer of dissolved, separate phase, and sorbed contaminants into the gas phase. The model equations are analytically solved by using the Laplace transform with respect to time. The solutions are represented by confluent hypergeometric functions in the Laplace domain. The Laplace domain solutions are then evaluated using a numerical Laplace inversion algorithm. The solutions can be used to simulate the spatial distribution and the temporal evolution of contaminant concentrations during operation of a soil vapor extraction well. Results of model simulations show that the effect of gas phase molecular diffusion upon concentrations at the extraction well is relatively small, although the effect upon the distribution of concentrations in space is significant. This study provides a tool that can be useful in designing SVE remediation strategies, as well as verifying numerical models used to simulate SVE system performance.

  10. Learner Dashboards a Double-Edged Sword? Students' Sense-Making of a Collaborative Critical Reading and Learning Analytics Environment for Fostering 21st-Century Literacies

    ERIC Educational Resources Information Center

    Pei-Ling Tan, Jennifer; Koh, Elizabeth; Jonathan, Christin; Yang, Simon

    2017-01-01

    The affordances of learning analytics (LA) tools and solutions are being increasingly harnessed for enhancing 21st century pedagogical and learning strategies and outcomes. However, use cases and empirical understandings of students' experiences with LA tools and environments aimed at fostering 21st century literacies, especially in the K-12…

  11. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  12. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  13. Quantifying the Temporal Inequality of Nutrient Loads with a Novel Metric

    NASA Astrophysics Data System (ADS)

    Gall, H. E.; Schultz, D.; Rao, P. S.; Jawitz, J. W.; Royer, M.

    2015-12-01

    Inequality is an emergent property of many complex systems. For a given series of stochastic events, some events generate a disproportionately large contribution to system responses compared to other events. In catchments, such responses cause streamflow and solute loads to exhibit strong temporal inequality, with the vast majority of discharge and solute loads exported during short periods of time during which high-flow events occur. These periods of time are commonly referred to as "hot moments". Although this temporal inequality is widely recognized, there is currently no uniform metric for assessing it. We used a novel application of Lorenz Inequality, a method commonly used in economics to quantify income inequality, to quantify the spatial and temporal inequality of streamflow and nutrient (nitrogen and phosphorus) loads exported to the Chesapeake Bay. Lorenz Inequality and the corresponding Gini Coefficient provide an analytical tool for quantifying inequality that can be applied at any temporal or spatial scale. The Gini coefficient (G) is a formal measure of inequality that varies from 0 to 1, with a value of 0 indicating perfect equality (i.e., fluxes and loads are constant in time) and 1 indicating perfect inequality (i.e., all of the discharge and solute loads are exported during one instant in time). Therefore, G is a simple yet powerful tool for providing insight into the temporal inequality of nutrient transport. We will present the results of our detailed analysis of streamflow and nutrient time series data collected since the early 1980's at 30 USGS gauging stations in the Chesapeake Bay watershed. The analysis is conducted at an annual time scale, enabling trends and patterns to be assessed both temporally (over time at each station) and spatially (for the same period of time across stations). The results of this analysis have the potential to create a transformative new framework for identifying "hot moments", improving our ability to temporally and spatially target implementation of best management practices to ultimately improve water quality in the Chesapeake Bay. This method also provides insight into the temporal scales at which hydrologic and biogeochemical variability dominate nutrient export dynamics.

  14. Image Quality Assessment Using the Joint Spatial/Spatial-Frequency Representation

    NASA Astrophysics Data System (ADS)

    Beghdadi, Azeddine; Iordache, Răzvan

    2006-12-01

    This paper demonstrates the usefulness of spatial/spatial-frequency representations in image quality assessment by introducing a new image dissimilarity measure based on 2D Wigner-Ville distribution (WVD). The properties of 2D WVD are shortly reviewed, and the important issue of choosing the analytic image is emphasized. The WVD-based measure is shown to be correlated with subjective human evaluation, which is the premise towards an image quality assessor developed on this principle.

  15. Mass Spectrometry Imaging of low Molecular Weight Compounds in Garlic (Allium sativum L.) with Gold Nanoparticle Enhanced Target.

    PubMed

    Misiorek, Maria; Sekuła, Justyna; Ruman, Tomasz

    2017-11-01

    Garlic (Allium sativum) is the subject of many studies due to its numerous beneficial properties. Although compounds of garlic have been studied by various analytical methods, their tissue distributions are still unclear. Mass spectrometry imaging (MSI) appears to be a very powerful tool for the identification of the localisation of compounds within a garlic clove. Visualisation of the spatial distribution of garlic low-molecular weight compounds with nanoparticle-based MSI. Compounds occurring on the cross-section of sprouted garlic has been transferred to gold-nanoparticle enhanced target (AuNPET) by imprinting. The imprint was then subjected to MSI analysis. The results suggest that low molecular weight compounds, such as amino acids, dipeptides, fatty acids, organosulphur and organoselenium compounds are distributed within the garlic clove in a characteristic manner. It can be connected with their biological functions and metabolic properties in the plant. New methodology for the visualisation of low molecular weight compounds allowed a correlation to be made between their spatial distribution within a sprouted garlic clove and their biological function. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Femtosecond laser ablation-based mass spectrometry. An ideal tool for stoichiometric analysis of thin films

    DOE PAGES

    LaHaye, Nicole L.; Kurian, Jose; Diwakar, Prasoon K.; ...

    2015-08-19

    An accurate and routinely available method for stoichiometric analysis of thin films is a desideratum of modern materials science where a material’s properties depend sensitively on elemental composition. We thoroughly investigated femtosecond laser ablation-inductively coupled plasma-mass spectrometry (fs-LA-ICP-MS) as an analytical technique for determination of the stoichiometry of thin films down to the nanometer scale. The use of femtosecond laser ablation allows for precise removal of material with high spatial and depth resolution that can be coupled to an ICP-MS to obtain elemental and isotopic information. We used molecular beam epitaxy-grown thin films of LaPd (x)Sb 2 and T´-La 2CuOmore » 4 to demonstrate the capacity of fs-LA-ICP-MS for stoichiometric analysis and the spatial and depth resolution of the technique. Here we demonstrate that the stoichiometric information of thin films with a thickness of ~10 nm or lower can be determined. Furthermore, our results indicate that fs-LA-ICP-MS provides precise information on the thin film-substrate interface and is able to detect the interdiffusion of cations.« less

  17. Multiscale Interactive Communication: Inside and Outside Thun Castle

    NASA Astrophysics Data System (ADS)

    Massari, G. A.; Luce, F.; Pellegatta, C.

    2011-09-01

    The applications of informatics to architecture have become, for professionals, a great tool for managing analytical phases and project activities but also, for the general public, new ways of communication that may relate directly present, past and future facts. Museums in historic buildings, their installations and the recent experiences of eco-museums located throughout the territory provide a privileged experimentation field for technical and digital representation. On the one hand, the safeguarding and the functional adaptation of buildings use 3D computer graphics models that are real spatially related databases: in them are ordered, viewed and interpreted the results of archival, artistic-historical, diagnostic, technological-structural studies and the assumption and feasibility of interventions. On the other hand, the disclosure of things and knowledge linked to collective memory relies on interactive maps and hypertext systems that provide access to authentic virtual museums; a sort of multimedia extension of the exhibition hall is produced to an architectural scale, but at landscape scale the result is an instrument of cultural development so far unpublished: works that are separated in direct perception find in a zenith view of the map a synthetic relation, related both to spatial parameters and temporal interpretations.

  18. Approximate Algorithms for Computing Spatial Distance Histograms with Accuracy Guarantees

    PubMed Central

    Grupcev, Vladimir; Yuan, Yongke; Tu, Yi-Cheng; Huang, Jin; Chen, Shaoping; Pandit, Sagar; Weng, Michael

    2014-01-01

    Particle simulation has become an important research tool in many scientific and engineering fields. Data generated by such simulations impose great challenges to database storage and query processing. One of the queries against particle simulation data, the spatial distance histogram (SDH) query, is the building block of many high-level analytics, and requires quadratic time to compute using a straightforward algorithm. Previous work has developed efficient algorithms that compute exact SDHs. While beating the naive solution, such algorithms are still not practical in processing SDH queries against large-scale simulation data. In this paper, we take a different path to tackle this problem by focusing on approximate algorithms with provable error bounds. We first present a solution derived from the aforementioned exact SDH algorithm, and this solution has running time that is unrelated to the system size N. We also develop a mathematical model to analyze the mechanism that leads to errors in the basic approximate algorithm. Our model provides insights on how the algorithm can be improved to achieve higher accuracy and efficiency. Such insights give rise to a new approximate algorithm with improved time/accuracy tradeoff. Experimental results confirm our analysis. PMID:24693210

  19. II Spatial metaphors and somatic communication: the embodiment of multigenerational experiences of helplessness and futility in an obese patient.

    PubMed

    2013-06-01

    This paper explores the analysis of an obese woman who came to experience her flesh as a bodying forth of personal and multigenerational family and cultural experiences of helplessness. The paper discusses the ideas and images that formed the basis of how I engaged with these themes as they presented countertransferentially. My thesis is that clinical approaches which draw on spatial metaphors for the psyche offer valuable tools for working with people whose inner world expresses itself somatically because such metaphors can be used to engage simultaneously with the personal, cultural, and ancestral dimensions of these unconscious communications. The paper builds on Jung's view of the psyche as comprised of pockets of inner otherness (complexes), on Redfearn's image of psyche as landscape-like and on Samuels' thinking on embodied countertransference and on the political psyche. It also draws on Butler's work on the body as a social phenomenon and on the theme of being a helpless non-person or nobody as explored in Tom Stoppard's play Rosencrantz and Guildenstern are Dead which retells Shakespeare's Hamlet from the perspective of two of the play's 'bit' characters. © 2013, The Society of Analytical Psychology.

  20. The Conformations of Confined Polymers in an External Potential

    NASA Astrophysics Data System (ADS)

    Morrison, Greg

    The confinement of biomolecules is ubiquitous in nature, such as the spatial constraints of viral encapsulation, histone binding, and chromosomal packing. Advances in microfluidics and nanopore fabrication have permitted powerful new tools in single molecule manipulation and gene sequencing through molecular confinement as well. In order to fully understand and exploit these systems, the ability to predict the structure of spatially confined molecules is essential. In this talk, I describe a mean field approach to determine the properties of stiff polymers confined to cylinders and slits, which is relevant for a variety of biological and experimental conditions. I show that this approach is able to not only reproduce known scaling laws for confined wormlike chains, but also provides an improvement over existing weakly bending rod approximations in determining the detailed chain properties (such as correlation functions). Using this approach, we also show that it is possible to study the effect of an externally applied tension or static electric field in a natural and analytically tractable way. These external perturbations can alter the scaling laws and introduce important new length scales into the system, relevant for histone unbinding and single-molecule analysis of DNA.

  1. Spatial and temporal patterns in concentrations of perfluorinated compounds in bald eagle nestlings in the Upper Midwestern United States

    USGS Publications Warehouse

    Route, William T.; Russell, Robin E.; Lindstrom, Andrew B.; Strynor, Mark J.; Key, Rebecca L.

    2014-01-01

    Perfluorinated chemicals (PFCs) are of concern due to their widespread use, persistence in the environment, tendency to accumulate in animal tissues, and growing evidence of toxicity. Between 2006 and 2011 we collected blood plasma from 261 bald eagle nestlings in six study areas from the upper Midwestern United States. Samples were assessed for levels of 16 different PFCs. We used regression analysis in a Bayesian framework to evaluate spatial and temporal trends for these analytes. We found levels as high as 7370 ng/mL for the sum of all 16 PFCs (∑PFCs). Perfluorooctanesulfonate (PFOS) and perfluorodecanesulfonate (PFDS) were the most abundant analytes, making up 67% and 23% of the PFC burden, respectively. Levels of ∑PFC, PFOS, and PFDS were highest in more urban and industrial areas, moderate on Lake Superior, and low on the remote upper St. Croix River watershed. We found evidence of declines in ∑PFCs and seven analytes, including PFOS, PFDS, and perfluorooctanoic acid (PFOA); no trend in two analytes; and increases in two analytes. We argue that PFDS, a long-chained PFC with potential for high bioaccumulation and toxicity, should be considered for future animal and human studies.

  2. Scaling Law for Cross-stream Diffusion in Microchannels under Combined Electroosmotic and Pressure Driven Flow.

    PubMed

    Song, Hongjun; Wang, Yi; Pant, Kapil

    2013-01-01

    This paper presents an analytical study of the cross-stream diffusion of an analyte in a rectangular microchannel under combined electroosmotic flow (EOF) and pressure driven flow to investigate the heterogeneous transport behavior and spatially-dependent diffusion scaling law. An analytical model capable of accurately describing 3D steady-state convection-diffusion in microchannels with arbitrary aspect ratios is developed based on the assumption of the thin Electric Double Layer (EDL). The model is verified against high-fidelity numerical simulation in terms of flow velocity and analyte concentration profiles with excellent agreement (<0.5% relative error). An extensive parametric analysis is then undertaken to interrogate the effect of the combined flow velocity field on the transport behavior in both the positive pressure gradient (PPG) and negative pressure gradient (NPG) cases. For the first time, the evolution from the spindle-shaped concentration profile in the PPG case, via the stripe-shaped profile (pure EOF), and finally to the butterfly-shaped profile in the PPG case is obtained using the analytical model along with a quantitative depiction of the spatially-dependent diffusion layer thickness and scaling law across a wide range of the parameter space.

  3. Scaling Law for Cross-stream Diffusion in Microchannels under Combined Electroosmotic and Pressure Driven Flow

    PubMed Central

    Song, Hongjun; Wang, Yi; Pant, Kapil

    2012-01-01

    This paper presents an analytical study of the cross-stream diffusion of an analyte in a rectangular microchannel under combined electroosmotic flow (EOF) and pressure driven flow to investigate the heterogeneous transport behavior and spatially-dependent diffusion scaling law. An analytical model capable of accurately describing 3D steady-state convection-diffusion in microchannels with arbitrary aspect ratios is developed based on the assumption of the thin Electric Double Layer (EDL). The model is verified against high-fidelity numerical simulation in terms of flow velocity and analyte concentration profiles with excellent agreement (<0.5% relative error). An extensive parametric analysis is then undertaken to interrogate the effect of the combined flow velocity field on the transport behavior in both the positive pressure gradient (PPG) and negative pressure gradient (NPG) cases. For the first time, the evolution from the spindle-shaped concentration profile in the PPG case, via the stripe-shaped profile (pure EOF), and finally to the butterfly-shaped profile in the PPG case is obtained using the analytical model along with a quantitative depiction of the spatially-dependent diffusion layer thickness and scaling law across a wide range of the parameter space. PMID:23554584

  4. Numerical Treatment of the Boltzmann Equation for Self-Propelled Particle Systems

    NASA Astrophysics Data System (ADS)

    Thüroff, Florian; Weber, Christoph A.; Frey, Erwin

    2014-10-01

    Kinetic theories constitute one of the most promising tools to decipher the characteristic spatiotemporal dynamics in systems of actively propelled particles. In this context, the Boltzmann equation plays a pivotal role, since it provides a natural translation between a particle-level description of the system's dynamics and the corresponding hydrodynamic fields. Yet, the intricate mathematical structure of the Boltzmann equation substantially limits the progress toward a full understanding of this equation by solely analytical means. Here, we propose a general framework to numerically solve the Boltzmann equation for self-propelled particle systems in two spatial dimensions and with arbitrary boundary conditions. We discuss potential applications of this numerical framework to active matter systems and use the algorithm to give a detailed analysis to a model system of self-propelled particles with polar interactions. In accordance with previous studies, we find that spatially homogeneous isotropic and broken-symmetry states populate two distinct regions in parameter space, which are separated by a narrow region of spatially inhomogeneous, density-segregated moving patterns. We find clear evidence that these three regions in parameter space are connected by first-order phase transitions and that the transition between the spatially homogeneous isotropic and polar ordered phases bears striking similarities to liquid-gas phase transitions in equilibrium systems. Within the density-segregated parameter regime, we find a novel stable limit-cycle solution of the Boltzmann equation, which consists of parallel lanes of polar clusters moving in opposite directions, so as to render the overall symmetry of the system's ordered state nematic, despite purely polar interactions on the level of single particles.

  5. An interactive visualization tool for mobile objects

    NASA Astrophysics Data System (ADS)

    Kobayashi, Tetsuo

    Recent advancements in mobile devices---such as Global Positioning System (GPS), cellular phones, car navigation system, and radio-frequency identification (RFID)---have greatly influenced the nature and volume of data about individual-based movement in space and time. Due to the prevalence of mobile devices, vast amounts of mobile objects data are being produced and stored in databases, overwhelming the capacity of traditional spatial analytical methods. There is a growing need for discovering unexpected patterns, trends, and relationships that are hidden in the massive mobile objects data. Geographic visualization (GVis) and knowledge discovery in databases (KDD) are two major research fields that are associated with knowledge discovery and construction. Their major research challenges are the integration of GVis and KDD, enhancing the ability to handle large volume mobile objects data, and high interactivity between the computer and users of GVis and KDD tools. This dissertation proposes a visualization toolkit to enable highly interactive visual data exploration for mobile objects datasets. Vector algebraic representation and online analytical processing (OLAP) are utilized for managing and querying the mobile object data to accomplish high interactivity of the visualization tool. In addition, reconstructing trajectories at user-defined levels of temporal granularity with time aggregation methods allows exploration of the individual objects at different levels of movement generality. At a given level of generality, individual paths can be combined into synthetic summary paths based on three similarity measures, namely, locational similarity, directional similarity, and geometric similarity functions. A visualization toolkit based on the space-time cube concept exploits these functionalities to create a user-interactive environment for exploring mobile objects data. Furthermore, the characteristics of visualized trajectories are exported to be utilized for data mining, which leads to the integration of GVis and KDD. Case studies using three movement datasets (personal travel data survey in Lexington, Kentucky, wild chicken movement data in Thailand, and self-tracking data in Utah) demonstrate the potential of the system to extract meaningful patterns from the otherwise difficult to comprehend collections of space-time trajectories.

  6. Analytical solution for multi-species contaminant transport in finite media with time-varying boundary conditions

    USDA-ARS?s Scientific Manuscript database

    Most analytical solutions available for the equations governing the advective-dispersive transport of multiple solutes undergoing sequential first-order decay reactions have been developed for infinite or semi-infinite spatial domains and steady-state boundary conditions. In this work we present an ...

  7. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    ERIC Educational Resources Information Center

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Partridge Jr, William P.; Choi, Jae-Soon

    By directly resolving spatial and temporal species distributions within operating honeycomb monolith catalysts, spatially resolved capillary inlet mass spectrometry (SpaciMS) provides a uniquely enabling perspective for advancing automotive catalysis. Specifically, the ability to follow the spatiotemporal evolution of reactions throughout the catalyst is a significant advantage over inlet-and-effluent-limited analysis. Intracatalyst resolution elucidates numerous catalyst details including the network and sequence of reactions, clarifying reaction pathways; the relative rates of different reactions and impacts of operating conditions and catalyst state; and reaction dynamics and intermediate species that exist only within the catalyst. These details provide a better understanding of how themore » catalyst functions and have basic and practical benefits; e.g., catalyst system design; strategies for on-road catalyst state assessment, control, and on-board diagnostics; and creating robust and accurate predictive catalyst models. Moreover, such spatiotemporally distributed data provide for critical model assessment, and identification of improvement opportunities that might not be apparent from effluent assessment; i.e., while an incorrectly formulated model may provide correct effluent predictions, one that can accurately predict the spatiotemporal evolution of reactions along the catalyst channels will be more robust, accurate, and reliable. In such ways, intracatalyst diagnostics comprehensively enable improved design and development tools, and faster and lower-cost development of more efficient and durable automotive catalyst systems. Beyond these direct contributions, SpaciMS has spawned and been applied to enable other analytical techniques for resolving transient distributed intracatalyst performance. This chapter focuses on SpaciMS applications and associated catalyst insights and improvements, with specific sections related to lean NOx traps, selective catalytic reduction catalysts, oxidation catalysts, and particulate filters. The objective is to promote broader use and development of intracatalyst analytical methods, and thereby expand the insights resulting from this detailed perspective for advancing automotive catalyst technologies.« less

  9. An Automated Algorithm for Producing Land Cover Information from Landsat Surface Reflectance Data Acquired Between 1984 and Present

    NASA Astrophysics Data System (ADS)

    Rover, J.; Goldhaber, M. B.; Holen, C.; Dittmeier, R.; Wika, S.; Steinwand, D.; Dahal, D.; Tolk, B.; Quenzer, R.; Nelson, K.; Wylie, B. K.; Coan, M.

    2015-12-01

    Multi-year land cover mapping from remotely sensed data poses challenges. Producing land cover products at spatial and temporal scales required for assessing longer-term trends in land cover change are typically a resource-limited process. A recently developed approach utilizes open source software libraries to automatically generate datasets, decision tree classifications, and data products while requiring minimal user interaction. Users are only required to supply coordinates for an area of interest, land cover from an existing source such as National Land Cover Database and percent slope from a digital terrain model for the same area of interest, two target acquisition year-day windows, and the years of interest between 1984 and present. The algorithm queries the Landsat archive for Landsat data intersecting the area and dates of interest. Cloud-free pixels meeting the user's criteria are mosaicked to create composite images for training the classifiers and applying the classifiers. Stratification of training data is determined by the user and redefined during an iterative process of reviewing classifiers and resulting predictions. The algorithm outputs include yearly land cover raster format data, graphics, and supporting databases for further analysis. Additional analytical tools are also incorporated into the automated land cover system and enable statistical analysis after data are generated. Applications tested include the impact of land cover change and water permanence. For example, land cover conversions in areas where shrubland and grassland were replaced by shale oil pads during hydrofracking of the Bakken Formation were quantified. Analytical analysis of spatial and temporal changes in surface water included identifying wetlands in the Prairie Pothole Region of North Dakota with potential connectivity to ground water, indicating subsurface permeability and geochemistry.

  10. Analysis of land suitability for urban development in Ahwaz County in southwestern Iran using fuzzy logic and analytic network process (ANP).

    PubMed

    Malmir, Maryam; Zarkesh, Mir Masoud Kheirkhah; Monavari, Seyed Masoud; Jozi, Seyed Ali; Sharifi, Esmail

    2016-08-01

    The ever-increasing development of cities due to population growth and migration has led to unplanned constructions and great changes in urban spatial structure, especially the physical development of cities in unsuitable places, which requires conscious guidance and fundamental organization. It is therefore necessary to identify suitable sites for future development of cities and prevent urban sprawl as one of the main concerns of urban managers and planners. In this study, to determine the suitable sites for urban development in the county of Ahwaz, the effective biophysical and socioeconomic criteria (including 27 sub-criteria) were initially determined based on literature review and interviews with certified experts. In the next step, a database of criteria and sub-criteria was prepared. Standardization of values and unification of scales in map layers were done using fuzzy logic. The criteria and sub-criteria were weighted by analytic network process (ANP) in the Super Decision software. Next, the map layers were overlaid using weighted linear combination (WLC) in the GIS software. According to the research findings, the final land suitability map was prepared with five suitability classes of very high (5.86 %), high (31.93 %), medium (38.61 %), low (17.65 %), and very low (5.95 %). Also, in terms of spatial distribution, suitable lands for urban development are mainly located in the central and southern parts of the Ahwaz County. It is expected that integration of fuzzy logic and ANP model will provide a better decision support tool compared with other models. The developed model can also be used in the land suitability analysis of other cities.

  11. a Geo-Visual Analytics Approach to Biological Shepherding: Modelling Animal Movements and Impacts

    NASA Astrophysics Data System (ADS)

    Benke, K. K.; Sheth, F.; Betteridge, K.; Pettit, C. J.; Aurambout, J.-P.

    2012-07-01

    The lamb industry in Victoria is a significant component of the state economy with annual exports in the vicinity of 1 billion. GPS and visualisation tools can be used to monitor grazing animal movements at the farm scale and observe interactions with the environment. Modelling the spatial-temporal movements of grazing animals in response to environmental conditions provides input for the design of paddocks with the aim of improving management procedures, animal performance and animal welfare. The term "biological shepherding" is associated with the re-design of environmental conditions and the analysis of responses from grazing animals. The combination of biological shepherding with geo-visual analytics (geo-spatial data analysis with visualisation) provides a framework for improving landscape design and supports research in grazing behaviour in variable landscapes, heat stress avoidance behaviour during summer months, and modelling excreta distributions (with respect to nitrogen emissions and nitrogen return for fertilising the paddock). Nitrogen losses due to excreta are mainly in the form of gaseous emissions to the atmosphere and leaching into the groundwater. In this study, background and context are provided in the case of biological shepherding and tracking animal movements. Examples are provided of recent applications in regional Australia and New Zealand. Based on experimental data and computer simulation, and using data visualisation and feature extraction, it was demonstrated that livestock excreta are not always randomly located, but concentrated around localised gathering points, sometimes separated by the nature of the excretion. Farmers require information on the nitrogen losses in order to reduce emissions to meet local and international nitrogen leaching and greenhouse gas targets and to improve the efficiency of nutrient management.

  12. Data-Driven Geospatial Visual Analytics for Real-Time Urban Flooding Decision Support

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Hill, D.; Rodriguez, A.; Marini, L.; Kooper, R.; Myers, J.; Wu, X.; Minsker, B. S.

    2009-12-01

    Urban flooding is responsible for the loss of life and property as well as the release of pathogens and other pollutants into the environment. Previous studies have shown that spatial distribution of intense rainfall significantly impacts the triggering and behavior of urban flooding. However, no general purpose tools yet exist for deriving rainfall data and rendering them in real-time at the resolution of hydrologic units used for analyzing urban flooding. This paper presents a new visual analytics system that derives and renders rainfall data from the NEXRAD weather radar system at the sewershed (i.e. urban hydrologic unit) scale in real-time for a Chicago stormwater management project. We introduce a lightweight Web 2.0 approach which takes advantages of scientific workflow management and publishing capabilities developed at NCSA (National Center for Supercomputing Applications), streaming data-aware semantic content management repository, web-based Google Earth/Map and time-aware KML (Keyhole Markup Language). A collection of polygon-based virtual sensors is created from the NEXRAD Level II data using spatial, temporal and thematic transformations at the sewershed level in order to produce persistent virtual rainfall data sources for the animation. Animated color-coded rainfall map in the sewershed can be played in real-time as a movie using time-aware KML inside the web browser-based Google Earth for visually analyzing the spatiotemporal patterns of the rainfall intensity in the sewershed. Such system provides valuable information for situational awareness and improved decision support during extreme storm events in an urban area. Our further work includes incorporating additional data (such as basement flooding events data) or physics-based predictive models that can be used for more integrated data-driven decision support.

  13. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  14. IBM’s Health Analytics and Clinical Decision Support

    PubMed Central

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  15. Difet: Distributed Feature Extraction Tool for High Spatial Resolution Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Eken, S.; Aydın, E.; Sayar, A.

    2017-11-01

    In this paper, we propose distributed feature extraction tool from high spatial resolution remote sensing images. Tool is based on Apache Hadoop framework and Hadoop Image Processing Interface. Two corner detection (Harris and Shi-Tomasi) algorithms and five feature descriptors (SIFT, SURF, FAST, BRIEF, and ORB) are considered. Robustness of the tool in the task of feature extraction from LandSat-8 imageries are evaluated in terms of horizontal scalability.

  16. Multi-objective spatial tools to inform maritime spatial planning in the Adriatic Sea.

    PubMed

    Depellegrin, Daniel; Menegon, Stefano; Farella, Giulio; Ghezzo, Michol; Gissi, Elena; Sarretta, Alessandro; Venier, Chiara; Barbanti, Andrea

    2017-12-31

    This research presents a set of multi-objective spatial tools for sea planning and environmental management in the Adriatic Sea Basin. The tools address four objectives: 1) assessment of cumulative impacts from anthropogenic sea uses on environmental components of marine areas; 2) analysis of sea use conflicts; 3) 3-D hydrodynamic modelling of nutrient dispersion (nitrogen and phosphorus) from riverine sources in the Adriatic Sea Basin and 4) marine ecosystem services capacity assessment from seabed habitats based on an ES matrix approach. Geospatial modelling results were illustrated, analysed and compared on country level and for three biogeographic subdivisions, Northern-Central-Southern Adriatic Sea. The paper discusses model results for their spatial implications, relevance for sea planning, limitations and concludes with an outlook towards the need for more integrated, multi-functional tools development for sea planning. Copyright © 2017. Published by Elsevier B.V.

  17. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.

  18. Multispectral scanner system parameter study and analysis software system description, volume 2

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A. (Principal Investigator); Mobasseri, B. G.; Wiersma, D. J.; Wiswell, E. R.; Mcgillem, C. D.; Anuta, P. E.

    1978-01-01

    The author has identified the following significant results. The integration of the available methods provided the analyst with the unified scanner analysis package (USAP), the flexibility and versatility of which was superior to many previous integrated techniques. The USAP consisted of three main subsystems; (1) a spatial path, (2) a spectral path, and (3) a set of analytic classification accuracy estimators which evaluated the system performance. The spatial path consisted of satellite and/or aircraft data, data correlation analyzer, scanner IFOV, and random noise model. The output of the spatial path was fed into the analytic classification and accuracy predictor. The spectral path consisted of laboratory and/or field spectral data, EXOSYS data retrieval, optimum spectral function calculation, data transformation, and statistics calculation. The output of the spectral path was fended into the stratified posterior performance estimator.

  19. Data Basin Aquatic Center: expanding access to aquatic conservation data, analysis tools, people and practical answers

    NASA Astrophysics Data System (ADS)

    Osborne-Gowey, J.; Strittholt, J.; Bergquist, J.; Ward, B. C.; Sheehan, T.; Comendant, T.; Bachelet, D. M.

    2009-12-01

    The world’s aquatic resources are experiencing anthropogenic pressures on an unprecedented scale and aquatic organisms are experiencing widespread population changes and ecosystem-scale habitat alterations. Climate change is likely to exacerbate these threats, in some cases reducing the range of native North American fishes by 20-100% (depending on the location of the population and the model assumptions). Scientists around the globe are generating large volumes of data that vary in quality, format, supporting documentation, and accessibility. Moreover, diverse models are being run at various temporal and spatial scales as scientists attempt to understand previous (and project future) human impacts to aquatic species and their habitats. Conservation scientists often struggle to synthesize this wealth of information for developing practical on-the-ground management strategies. As a result, the best available science is often not utilized in the decision-making and adaptive management processes. As aquatic conservation problems around the globe become more serious and the demand to solve them grows more urgent, scientists and land-use managers need a new way to bring strategic, science-based, and action-oriented approaches to aquatic conservation. The Conservation Biology Institute (CBI), with partners such as ESRI, is developing an Aquatic Center as part of a dynamic, web-based resource (Data Basin; http: databasin.org) that centralizes usable aquatic datasets and provides analytical tools to visualize, analyze, and communicate findings for practical applications. To illustrate its utility, we present example datasets of varying spatial scales and synthesize multiple studies to arrive at novel solutions to aquatic threats.

  20. The WATERS Network Conceptual Design

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Schnoor, J. L.; Haas, C. N.; Minsker, B.; Bales, R. C.; Hooper, R. P.

    2007-12-01

    The Water and Environmental Research Systems (WATERS) Network is a collaboration between the water- related Earth science and environmental engineering communities around a series of grand-challenge and strategic research questions. The vision of WATERS Network is to transform our ability to predict the quality, quantity and use of our nation's waters. The real transformative power of the WATERS Network lies in its ability to put sustained, spatially extensive, high-frequency information in the hands of researchers, information that will resolve how natural and engineered systems respond to perturbations. This knowledge then improves process understanding, and provides better predictive capabilities. In order to do this, the WATERS Network will create a national network of observatories equipped with multimedia sensors located across a range of different climatic and geographic regions and linked together by a common cyberinfrastructure. The network will incorporate existing and new environmental and socioeconomic data at various spatial and temporal scales. Data will include physical, chemical, and biological information to characterize surface water, ground water, land, socioeconomic and behavioral information to better frame human influences. Real-time data resources will be assimilated into an information system (cyberinfrastructure) that supports analytical tools and models, networking tools, and education and outreach services. The WATERS Network is an Environmental Observatory initiative of the U.S. National Science Foundation, developed in response to community planning over the past 10 years. It is being developed for the foundation's Engineering and Geosciences Directorates to jointly propose for funding consideration through the foundation's Major Research Equipment and Facilities Construction (MREFC) account. This presentation will summarize the current status of planning for the WATERS Network.

  1. Data Intensive Computing on Amazon Web Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magana-Zook, S. A.

    The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less

  2. Haze Gray Paint and the U.S. Navy: A Procurement Process Review

    DTIC Science & Technology

    2017-12-01

    support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane

  3. Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools

    DTIC Science & Technology

    2014-01-14

    Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and

  4. Visualization and Analytics Software Tools for Peregrine System |

    Science.gov Websites

    R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open

  5. Dynamic Vision for Control

    DTIC Science & Technology

    2006-07-27

    unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry

  6. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    DTIC Science & Technology

    2011-03-28

    Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory

  7. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  8. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  9. Orbital and spin parts of energy currents for electromagnetic waves through spatially inhomogeneous media

    NASA Astrophysics Data System (ADS)

    Lee, Hyoung-In; Mok, Jinsik

    2018-05-01

    We investigate electromagnetic waves propagating through non-magnetic and loss-free dielectric media, but with spatially inhomogeneous refractive indices. We derive hence a set of analytic formulae for conservation laws and energy-current (Poynting) vector. As a result, we deduce that the energy-current vector cannot be neatly separated into its orbital and spin parts in contrast to the cases with spatially homogeneous media. In addition, we present physical interpretations of the two additional terms due to spatial material inhomogeneity.

  10. Development of high-spatial and high-mass resolution mass spectrometric imaging (MSI) and its application to the study of small metabolites and endogenous molecules of plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jun, Ji Hyun

    High-spatial and high-mass resolution laser desorption ionization (LDI) mass spectrometric (MS) imaging technology was developed for the attainment of MS images of higher quality containing more information on the relevant cellular and molecular biology in unprecedented depth. The distribution of plant metabolites is asymmetric throughout the cells and tissues, and therefore the increase in the spatial resolution was pursued to reveal the localization of plant metabolites at the cellular level by MS imaging. For achieving high-spatial resolution, the laser beam size was reduced by utilizing an optical fiber with small core diameter (25 μm) in a vacuum matrix-assisted laser desorptionmore » ionization-linear ion trap (vMALDI-LTQ) mass spectrometer. Matrix application was greatly improved using oscillating capillary nebulizer. As a result, single cell level spatial resolution of ~ 12 μm was achieved. MS imaging at this high spatial resolution was directly applied to a whole Arabidopsis flower and the substructures of an anther and single pollen grains at the stigma and anther were successfully visualized. MS imaging of high spatial resolution was also demonstrated to the secondary roots of Arabidopsis thaliana and a high degree of localization of detected metabolites was successfully unveiled. This was the first MS imaging on the root for molecular species. MS imaging with high mass resolution was also achieved by utilizing the LTQ-Orbitrap mass spectrometer for the direct identification of the surface metabolites on the Arabidopsis stem and root and differentiation of isobaric ions having the same nominal mass with no need of tandem mass spectrometry (MS/MS). MS imaging at high-spatial and high-mass resolution was also applied to cer1 mutant of the model system Arabidopsis thaliana to demonstrate its usefulness in biological studies and reveal associated metabolite changes in terms of spatial distribution and/or abundances compared to those of wild-type. The spatial distribution of targeted metabolites, mainly waxes and flavonoids, was systematically explored on various organs, including flowers, leaves, stems, and roots at high spatial resolution of ~ 12-50 μm and the changes in the abundance level of these metabolites were monitored on the cer1 mutant with respect to the wild-type. This study revealed the metabolic biology of CER1 gene on each individual organ level with very detailed high spatial resolution. The separate MS images of isobaric metabolites, i.e. C29 alkane vs. C28 aldehyde could be constructed on both genotypes from MS imaging at high mass resolution. This allows tracking of abundance changes for those compounds along with the genetic mutation, which is not achievable with low mass resolution mass spectrometry. This study supported previous hypothesis of molecular function of CER1 gene as aldehyde decarbonylase, especially by displaying hyper accumulation of aldehydes and C30 fatty acid and decrease in abundance of alkanes and ketones in several plant organs of cer1 mutant. The scope of analytes was further directed toward internal cell metabolites from the surface metabolites of the plant. MS profiling and imaging of internal cell metabolites were performed on the vibratome section of Arabidopsis leaf. Vibratome sectioning of the leaf was first conducted to remove the surface cuticle layer and it was followed by enzymatic treatment of the section to induce the digestion of primary cell walls, middle lamella, and expose the internal cells underneath to the surface for detection with the laser by LDI-MS. The subsequent MS imaging onto the enzymatically treated vibratome section allowed us to map the distribution of the metabolites in the internal cell layers, linolenic acid (C18:3 FA) and linoleic acid (C18:2 FA). The development of an assay for relative quantification of analytes at the single subcellular/organelle level by LDI-MS imaging was attempted and both plausibility and significant obstacles were seen. As a test system, native plant organelle, chloroplasts isolated from the spinach leaves were used and the localization of isolated chloroplasts dispersed on the target plate in low density was monitored by detecting the ion signal of chlorophyll a (Chl a) degradation products such as pheophytin a and pheophobide a by LDI-MS imaging in combination with fluorescence microscopy. The number of chloroplasts and their localization visualized in the MS image exactly matched those in the fluorescence image especially at low density, which first shows the plausibility of single-organelle level quantification of analytes by LDI-MS. The accumulation level of Chl a within a single chloroplast detected by LDI-MS was compared to the fluorescence signal on a pixel-to-pixel basis to further confirm the correlations of the accumulation levels measured by two methods. The proportional correlation was observed only for the chloroplasts which do not show the significant leakage of chlorophyll indicated by MS ion signal of Chl a degradation products and fluorescence signal, which was presumably caused by the prior fluorescence measurement before MS imaging. Further investigation is necessary to make this method more complete and develop LDI-MS imaging as an effective analytical tool to evaluate a relative accumulation of analytes of interest at the single subcellular/organelle level.« less

  11. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    NASA Astrophysics Data System (ADS)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on https://github.com/planetserver References: Baumann, P., et al. (2015) Big Data Analytics for Earth Sciences: the EarthServer approach, International Journal of Digital Earth, doi: 10.1080/17538947.2014.1003106. Cantini, F. et al. (2014) Geophys. Res. Abs., Vol. 16, #EGU2014-3784. Gaddis, L., and T. Hare (2015), Status of tools and data for planetary research, Eos, 96, dos: 10.1029/2015EO041125. Hogan, P., 2011. NASA World Wind: Infrastructure for Spatial Data. Technical report. Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. Oosthoek, J.H.P, et al. (2013) Advances in Space Research. doi: 10.1016/j.asr.2013.07.002. Rossi, A. P., et al. (2014) PlanetServer/EarthServer: Big Data analytics in Planetary Science. Geophysical Research Abstracts, Vol. 16, #EGU2014-5149.

  12. RipleyGUI: software for analyzing spatial patterns in 3D cell distributions

    PubMed Central

    Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik

    2013-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544

  13. Distributed Generation Interconnection Collaborative | NREL

    Science.gov Websites

    , reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability

  14. Investigation and Evaluation of the open source ETL tools GeoKettle and Talend Open Studio in terms of their ability to process spatial data

    NASA Astrophysics Data System (ADS)

    Kuhnert, Kristin; Quedenau, Jörn

    2016-04-01

    Integration and harmonization of large spatial data sets is not only since the introduction of the spatial data infrastructure INSPIRE a big issue. The process of extracting and combining spatial data from heterogeneous source formats, transforming that data to obtain the required quality for particular purposes and loading it into a data store, are common tasks. The procedure of Extraction, Transformation and Loading of data is called ETL process. Geographic Information Systems (GIS) can take over many of these tasks but often they are not suitable for processing large datasets. ETL tools can make the implementation and execution of ETL processes convenient and efficient. One reason for choosing ETL tools for data integration is that they ease maintenance because of a clear (graphical) presentation of the transformation steps. Developers and administrators are provided with tools for identification of errors, analyzing processing performance and managing the execution of ETL processes. Another benefit of ETL tools is that for most tasks no or only little scripting skills are required so that also researchers without programming background can easily work with it. Investigations on ETL tools for business approaches are available for a long time. However, little work has been published on the capabilities of those tools to handle spatial data. In this work, we review and compare the open source ETL tools GeoKettle and Talend Open Studio in terms of processing spatial data sets of different formats. For evaluation, ETL processes are performed with both software packages based on air quality data measured during the BÄRLIN2014 Campaign initiated by the Institute for Advanced Sustainability Studies (IASS). The aim of the BÄRLIN2014 Campaign is to better understand the sources and distribution of particulate matter in Berlin. The air quality data are available in heterogeneous formats because they were measured with different instruments. For further data analysis, the instrument data has been complemented by other georeferenced data provided by the local environmental authorities. This includes both vector and raster data on e.g. land use categories or building heights, extracted from flat files and OGC-compliant web services. The requirements on the ETL tools are now for instance the extraction of different input datasets like Web Feature Services or vector datasets and the loading of those into databases. The tools also have to manage transformations on spatial datasets like to work with spatial functions (e.g. intersection, union) or change spatial reference systems. Preliminary results suggest that many complex transformation tasks could be accomplished with the existing set of components from both software tools, while there are still many gaps in the range of available features. Both ETL tools differ in functionality and in the way of implementation of various steps. For some tasks no predefined components are available at all, which could partly be compensated by the use of the respective API (freely configurable components in Java or JavaScript).

  15. The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science

    NASA Astrophysics Data System (ADS)

    Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.

    2017-12-01

    The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.

  16. Analytic score distributions for a spatially continuous tridirectional Monte Carol transport problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Booth, T.E.

    1996-01-01

    The interpretation of the statistical error estimates produced by Monte Carlo transport codes is still somewhat of an art. Empirically, there are variance reduction techniques whose error estimates are almost always reliable, and there are variance reduction techniques whose error estimates are often unreliable. Unreliable error estimates usually result from inadequate large-score sampling from the score distribution`s tail. Statisticians believe that more accurate confidence interval statements are possible if the general nature of the score distribution can be characterized. Here, the analytic score distribution for the exponential transform applied to a simple, spatially continuous Monte Carlo transport problem is provided.more » Anisotropic scattering and implicit capture are included in the theory. In large part, the analytic score distributions that are derived provide the basis for the ten new statistical quality checks in MCNP.« less

  17. Analytic double product integrals for all-frequency relighting.

    PubMed

    Wang, Rui; Pan, Minghao; Chen, Weifeng; Ren, Zhong; Zhou, Kun; Hua, Wei; Bao, Hujun

    2013-07-01

    This paper presents a new technique for real-time relighting of static scenes with all-frequency shadows from complex lighting and highly specular reflections from spatially varying BRDFs. The key idea is to depict the boundaries of visible regions using piecewise linear functions, and convert the shading computation into double product integrals—the integral of the product of lighting and BRDF on visible regions. By representing lighting and BRDF with spherical Gaussians and approximating their product using Legendre polynomials locally in visible regions, we show that such double product integrals can be evaluated in an analytic form. Given the precomputed visibility, our technique computes the visibility boundaries on the fly at each shading point, and performs the analytic integral to evaluate the shading color. The result is a real-time all-frequency relighting technique for static scenes with dynamic, spatially varying BRDFs, which can generate more accurate shadows than the state-of-the-art real-time PRT methods.

  18. Analytical theory for the dark-soliton interaction in nonlocal nonlinear materials with an arbitrary degree of nonlocality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kong Qian; Department of Physics, Shanghai University, Shanghai 200444; Wang, Q.

    2010-07-15

    We investigate theoretically the interaction of dark solitons in materials with a spatially nonlocal nonlinearity. In particular we do this analytically and for arbitrary degree of nonlocality. We employ the variational technique to show that nonlocality induces an attractive force in the otherwise repulsive soliton interaction.

  19. Local Spatial Obesity Analysis and Estimation Using Online Social Network Sensors.

    PubMed

    Sun, Qindong; Wang, Nan; Li, Shancang; Zhou, Hongyi

    2018-03-15

    Recently, the online social networks (OSNs) have received considerable attentions as a revolutionary platform to offer users massive social interaction among users that enables users to be more involved in their own healthcare. The OSNs have also promoted increasing interests in the generation of analytical, data models in health informatics. This paper aims at developing an obesity identification, analysis, and estimation model, in which each individual user is regarded as an online social network 'sensor' that can provide valuable health information. The OSN-based obesity analytic model requires each sensor node in an OSN to provide associated features, including dietary habit, physical activity, integral/incidental emotions, and self-consciousness. Based on the detailed measurements on the correlation of obesity and proposed features, the OSN obesity analytic model is able to estimate the obesity rate in certain urban areas and the experimental results demonstrate a high success estimation rate. The measurements and estimation experimental findings created by the proposed obesity analytic model show that the online social networks could be used in analyzing the local spatial obesity problems effectively. Copyright © 2018. Published by Elsevier Inc.

  20. A semi-analytical study of positive corona discharge in wire-plane electrode configuration

    NASA Astrophysics Data System (ADS)

    Yanallah, K.; Pontiga, F.; Chen, J. H.

    2013-08-01

    Wire-to-plane positive corona discharge in air has been studied using an analytical model of two species (electrons and positive ions). The spatial distributions of electric field and charged species are obtained by integrating Gauss's law and the continuity equations of species along the Laplacian field lines. The experimental values of corona current intensity and applied voltage, together with Warburg's law, have been used to formulate the boundary condition for the electron density on the corona wire. To test the accuracy of the model, the approximate electric field distribution has been compared with the exact numerical solution obtained from a finite element analysis. A parametrical study of wire-to-plane corona discharge has then been undertaken using the approximate semi-analytical solutions. Thus, the spatial distributions of electric field and charged particles have been computed for different values of the gas pressure, wire radius and electrode separation. Also, the two dimensional distribution of ozone density has been obtained using a simplified plasma chemistry model. The approximate semi-analytical solutions can be evaluated in a negligible computational time, yet provide precise estimates of corona discharge variables.

  1. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Analytical treatment of self-phase-modulation beyond the slowly varying envelope approximation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Syrchin, M.S.; Zheltikov, A.M.; International Laser Center, M.V. Lomonosov Moscow State University, 119899 Moscow

    Analytical treatment of the self-phase-modulation of an ultrashort light pulse is extended beyond the slowly varying envelope approximation. The resulting wave equation is modified to include corrections to self-phase-modulation due to higher-order spatial and temporal derivatives. Analytical solutions are found in the limiting regimes of high nonlinearities and very short pulses. Our results reveal features that can significantly impact both pulse shape and the evolution of the phase.

  3. The Role of Motor Learning in Spatial Adaptation near a Tool

    PubMed Central

    Brown, Liana E.; Doole, Robert; Malfait, Nicole

    2011-01-01

    Some visual-tactile (bimodal) cells have visual receptive fields (vRFs) that overlap and extend moderately beyond the skin of the hand. Neurophysiological evidence suggests, however, that a vRF will grow to encompass a hand-held tool following active tool use but not after passive holding. Why does active tool use, and not passive holding, lead to spatial adaptation near a tool? We asked whether spatial adaptation could be the result of motor or visual experience with the tool, and we distinguished between these alternatives by isolating motor from visual experience with the tool. Participants learned to use a novel, weighted tool. The active training group received both motor and visual experience with the tool, the passive training group received visual experience with the tool, but no motor experience, and finally, a no-training control group received neither visual nor motor experience using the tool. After training, we used a cueing paradigm to measure how quickly participants detected targets, varying whether the tool was placed near or far from the target display. Only the active training group detected targets more quickly when the tool was placed near, rather than far, from the target display. This effect of tool location was not present for either the passive-training or control groups. These results suggest that motor learning influences how visual space around the tool is represented. PMID:22174944

  4. Template based rotation: A method for functional connectivity analysis with a priori templates☆

    PubMed Central

    Schultz, Aaron P.; Chhatwal, Jasmeer P.; Huijbers, Willem; Hedden, Trey; van Dijk, Koene R.A.; McLaren, Donald G.; Ward, Andrew M.; Wigman, Sarah; Sperling, Reisa A.

    2014-01-01

    Functional connectivity magnetic resonance imaging (fcMRI) is a powerful tool for understanding the network level organization of the brain in research settings and is increasingly being used to study large-scale neuronal network degeneration in clinical trial settings. Presently, a variety of techniques, including seed-based correlation analysis and group independent components analysis (with either dual regression or back projection) are commonly employed to compute functional connectivity metrics. In the present report, we introduce template based rotation,1 a novel analytic approach optimized for use with a priori network parcellations, which may be particularly useful in clinical trial settings. Template based rotation was designed to leverage the stable spatial patterns of intrinsic connectivity derived from out-of-sample datasets by mapping data from novel sessions onto the previously defined a priori templates. We first demonstrate the feasibility of using previously defined a priori templates in connectivity analyses, and then compare the performance of template based rotation to seed based and dual regression methods by applying these analytic approaches to an fMRI dataset of normal young and elderly subjects. We observed that template based rotation and dual regression are approximately equivalent in detecting fcMRI differences between young and old subjects, demonstrating similar effect sizes for group differences and similar reliability metrics across 12 cortical networks. Both template based rotation and dual-regression demonstrated larger effect sizes and comparable reliabilities as compared to seed based correlation analysis, though all three methods yielded similar patterns of network differences. When performing inter-network and sub-network connectivity analyses, we observed that template based rotation offered greater flexibility, larger group differences, and more stable connectivity estimates as compared to dual regression and seed based analyses. This flexibility owes to the reduced spatial and temporal orthogonality constraints of template based rotation as compared to dual regression. These results suggest that template based rotation can provide a useful alternative to existing fcMRI analytic methods, particularly in clinical trial settings where predefined outcome measures and conserved network descriptions across groups are at a premium. PMID:25150630

  5. A Comprehensive Tool and Analytical Pathway for Differential Molecular Profiling and Biomarker Discovery

    DTIC Science & Technology

    2014-10-20

    three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY

  6. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  7. Modeling space-time correlations of velocity fluctuations in wind farms

    NASA Astrophysics Data System (ADS)

    Lukassen, Laura J.; Stevens, Richard J. A. M.; Meneveau, Charles; Wilczek, Michael

    2018-07-01

    An analytical model for the streamwise velocity space-time correlations in turbulent flows is derived and applied to the special case of velocity fluctuations in large wind farms. The model is based on the Kraichnan-Tennekes random sweeping hypothesis, capturing the decorrelation in time while including a mean wind velocity in the streamwise direction. In the resulting model, the streamwise velocity space-time correlation is expressed as a convolution of the pure space correlation with an analytical temporal decorrelation kernel. Hence, the spatio-temporal structure of velocity fluctuations in wind farms can be derived from the spatial correlations only. We then explore the applicability of the model to predict spatio-temporal correlations in turbulent flows in wind farms. Comparisons of the model with data from a large eddy simulation of flow in a large, spatially periodic wind farm are performed, where needed model parameters such as spatial and temporal integral scales and spatial correlations are determined from the large eddy simulation. Good agreement is obtained between the model and large eddy simulation data showing that spatial data may be used to model the full temporal structure of fluctuations in wind farms.

  8. Resolution of VTI anisotropy with elastic full-waveform inversion: theory and basic numerical examples

    NASA Astrophysics Data System (ADS)

    Podgornova, O.; Leaney, S.; Liang, L.

    2018-07-01

    Extracting medium properties from seismic data faces some limitations due to the finite frequency content of the data and restricted spatial positions of the sources and receivers. Some distributions of the medium properties make low impact on the data (including none). If these properties are used as the inversion parameters, then the inverse problem becomes overparametrized, leading to ambiguous results. We present an analysis of multiparameter resolution for the linearized inverse problem in the framework of elastic full-waveform inversion. We show that the spatial and multiparameter sensitivities are intertwined and non-sensitive properties are spatial distributions of some non-trivial combinations of the conventional elastic parameters. The analysis accounts for the Hessian information and frequency content of the data; it is semi-analytical (in some scenarios analytical), easy to interpret and enhances results of the widely used radiation pattern analysis. Single-type scattering is shown to have limited sensitivity, even for full-aperture data. Finite-frequency data lose multiparameter sensitivity at smooth and fine spatial scales. Also, we establish ways to quantify a spatial-multiparameter coupling and demonstrate that the theoretical predictions agree well with the numerical results.

  9. Spatial data analytics on heterogeneous multi- and many-core parallel architectures using python

    USGS Publications Warehouse

    Laura, Jason R.; Rey, Sergio J.

    2017-01-01

    Parallel vector spatial analysis concerns the application of parallel computational methods to facilitate vector-based spatial analysis. The history of parallel computation in spatial analysis is reviewed, and this work is placed into the broader context of high-performance computing (HPC) and parallelization research. The rise of cyber infrastructure and its manifestation in spatial analysis as CyberGIScience is seen as a main driver of renewed interest in parallel computation in the spatial sciences. Key problems in spatial analysis that have been the focus of parallel computing are covered. Chief among these are spatial optimization problems, computational geometric problems including polygonization and spatial contiguity detection, the use of Monte Carlo Markov chain simulation in spatial statistics, and parallel implementations of spatial econometric methods. Future directions for research on parallelization in computational spatial analysis are outlined.

  10. Exploring hyperspectral imaging data sets with topological data analysis.

    PubMed

    Duponchel, Ludovic

    2018-02-13

    Analytical chemistry is rapidly changing. Indeed we acquire always more data in order to go ever further in the exploration of complex samples. Hyperspectral imaging has not escaped this trend. It quickly became a tool of choice for molecular characterisation of complex samples in many scientific domains. The main reason is that it simultaneously provides spectral and spatial information. As a result, chemometrics has provided many exploration tools (PCA, clustering, MCR-ALS …) well-suited for such data structure at early stage. However we are today facing a new challenge considering the always increasing number of pixels in the data cubes we have to manage. The idea is therefore to introduce a new paradigm of Topological Data Analysis in order explore hyperspectral imaging data sets highlighting its nice properties and specific features. With this paper, we shall also point out the fact that conventional chemometric methods are often based on variance analysis or simply impose a data model which implicitly defines the geometry of the data set. Thus we will show that it is not always appropriate in the framework of hyperspectral imaging data sets exploration. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Ecotracer: analyzing concentration of contaminants and radioisotopes in an aquatic spatial-dynamic food web model.

    PubMed

    Walters, William J; Christensen, Villy

    2018-01-01

    Ecotracer is a tool in the Ecopath with Ecosim (EwE) software package used to simulate and analyze the transport of contaminants such as methylmercury or radiocesium through aquatic food webs. Ecotracer solves the contaminant dynamic equations simultaneously with the biomass dynamic equations in Ecosim/Ecospace. In this paper, we give a detailed description of the Ecotracer module and analyze the performance on two problems of differing complexity. Ecotracer was modified from previous versions to more accurately model contaminant excretion, and new numerical integration algorithms were implemented to increase accuracy and robustness. To test the mathematical robustness of the computational algorithm, Ecotracer was tested on a simple problem for which we know an analytical solution. These results demonstrated the effectiveness of the program numerics. A much more complex model, the release of the cesium radionuclide 137 Cs from the Fukushima Dai-ichi nuclear accident, was also modeled and analyzed. A comparison of the Ecotracer results to sampled 137 Cs measurements in the coastal ocean area around Fukushima show the promise of the tool but also highlight some important limitations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. An improved method for estimating capillary pressure from 3D microtomography images and its application to the study of disconnected nonwetting phase

    NASA Astrophysics Data System (ADS)

    Li, Tianyi; Schlüter, Steffen; Dragila, Maria Ines; Wildenschild, Dorthe

    2018-04-01

    We present an improved method for estimating interfacial curvatures from x-ray computed microtomography (CMT) data that significantly advances the potential for this tool to unravel the mechanisms and phenomena associated with multi-phase fluid motion in porous media. CMT data, used to analyze the spatial distribution and capillary pressure-saturation (Pc-S) relationships of liquid phases, requires accurate estimates of interfacial curvature. Our improved method for curvature estimation combines selective interface modification and distance weighting approaches. It was verified against synthetic (analytical computer-generated) and real image data sets, demonstrating a vast improvement over previous methods. Using this new tool on a previously published data set (multiphase flow) yielded important new insights regarding the pressure state of the disconnected nonwetting phase during drainage and imbibition. The trapped and disconnected non-wetting phase delimits its own hysteretic Pc-S curve that inhabits the space within the main hysteretic Pc-S loop of the connected wetting phase. Data suggests that the pressure of the disconnected, non-wetting phase is strongly modified by the pore geometry rather than solely by the bulk liquid phase that surrounds it.

  13. Modeling of the Global Water Cycle - Analytical Models

    Treesearch

    Yongqiang Liu; Roni Avissar

    2005-01-01

    Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...

  14. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  15. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  16. Stability of equilibrium solutions of Hamiltonian systems with n-degrees of freedom and single resonance in the critical case

    NASA Astrophysics Data System (ADS)

    dos Santos, Fabio; Vidal, Claudio

    2018-04-01

    In this paper we give new results for the stability of one equilibrium solution of an autonomous analytic Hamiltonian system in a neighborhood of the equilibrium point with n-degrees of freedom. Our Main Theorem generalizes several results existing in the literature and mainly we give information in the critical cases (i.e., the condition of stability and instability is not fulfilled). In particular, our Main Theorem provides necessary and sufficient conditions for stability of the equilibrium solutions under the existence of a single resonance. Using analogous tools used in the Main Theorem for the critical case, we study the stability or instability of degenerate equilibrium points in Hamiltonian systems with one degree of freedom. We apply our results to the stability of Hamiltonians of the type of cosmological models as in planar as in the spatial case.

  17. Investigation of burn effect on skin using simultaneous Raman-Brillouin spectroscopy, and fluorescence microspectroscopy

    NASA Astrophysics Data System (ADS)

    Coker, Zachary; Meng, Zhaokai; Troyanova-Wood, Maria; Traverso, Andrew; Ballmann, Charles; Petrov, Georgi; Ibey, Bennett L.; Yakovlev, Vladislav

    2017-02-01

    Burns are thermal injuries that can completely damage or at least compromise the protective function of skin, and affect the ability of tissues to manage moisture. Burn-damaged tissues exhibit lower elasticity than healthy tissues, due to significantly reduced water concentrations and plasma retention. Current methods for determining burn intensity are limited to visual inspection, and potential hospital x-ray examination. We present a unique confocal microscope capable of measuring Raman and Brillouin spectra simultaneously, with concurrent fluorescence investigation from a single spatial location, and demonstrate application by investigating and characterizing the properties of burn-afflicted tissue on chicken skin model. Raman and Brillouin scattering offer complementary information about a material's chemical and mechanical structure, while fluorescence can serve as a useful diagnostic indicator and imaging tool. The developed instrument has the potential for very diverse analytical applications in basic biomedical science and biomedical diagnostics and imaging.

  18. What changed during the axial age: Cognitive styles or reward systems?

    PubMed Central

    Baumard, Nicolas; Hyafil, Alexandre; Boyer, Pascal

    2015-01-01

    The ‘Axial Age’ (500–300 BCE) refers to the period during which most of the main religious and spiritual traditions emerged in Eurasian societies. Although the Axial Age has recently been the focus of increasing interest,1-5 its existence is still very much in dispute. The main reason for questioning the existence of the Axial Age is that its nature, as well as its spatial and temporal boundaries, remain very much unclear. The standard approach to the Axial Age defines it as a change of cognitive style, from a narrative and analogical style to a more analytical and reflective style, probably due to the increasing use of external memory tools. Our recent research suggests an alternative hypothesis, namely a change in reward orientation, from a short-term materialistic orientation to a long-term spiritual one.6 Here, we briefly discuss these 2 alternative definitions of the Axial Age. PMID:27066164

  19. Sample Preparation for Mass Spectrometry Imaging of Plant Tissues: A Review

    PubMed Central

    Dong, Yonghui; Li, Bin; Malitsky, Sergey; Rogachev, Ilana; Aharoni, Asaph; Kaftan, Filip; Svatoš, Aleš; Franceschi, Pietro

    2016-01-01

    Mass spectrometry imaging (MSI) is a mass spectrometry based molecular ion imaging technique. It provides the means for ascertaining the spatial distribution of a large variety of analytes directly on tissue sample surfaces without any labeling or staining agents. These advantages make it an attractive molecular histology tool in medical, pharmaceutical, and biological research. Likewise, MSI has started gaining popularity in plant sciences; yet, information regarding sample preparation methods for plant tissues is still limited. Sample preparation is a crucial step that is directly associated with the quality and authenticity of the imaging results, it therefore demands in-depth studies based on the characteristics of plant samples. In this review, a sample preparation pipeline is discussed in detail and illustrated through selected practical examples. In particular, special concerns regarding sample preparation for plant imaging are critically evaluated. Finally, the applications of MSI techniques in plants are reviewed according to different classes of plant metabolites. PMID:26904042

  20. Mass Spectrometry Analyses of Multicellular Tumor Spheroids.

    PubMed

    Acland, Mitchell; Mittal, Parul; Lokman, Noor A; Klingler-Hoffmann, Manuela; Oehler, Martin K; Hoffmann, Peter

    2018-05-01

    Multicellular tumor spheroids (MCTS) are a powerful biological in vitro model, which closely mimics the 3D structure of primary avascularized tumors. Mass spectrometry (MS) has established itself as a powerful analytical tool, not only to better understand and describe the complex structure of MCTS, but also to monitor their response to cancer therapeutics. The first part of this review focuses on traditional mass spectrometry approaches with an emphasis on elucidating the molecular characteristics of these structures. Then the mass spectrometry imaging (MSI) approaches used to obtain spatially defined information from MCTS is described. Finally the analysis of primary spheroids, such as those present in ovarian cancer, and the great potential that mass spectrometry analysis of these structures has for improved understanding of cancer progression and for personalized in vitro therapeutic testing is discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Atomic force microscopy and spectroscopy to probe single membrane proteins in lipid bilayers.

    PubMed

    Sapra, K Tanuj

    2013-01-01

    The atomic force microscope (AFM) has opened vast avenues hitherto inaccessible to the biological scientist. The high temporal (millisecond) and spatial (nanometer) resolutions of the AFM are suited for studying many biological processes in their native conditions. The AFM cantilever stylus is aptly termed as a "lab on a tip" owing to its versatility as an imaging tool as well as a handle to manipulate single bonds and proteins. Recent examples assert that the AFM can be used to study the mechanical properties and monitor processes of single proteins and single cells, thus affording insight into important mechanistic details. This chapter specifically focuses on practical and analytical protocols of single-molecule AFM methodologies related to high-resolution imaging and single-molecule force spectroscopy of membrane proteins. Both these techniques are operator oriented, and require specialized working knowledge of the instrument, theoretical, and practical skills.

  2. The Common Evolution of Geometry and Architecture from a Geodetic Point of View

    NASA Astrophysics Data System (ADS)

    Bellone, T.; Fiermonte, F.; Mussio, L.

    2017-05-01

    Throughout history the link between geometry and architecture has been strong and while architects have used mathematics to construct their buildings, geometry has always been the essential tool allowing them to choose spatial shapes which are aesthetically appropriate. Sometimes it is geometry which drives architectural choices, but at other times it is architectural innovation which facilitates the emergence of new ideas in geometry. Among the best known types of geometry (Euclidean, projective, analytical, Topology, descriptive, fractal,…) those most frequently employed in architectural design are: - Euclidean Geometry - Projective Geometry - The non-Euclidean geometries. Entire architectural periods are linked to specific types of geometry. Euclidean geometry, for example, was the basis for architectural styles from Antiquity through to the Romanesque period. Perspective and Projective geometry, for their part, were important from the Gothic period through the Renaissance and into the Baroque and Neo-classical eras, while non-Euclidean geometries characterize modern architecture.

  3. Abstract - Cooperative Research and Development Agreement between Environmental Defense Fund and National Energy Technology Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Kelly K.; Zavala-Zraiza, Daniel

    Here, we summarize an effort to develop a global oil and gas infrastructure (GOGI) taxonomy and geodatabase, using a combination of big data computing, custom search and data integration algorithms, and expert driven spatio-temporal analytics to identify, access, and evaluate open oil and gas data resources and uncertainty trends worldwide. This approach leveraged custom National Energy Technology Laboratory (NETL) tools and capabilities in collaboration with Environmental Defense Fund (EDF) and Carbon Limits subject matter expertise, to identify over 380 datasets and integrate more than 4.8 million features into the GOGI database. In addition to acquisition of open oil and gasmore » infrastructure data, information was collected and analyzed to assess the spatial, temporal, and source quality of these resources, and estimate their completeness relative to the top 40 hydrocarbon producing and consuming countries.« less

  4. Wildfire risk as a socioecological pathology

    USGS Publications Warehouse

    Fischer, A. Paige; Spies, Thomas A; Steelman, Toddi A; Moseley, Cassandra; Johnson, Bart R.; Bailey, John D.; Ager, Alan A; Bourgeron, Patrick S.; Charnley, Susan; Collins, Brandon M.; Kline, Jeffrey D; Leahy, Jessica E; Littell, Jeremy; Millington, James D. A.; Nielsen-Pincus, Max; Olsen, Christine S; Paveglio, Travis B; Roos, Christopher I.; Steen-Adams, Michelle M; Stevens, Forrest R; Vukomanovic, Jelena; White, Eric M; Bowman, David M J S

    2016-01-01

    Wildfire risk in temperate forests has become a nearly intractable problem that can be characterized as a socioecological “pathology”: that is, a set of complex and problematic interactions among social and ecological systems across multiple spatial and temporal scales. Assessments of wildfire risk could benefit from recognizing and accounting for these interactions in terms of socioecological systems, also known as coupled natural and human systems (CNHS). We characterize the primary social and ecological dimensions of the wildfire risk pathology, paying particular attention to the governance system around wildfire risk, and suggest strategies to mitigate the pathology through innovative planning approaches, analytical tools, and policies. We caution that even with a clear understanding of the problem and possible solutions, the system by which human actors govern fire-prone forests may evolve incrementally in imperfect ways and can be expected to resist change even as we learn better ways to manage CNHS.

  5. Application of statistical classification methods for predicting the acceptability of well-water quality

    NASA Astrophysics Data System (ADS)

    Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.

    2018-06-01

    The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.

  6. Web-Based Geographic Information System Tool for Accessing Hanford Site Environmental Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Triplett, Mark B.; Seiple, Timothy E.; Watson, David J.

    Data volume, complexity, and access issues pose severe challenges for analysts, regulators and stakeholders attempting to efficiently use legacy data to support decision making at the U.S. Department of Energy’s (DOE) Hanford Site. DOE has partnered with the Pacific Northwest National Laboratory (PNNL) on the PHOENIX (PNNL-Hanford Online Environmental Information System) project, which seeks to address data access, transparency, and integration challenges at Hanford to provide effective decision support. PHOENIX is a family of spatially-enabled web applications providing quick access to decades of valuable scientific data and insight through intuitive query, visualization, and analysis tools. PHOENIX realizes broad, public accessibilitymore » by relying only on ubiquitous web-browsers, eliminating the need for specialized software. It accommodates a wide range of users with intuitive user interfaces that require little or no training to quickly obtain and visualize data. Currently, PHOENIX is actively hosting three applications focused on groundwater monitoring, groundwater clean-up performance reporting, and in-tank monitoring. PHOENIX-based applications are being used to streamline investigative and analytical processes at Hanford, saving time and money. But more importantly, by integrating previously isolated datasets and developing relevant visualization and analysis tools, PHOENIX applications are enabling DOE to discover new correlations hidden in legacy data, allowing them to more effectively address complex issues at Hanford.« less

  7. Development and Evaluation of a Web Map Mind Tool Environment with the Theory of Spatial Thinking and Project-Based Learning Strategy

    ERIC Educational Resources Information Center

    Hou, Huei-Tse; Yu, Tsai-Fang; Wu, Yi-Xuan; Sung, Yao-Ting; Chang, Kuo-En

    2016-01-01

    The theory of spatial thinking is relevant to the learning and teaching of many academic domains. One promising method to facilitate learners' higher-order thinking is to utilize a web map mind tool to assist learners in applying spatial thinking to cooperative problem solving. In this study, an environment is designed based on the theory of…

  8. Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.

    PubMed

    Dunn, Joshua G; Weissman, Jonathan S

    2016-11-22

    Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .

  9. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  10. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  11. Crosstalk and transitions between multiple spatial maps in an attractor neural network model of the hippocampus: Collective motion of the activity

    NASA Astrophysics Data System (ADS)

    Monasson, R.; Rosay, S.

    2014-03-01

    The dynamics of a neural model for hippocampal place cells storing spatial maps is studied. In the absence of external input, depending on the number of cells and on the values of control parameters (number of environments stored, level of neural noise, average level of activity, connectivity of place cells), a "clump" of spatially localized activity can diffuse or remains pinned due to crosstalk between the environments. In the single-environment case, the macroscopic coefficient of diffusion of the clump and its effective mobility are calculated analytically from first principles and corroborated by numerical simulations. In the multienvironment case the heights and the widths of the pinning barriers are analytically characterized with the replica method; diffusion within one map is then in competition with transitions between different maps. Possible mechanisms enhancing mobility are proposed and tested.

  12. Fresnel coefficients and Fabry-Perot formula for spatially dispersive metallic layers

    NASA Astrophysics Data System (ADS)

    Pitelet, Armel; Mallet, Émilien; Centeno, Emmanuel; Moreau, Antoine

    2017-07-01

    The repulsion between free electrons inside a metal makes its optical response spatially dispersive, so that it is not described by Drude's model but by a hydrodynamic model. We give here fully analytic results for a metallic slab in this framework, thanks to a two-mode cavity formalism leading to a Fabry-Perot formula, and show that a simplification can be made that preserves the accuracy of the results while allowing much simpler analytic expressions. For metallic layers thicker than 2.7 nm modified Fresnel coefficients can actually be used to accurately predict the response of any multilayer with spatially dispersive metals (for reflection, transmission, or the guided modes). Finally, this explains why adding a small dielectric layer [Y. Luo et al., Phys. Rev. Lett. 111, 093901 (2013), 10.1103/PhysRevLett.111.093901] allows one to reproduce the effects of nonlocality in many cases, and especially for multilayers.

  13. Visualizing statistical significance of disease clusters using cartograms.

    PubMed

    Kronenfeld, Barry J; Wong, David W S

    2017-05-15

    Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.

  14. Assessment of uncertainty in the numerical simulation of solar irradiance over inclined PV panels: New algorithms using measurements and modeling tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Yu; Sengupta, Manajit; Dooraghi, Mike

    Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less

  15. Assessment of uncertainty in the numerical simulation of solar irradiance over inclined PV panels: New algorithms using measurements and modeling tools

    DOE PAGES

    Xie, Yu; Sengupta, Manajit; Dooraghi, Mike

    2018-03-20

    Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less

  16. Google Analytics – Index of Resources

    EPA Pesticide Factsheets

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  17. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  18. Substrate-Mediated Laser Ablation under Ambient Conditions for Spatially-Resolved Tissue Proteomics

    PubMed Central

    Fatou, Benoit; Wisztorski, Maxence; Focsa, Cristian; Salzet, Michel; Ziskind, Michael; Fournier, Isabelle

    2015-01-01

    Numerous applications of ambient Mass Spectrometry (MS) have been demonstrated over the past decade. They promoted the emergence of various micro-sampling techniques such as Laser Ablation/Droplet Capture (LADC). LADC consists in the ablation of analytes from a surface and their subsequent capture in a solvent droplet which can then be analyzed by MS. LADC is thus generally performed in the UV or IR range, using a wavelength at which analytes or the matrix absorb. In this work, we explore the potential of visible range LADC (532 nm) as a micro-sampling technology for large-scale proteomics analyses. We demonstrate that biomolecule analyses using 532 nm LADC are possible, despite the low absorbance of biomolecules at this wavelength. This is due to the preponderance of an indirect substrate-mediated ablation mechanism at low laser energy which contrasts with the conventional direct ablation driven by sample absorption. Using our custom LADC system and taking advantage of this substrate-mediated ablation mechanism, we were able to perform large-scale proteomic analyses of micro-sampled tissue sections and demonstrated the possible identification of proteins with relevant biological functions. Consequently, the 532 nm LADC technique offers a new tool for biological and clinical applications. PMID:26674367

  19. Combining Laser Ablation/Liquid Phase Collection Surface Sampling and High-Performance Liquid Chromatography Electrospray Ionization Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovchinnikova, Olga S; Kertesz, Vilmos; Van Berkel, Gary J

    This paper describes the coupling of ambient pressure transmission geometry laser ablation with a liquid phase sample collection method for surface sampling and ionization with subsequent mass spectral analysis. A commercially available autosampler was adapted to produce a liquid droplet at the end of the syringe injection needle while in close proximity to the surface to collect the sample plume produced by laser ablation. The sample collection was followed by either flow injection or a high performance liquid chromatography (HPLC) separation of the extracted components and detection with electrospray ionization mass spectrometry (ESI-MS). To illustrate the analytical utility of thismore » coupling, thin films of a commercial ink sample containing rhodamine 6G and of mixed isobaric rhodamine B and 6G dyes on glass microscope slides were analyzed. The flow injection and HPLC/ESI-MS analysis revealed successful laser ablation, capture and, with HPLC, the separation of the two compounds. The ablated circular area was about 70 m in diameter for these experiments. The spatial sampling resolution afforded by the laser ablation, as well as the ability to use sample processing methods like HPLC between the sample collection and ionization steps, makes this combined surface sampling/ionization technique a highly versatile analytical tool.« less

  20. An analytical approach to the CMB polarization in a spatially closed background

    NASA Astrophysics Data System (ADS)

    Niazy, Pedram; Abbassi, Amir H.

    2018-03-01

    The scalar mode polarization of the cosmic microwave background is derived in a spatially closed universe from the Boltzmann equation using the line of sight integral method. The EE and TE multipole coefficients have been extracted analytically by considering some tolerable approximations such as considering the evolution of perturbation hydrodynamically and sudden transition from opacity to transparency at the time of last scattering. As the major advantage of analytic expressions, CEE,ℓS and CTE,ℓ explicitly show the dependencies on baryon density ΩB, matter density ΩM, curvature ΩK, primordial spectral index ns, primordial power spectrum amplitude As, Optical depth τreion, recombination width σt and recombination time tL. Using a realistic set of cosmological parameters taken from a fit to data from Planck, the closed universe EE and TE power spectrums in the scalar mode are compared with numerical results from the CAMB code and also latest observational data. The analytic results agree with the numerical ones on the big and moderate scales. The peak positions are in good agreement with the numerical result on these scales while the peak heights agree with that to within 20% due to the approximations have been considered for these derivations. Also, several interesting properties of CMB polarization are revealed by the analytic spectra.

  1. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  2. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data.

    PubMed

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H

    2012-11-06

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the "big data" challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce.

  3. Towards Building a High Performance Spatial Query System for Large Scale Medical Imaging Data

    PubMed Central

    Aji, Ablimit; Wang, Fusheng; Saltz, Joel H.

    2013-01-01

    Support of high performance queries on large volumes of scientific spatial data is becoming increasingly important in many applications. This growth is driven by not only geospatial problems in numerous fields, but also emerging scientific applications that are increasingly data- and compute-intensive. For example, digital pathology imaging has become an emerging field during the past decade, where examination of high resolution images of human tissue specimens enables more effective diagnosis, prediction and treatment of diseases. Systematic analysis of large-scale pathology images generates tremendous amounts of spatially derived quantifications of micro-anatomic objects, such as nuclei, blood vessels, and tissue regions. Analytical pathology imaging provides high potential to support image based computer aided diagnosis. One major requirement for this is effective querying of such enormous amount of data with fast response, which is faced with two major challenges: the “big data” challenge and the high computation complexity. In this paper, we present our work towards building a high performance spatial query system for querying massive spatial data on MapReduce. Our framework takes an on demand index building approach for processing spatial queries and a partition-merge approach for building parallel spatial query pipelines, which fits nicely with the computing model of MapReduce. We demonstrate our framework on supporting multi-way spatial joins for algorithm evaluation and nearest neighbor queries for microanatomic objects. To reduce query response time, we propose cost based query optimization to mitigate the effect of data skew. Our experiments show that the framework can efficiently support complex analytical spatial queries on MapReduce. PMID:24501719

  4. Spatial Characterization of Radio Propagation Channel in Urban Vehicle-to-Infrastructure Environments to Support WSNs Deployment

    PubMed Central

    Granda, Fausto; Azpilicueta, Leyre; Vargas-Rosales, Cesar; Lopez-Iturri, Peio; Aguirre, Erik; Astrain, Jose Javier; Villandangos, Jesus; Falcone, Francisco

    2017-01-01

    Vehicular ad hoc Networks (VANETs) enable vehicles to communicate with each other as well as with roadside units (RSUs). Although there is a significant research effort in radio channel modeling focused on vehicle-to-vehicle (V2V), not much work has been done for vehicle-to-infrastructure (V2I) using 3D ray-tracing tools. This work evaluates some important parameters of a V2I wireless channel link such as large-scale path loss and multipath metrics in a typical urban scenario using a deterministic simulation model based on an in-house 3D Ray-Launching (3D-RL) algorithm at 5.9 GHz. Results show the high impact that the spatial distance; link frequency; placement of RSUs; and factors such as roundabout, geometry and relative position of the obstacles have in V2I propagation channel. A detailed spatial path loss characterization of the V2I channel along the streets and avenues is presented. The 3D-RL results show high accuracy when compared with measurements, and represent more reliably the propagation phenomena when compared with analytical path loss models. Performance metrics for a real test scenario implemented with a VANET wireless sensor network implemented ad-hoc are also described. These results constitute a starting point in the design phase of Wireless Sensor Networks (WSNs) radio-planning in the urban V2I deployment in terms of coverage. PMID:28590429

  5. Spatial Characterization of Radio Propagation Channel in Urban Vehicle-to-Infrastructure Environments to Support WSNs Deployment.

    PubMed

    Granda, Fausto; Azpilicueta, Leyre; Vargas-Rosales, Cesar; Lopez-Iturri, Peio; Aguirre, Erik; Astrain, Jose Javier; Villandangos, Jesus; Falcone, Francisco

    2017-06-07

    Vehicular ad hoc Networks (VANETs) enable vehicles to communicate with each other as well as with roadside units (RSUs). Although there is a significant research effort in radio channel modeling focused on vehicle-to-vehicle (V2V), not much work has been done for vehicle-to-infrastructure (V2I) using 3D ray-tracing tools. This work evaluates some important parameters of a V2I wireless channel link such as large-scale path loss and multipath metrics in a typical urban scenario using a deterministic simulation model based on an in-house 3D Ray-Launching (3D-RL) algorithm at 5.9 GHz. Results show the high impact that the spatial distance; link frequency; placement of RSUs; and factors such as roundabout, geometry and relative position of the obstacles have in V2I propagation channel. A detailed spatial path loss characterization of the V2I channel along the streets and avenues is presented. The 3D-RL results show high accuracy when compared with measurements, and represent more reliably the propagation phenomena when compared with analytical path loss models. Performance metrics for a real test scenario implemented with a VANET wireless sensor network implemented ad-hoc are also described. These results constitute a starting point in the design phase of Wireless Sensor Networks (WSNs) radio-planning in the urban V2I deployment in terms of coverage.

  6. Gambling score in earthquake prediction analysis

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.

    2011-03-01

    The number of successes and the space-time alarm rate are commonly used to characterize the strength of an earthquake prediction method and the significance of prediction results. It has been recently suggested to use a new characteristic to evaluate the forecaster's skill, the gambling score (GS), which incorporates the difficulty of guessing each target event by using different weights for different alarms. We expand parametrization of the GS and use the M8 prediction algorithm to illustrate difficulties of the new approach in the analysis of the prediction significance. We show that the level of significance strongly depends (1) on the choice of alarm weights, (2) on the partitioning of the entire alarm volume into component parts and (3) on the accuracy of the spatial rate measure of target events. These tools are at the disposal of the researcher and can affect the significance estimate. Formally, all reasonable GSs discussed here corroborate that the M8 method is non-trivial in the prediction of 8.0 ≤M < 8.5 events because the point estimates of the significance are in the range 0.5-5 per cent. However, the conservative estimate 3.7 per cent based on the number of successes seems preferable owing to two circumstances: (1) it is based on relative values of the spatial rate and hence is more stable and (2) the statistic of successes enables us to construct analytically an upper estimate of the significance taking into account the uncertainty of the spatial rate measure.

  7. Spatial Approaches for Ecological Screening and Exposure Assessment of Chemicals and Radionclides

    EPA Science Inventory

    This presentation details a tool, SADA, available for use in environmental assessments of chemicals that can also be used for radiological assessments of the environment. Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from e...

  8. New integrable models and analytical solutions in f (R ) cosmology with an ideal gas

    NASA Astrophysics Data System (ADS)

    Papagiannopoulos, G.; Basilakos, Spyros; Barrow, John D.; Paliathanasis, Andronikos

    2018-01-01

    In the context of f (R ) gravity with a spatially flat FLRW metric containing an ideal fluid, we use the method of invariant transformations to specify families of models which are integrable. We find three families of f (R ) theories for which new analytical solutions are given and closed-form solutions are provided.

  9. CD control with defect inspection: you can teach an old dog a new trick

    NASA Astrophysics Data System (ADS)

    Utzny, Clemens; Ullrich, Albrecht; Heumann, Jan; Mohn, Elias; Meusemann, Stefan; Seltmann, Rolf

    2012-11-01

    Achieving the required critical dimensions (CD) with the best possible uniformity (CDU) on photo-masks has always played a pivotal role in enabling chip technology. Current control strategies are based on scanning electron microscopy (SEM) based measurements implying a sparse spatial resolution on the order of ~ 10-2 m to 10-1 m. A higher spatial resolution could be reached with an adequate measurement sampling, however the increase in the number of measurements makes this approach in the context of a productive environment unfeasible. With the advent of more powerful defect inspection tools a significantly higher spatial resolution of 10-4 m can be achieved by measuring also CD during the regular defect inspection. This method is not limited to the measurement of specific measurement features thus paving the way to a CD assessment of all electrically relevant mask patterns. Enabling such a CD measurement gives way to new realms of CD control. Deterministic short range CD effects which were previously interpreted as noise can be resolved and addressed by CD compensation methods. This in can lead to substantial improvements of the CD uniformity. Thus the defect inspection mediated CD control closes a substantial gap in the mask manufacturing process by allowing the control of short range CD effects which were up till now beyond the reach of regular CD SEM based control strategies. This increase in spatial resolution also counters the decrease in measurement precision due to the usage of an optical system. In this paper we present detailed results on a) the CD data generated during the inspection process, b) the analytical tools needed for relating this data to CD SEM measurement and c) how the CD inspection process enables new dimension of CD compensation within the mask manufacturing process. We find that the inspection based CD measurement generates typically around 500000 measurements with a homogeneous covering of the active mask area. In comparing the CD inspection results with CD SEM measurement on a single measurement point base we find that optical limitations of the inspection tool play a substantial role within the photon based inspection process. Once these shift are characterized and removed a correlation coefficient of 0.9 between these two CD measurement techniques is found. This finding agrees well with a signature based matching approach. Based on these findings we set up a dedicated pooling algorithm which performs on outlier removal for all CD inspections together with a data clustering according to feature specific tool induced shifts. This way tool induced shift effects can be removed and CD signature computation is enabled. A statistical model of the CD signatures which relates the mask design parameters on the relevant length scales to CD effects thus enabling the computation CD compensation maps. The compensation maps address the CD effects on various distinct length scales and we show that long and short range contributions to the CD variation are decreased. We find that the CD uniformity is improved by 25% using this novel CD compensation strategy.

  10. An Object-Based Approach to Evaluation of Climate Variability Projections and Predictions

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.

    2017-12-01

    Evaluations of the performance of earth system model predictions and projections are of critical importance to enhance usefulness of these products. Such evaluations need to address specific concerns depending on the system and decisions of interest; hence, evaluation tools must be tailored to inform about specific issues. Traditional approaches that summarize grid-based comparisons of analyses and models, or between current and future climate, often do not reveal important information about the models' performance (e.g., spatial or temporal displacements; the reason behind a poor score) and are unable to accommodate these specific information needs. For example, summary statistics such as the correlation coefficient or the mean-squared error provide minimal information to developers, users, and decision makers regarding what is "right" and "wrong" with a model. New spatial and temporal-spatial object-based tools from the field of weather forecast verification (where comparisons typically focus on much finer temporal and spatial scales) have been adapted to more completely answer some of the important earth system model evaluation questions. In particular, the Method for Object-based Diagnostic Evaluation (MODE) tool and its temporal (three-dimensional) extension (MODE-TD) have been adapted for these evaluations. More specifically, these tools can be used to address spatial and temporal displacements in projections of El Nino-related precipitation and/or temperature anomalies, ITCZ-associated precipitation areas, atmospheric rivers, seasonal sea-ice extent, and other features of interest. Examples of several applications of these tools in a climate context will be presented, using output of the CESM large ensemble. In general, these tools provide diagnostic information about model performance - accounting for spatial, temporal, and intensity differences - that cannot be achieved using traditional (scalar) model comparison approaches. Thus, they can provide more meaningful information that can be used in decision-making and planning. Future extensions and applications of these tools in a climate context will be considered.

  11. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  12. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  13. New tools for linking human and earth system models: The Toolbox for Human-Earth System Interaction & Scaling (THESIS)

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.; Kauffman, B.; Lawrence, P.

    2016-12-01

    Integrated analysis of questions regarding land, water, and energy resources often requires integration of models of different types. One type of integration is between human and earth system models, since both societal and physical processes influence these resources. For example, human processes such as changes in population, economic conditions, and policies govern the demand for land, water and energy, while the interactions of these resources with physical systems determine their availability and environmental consequences. We have begun to develop and use a toolkit for linking human and earth system models called the Toolbox for Human-Earth System Integration and Scaling (THESIS). THESIS consists of models and software tools to translate, scale, and synthesize information from and between human system models and earth system models (ESMs), with initial application to linking the NCAR integrated assessment model, iPETS, with the NCAR earth system model, CESM. Initial development is focused on urban areas and agriculture, sectors that are both explicitly represented in both CESM and iPETS. Tools are being made available to the community as they are completed (see https://www2.cgd.ucar.edu/sections/tss/iam/THESIS_tools). We discuss four general types of functions that THESIS tools serve (Spatial Distribution, Spatial Properties, Consistency, and Outcome Evaluation). Tools are designed to be modular and can be combined in order to carry out more complex analyses. We illustrate their application to both the exposure of population to climate extremes and to the evaluation of climate impacts on the agriculture sector. For example, projecting exposure to climate extremes involves use of THESIS tools for spatial population, spatial urban land cover, the characteristics of both, and a tool to bring urban climate information together with spatial population information. Development of THESIS tools is continuing and open to the research community.

  14. Applying Pragmatics Principles for Interaction with Visual Analytics.

    PubMed

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  15. A flexible object-oriented software framework for developing complex multimedia simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sydelko, P. J.; Dolph, J. E.; Christiansen, J. H.

    Decision makers involved in brownfields redevelopment and long-term stewardship must consider environmental conditions, future-use potential, site ownership, area infrastructure, funding resources, cost recovery, regulations, risk and liability management, community relations, and expected return on investment in a comprehensive and integrated fashion to achieve desired results. Successful brownfields redevelopment requires the ability to assess the impacts of redevelopment options on multiple interrelated aspects of the ecosystem, both natural and societal. Computer-based tools, such as simulation models, databases, and geographical information systems (GISs) can be used to address brownfields planning and project execution. The transparent integration of these tools into a comprehensivemore » and dynamic decision support system would greatly enhance the brownfields assessment process. Such a system needs to be able to adapt to shifting and expanding analytical requirements and contexts. The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-oriented framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application domains. The modeling domain of a specific DIAS-based simulation is determined by (1) software objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. Models and applications used to express dynamic behaviors can be either internal or external to DIAS, including existing legacy models written in various languages (FORTRAN, C, etc.). The flexible design framework of DIAS makes the objects adjustable to the context of the problem without a great deal of recoding. The DIAS Spatial Data Set facility allows parameters to vary spatially depending on the simulation context according to any of a number of 1-D, 2-D, or 3-D topologies. DIAS is also capable of interacting with other GIS packages and can import many standard spatial data formats. DIAS simulation capabilities can also be extended by including societal process models. Models that implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations allow for interaction and feedback among natural and societal processes. The ability to simulate the complex interplay of multimedia processes makes DIAS a promising tool for constructing applications for comprehensive community planning, including the assessment of multiple development and redevelopment scenarios.« less

  16. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  17. Semi-Analytical Models of CO2 Injection into Deep Saline Aquifers: Evaluation of the Area of Review and Leakage through Abandoned Wells

    EPA Science Inventory

    This presentation will provide a conceptual preview of an Area of Review (AoR) tool being developed by EPA’s Office of Research and Development that applies analytic and semi-analytical mathematical solutions to elucidate potential risks associated with geologic sequestration of ...

  18. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  19. TOOLS FOR PRESENTING SPATIAL AND TEMPORAL PATTERNS OF ENVIRONMENTAL MONITORING DATA

    EPA Science Inventory

    The EPA Health Effects Research Laboratory has developed this data presentation tool for use with a variety of types of data which may contain spatial and temporal patterns of interest. he technology links mainframe computing power to the new generation of "desktop publishing" ha...

  20. Toward Accessing Spatial Structure from Building Information Models

    NASA Astrophysics Data System (ADS)

    Schultz, C.; Bhatt, M.

    2011-08-01

    Data about building designs and layouts is becoming increasingly more readily available. In the near future, service personal (such as maintenance staff or emergency rescue workers) arriving at a building site will have immediate real-time access to enormous amounts of data relating to structural properties, utilities, materials, temperature, and so on. The critical problem for users is the taxing and error prone task of interpreting such a large body of facts in order to extract salient information. This is necessary for comprehending a situation and deciding on a plan of action, and is a particularly serious issue in time-critical and safety-critical activities such as firefighting. Current unifying building models such as the Industry Foundation Classes (IFC), while being comprehensive, do not directly provide data structures that focus on spatial reasoning and spatial modalities that are required for high-level analytical tasks. The aim of the research presented in this paper is to provide computational tools for higher level querying and reasoning that shift the cognitive burden of dealing with enormous amounts of data away from the user. The user can then spend more energy and time in planning and decision making in order to accomplish the tasks at hand. We present an overview of our framework that provides users with an enhanced model of "built-up space". In order to test our approach using realistic design data (in terms of both scale and the nature of the building models) we describe how our system interfaces with IFC, and we conduct timing experiments to determine the practicality of our approach. We discuss general computational approaches for deriving higher-level spatial modalities by focusing on the example of route graphs. Finally, we present a firefighting scenario with alternative route graphs to motivate the application of our framework.

  1. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  2. Fast and simultaneous monitoring of organic pollutants in a drinking water treatment plant by a multi-analyte biosensor followed by LC-MS validation.

    PubMed

    Rodriguez-Mozaz, Sara; de Alda, Maria J López; Barceló, Damià

    2006-04-15

    This work describes the application of an optical biosensor (RIver ANALyser, RIANA) to the simultaneous analysis of three relevant environmental organic pollutants, namely, the pesticides atrazine and isoproturon and the estrogen estrone, in real water samples. This biosensor is based on an indirect inhibition immunoassay which takes place at a chemically modified optical transducer chip. The spatially resolved modification of the transducer surface allows the simultaneous determination of selected target analytes by means of "total internal reflection fluorescence" (TIRF). The performance of the immunosensor method developed was evaluated against a well accepted traditional method based on solid-phase extraction followed by liquid chromatography-mass spectrometry (LC-MS). The chromatographic method was superior in terms of linearity, sensitivity and accuracy, and the biosensor method in terms of repeatability, speed, cost and automation. The application of both methods in parallel to determine the occurrence and removal of atrazine, isoproturon and estrone throughout the treatment process (sand filtration, ozonation, activated carbon filtration and chlorination) in a waterworks showed an overestimation of results in the case of the biosensor, which was partially attributed to matrix and cross-reactivity effects, in spite of the addition of ovalbumin to the sample to minimize matrix interferences. Based on the comparative performance of both techniques, the biosensor emerges as a suitable tool for fast, simple and automated screening of water pollutants without sample pretreatment. To the author's knowledge, this is the first description of the application of the biosensor RIANA in the multi-analyte configuration to the regular monitoring of pollutants in a waterworks.

  3. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less

  4. A Study on Re-entry Predictions of Uncontrolled Space Objects for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Choi, Eun-Jung; Cho, Sungki; Lee, Deok-Jin; Kim, Siwoo; Jo, Jung Hyun

    2017-12-01

    The key risk analysis technologies for the re-entry of space objects into Earth’s atmosphere are divided into four categories: cataloguing and databases of the re-entry of space objects, lifetime and re-entry trajectory predictions, break-up models after re-entry and multiple debris distribution predictions, and ground impact probability models. In this study, we focused on re- entry prediction, including orbital lifetime assessments, for space situational awareness systems. Re-entry predictions are very difficult and are affected by various sources of uncertainty. In particular, during uncontrolled re-entry, large spacecraft may break into several pieces of debris, and the surviving fragments can be a significant hazard for persons and properties on the ground. In recent years, specific methods and procedures have been developed to provide clear information for predicting and analyzing the re-entry of space objects and for ground-risk assessments. Representative tools include object reentry survival analysis tool (ORSAT) and debris assessment software (DAS) developed by National Aeronautics and Space Administration (NASA), spacecraft atmospheric re-entry and aerothermal break-up (SCARAB) and debris risk assessment and mitigation analysis (DRAMA) developed by European Space Agency (ESA), and semi-analytic tool for end of life analysis (STELA) developed by Centre National d’Etudes Spatiales (CNES). In this study, various surveys of existing re-entry space objects are reviewed, and an efficient re-entry prediction technique is suggested based on STELA, the life-cycle analysis tool for satellites, and DRAMA, a re-entry analysis tool. To verify the proposed method, the re-entry of the Tiangong-1 Space Lab, which is expected to re-enter Earth’s atmosphere shortly, was simulated. Eventually, these results will provide a basis for space situational awareness risk analyses of the re-entry of space objects.

  5. Generalized Analysis Tools for Multi-Spacecraft Missions

    NASA Astrophysics Data System (ADS)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI SR-001, 1998. [2] Chanteur, G.: Spatial Interpolation for Four Spacecraft: Theory, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 371-393, ISSI SR-001, 1998. [3] Chanteur, G.: Accuracy of field gradient estimations by Cluster: Explanation of its dependency upon elongation and planarity of the tetrahedron, pp. 265-268, ESA SP-449, 2000. [4] Vogt, J., Paschmann, G., and Chanteur, G.: Reciprocal Vectors, pp. 33-46, ISSI SR-008, 2008.

  6. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    DOT National Transportation Integrated Search

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  7. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  8. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  9. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  10. Chemical Transport in a Fissured Rock: Verification of a Numerical Model

    NASA Astrophysics Data System (ADS)

    Rasmuson, A.; Narasimhan, T. N.; Neretnieks, I.

    1982-10-01

    Numerical models for simulating chemical transport in fissured rocks constitute powerful tools for evaluating the acceptability of geological nuclear waste repositories. Due to the very long-term, high toxicity of some nuclear waste products, the models are required to predict, in certain cases, the spatial and temporal distribution of chemical concentration less than 0.001% of the concentration released from the repository. Whether numerical models can provide such accuracies is a major question addressed in the present work. To this end we have verified a numerical model, TRUMP, which solves the advective diffusion equation in general three dimensions, with or without decay and source terms. The method is based on an integrated finite difference approach. The model was verified against known analytic solution of the one-dimensional advection-diffusion problem, as well as the problem of advection-diffusion in a system of parallel fractures separated by spherical particles. The studies show that as long as the magnitude of advectance is equal to or less than that of conductance for the closed surface bounding any volume element in the region (that is, numerical Peclet number <2), the numerical method can indeed match the analytic solution within errors of ±10-3% or less. The realistic input parameters used in the sample calculations suggest that such a range of Peclet numbers is indeed likely to characterize deep groundwater systems in granitic and ancient argillaceous systems. Thus TRUMP in its present form does provide a viable tool for use in nuclear waste evaluation studies. A sensitivity analysis based on the analytic solution suggests that the errors in prediction introduced due to uncertainties in input parameters are likely to be larger than the computational inaccuracies introduced by the numerical model. Currently, a disadvantage in the TRUMP model is that the iterative method of solving the set of simultaneous equations is rather slow when time constants vary widely over the flow region. Although the iterative solution may be very desirable for large three-dimensional problems in order to minimize computer storage, it seems desirable to use a direct solver technique in conjunction with the mixed explicit-implicit approach whenever possible. Work in this direction is in progress.

  11. MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.

    PubMed

    Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui

    2015-12-12

    Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.

  12. Analytical challenges for conducting rapid metabolism characterization for QIVIVE.

    PubMed

    Tolonen, Ari; Pelkonen, Olavi

    2015-06-05

    For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  14. OpenMSI Arrayed Analysis Tools v2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BOWEN, BENJAMIN; RUEBEL, OLIVER; DE ROND, TRISTAN

    2017-02-07

    Mass spectrometry imaging (MSI) enables high-resolution spatial mapping of biomolecules in samples and is a valuable tool for the analysis of tissues from plants and animals, microbial interactions, high-throughput screening, drug metabolism, and a host of other applications. This is accomplished by desorbing molecules from the surface on spatially defined locations, using a laser or ion beam. These ions are analyzed by a mass spectrometry and collected into a MSI 'image', a dataset containing unique mass spectra from the sampled spatial locations. MSI is used in a diverse and increasing number of biological applications. The OpenMSI Arrayed Analysis Tool (OMAAT)more » is a new software method that addresses the challenges of analyzing spatially defined samples in large MSI datasets, by providing support for automatic sample position optimization and ion selection.« less

  15. Emerging tools and technologies in watershed management

    Treesearch

    D. Phillip Guertin; Scott N. Miller; David C. Goodrich

    2000-01-01

    The field of watershed management is highly dependent on spatially distributed data. Over the past decade, significant advances have been made toward the capture, storage, and use of spatial data. Emerging tools and technologies hold great promise for improving the scientific understanding of watershed processes and are already revolutionizing watershed research....

  16. SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING AND RISK ASSESSMENT (SLIDE PRESENTATION)

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  17. MEETING IN CHICAGO: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND ENVIRONMENTAL RISK ASSESSMENT

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  18. MEETING IN CZECH REPUBLIC: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND RISK ASSESSMENT

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  19. Update on SLD Engineering Tools Development

    NASA Technical Reports Server (NTRS)

    Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.

    2004-01-01

    The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.

  20. 76 FR 70517 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...

Top