Sample records for visual analytics techniques

  1. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  2. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  3. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and countries.

  4. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  5. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  6. Visual Analytics and Storytelling through Video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Perrine, Kenneth A.; Mackey, Patrick S.

    2005-10-31

    This paper supplements a video clip submitted to the Video Track of IEEE Symposium on Information Visualization 2005. The original video submission applies a two-way storytelling approach to demonstrate the visual analytics capabilities of a new visualization technique. The paper presents our video production philosophy, describes the plot of the video, explains the rationale behind the plot, and finally, shares our production experiences with our readers.

  7. Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment

    ERIC Educational Resources Information Center

    Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James

    2010-01-01

    The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…

  8. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less

  9. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  10. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  11. Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey

    State-of-the-art visual analytics models and frameworks mostly assume a static snapshot of the data, while in many cases it is a stream with constant updates and changes. Exploration of streaming data poses unique challenges as machine-level computations and abstractions need to be synchronized with the visual representation of the data and the temporally evolving human insights. In the visual analytics literature, we lack a thorough characterization of streaming data and analysis of the challenges associated with task abstraction, visualization design, and adaptation of the role of human-in-the-loop for exploration of data streams. We aim to fill this gap by conductingmore » a survey of the state-of-the-art in visual analytics of streaming data for systematically describing the contributions and shortcomings of current techniques and analyzing the research gaps that need to be addressed in the future. Our contributions are: i) problem characterization for identifying challenges that are unique to streaming data analysis tasks, ii) a survey and analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space for dynamic data and the role of the human-in-the-loop, and iii) reflections on the design-trade-offs for streaming visual analytics techniques and their practical applicability in real-world application scenarios.« less

  12. [application of the analytical transmission electron microscopy techniques for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in mammalian cells].

    PubMed

    Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V

    2014-01-01

    This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.

  13. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    PubMed

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  14. Visual analytics techniques for large multi-attribute time series data

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.

    2008-01-01

    Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.

  15. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  16. SnapShot: Visualization to Propel Ice Hockey Analytics.

    PubMed

    Pileggi, H; Stolper, C D; Boyle, J M; Stasko, J T

    2012-12-01

    Sports analysts live in a world of dynamic games flattened into tables of numbers, divorced from the rinks, pitches, and courts where they were generated. Currently, these professional analysts use R, Stata, SAS, and other statistical software packages for uncovering insights from game data. Quantitative sports consultants seek a competitive advantage both for their clients and for themselves as analytics becomes increasingly valued by teams, clubs, and squads. In order for the information visualization community to support the members of this blossoming industry, it must recognize where and how visualization can enhance the existing analytical workflow. In this paper, we identify three primary stages of today's sports analyst's routine where visualization can be beneficially integrated: 1) exploring a dataspace; 2) sharing hypotheses with internal colleagues; and 3) communicating findings to stakeholders.Working closely with professional ice hockey analysts, we designed and built SnapShot, a system to integrate visualization into the hockey intelligence gathering process. SnapShot employs a variety of information visualization techniques to display shot data, yet given the importance of a specific hockey statistic, shot length, we introduce a technique, the radial heat map. Through a user study, we received encouraging feedback from several professional analysts, both independent consultants and professional team personnel.

  17. Big data in medical informatics: improving education through visual analytics.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    A continuous effort to improve healthcare education today is currently driven from the need to create competent health professionals able to meet healthcare demands. Limited research reporting how educational data manipulation can help in healthcare education improvement. The emerging research field of visual analytics has the advantage to combine big data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognise visual patterns. The aim of this study was therefore to explore novel ways of representing curriculum and educational data using visual analytics. Three approaches of visualization and representation of educational data were presented. Five competencies at undergraduate medical program level addressed in courses were identified to inaccurately correspond to higher education board competencies. Different visual representations seem to have a potential in impacting on the ability to perceive entities and connections in the curriculum data.

  18. Visualization Techniques for Computer Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Justin M; Steed, Chad A; Patton, Robert M

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less

  19. Qualitative evaluation of water displacement in simulated analytical breaststroke movements.

    PubMed

    Martens, Jonas; Daly, Daniel

    2012-05-01

    One purpose of evaluating a swimmer is to establish the individualized optimal technique. A swimmer's particular body structure and the resulting movement pattern will cause the surrounding water to react in differing ways. Consequently, an assessment method based on flow visualization was developed complimentary to movement analysis and body structure quantification. A fluorescent dye was used to make the water displaced by the body visible on video. To examine the hypothesis on the propulsive mechanisms applied in breaststroke swimming, we analyzed the movements of the surrounding water during 4 analytical breaststroke movements using the flow visualization technique.

  20. BiSet: Semantic Edge Bundling with Biclusters for Sensemaking.

    PubMed

    Sun, Maoyuan; Mi, Peng; North, Chris; Ramakrishnan, Naren

    2016-01-01

    Identifying coordinated relationships is an important task in data analytics. For example, an intelligence analyst might want to discover three suspicious people who all visited the same four cities. Existing techniques that display individual relationships, such as between lists of entities, require repetitious manual selection and significant mental aggregation in cluttered visualizations to find coordinated relationships. In this paper, we present BiSet, a visual analytics technique to support interactive exploration of coordinated relationships. In BiSet, we model coordinated relationships as biclusters and algorithmically mine them from a dataset. Then, we visualize the biclusters in context as bundled edges between sets of related entities. Thus, bundles enable analysts to infer task-oriented semantic insights about potentially coordinated activities. We make bundles as first class objects and add a new layer, "in-between", to contain these bundle objects. Based on this, bundles serve to organize entities represented in lists and visually reveal their membership. Users can interact with edge bundles to organize related entities, and vice versa, for sensemaking purposes. With a usage scenario, we demonstrate how BiSet supports the exploration of coordinated relationships in text analytics.

  1. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  2. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  3. The Relationship between Visual Analysis and Five Statistical Analyses in a Simple AB Single-Case Research Design

    ERIC Educational Resources Information Center

    Brossart, Daniel F.; Parker, Richard I.; Olson, Elizabeth A.; Mahadevan, Lakshmi

    2006-01-01

    This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from…

  4. Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.

    PubMed

    Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A

    2018-01-01

    Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.

  5. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  6. Finding Waldo: Learning about Users from their Interactions.

    PubMed

    Brown, Eli T; Ottley, Alvitta; Zhao, Helen; Quan Lin; Souvenir, Richard; Endert, Alex; Chang, Remco

    2014-12-01

    Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user's interactions with a system reflect a large amount of the user's reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user's task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, we conduct an experiment in which participants perform a visual search task, and apply well-known machine learning algorithms to three encodings of the users' interaction data. We achieve, depending on algorithm and encoding, between 62% and 83% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user's personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time: in one case 95% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed-initiative visual analytics systems.

  7. A Grammar-based Approach for Modeling User Interactions and Generating Suggestions During the Data Exploration Process.

    PubMed

    Dabek, Filip; Caban, Jesus J

    2017-01-01

    Despite the recent popularity of visual analytics focusing on big data, little is known about how to support users that use visualization techniques to explore multi-dimensional datasets and accomplish specific tasks. Our lack of models that can assist end-users during the data exploration process has made it challenging to learn from the user's interactive and analytical process. The ability to model how a user interacts with a specific visualization technique and what difficulties they face are paramount in supporting individuals with discovering new patterns within their complex datasets. This paper introduces the notion of visualization systems understanding and modeling user interactions with the intent of guiding a user through a task thereby enhancing visual data exploration. The challenges faced and the necessary future steps to take are discussed; and to provide a working example, a grammar-based model is presented that can learn from user interactions, determine the common patterns among a number of subjects using a K-Reversible algorithm, build a set of rules, and apply those rules in the form of suggestions to new users with the goal of guiding them along their visual analytic process. A formal evaluation study with 300 subjects was performed showing that our grammar-based model is effective at capturing the interactive process followed by users and that further research in this area has the potential to positively impact how users interact with a visualization system.

  8. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  9. Application of Data Provenance in Healthcare Analytics Software: Information Visualisation of User Activities

    PubMed Central

    Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa

    2018-01-01

    Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084

  10. Noninvasive studies of human visual cortex using neuromagnetic techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aine, C.J.; George, J.S.; Supek, S.

    1990-01-01

    The major goals of noninvasive studies of the human visual cortex are: to increase knowledge of the functional organization of cortical visual pathways; and to develop noninvasive clinical tests for the assessment of cortical function. Noninvasive techniques suitable for studies of the structure and function of human visual cortex include magnetic resonance imaging (MRI), positron emission tomography (PET), single photon emission tomography (SPECT), scalp recorded event-related potentials (ERPs), and event-related magnetic fields (ERFs). The primary challenge faced by noninvasive functional measures is to optimize the spatial and temporal resolution of the measurement and analytic techniques in order to effectively characterizemore » the spatial and temporal variations in patterns of neuronal activity. In this paper we review the use of neuromagnetic techniques for this purpose. 8 refs., 3 figs.« less

  11. Shaded computer graphic techniques for visualizing and interpreting analytic fluid flow models

    NASA Technical Reports Server (NTRS)

    Parke, F. I.

    1981-01-01

    Mathematical models which predict the behavior of fluid flow in different experiments are simulated using digital computers. The simulations predict values of parameters of the fluid flow (pressure, temperature and velocity vector) at many points in the fluid. Visualization of the spatial variation in the value of these parameters is important to comprehend and check the data generated, to identify the regions of interest in the flow, and for effectively communicating information about the flow to others. The state of the art imaging techniques developed in the field of three dimensional shaded computer graphics is applied to visualization of fluid flow. Use of an imaging technique known as 'SCAN' for visualizing fluid flow, is studied and the results are presented.

  12. Finding Waldo: Learning about Users from their Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Eli T.; Ottley, Alvitta; Zhao, Helen

    Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user’s interactions with a system reflect a large amount of the user’s reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user’s task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, wemore » conduct an experiment in which participants perform a visual search task and we apply well-known machine learning algorithms to three encodings of the users interaction data. We achieve, depending on algorithm and encoding, between 62% and 96% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user’s personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time, in some cases, 82% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed- initiative visual analytics systems.« less

  13. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    ERIC Educational Resources Information Center

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  14. Learning Visualization Strategies: A qualitative investigation

    NASA Astrophysics Data System (ADS)

    Halpern, Daniel; Oh, Kyong Eun; Tremaine, Marilyn; Chiang, James; Bemis, Karen; Silver, Deborah

    2015-12-01

    The following study investigates the range of strategies individuals develop to infer and interpret cross-sections of three-dimensional objects. We focus on the identification of mental representations and problem-solving processes made by 11 individuals with the goal of building training applications that integrate the strategies developed by the participants in our study. Our results suggest that although spatial transformation and perspective-taking techniques are useful for visualizing cross-section problems, these visual processes are augmented by analytical thinking. Further, our study shows that participants employ general analytic strategies for extended periods which evolve through practice into a set of progressively more expert strategies. Theoretical implications are discussed and five main findings are recommended for integration into the design of education software that facilitates visual learning and comprehension.

  15. 2D-Visualization of metabolic activity with planar optical chemical sensors (optodes)

    NASA Astrophysics Data System (ADS)

    Meier, R. J.; Liebsch, G.

    2015-12-01

    Microbia plays an outstandingly important role in many hydrologic compartments, such as e.g. the benthic community in sediments, or biologically active microorganisms in the capillary fringe, in ground water, or soil. Oxygen, pH, and CO2 are key factors and indicators for microbial activity. They can be measured using optical chemical sensors. These sensors record changing fluorescence properties of specific indicator dyes. The signals can be measured in a non-contact mode, even through transparent walls, which is important for many lab-experiments. They can measure in closed (transparent) systems, without sampling or intruding into the sample. They do not consume the analytes while measuring, are fully reversible and able to measure in non-stirred solutions. These sensors can be applied as high precision fiberoptic sensors (for profiling), robust sensor spots, or as planar sensors for 2D visualization (imaging). Imaging enables to detect thousands of measurement spots at the same time and generate 2D analyte maps over a region of interest. It allows for comparing different regions within one recorded image, visualizing spatial analyte gradients, or more important to identify hot spots of metabolic activity. We present ready-to-use portable imaging systems for the analytes oxygen, pH, and CO2. They consist of a detector unit, planar sensor foils and a software for easy data recording and evaluation. Sensors foils for various analytes and measurement ranges enable visualizing metabolic activity or analyte changes in the desired range. Dynamics of metabolic activity can be detected in one shot or over long time periods. We demonstrate the potential of this analytical technique by presenting experiments on benthic disturbance-recovery dynamics in sediments and microbial degradation of organic material in the capillary fringe. We think this technique is a new tool to further understand how microbial and geochemical processes are linked in (not solely) hydrologic systems.

  16. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  17. The VAST Challenge: History, Scope, and Outcomes: An introduction to the Special Issue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Grinstein, Georges; Whiting, Mark A.

    2014-10-01

    Visual analytics aims to facilitate human insight from complex data via a combination of visual representations, interaction techniques, and supporting algorithms. To create new tools and techniques that achieve this goal requires that researchers have an understanding of analytical questions to be addressed, data that illustrates the complexities and ambiguities found in realistic analytic settings, and methods for evaluating whether the plausible insights are gained through use of the new methods. However, researchers do not, generally speaking, have access to analysts who can articulate their problems or operational data that is used for analysis. To fill this gap, the Visualmore » Analytics Science and Technology (VAST) Challenge has been held annually since 2006. The VAST Challenge provides an opportunity for researchers to experiment with realistic but not real problems, using realistic synthetic data with known events embedded. Since its inception, the VAST Challenge has evolved along with the visual analytics research community to pose more complex challenges, ranging from text analysis to video analysis to large scale network log analysis. The seven years of the VAST Challenge have seen advancements in research and development, education, evaluation, and in the challenge process itself. This special issue of Information Visualization highlights some of the noteworthy advancements in each of these areas. Some of these papers focus on important research questions related to the challenge itself, and other papers focus on innovative research that has been shaped by participation in the challenge. This paper describes the VAST Challenge process and benefits in detail. It also provides an introduction to and context for the remaining papers in the issue.« less

  18. Storyline Visualizations of Eye Tracking of Movie Viewing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balint, John T.; Arendt, Dustin L.; Blaha, Leslie M.

    Storyline visualizations offer an approach that promises to capture the spatio-temporal characteristics of individual observers and simultaneously illustrate emerging group behaviors. We develop a visual analytics approach to parsing, aligning, and clustering fixation sequences from eye tracking data. Visualization of the results captures the similarities and differences across a group of observers performing a common task. We apply our storyline approach to visualize gaze patterns of people watching dynamic movie clips. Storylines mitigate some of the shortcomings of existent spatio-temporal visualization techniques and, importantly, continue to highlight individual observer behavioral dynamics.

  19. SmartAdP: Visual Analytics of Large-scale Taxi Trajectories for Selecting Billboard Locations.

    PubMed

    Liu, Dongyu; Weng, Di; Li, Yuhong; Bao, Jie; Zheng, Yu; Qu, Huamin; Wu, Yingcai

    2017-01-01

    The problem of formulating solutions immediately and comparing them rapidly for billboard placements has plagued advertising planners for a long time, owing to the lack of efficient tools for in-depth analyses to make informed decisions. In this study, we attempt to employ visual analytics that combines the state-of-the-art mining and visualization techniques to tackle this problem using large-scale GPS trajectory data. In particular, we present SmartAdP, an interactive visual analytics system that deals with the two major challenges including finding good solutions in a huge solution space and comparing the solutions in a visual and intuitive manner. An interactive framework that integrates a novel visualization-driven data mining model enables advertising planners to effectively and efficiently formulate good candidate solutions. In addition, we propose a set of coupled visualizations: a solution view with metaphor-based glyphs to visualize the correlation between different solutions; a location view to display billboard locations in a compact manner; and a ranking view to present multi-typed rankings of the solutions. This system has been demonstrated using case studies with a real-world dataset and domain-expert interviews. Our approach can be adapted for other location selection problems such as selecting locations of retail stores or restaurants using trajectory data.

  20. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  1. Matisse: A Visual Analytics System for Exploring Emotion Trends in Social Media Text Streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Drouhard, Margaret MEG G; Beaver, Justin M

    Dynamically mining textual information streams to gain real-time situational awareness is especially challenging with social media systems where throughput and velocity properties push the limits of a static analytical approach. In this paper, we describe an interactive visual analytics system, called Matisse, that aids with the discovery and investigation of trends in streaming text. Matisse addresses the challenges inherent to text stream mining through the following technical contributions: (1) robust stream data management, (2) automated sentiment/emotion analytics, (3) interactive coordinated visualizations, and (4) a flexible drill-down interaction scheme that accesses multiple levels of detail. In addition to positive/negative sentiment prediction,more » Matisse provides fine-grained emotion classification based on Valence, Arousal, and Dominance dimensions and a novel machine learning process. Information from the sentiment/emotion analytics are fused with raw data and summary information to feed temporal, geospatial, term frequency, and scatterplot visualizations using a multi-scale, coordinated interaction model. After describing these techniques, we conclude with a practical case study focused on analyzing the Twitter sample stream during the week of the 2013 Boston Marathon bombings. The case study demonstrates the effectiveness of Matisse at providing guided situational awareness of significant trends in social media streams by orchestrating computational power and human cognition.« less

  2. Clustervision: Visual Supervision of Unsupervised Clustering.

    PubMed

    Kwon, Bum Chul; Eysenbach, Ben; Verma, Janu; Ng, Kenney; De Filippi, Christopher; Stewart, Walter F; Perer, Adam

    2018-01-01

    Clustering, the process of grouping together similar items into distinct partitions, is a common type of unsupervised machine learning that can be useful for summarizing and aggregating complex multi-dimensional data. However, data can be clustered in many ways, and there exist a large body of algorithms designed to reveal different patterns. While having access to a wide variety of algorithms is helpful, in practice, it is quite difficult for data scientists to choose and parameterize algorithms to get the clustering results relevant for their dataset and analytical tasks. To alleviate this problem, we built Clustervision, a visual analytics tool that helps ensure data scientists find the right clustering among the large amount of techniques and parameters available. Our system clusters data using a variety of clustering techniques and parameters and then ranks clustering results utilizing five quality metrics. In addition, users can guide the system to produce more relevant results by providing task-relevant constraints on the data. Our visual user interface allows users to find high quality clustering results, explore the clusters using several coordinated visualization techniques, and select the cluster result that best suits their task. We demonstrate this novel approach using a case study with a team of researchers in the medical domain and showcase that our system empowers users to choose an effective representation of their complex data.

  3. How Can Visual Analytics Assist Investigative Analysis? Design Implications from an Evaluation.

    PubMed

    Youn-Ah Kang; Görg, Carsten; Stasko, John

    2011-05-01

    Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations on metrics and techniques for evaluating visual analytics systems for investigative analysis.

  4. Propeller flow visualization techniques

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Paulovich, F. J.; Greissing, J. P.; Walker, E. D.

    1982-01-01

    Propeller flow visualization techniques were tested. The actual operating blade shape as it determines the actual propeller performance and noise was established. The ability to photographically determine the advanced propeller blade tip deflections, local flow field conditions, and gain insight into aeroelastic instability is demonstrated. The analytical prediction methods which are being developed can be compared with experimental data. These comparisons contribute to the verification of these improved methods and give improved capability for designing future advanced propellers with enhanced performance and noise characteristics.

  5. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    PubMed

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  6. Cultural Parallax and Content Analysis: Images of Black Women in High School History Textbooks

    ERIC Educational Resources Information Center

    Woyshner, Christine; Schocker, Jessica B.

    2015-01-01

    This study investigates the representation of Black women in high school history textbooks. To examine the extent to which Black women are represented visually and to explore how they are portrayed, the authors use a mixed-methods approach that draws on analytical techniques in content analysis and from visual culture studies. Their findings…

  7. TopicLens: Efficient Multi-Level Visual Topic Exploration of Large-Scale Document Collections.

    PubMed

    Kim, Minjeong; Kang, Kyeongpil; Park, Deokgun; Choo, Jaegul; Elmqvist, Niklas

    2017-01-01

    Topic modeling, which reveals underlying topics of a document corpus, has been actively adopted in visual analytics for large-scale document collections. However, due to its significant processing time and non-interactive nature, topic modeling has so far not been tightly integrated into a visual analytics workflow. Instead, most such systems are limited to utilizing a fixed, initial set of topics. Motivated by this gap in the literature, we propose a novel interaction technique called TopicLens that allows a user to dynamically explore data through a lens interface where topic modeling and the corresponding 2D embedding are efficiently computed on the fly. To support this interaction in real time while maintaining view consistency, we propose a novel efficient topic modeling method and a semi-supervised 2D embedding algorithm. Our work is based on improving state-of-the-art methods such as nonnegative matrix factorization and t-distributed stochastic neighbor embedding. Furthermore, we have built a web-based visual analytics system integrated with TopicLens. We use this system to measure the performance and the visualization quality of our proposed methods. We provide several scenarios showcasing the capability of TopicLens using real-world datasets.

  8. Toward interactive search in remote sensing imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, Reid B; Hush, Do; Harvey, Neal

    2010-01-01

    To move from data to information in almost all science and defense applications requires a human-in-the-loop to validate information products, resolve inconsistencies, and account for incomplete and potentially deceptive sources of information. This is a key motivation for visual analytics which aims to develop techniques that complement and empower human users. By contrast, the vast majority of algorithms developed in machine learning aim to replace human users in data exploitation. In this paper we describe a recently introduced machine learning problem, called rare category detection, which may be a better match to visual analytic environments. We describe a new designmore » criteria for this problem, and present comparisons to existing techniques with both synthetic and real-world datasets. We conclude by describing an application in broad-area search of remote sensing imagery.« less

  9. Visualizing statistical significance of disease clusters using cartograms.

    PubMed

    Kronenfeld, Barry J; Wong, David W S

    2017-05-15

    Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.

  10. Steady-State Visual Evoked Potentials and Phase Synchronization in Migraine Patients

    NASA Astrophysics Data System (ADS)

    Angelini, L.; Tommaso, M. De; Guido, M.; Hu, K.; Ivanov, P. Ch.; Marinazzo, D.; Nardulli, G.; Nitti, L.; Pellicoro, M.; Pierro, C.; Stramaglia, S.

    2004-07-01

    We investigate phase synchronization in EEG recordings from migraine patients. We use the analytic signal technique, based on the Hilbert transform, and find that migraine brains are characterized by enhanced alpha band phase synchronization in the presence of visual stimuli. Our findings show that migraine patients have an overactive regulatory mechanism that renders them more sensitive to external stimuli.

  11. Application of radar chart array analysis to visualize effects of formulation variables on IgG1 particle formation as measured by multiple analytical techniques

    PubMed Central

    Kalonia, Cavan; Kumru, Ozan S.; Kim, Jae Hyun; Middaugh, C. Russell; Volkin, David B.

    2013-01-01

    This study presents a novel method to visualize protein aggregate and particle formation data to rapidly evaluate the effect of solution and stress conditions on the physical stability of an IgG1 monoclonal antibody (mAb). Radar chart arrays were designed so that hundreds of Microflow Digital Imaging (MFI) solution measurements, evaluating different mAb formulations under varying stresses, could be presented in a single figure with minimal loss of data resolution. These MFI radar charts show measured changes in subvisible particle number, size and morphology distribution as a change in the shape of polygons. Radar charts were also created to visualize mAb aggregate and particle formation across a wide size range by combining data sets from size exclusion chromatography (SEC), Archimedes resonant mass measurements, and MFI. We found that the environmental/mechanical stress condition (e.g., heat vs. agitation) was the most important factor in influencing the particle size and morphology distribution with this IgG1 mAb. Additionally, the presence of NaCl exhibited a pH and stress dependent behavior resulting in promotion or inhibition mAb particle formation. This data visualization technique provides a comprehensive analysis of the aggregation tendencies of this IgG1 mAb in different formulations with varying stresses as measured by different analytical techniques. PMID:24122556

  12. LoyalTracker: Visualizing Loyalty Dynamics in Search Engines.

    PubMed

    Shi, Conglei; Wu, Yingcai; Liu, Shixia; Zhou, Hong; Qu, Huamin

    2014-12-01

    The huge amount of user log data collected by search engine providers creates new opportunities to understand user loyalty and defection behavior at an unprecedented scale. However, this also poses a great challenge to analyze the behavior and glean insights into the complex, large data. In this paper, we introduce LoyalTracker, a visual analytics system to track user loyalty and switching behavior towards multiple search engines from the vast amount of user log data. We propose a new interactive visualization technique (flow view) based on a flow metaphor, which conveys a proper visual summary of the dynamics of user loyalty of thousands of users over time. Two other visualization techniques, a density map and a word cloud, are integrated to enable analysts to gain further insights into the patterns identified by the flow view. Case studies and the interview with domain experts are conducted to demonstrate the usefulness of our technique in understanding user loyalty and switching behavior in search engines.

  13. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education

    PubMed Central

    Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data with possible future implications on healthcare education. It also opens a new direction in medical education informatics research. PMID:25469323

  14. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data with possible future implications on healthcare education. It also opens a new direction in medical education informatics research.

  15. SOMFlow: Guided Exploratory Cluster Analysis with Self-Organizing Maps and Analytic Provenance.

    PubMed

    Sacha, Dominik; Kraus, Matthias; Bernard, Jurgen; Behrisch, Michael; Schreck, Tobias; Asano, Yuki; Keim, Daniel A

    2018-01-01

    Clustering is a core building block for data analysis, aiming to extract otherwise hidden structures and relations from raw datasets, such as particular groups that can be effectively related, compared, and interpreted. A plethora of visual-interactive cluster analysis techniques has been proposed to date, however, arriving at useful clusterings often requires several rounds of user interactions to fine-tune the data preprocessing and algorithms. We present a multi-stage Visual Analytics (VA) approach for iterative cluster refinement together with an implementation (SOMFlow) that uses Self-Organizing Maps (SOM) to analyze time series data. It supports exploration by offering the analyst a visual platform to analyze intermediate results, adapt the underlying computations, iteratively partition the data, and to reflect previous analytical activities. The history of previous decisions is explicitly visualized within a flow graph, allowing to compare earlier cluster refinements and to explore relations. We further leverage quality and interestingness measures to guide the analyst in the discovery of useful patterns, relations, and data partitions. We conducted two pair analytics experiments together with a subject matter expert in speech intonation research to demonstrate that the approach is effective for interactive data analysis, supporting enhanced understanding of clustering results as well as the interactive process itself.

  16. Interactive visual exploration and analysis of origin-destination data

    NASA Astrophysics Data System (ADS)

    Ding, Linfang; Meng, Liqiu; Yang, Jian; Krisp, Jukka M.

    2018-05-01

    In this paper, we propose a visual analytics approach for the exploration of spatiotemporal interaction patterns of massive origin-destination data. Firstly, we visually query the movement database for data at certain time windows. Secondly, we conduct interactive clustering to allow the users to select input variables/features (e.g., origins, destinations, distance, and duration) and to adjust clustering parameters (e.g. distance threshold). The agglomerative hierarchical clustering method is applied for the multivariate clustering of the origin-destination data. Thirdly, we design a parallel coordinates plot for visualizing the precomputed clusters and for further exploration of interesting clusters. Finally, we propose a gradient line rendering technique to show the spatial and directional distribution of origin-destination clusters on a map view. We implement the visual analytics approach in a web-based interactive environment and apply it to real-world floating car data from Shanghai. The experiment results show the origin/destination hotspots and their spatial interaction patterns. They also demonstrate the effectiveness of our proposed approach.

  17. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  18. Hyperspectral imaging for non-contact analysis of forensic traces.

    PubMed

    Edelman, G J; Gaston, E; van Leeuwen, T G; Cullen, P J; Aalders, M C G

    2012-11-30

    Hyperspectral imaging (HSI) integrates conventional imaging and spectroscopy, to obtain both spatial and spectral information from a specimen. This technique enables investigators to analyze the chemical composition of traces and simultaneously visualize their spatial distribution. HSI offers significant potential for the detection, visualization, identification and age estimation of forensic traces. The rapid, non-destructive and non-contact features of HSI mark its suitability as an analytical tool for forensic science. This paper provides an overview of the principles, instrumentation and analytical techniques involved in hyperspectral imaging. We describe recent advances in HSI technology motivating forensic science applications, e.g. the development of portable and fast image acquisition systems. Reported forensic science applications are reviewed. Challenges are addressed, such as the analysis of traces on backgrounds encountered in casework, concluded by a summary of possible future applications. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.

    PubMed

    Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio

    2009-12-01

    Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.

  20. A comparative review of optical surface contamination assessment techniques

    NASA Technical Reports Server (NTRS)

    Heaney, James B.

    1987-01-01

    This paper will review the relative sensitivities and practicalities of the common surface analytical methods that are used to detect and identify unwelcome adsorbants on optical surfaces. The compared methods include visual inspection, simple reflectometry and transmissiometry, ellipsometry, infrared absorption and attenuated total reflectance spectroscopy (ATR), Auger electron spectroscopy (AES), scanning electron microscopy (SEM), secondary ion mass spectrometry (SIMS), and mass accretion determined by quartz crystal microbalance (QCM). The discussion is biased toward those methods that apply optical thin film analytical techniques to spacecraft optical contamination problems. Examples are cited from both ground based and in-orbit experiments.

  1. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    PubMed Central

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  2. The Effect of Multispectral Image Fusion Enhancement on Human Efficiency

    DTIC Science & Technology

    2017-03-20

    human visual system by applying a technique commonly used in visual percep- tion research : ideal observer analysis. Using this approach, we establish...applications, analytic tech- niques, and procedural methods used across studies. This paper uses ideal observer analysis to establish a frame- work that allows...augmented similarly to incorpo- rate research involving more complex stimulus content. Additionally, the ideal observer can be adapted for a number of

  3. Visualization rhetoric: framing effects in narrative visualization.

    PubMed

    Hullman, Jessica; Diakopoulos, Nicholas

    2011-12-01

    Narrative visualizations combine conventions of communicative and exploratory information visualization to convey an intended story. We demonstrate visualization rhetoric as an analytical framework for understanding how design techniques that prioritize particular interpretations in visualizations that "tell a story" can significantly affect end-user interpretation. We draw a parallel between narrative visualization interpretation and evidence from framing studies in political messaging, decision-making, and literary studies. Devices for understanding the rhetorical nature of narrative information visualizations are presented, informed by the rigorous application of concepts from critical theory, semiotics, journalism, and political theory. We draw attention to how design tactics represent additions or omissions of information at various levels-the data, visual representation, textual annotations, and interactivity-and how visualizations denote and connote phenomena with reference to unstated viewing conventions and codes. Classes of rhetorical techniques identified via a systematic analysis of recent narrative visualizations are presented, and characterized according to their rhetorical contribution to the visualization. We describe how designers and researchers can benefit from the potentially positive aspects of visualization rhetoric in designing engaging, layered narrative visualizations and how our framework can shed light on how a visualization design prioritizes specific interpretations. We identify areas where future inquiry into visualization rhetoric can improve understanding of visualization interpretation. © 2011 IEEE

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A

    Interactive data visualization leverages human visual perception and cognition to improve the accuracy and effectiveness of data analysis. When combined with automated data analytics, data visualization systems orchestrate the strengths of humans with the computational power of machines to solve problems neither approach can manage in isolation. In the intelligent transportation system domain, such systems are necessary to support decision making in large and complex data streams. In this chapter, we provide an introduction to several key topics related to the design of data visualization systems. In addition to an overview of key techniques and strategies, we will describe practicalmore » design principles. The chapter is concluded with a detailed case study involving the design of a multivariate visualization tool.« less

  5. Learning Dashboards

    ERIC Educational Resources Information Center

    Charleer, Sven; Klerkx, Joris; Duval, Erik

    2014-01-01

    This article explores how information visualization techniques can be applied to learning analytics data to help teachers and students deal with the abundance of learner traces. We also investigate how the affordances of large interactive surfaces can facilitate a collaborative sense-making environment for multiple students and teachers to explore…

  6. An orientation soil survey at the Pebble Cu-Au-Mo porphyry deposit, Alaska

    USGS Publications Warehouse

    Smith, Steven M.; Eppinger, Robert G.; Fey, David L.; Kelley, Karen D.; Giles, S.A.

    2009-01-01

    Soil samples were collected in 2007 and 2008 along three traverses across the giant Pebble Cu-Au-Mo porphyry deposit. Within each soil pit, four subsamples were collected following recommended protocols for each of ten commonly-used and proprietary leach/digestion techniques. The significance of geochemical patterns generated by these techniques was classified by visual inspection of plots showing individual element concentration by each analytical method along the 2007 traverse. A simple matrix by element versus method, populated with a value based on the significance classification, provides a method for ranking the utility of methods and elements at this deposit. The interpretation of a complex multi-element dataset derived from multiple analytical techniques is challenging. An example of vanadium results from a single leach technique is used to illustrate the several possible interpretations of the data.

  7. The Human is the Loop: New Directions for Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; Hossain, Shahriar H.; Ramakrishnan, Naren

    2014-01-28

    Visual analytics is the science of marrying interactive visualizations and analytic algorithms to support exploratory knowledge discovery in large datasets. We argue for a shift from a ‘human in the loop’ philosophy for visual analytics to a ‘human is the loop’ viewpoint, where the focus is on recognizing analysts’ work processes, and seamlessly fitting analytics into that existing interactive process. We survey a range of projects that provide visual analytic support contextually in the sensemaking loop, and outline a research agenda along with future challenges.

  8. An Affordance-Based Framework for Human Computation and Human-Computer Collaboration.

    PubMed

    Crouser, R J; Chang, R

    2012-12-01

    Visual Analytics is "the science of analytical reasoning facilitated by visual interactive interfaces". The goal of this field is to develop tools and methodologies for approaching problems whose size and complexity render them intractable without the close coupling of both human and machine analysis. Researchers have explored this coupling in many venues: VAST, Vis, InfoVis, CHI, KDD, IUI, and more. While there have been myriad promising examples of human-computer collaboration, there exists no common language for comparing systems or describing the benefits afforded by designing for such collaboration. We argue that this area would benefit significantly from consensus about the design attributes that define and distinguish existing techniques. In this work, we have reviewed 1,271 papers from many of the top-ranking conferences in visual analytics, human-computer interaction, and visualization. From these, we have identified 49 papers that are representative of the study of human-computer collaborative problem-solving, and provide a thorough overview of the current state-of-the-art. Our analysis has uncovered key patterns of design hinging on human and machine-intelligence affordances, and also indicates unexplored avenues in the study of this area. The results of this analysis provide a common framework for understanding these seemingly disparate branches of inquiry, which we hope will motivate future work in the field.

  9. Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oesterling, Patrick; Heine, Christian; Weber, Gunther H.

    2012-05-04

    Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phasemore » utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.« less

  10. Multi-focus and multi-level techniques for visualization and analysis of networks with thematic data

    NASA Astrophysics Data System (ADS)

    Cossalter, Michele; Mengshoel, Ole J.; Selker, Ted

    2013-01-01

    Information-rich data sets bring several challenges in the areas of visualization and analysis, even when associated with node-link network visualizations. This paper presents an integration of multi-focus and multi-level techniques that enable interactive, multi-step comparisons in node-link networks. We describe NetEx, a visualization tool that enables users to simultaneously explore different parts of a network and its thematic data, such as time series or conditional probability tables. NetEx, implemented as a Cytoscape plug-in, has been applied to the analysis of electrical power networks, Bayesian networks, and the Enron e-mail repository. In this paper we briefly discuss visualization and analysis of the Enron social network, but focus on data from an electrical power network. Specifically, we demonstrate how NetEx supports the analytical task of electrical power system fault diagnosis. Results from a user study with 25 subjects suggest that NetEx enables more accurate isolation of complex faults compared to an especially designed software tool.

  11. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  12. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  13. VAST Challenge 2016: Streaming Visual Analytics

    DTIC Science & Technology

    2016-10-25

    understand rapidly evolving situations. To support such tasks, visual analytics solutions must move well beyond systems that simply provide real-time...received. Mini-Challenge 1: Design Challenge Mini-Challenge 1 focused on systems to support security and operational analytics at the Euybia...Challenge 1 was to solicit novel approaches for streaming visual analytics that push the boundaries for what constitutes a visual analytics system , and to

  14. General Analytical Schemes for the Characterization of Pectin-Based Edible Gelled Systems

    PubMed Central

    Haghighi, Maryam; Rezaei, Karamatollah

    2012-01-01

    Pectin-based gelled systems have gained increasing attention for the design of newly developed food products. For this reason, the characterization of such formulas is a necessity in order to present scientific data and to introduce an appropriate finished product to the industry. Various analytical techniques are available for the evaluation of the systems formulated on the basis of pectin and the designed gel. In this paper, general analytical approaches for the characterization of pectin-based gelled systems were categorized into several subsections including physicochemical analysis, visual observation, textural/rheological measurement, microstructural image characterization, and psychorheological evaluation. Three-dimensional trials to assess correlations among microstructure, texture, and taste were also discussed. Practical examples of advanced objective techniques including experimental setups for small and large deformation rheological measurements and microstructural image analysis were presented in more details. PMID:22645484

  15. Combined use of optical and electron microscopic techniques for the measurement of hygroscopic property, chemical composition, and morphology of individual aerosol particles.

    PubMed

    Ahn, Kang-Ho; Kim, Sun-Man; Jung, Hae-Jin; Lee, Mi-Jung; Eom, Hyo-Jin; Maskey, Shila; Ro, Chul-Un

    2010-10-01

    In this work, an analytical method for the characterization of the hygroscopic property, chemical composition, and morphology of individual aerosol particles is introduced. The method, which is based on the combined use of optical and electron microscopic techniques, is simple and easy to apply. An optical microscopic technique was used to perform the visual observation of the phase transformation and hygroscopic growth of aerosol particles on a single particle level. A quantitative energy-dispersive electron probe X-ray microanalysis, named low-Z particle EPMA, was used to perform a quantitative chemical speciation of the same individual particles after the measurement of the hygroscopic property. To validate the analytical methodology, the hygroscopic properties of artificially generated NaCl, KCl, (NH(4))(2)SO(4), and Na(2)SO(4) aerosol particles of micrometer size were investigated. The practical applicability of the analytical method for studying the hygroscopic property, chemical composition, and morphology of ambient aerosol particles is demonstrated.

  16. Query2Question: Translating Visualization Interaction into Natural Language.

    PubMed

    Nafari, Maryam; Weaver, Chris

    2015-06-01

    Richly interactive visualization tools are increasingly popular for data exploration and analysis in a wide variety of domains. Existing systems and techniques for recording provenance of interaction focus either on comprehensive automated recording of low-level interaction events or on idiosyncratic manual transcription of high-level analysis activities. In this paper, we present the architecture and translation design of a query-to-question (Q2Q) system that automatically records user interactions and presents them semantically using natural language (written English). Q2Q takes advantage of domain knowledge and uses natural language generation (NLG) techniques to translate and transcribe a progression of interactive visualization states into a visual log of styled text that complements and effectively extends the functionality of visualization tools. We present Q2Q as a means to support a cross-examination process in which questions rather than interactions are the focus of analytic reasoning and action. We describe the architecture and implementation of the Q2Q system, discuss key design factors and variations that effect question generation, and present several visualizations that incorporate Q2Q for analysis in a variety of knowledge domains.

  17. Developing Guidelines for Assessing Visual Analytics Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    2011-07-01

    In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less

  18. 3D ToF-SIMS Analysis of Peptide Incorporation into MALDI Matrix Crystals with Sub-micrometer Resolution.

    PubMed

    Körsgen, Martin; Pelster, Andreas; Dreisewerd, Klaus; Arlinghaus, Heinrich F

    2016-02-01

    The analytical sensitivity in matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is largely affected by the specific analyte-matrix interaction, in particular by the possible incorporation of the analytes into crystalline MALDI matrices. Here we used time-of-flight secondary ion mass spectrometry (ToF-SIMS) to visualize the incorporation of three peptides with different hydrophobicities, bradykinin, Substance P, and vasopressin, into two classic MALDI matrices, 2,5-dihydroxybenzoic acid (DHB) and α-cyano-4-hydroxycinnamic acid (HCCA). For depth profiling, an Ar cluster ion beam was used to gradually sputter through the matrix crystals without causing significant degradation of matrix or biomolecules. A pulsed Bi3 ion cluster beam was used to image the lateral analyte distribution in the center of the sputter crater. Using this dual beam technique, the 3D distribution of the analytes and spatial segregation effects within the matrix crystals were imaged with sub-μm resolution. The technique could in the future enable matrix-enhanced (ME)-ToF-SIMS imaging of peptides in tissue slices at ultra-high resolution. Graphical Abstract ᅟ.

  19. 3D ToF-SIMS Analysis of Peptide Incorporation into MALDI Matrix Crystals with Sub-micrometer Resolution

    NASA Astrophysics Data System (ADS)

    Körsgen, Martin; Pelster, Andreas; Dreisewerd, Klaus; Arlinghaus, Heinrich F.

    2016-02-01

    The analytical sensitivity in matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is largely affected by the specific analyte-matrix interaction, in particular by the possible incorporation of the analytes into crystalline MALDI matrices. Here we used time-of-flight secondary ion mass spectrometry (ToF-SIMS) to visualize the incorporation of three peptides with different hydrophobicities, bradykinin, Substance P, and vasopressin, into two classic MALDI matrices, 2,5-dihydroxybenzoic acid (DHB) and α-cyano-4-hydroxycinnamic acid (HCCA). For depth profiling, an Ar cluster ion beam was used to gradually sputter through the matrix crystals without causing significant degradation of matrix or biomolecules. A pulsed Bi3 ion cluster beam was used to image the lateral analyte distribution in the center of the sputter crater. Using this dual beam technique, the 3D distribution of the analytes and spatial segregation effects within the matrix crystals were imaged with sub-μm resolution. The technique could in the future enable matrix-enhanced (ME)-ToF-SIMS imaging of peptides in tissue slices at ultra-high resolution.

  20. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE PAGES

    Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...

    2017-02-16

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  1. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.; Halsey, William; Dehoff, Ryan

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  2. Visual communication in the psychoanalytic situation.

    PubMed

    Kanzer, M

    1980-01-01

    The relationship between verbal and visual aspects of the analytic proceedings shows them blended integrally in the experiences of both patient and analyst and in contributing to the insights derived during the treatment. Areas in which the admixture of the verbal and visual occur are delineated. Awareness of the visual aspects gives substance to the operations of empathy, intuition, acting out, working through, etc. Some typical features of visual 'language" are noted and related to the analytic situation. As such they can be translated with the use of logic and consciousness on the analyst's part, not mere random eruptions of intuition. The original significance of dreams as a royal road to the unconscious is confirmed-but we also find in them insights to be derived with higher mental processes. Finally, dyadic aspects of the formation and aims of dreams during analysis are pointed out, with important implications for the analyst's own self-supervision of his techniques and 'real personality" and their effects upon the patient. how remarkable that Dora's dreams, all too belatedly teaching Freud about their transference implications, still have so much more to communicate that derives from his capacity to record faithfully observations he was not yet ready to explain.

  3. Visual analytics of inherently noisy crowdsourced data on ultra high resolution displays

    NASA Astrophysics Data System (ADS)

    Huynh, Andrew; Ponto, Kevin; Lin, Albert Yu-Min; Kuester, Falko

    The increasing prevalence of distributed human microtasking, crowdsourcing, has followed the exponential increase in data collection capabilities. The large scale and distributed nature of these microtasks produce overwhelming amounts of information that is inherently noisy due to the nature of human input. Furthermore, these inputs create a constantly changing dataset with additional information added on a daily basis. Methods to quickly visualize, filter, and understand this information over temporal and geospatial constraints is key to the success of crowdsourcing. This paper present novel methods to visually analyze geospatial data collected through crowdsourcing on top of remote sensing satellite imagery. An ultra high resolution tiled display system is used to explore the relationship between human and satellite remote sensing data at scale. A case study is provided that evaluates the presented technique in the context of an archaeological field expedition. A team in the field communicated in real-time with and was guided by researchers in the remote visual analytics laboratory, swiftly sifting through incoming crowdsourced data to identify target locations that were identified as viable archaeological sites.

  4. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  5. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less

  7. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    NASA Astrophysics Data System (ADS)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  8. Social Media Visual Analytics for Events

    NASA Astrophysics Data System (ADS)

    Diakopoulos, Nicholas; Naaman, Mor; Yazdani, Tayebeh; Kivran-Swaine, Funda

    For large-scale multimedia events such as televised debates and speeches, the amount of content on social media channels such as Facebook or Twitter can easily become overwhelming, yet still contain information that may aid and augment understanding of the multimedia content via individual social media items, or aggregate information from the crowd's response. In this work we discuss this opportunity in the context of a social media visual analytic tool, Vox Civitas, designed to help journalists, media professionals, or other researchers make sense of large-scale aggregations of social media content around multimedia broadcast events. We discuss the design of the tool, present and evaluate the text analysis techniques used to enable the presentation, and detail the visual and interaction design. We provide an exploratory evaluation based on a user study in which journalists interacted with the system to analyze and report on a dataset of over one 100 000 Twitter messages collected during the broadcast of the U.S. State of the Union presidential address in 2010.

  9. Data Analytics and Visualization for Large Army Testing Data

    DTIC Science & Technology

    2013-09-01

    and relationships in the data that would otherwise remain hidden. 7 Bibliography 1. Goodall , J. R.; Tesone, D. R. Visual Analytics for Network...Software Visualization, 2003, pp 143–149. 3. Goodall , J. R.; Sowul, M. VIAssist: Visual Analytics for Cyber Defense, IEEE Conference on Technologies

  10. Familiarity Vs Trust: A Comparative Study of Domain Scientists' Trust in Visual Analytics and Conventional Analysis Methods.

    PubMed

    Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel

    2017-01-01

    Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.

  11. EEG synchronization and migraine

    NASA Astrophysics Data System (ADS)

    Stramaglia, Sebastiano; Angelini, Leonardo; Pellicoro, Mario; Hu, Kun; Ivanov, Plamen Ch.

    2004-03-01

    We investigate phase synchronization in EEG recordings from migraine patients. We use the analytic signal technique, based on the Hilbert transform, and find that migraine brains are characterized by enhanced alpha band phase synchronization in presence of visual stimuli. Our findings show that migraine patients have an overactive regulatory mechanism that renders them more sensitive to external stimuli.

  12. Preparation and Analysis of Potassium Tris(Oxalato)Ferrate(III)Trihydrate: A General Chemistry Experiment.

    ERIC Educational Resources Information Center

    Olmsted, John

    1984-01-01

    Describes a five-period experiment which: (1) integrates preparative and analytical techniques; (2) utilizes a photochemical reaction that excites student interest both from visual impact and as an introduction to photoinduced processes; (3) provides accurate results; and (4) costs less than $0.20 per student per laboratory session. (JN)

  13. UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.

    PubMed

    Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun

    2013-12-01

    Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.

  14. Innovative Visualization Techniques applied to a Flood Scenario

    NASA Astrophysics Data System (ADS)

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other windows. These concepts can be part of a collaborative platform, where multiple people share and work together on the data, via online access, which also allows its remote usage from a mobile platform. Storytelling augments analysis and decision-making capabilities allowing to assimilate complex situations and reach informed decisions, in addition to helping the public visualize information. In our visualization scenario, developed in the context of the VA-4D project for the European Space Agency (see http://www.ca3-uninova.org/project_va4d), we make use of the GAV (GeoAnalytics Visualization) framework, a web-oriented visual analytics application based on multiple interactive views. The final visualization that we produce includes multiple interactive views, including a dynamic multi-layer map surrounded by other visualizations such as bar charts, time graphs and scatter plots. The map provides flood and building information, on top of a base city map (street maps and/or satellite imagery provided by online map services such as Google Maps, Bing Maps etc.). Damage over time for selected buildings, damage for all buildings at a chosen time period, correlation between damage and water depth can be analysed in the other views. This interactive web-based visualization that incorporates the ideas of storytelling, web-based linked views, and other visualization techniques, for a 4 hour flood event in Lisbon in 2010, can be found online at http://www.ncomva.se/flash/projects/esa/flooding/.

  15. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  16. Visual Analytics 101

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Burtner, Edwin R.; Cook, Kristin A.

    This course will introduce the field of Visual Analytics to HCI researchers and practitioners highlighting the contributions they can make to this field. Topics will include a definition of visual analytics along with examples of current systems, types of tasks and end users, issues in defining user requirements, design of visualizations and interactions, guidelines and heuristics, the current state of user-centered evaluations, and metrics for evaluation. We encourage designers, HCI researchers, and HCI practitioners to attend to learn how their skills can contribute to advancing the state of the art of visual analytics

  17. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  18. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  19. Changes in Visual/Spatial and Analytic Strategy Use in Organic Chemistry with the Development of Expertise

    ERIC Educational Resources Information Center

    Vlacholia, Maria; Vosniadou, Stella; Roussos, Petros; Salta, Katerina; Kazi, Smaragda; Sigalas, Michael; Tzougraki, Chryssa

    2017-01-01

    We present two studies that investigated the adoption of visual/spatial and analytic strategies by individuals at different levels of expertise in the area of organic chemistry, using the Visual Analytic Chemistry Task (VACT). The VACT allows the direct detection of analytic strategy use without drawing inferences about underlying mental…

  20. The case for visual analytics of arsenic concentrations in foods.

    PubMed

    Johnson, Matilda O; Cohly, Hari H P; Isokpehi, Raphael D; Awofolu, Omotayo R

    2010-05-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species.

  1. The Case for Visual Analytics of Arsenic Concentrations in Foods

    PubMed Central

    Johnson, Matilda O.; Cohly, Hari H.P.; Isokpehi, Raphael D.; Awofolu, Omotayo R.

    2010-01-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species. PMID:20623005

  2. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  3. Exploring the Micro-Social Geography of Children's Interactions in Preschool: A Long-Term Observational Study and Analysis Using Geographic Information Technologies

    ERIC Educational Resources Information Center

    Torrens, Paul M.; Griffin, William A.

    2013-01-01

    The authors describe an observational and analytic methodology for recording and interpreting dynamic microprocesses that occur during social interaction, making use of space--time data collection techniques, spatial-statistical analysis, and visualization. The scheme has three investigative foci: Structure, Activity Composition, and Clustering.…

  4. Measuring the Effectiveness of Visual Analytics and Data Fusion Techniques on Situation Awareness in Cyber-Security

    ERIC Educational Resources Information Center

    Giacobe, Nicklaus A.

    2013-01-01

    Cyber-security involves the monitoring a complex network of inter-related computers to prevent, identify and remediate from undesired actions. This work is performed in organizations by human analysts. These analysts monitor cyber-security sensors to develop and maintain situation awareness (SA) of both normal and abnormal activities that occur on…

  5. Visual analysis of online social media to open up the investigation of stance phenomena

    PubMed Central

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2015-01-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool. PMID:29249903

  6. Visual analysis of online social media to open up the investigation of stance phenomena.

    PubMed

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2016-04-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool.

  7. Visualizing Mobility of Public Transportation System.

    PubMed

    Zeng, Wei; Fu, Chi-Wing; Arisona, Stefan Müller; Erath, Alexander; Qu, Huamin

    2014-12-01

    Public transportation systems (PTSs) play an important role in modern cities, providing shared/massive transportation services that are essential for the general public. However, due to their increasing complexity, designing effective methods to visualize and explore PTS is highly challenging. Most existing techniques employ network visualization methods and focus on showing the network topology across stops while ignoring various mobility-related factors such as riding time, transfer time, waiting time, and round-the-clock patterns. This work aims to visualize and explore passenger mobility in a PTS with a family of analytical tasks based on inputs from transportation researchers. After exploring different design alternatives, we come up with an integrated solution with three visualization modules: isochrone map view for geographical information, isotime flow map view for effective temporal information comparison and manipulation, and OD-pair journey view for detailed visual analysis of mobility factors along routes between specific origin-destination pairs. The isotime flow map linearizes a flow map into a parallel isoline representation, maximizing the visualization of mobility information along the horizontal time axis while presenting clear and smooth pathways from origin to destinations. Moreover, we devise several interactive visual query methods for users to easily explore the dynamics of PTS mobility over space and time. Lastly, we also construct a PTS mobility model from millions of real passenger trajectories, and evaluate our visualization techniques with assorted case studies with the transportation researchers.

  8. Beyond Control Panels: Direct Manipulation for Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; Bradel, Lauren; North, Chris

    2013-07-19

    Information Visualization strives to provide visual representations through which users can think about and gain insight into information. By leveraging the visual and cognitive systems of humans, complex relationships and phenomena occurring within datasets can be uncovered by exploring information visually. Interaction metaphors for such visualizations are designed to enable users direct control over the filters, queries, and other parameters controlling how the data is visually represented. Through the evolution of information visualization, more complex mathematical and data analytic models are being used to visualize relationships and patterns in data – creating the field of Visual Analytics. However, the expectationsmore » for how users interact with these visualizations has remained largely unchanged – focused primarily on the direct manipulation of parameters of the underlying mathematical models. In this article we present an opportunity to evolve the methodology for user interaction from the direct manipulation of parameters through visual control panels, to interactions designed specifically for visual analytic systems. Instead of focusing on traditional direct manipulation of mathematical parameters, the evolution of the field can be realized through direct manipulation within the visual representation – where users can not only gain insight, but also interact. This article describes future directions and research challenges that fundamentally change the meaning of direct manipulation with regards to visual analytics, advancing the Science of Interaction.« less

  9. Visualizing the Big (and Large) Data from an HPC Resource

    NASA Astrophysics Data System (ADS)

    Sisneros, R.

    2015-10-01

    Supercomputers are built to endure painfully large simulations and contend with resulting outputs. These are characteristics that scientists are all too willing to test the limits of in their quest for science at scale. The data generated during a scientist's workflow through an HPC center (large data) is the primary target for analysis and visualization. However, the hardware itself is also capable of generating volumes of diagnostic data (big data); this presents compelling opportunities to deploy analogous analytic techniques. In this paper we will provide a survey of some of the many ways in which visualization and analysis may be crammed into the scientific workflow as well as utilized on machine-specific data.

  10. Interaction Junk: User Interaction-Based Evaluation of Visual Analytic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; North, Chris

    2012-10-14

    With the growing need for visualization to aid users in understanding large, complex datasets, the ability for users to interact and explore these datasets is critical. As visual analytic systems have advanced to leverage powerful computational models and data analytics capabilities, the modes by which users engage and interact with the information are limited. Often, users are taxed with directly manipulating parameters of these models through traditional GUIs (e.g., using sliders to directly manipulate the value of a parameter). However, the purpose of user interaction in visual analytic systems is to enable visual data exploration – where users can focusmore » on their task, as opposed to the tool or system. As a result, users can engage freely in data exploration and decision-making, for the purpose of gaining insight. In this position paper, we discuss how evaluating visual analytic systems can be approached through user interaction analysis, where the goal is to minimize the cognitive translation between the visual metaphor and the mode of interaction (i.e., reducing the “Interactionjunk”). We motivate this concept through a discussion of traditional GUIs used in visual analytics for direct manipulation of model parameters, and the importance of designing interactions the support visual data exploration.« less

  11. Examining the Use of a Visual Analytics System for Sensemaking Tasks: Case Studies with Domain Experts.

    PubMed

    Kang, Youn-Ah; Stasko, J

    2012-12-01

    While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.

  12. Visual analysis of large heterogeneous social networks by semantic and structural abstraction.

    PubMed

    Shen, Zeqian; Ma, Kwan-Liu; Eliassi-Rad, Tina

    2006-01-01

    Social network analysis is an active area of study beyond sociology. It uncovers the invisible relationships between actors in a network and provides understanding of social processes and behaviors. It has become an important technique in a variety of application areas such as the Web, organizational studies, and homeland security. This paper presents a visual analytics tool, OntoVis, for understanding large, heterogeneous social networks, in which nodes and links could represent different concepts and relations, respectively. These concepts and relations are related through an ontology (also known as a schema). OntoVis is named such because it uses information in the ontology associated with a social network to semantically prune a large, heterogeneous network. In addition to semantic abstraction, OntoVis also allows users to do structural abstraction and importance filtering to make large networks manageable and to facilitate analytic reasoning. All these unique capabilities of OntoVis are illustrated with several case studies.

  13. Visual Reconciliation of Alternative Similarity Spaces in Climate Modeling.

    PubMed

    Poco, Jorge; Dasgupta, Aritra; Wei, Yaxing; Hargrove, William; Schwalm, Christopher R; Huntzinger, Deborah N; Cook, Robert; Bertini, Enrico; Silva, Claudio T

    2014-12-01

    Visual data analysis often requires grouping of data objects based on their similarity. In many application domains researchers use algorithms and techniques like clustering and multidimensional scaling to extract groupings from data. While extracting these groups using a single similarity criteria is relatively straightforward, comparing alternative criteria poses additional challenges. In this paper we define visual reconciliation as the problem of reconciling multiple alternative similarity spaces through visualization and interaction. We derive this problem from our work on model comparison in climate science where climate modelers are faced with the challenge of making sense of alternative ways to describe their models: one through the output they generate, another through the large set of properties that describe them. Ideally, they want to understand whether groups of models with similar spatio-temporal behaviors share similar sets of criteria or, conversely, whether similar criteria lead to similar behaviors. We propose a visual analytics solution based on linked views, that addresses this problem by allowing the user to dynamically create, modify and observe the interaction among groupings, thereby making the potential explanations apparent. We present case studies that demonstrate the usefulness of our technique in the area of climate science.

  14. Direct imaging of isofrequency contours in photonic structures

    DOE PAGES

    Regan, E. C.; Igarashi, Y.; Zhen, B.; ...

    2016-11-25

    The isofrequency contours of a photonic crystal are important for predicting and understanding exotic optical phenomena that are not apparent from high-symmetry band structure visualizations. We demonstrate a method to directly visualize the isofrequency contours of high-quality photonic crystal slabs that show quantitatively good agreement with numerical results throughout the visible spectrum. Our technique relies on resonance-enhanced photon scattering from generic fabrication disorder and surface roughness, so it can be applied to general photonic and plasmonic crystals or even quasi-crystals. We also present an analytical model of the scattering process, which explains the observation of isofrequency contours in our technique.more » Furthermore, the isofrequency contours provide information about the characteristics of the disorder and therefore serve as a feedback tool to improve fabrication processes.« less

  15. Analytical and Theranostic Applications of Gold Nanoparticles and Multifunctional Nanocomposites

    PubMed Central

    Khlebtsov, Nikolai; Bogatyrev, Vladimir; Dykman, Lev; Khlebtsov, Boris; Staroverov, Sergey; Shirokov, Alexander; Matora, Larisa; Khanadeev, Vitaly; Pylaev, Timofey; Tsyganova, Natalia; Terentyuk, Georgy

    2013-01-01

    Gold nanoparticles (GNPs) and GNP-based multifunctional nanocomposites are the subject of intensive studies and biomedical applications. This minireview summarizes our recent efforts in analytical and theranostic applications of engineered GNPs and nanocomposites by using plasmonic properties of GNPs and various optical techniques. Specifically, we consider analytical biosensing; visualization and bioimaging of bacterial, mammalian, and plant cells; photodynamic treatment of pathogenic bacteria; and photothermal therapy of xenografted tumors. In addition to recently published reports, we discuss new data on dot immunoassay diagnostics of mycobacteria, multiplexed immunoelectron microscopy analysis of Azospirillum brasilense, materno-embryonic transfer of GNPs in pregnant rats, and combined photodynamic and photothermal treatment of rat xenografted tumors with gold nanorods covered by a mesoporous silica shell doped with hematoporphyrin. PMID:23471188

  16. Dissociable meta-analytic brain networks contribute to coordinated emotional processing.

    PubMed

    Riedel, Michael C; Yanes, Julio A; Ray, Kimberly L; Eickhoff, Simon B; Fox, Peter T; Sutherland, Matthew T; Laird, Angela R

    2018-06-01

    Meta-analytic techniques for mining the neuroimaging literature continue to exert an impact on our conceptualization of functional brain networks contributing to human emotion and cognition. Traditional theories regarding the neurobiological substrates contributing to affective processing are shifting from regional- towards more network-based heuristic frameworks. To elucidate differential brain network involvement linked to distinct aspects of emotion processing, we applied an emergent meta-analytic clustering approach to the extensive body of affective neuroimaging results archived in the BrainMap database. Specifically, we performed hierarchical clustering on the modeled activation maps from 1,747 experiments in the affective processing domain, resulting in five meta-analytic groupings of experiments demonstrating whole-brain recruitment. Behavioral inference analyses conducted for each of these groupings suggested dissociable networks supporting: (1) visual perception within primary and associative visual cortices, (2) auditory perception within primary auditory cortices, (3) attention to emotionally salient information within insular, anterior cingulate, and subcortical regions, (4) appraisal and prediction of emotional events within medial prefrontal and posterior cingulate cortices, and (5) induction of emotional responses within amygdala and fusiform gyri. These meta-analytic outcomes are consistent with a contemporary psychological model of affective processing in which emotionally salient information from perceived stimuli are integrated with previous experiences to engender a subjective affective response. This study highlights the utility of using emergent meta-analytic methods to inform and extend psychological theories and suggests that emotions are manifest as the eventual consequence of interactions between large-scale brain networks. © 2018 Wiley Periodicals, Inc.

  17. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    NASA Astrophysics Data System (ADS)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  18. Visualizing Uncertainty for Probabilistic Weather Forecasting based on Reforecast Analogs

    NASA Astrophysics Data System (ADS)

    Pelorosso, Leandro; Diehl, Alexandra; Matković, Krešimir; Delrieux, Claudio; Ruiz, Juan; Gröeller, M. Eduard; Bruckner, Stefan

    2016-04-01

    Numerical weather forecasts are prone to uncertainty coming from inaccuracies in the initial and boundary conditions and lack of precision in numerical models. Ensemble of forecasts partially addresses these problems by considering several runs of the numerical model. Each forecast is generated with different initial and boundary conditions and different model configurations [GR05]. The ensembles can be expressed as probabilistic forecasts, which have proven to be very effective in the decision-making processes [DE06]. The ensemble of forecasts represents only some of the possible future atmospheric states, usually underestimating the degree of uncertainty in the predictions [KAL03, PH06]. Hamill and Whitaker [HW06] introduced the "Reforecast Analog Regression" (RAR) technique to overcome the limitations of ensemble forecasting. This technique produces probabilistic predictions based on the analysis of historical forecasts and observations. Visual analytics provides tools for processing, visualizing, and exploring data to get new insights and discover hidden information patterns in an interactive exchange between the user and the application [KMS08]. In this work, we introduce Albero, a visual analytics solution for probabilistic weather forecasting based on the RAR technique. Albero targets at least two different type of users: "forecasters", who are meteorologists working in operational weather forecasting and "researchers", who work in the construction of numerical prediction models. Albero is an efficient tool for analyzing precipitation forecasts, allowing forecasters to make and communicate quick decisions. Our solution facilitates the analysis of a set of probabilistic forecasts, associated statistical data, observations and uncertainty. A dashboard with small-multiples of probabilistic forecasts allows the forecasters to analyze at a glance the distribution of probabilities as a function of time, space, and magnitude. It provides the user with a more accurate measure of forecast uncertainty that could result in better decision-making. It offers different level of abstractions to help with the recalibration of the RAR method. It also has an inspection tool that displays the selected analogs, their observations and statistical data. It gives the users access to inner parts of the method, unveiling hidden information. References [GR05] GNEITING T., RAFTERY A. E.: Weather forecasting with ensemble methods. Science 310, 5746, 248-249, 2005. [KAL03] KALNAY E.: Atmospheric modeling, data assimilation and predictability. Cambridge University Press, 2003. [PH06] PALMER T., HAGEDORN R.: Predictability of weather and climate. Cambridge University Press, 2006. [HW06] HAMILL T. M., WHITAKER J. S.: Probabilistic quantitative precipitation forecasts based on reforecast analogs: Theory and application. Monthly Weather Review 134, 11, 3209-3229, 2006. [DE06] DEITRICK S., EDSALL R.: The influence of uncertainty visualization on decision making: An empirical evaluation. Springer, 2006. [KMS08] KEIM D. A., MANSMANN F., SCHNEIDEWIND J., THOMAS J., ZIEGLER H.: Visual analytics: Scope and challenges. Springer, 2008.

  19. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality

    PubMed Central

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M

    2008-01-01

    Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. Conclusion The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. Method We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit. PMID:18992163

  20. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality.

    PubMed

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M

    2008-11-07

    Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit.

  1. World Spatiotemporal Analytics and Mapping Project (wstamp): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World's Largest Open Soruce Data Sets

    NASA Astrophysics Data System (ADS)

    Stewart, R.; Piburn, J.; Sorokine, A.; Myers, A.; Moehl, J.; White, D.

    2015-07-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings.

  2. The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science

    NASA Astrophysics Data System (ADS)

    Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.

    2017-12-01

    The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.

  3. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  4. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  5. Towards sustainable infrastructure management: knowledge-based service-oriented computing framework for visual analytics

    NASA Astrophysics Data System (ADS)

    Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd

    2009-05-01

    Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.

  6. Pathways to Identity: Aiding Law Enforcement in Identification Tasks With Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruce, Joseph R.; Scholtz, Jean; Hodges, Duncan

    The nature of identity has changed dramatically in recent years, and has grown in complexity. Identities are defined in multiple domains: biological and psychological elements strongly contribute, but also biographical and cyber elements are necessary to complete the picture. Law enforcement is beginning to adjust to these changes, recognizing its importance in criminal justice. The SuperIdentity project seeks to aid law enforcement officials in their identification tasks through research of techniques for discovering identity traits, generation of statistical models of identity and analysis of identity traits through visualization. We present use cases compiled through user interviews in multiple fields, includingmore » law enforcement, as well as the modeling and visualization tools design to aid in those use cases.« less

  7. Visual Analytics in Public Safety: Example Capabilities for Example Government Agencies

    DTIC Science & Technology

    2011-10-01

    is not limited to: the Police Records Information Management Environment for British Columbia (PRIME-BC), the Police Reporting and Occurrence System...and filtering for rapid identification of relevant documents - Graphical environment for visual evidence marshaling - Interactive linking and...analytical reasoning facilitated by interactive visual interfaces and integration with computational analytics. Indeed, a wide variety of technologies

  8. [Application of the technique of analytical structure of project for the sub-project of websites catalog of the Virtual Health Library-Nursing].

    PubMed

    dos, Santos Luís Augusto; Marin, Heimar de Fátima; Marques, Isaac Rosa; Cunha, Isabel Cristina Kowal Olm

    2007-01-01

    This work intents, in a didactic form, to explain the benefits of use of a technique of project management, named Work Breakdown Structure: a graphical tool to identify the main results to be developed in a project. The real examples are applied to a sub-project of the Virtual Library in Health in Nursing (BVS-Enfermagem) to development of the Sites Catalogs. The benefits of graphical visualization for a major agreement between professionals of different expertise are presented.

  9. TimeBench: a data model and software library for visual analytics of time-oriented data.

    PubMed

    Rind, Alexander; Lammarsch, Tim; Aigner, Wolfgang; Alsallakh, Bilal; Miksch, Silvia

    2013-12-01

    Time-oriented data play an essential role in many Visual Analytics scenarios such as extracting medical insights from collections of electronic health records or identifying emerging problems and vulnerabilities in network traffic. However, many software libraries for Visual Analytics treat time as a flat numerical data type and insufficiently tackle the complexity of the time domain such as calendar granularities and intervals. Therefore, developers of advanced Visual Analytics designs need to implement temporal foundations in their application code over and over again. We present TimeBench, a software library that provides foundational data structures and algorithms for time-oriented data in Visual Analytics. Its expressiveness and developer accessibility have been evaluated through application examples demonstrating a variety of challenges with time-oriented data and long-term developer studies conducted in the scope of research and student projects.

  10. TelCoVis: Visual Exploration of Co-occurrence in Urban Human Mobility Based on Telco Data.

    PubMed

    Wu, Wenchao; Xu, Jiayi; Zeng, Haipeng; Zheng, Yixian; Qu, Huamin; Ni, Bing; Yuan, Mingxuan; Ni, Lionel M

    2016-01-01

    Understanding co-occurrence in urban human mobility (i.e. people from two regions visit an urban place during the same time span) is of great value in a variety of applications, such as urban planning, business intelligence, social behavior analysis, as well as containing contagious diseases. In recent years, the widespread use of mobile phones brings an unprecedented opportunity to capture large-scale and fine-grained data to study co-occurrence in human mobility. However, due to the lack of systematic and efficient methods, it is challenging for analysts to carry out in-depth analyses and extract valuable information. In this paper, we present TelCoVis, an interactive visual analytics system, which helps analysts leverage their domain knowledge to gain insight into the co-occurrence in urban human mobility based on telco data. Our system integrates visualization techniques with new designs and combines them in a novel way to enhance analysts' perception for a comprehensive exploration. In addition, we propose to study the correlations in co-occurrence (i.e. people from multiple regions visit different places during the same time span) by means of biclustering techniques that allow analysts to better explore coordinated relationships among different regions and identify interesting patterns. The case studies based on a real-world dataset and interviews with domain experts have demonstrated the effectiveness of our system in gaining insights into co-occurrence and facilitating various analytical tasks.

  11. Visual and Analytic Strategies in Geometry

    ERIC Educational Resources Information Center

    Kospentaris, George; Vosniadou, Stella; Kazic, Smaragda; Thanou, Emilian

    2016-01-01

    We argue that there is an increasing reliance on analytic strategies compared to visuospatial strategies, which is related to geometry expertise and not on individual differences in cognitive style. A Visual/Analytic Strategy Test (VAST) was developed to investigate the use of visuo-spatial and analytic strategies in geometry in 30 mathematics…

  12. Characterization and measurement of polymer wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.; Aron, P. R.

    1984-01-01

    Analytical tools which characterize the polymer wear process are discussed. The devices discussed include: visual observation of polymer wear with SEM, the quantification with surface profilometry and ellipsometry, to study the chemistry with AES, XPS and SIMS, to establish interfacial polymer orientation and accordingly bonding with QUARTIR, polymer state with Raman spectroscopy and stresses that develop in polymer films using a X-ray double crystal camera technique.

  13. Comparative analysis of methods for real-time analytical control of chemotherapies preparations.

    PubMed

    Bazin, Christophe; Cassard, Bruno; Caudron, Eric; Prognon, Patrice; Havard, Laurent

    2015-10-15

    Control of chemotherapies preparations are now an obligation in France, though analytical control is compulsory. Several methods are available and none of them is presumed as ideal. We wanted to compare them so as to determine which one could be the best choice. We compared non analytical (visual and video-assisted, gravimetric) and analytical (HPLC/FIA, UV/FT-IR, UV/Raman, Raman) methods thanks to our experience and a SWOT analysis. The results of the analysis show great differences between the techniques, but as expected none us them is without defects. However they can probably be used in synergy. Overall for the pharmacist willing to get involved, the implementation of the control for chemotherapies preparations must be widely anticipated, with the listing of every parameter, and remains according to us an analyst's job. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    PubMed

    Wiles, Amy M

    2016-07-08

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  15. CollaborationViz: Interactive Visual Exploration of Biomedical Research Collaboration Networks

    PubMed Central

    Bian, Jiang; Xie, Mengjun; Hudson, Teresa J.; Eswaran, Hari; Brochhausen, Mathias; Hanna, Josh; Hogan, William R.

    2014-01-01

    Social network analysis (SNA) helps us understand patterns of interaction between social entities. A number of SNA studies have shed light on the characteristics of research collaboration networks (RCNs). Especially, in the Clinical Translational Science Award (CTSA) community, SNA provides us a set of effective tools to quantitatively assess research collaborations and the impact of CTSA. However, descriptive network statistics are difficult for non-experts to understand. In this article, we present our experiences of building meaningful network visualizations to facilitate a series of visual analysis tasks. The basis of our design is multidimensional, visual aggregation of network dynamics. The resulting visualizations can help uncover hidden structures in the networks, elicit new observations of the network dynamics, compare different investigators and investigator groups, determine critical factors to the network evolution, and help direct further analyses. We applied our visualization techniques to explore the biomedical RCNs at the University of Arkansas for Medical Sciences – a CTSA institution. And, we created CollaborationViz, an open-source visual analytical tool to help network researchers and administration apprehend the network dynamics of research collaborations through interactive visualization. PMID:25405477

  16. Pulmonary nodule characterization, including computer analysis and quantitative features.

    PubMed

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  17. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  18. Lack of habituation of evoked visual potentials in analytic information processing style: evidence in healthy subjects.

    PubMed

    Buonfiglio, Marzia; Toscano, M; Puledda, F; Avanzini, G; Di Clemente, L; Di Sabato, F; Di Piero, V

    2015-03-01

    Habituation is considered one of the most basic mechanisms of learning. Habituation deficit to several sensory stimulations has been defined as a trait of migraine brain and also observed in other disorders. On the other hand, analytic information processing style is characterized by the habit of continually evaluating stimuli and it has been associated with migraine. We investigated a possible correlation between lack of habituation of evoked visual potentials and analytic cognitive style in healthy subjects. According to Sternberg-Wagner self-assessment inventory, 15 healthy volunteers (HV) with high analytic score and 15 HV with high global score were recruited. Both groups underwent visual evoked potentials recordings after psychological evaluation. We observed significant lack of habituation in analytical individuals compared to global group. In conclusion, a reduced habituation of visual evoked potentials has been observed in analytic subjects. Our results suggest that further research should be undertaken regarding the relationship between analytic cognitive style and lack of habituation in both physiological and pathophysiological conditions.

  19. StreamSqueeze: a dynamic stream visualization for monitoring of event data

    NASA Astrophysics Data System (ADS)

    Mansmann, Florian; Krstajic, Milos; Fischer, Fabian; Bertini, Enrico

    2012-01-01

    While in clear-cut situations automated analytical solution for data streams are already in place, only few visual approaches have been proposed in the literature for exploratory analysis tasks on dynamic information. However, due to the competitive or security-related advantages that real-time information gives in domains such as finance, business or networking, we are convinced that there is a need for exploratory visualization tools for data streams. Under the conditions that new events have higher relevance and that smooth transitions enable traceability of items, we propose a novel dynamic stream visualization called StreamSqueeze. In this technique the degree of interest of recent items is expressed through an increase in size and thus recent events can be shown with more details. The technique has two main benefits: First, the layout algorithm arranges items in several lists of various sizes and optimizes the positions within each list so that the transition of an item from one list to the other triggers least visual changes. Second, the animation scheme ensures that for 50 percent of the time an item has a static screen position where reading is most effective and then continuously shrinks and moves to the its next static position in the subsequent list. To demonstrate the capability of our technique, we apply it to large and high-frequency news and syslog streams and show how it maintains optimal stability of the layout under the conditions given above.

  20. Service Bundle Recommendation for Person-Centered Care Planning in Cities.

    PubMed

    Kotoulas, Spyros; Daly, Elizabeth; Tommasi, Pierpaolo; Kishimoto, Akihiro; Lopez, Vanessa; Stephenson, Martin; Botea, Adi; Sbodio, Marco; Marinescu, Radu; Rooney, Ronan

    2016-01-01

    Providing appropriate support for the most vulnerable individuals carries enormous societal significance and economic burden. Yet, finding the right balance between costs, estimated effectiveness and the experience of the care recipient is a daunting task that requires considering vast amount of information. We present a system that helps care teams choose the optimal combination of providers for a set of services. We draw from techniques in Open Data processing, semantic processing, faceted exploration, visual analytics, transportation analytics and multi-objective optimization. We present an implementation of the system using data from New York City and illustrate the feasibility these technologies to guide care workers in care planning.

  1. Liquid Microjunction Surface Sampling Probe Fluid Dynamics: Computational and Experimental Analysis of Coaxial Intercapillary Positioning Effects on Sample Manipulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ElNaggar, Mariam S; Barbier, Charlotte N; Van Berkel, Gary J

    A coaxial geometry liquid microjunction surface sampling probe (LMJ-SSP) enables direct extraction of analytes from surfaces for subsequent analysis by techniques like mass spectrometry. Solution dynamics at the probe-to-sample surface interface in the LMJ-SSP has been suspected to influence sampling efficiency and dispersion but has not been rigorously investigated. The effect on flow dynamics and analyte transport to the mass spectrometer caused by coaxial retraction of the inner and outer capillaries from each other and the surface during sampling with a LMJ-SSP was investigated using computational fluid dynamics and experimentation. A transparent LMJ-SSP was constructed to provide the means formore » visual observation of the dynamics of the surface sampling process. Visual observation, computational fluid dynamics (CFD) analysis, and experimental results revealed that inner capillary axial retraction from the flush position relative to the outer capillary transitioned the probe from a continuous sampling and injection mode through an intermediate regime to sample plug formationmode caused by eddy currents at the sampling end of the probe. The potential for analytical implementation of these newly discovered probe operational modes is discussed.« less

  2. Combined imaging and chemical sensing using a single optical imaging fiber.

    PubMed

    Bronk, K S; Michael, K L; Pantano, P; Walt, D R

    1995-09-01

    Despite many innovations and developments in the field of fiber-optic chemical sensors, optical fibers have not been employed to both view a sample and concurrently detect an analyte of interest. While chemical sensors employing a single optical fiber or a noncoherent fiberoptic bundle have been applied to a wide variety of analytical determinations, they cannot be used for imaging. Similarly, coherent imaging fibers have been employed only for their originally intended purpose, image transmission. We herein report a new technique for viewing a sample and measuring surface chemical concentrations that employs a coherent imaging fiber. The method is based on the deposition of a thin, analyte-sensitive polymer layer on the distal surface of a 350-microns-diameter imaging fiber. We present results from a pH sensor array and an acetylcholine biosensor array, each of which contains approximately 6000 optical sensors. The acetylcholine biosensor has a detection limit of 35 microM and a fast (< 1 s) response time. In association with an epifluorescence microscope and a charge-coupled device, these modified imaging fibers can display visual information of a remote sample with 4-microns spatial resolution, allowing for alternating acquisition of both chemical analysis and visual histology.

  3. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies

    PubMed Central

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-01-01

    Background The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. Objective To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. Methods The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Results Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Conclusions Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. PMID:26728006

  4. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase two, volume 4 : web-based bridge information database--visualization analytics and distributed sensing.

    DOT National Transportation Integrated Search

    2012-03-01

    This report introduces the design and implementation of a Web-based bridge information visual analytics system. This : project integrates Internet, multiple databases, remote sensing, and other visualization technologies. The result : combines a GIS ...

  5. Instrumental images: the visual rhetoric of self-presentation in Hevelius's Machina Coelestis.

    PubMed

    Vertesi, Janet

    2010-06-01

    This article places the famous images of Johannes Hevelius's instruments in his Machina Coelestis (1673) in the context of Hevelius's contested cometary observations and his debate with Hooke over telescopic sights. Seen thus, the images promote a crafted vision of Hevelius's astronomical practice and skills, constituting a careful self-presentation to his distant professional network and a claim as to which instrumental techniques guarantee accurate observations. Reviewing the reception of the images, the article explores how visual rhetoric may be invoked and challenged in the context of controversy, and suggests renewed analytical attention to the role of laboratory imagery in instrumental cultures in the history of science.

  6. Visual analytics for semantic queries of TerraSAR-X image content

    NASA Astrophysics Data System (ADS)

    Espinoza-Molina, Daniela; Alonso, Kevin; Datcu, Mihai

    2015-10-01

    With the continuous image product acquisition of satellite missions, the size of the image archives is considerably increasing every day as well as the variety and complexity of their content, surpassing the end-user capacity to analyse and exploit them. Advances in the image retrieval field have contributed to the development of tools for interactive exploration and extraction of the images from huge archives using different parameters like metadata, key-words, and basic image descriptors. Even though we count on more powerful tools for automated image retrieval and data analysis, we still face the problem of understanding and analyzing the results. Thus, a systematic computational analysis of these results is required in order to provide to the end-user a summary of the archive content in comprehensible terms. In this context, visual analytics combines automated analysis with interactive visualizations analysis techniques for an effective understanding, reasoning and decision making on the basis of very large and complex datasets. Moreover, currently several researches are focused on associating the content of the images with semantic definitions for describing the data in a format to be easily understood by the end-user. In this paper, we present our approach for computing visual analytics and semantically querying the TerraSAR-X archive. Our approach is mainly composed of four steps: 1) the generation of a data model that explains the information contained in a TerraSAR-X product. The model is formed by primitive descriptors and metadata entries, 2) the storage of this model in a database system, 3) the semantic definition of the image content based on machine learning algorithms and relevance feedback, and 4) querying the image archive using semantic descriptors as query parameters and computing the statistical analysis of the query results. The experimental results shows that with the help of visual analytics and semantic definitions we are able to explain the image content using semantic terms and the relations between them answering questions such as what is the percentage of urban area in a region? or what is the distribution of water bodies in a city?

  7. Visualisation and Analytic Strategies for Anticipating the Folding of Nets

    ERIC Educational Resources Information Center

    Wright, Vince

    2016-01-01

    Visual and analytic strategies are features of students' schemes for spatial tasks. The strategies used by six students to anticipate the folding of nets were investigated. Evidence suggested that visual and analytic strategies were strongly connected in competent performance.

  8. Dynamic Visualization of Co-expression in Systems Genetics Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    New, Joshua Ryan; Huang, Jian; Chesler, Elissa J

    2008-01-01

    Biologists hope to address grand scientific challenges by exploring the abundance of data made available through modern microarray technology and other high-throughput techniques. The impact of this data, however, is limited unless researchers can effectively assimilate such complex information and integrate it into their daily research; interactive visualization tools are called for to support the effort. Specifically, typical studies of gene co-expression require novel visualization tools that enable the dynamic formulation and fine-tuning of hypotheses to aid the process of evaluating sensitivity of key parameters. These tools should allow biologists to develop an intuitive understanding of the structure of biologicalmore » networks and discover genes which reside in critical positions in networks and pathways. By using a graph as a universal data representation of correlation in gene expression data, our novel visualization tool employs several techniques that when used in an integrated manner provide innovative analytical capabilities. Our tool for interacting with gene co-expression data integrates techniques such as: graph layout, qualitative subgraph extraction through a novel 2D user interface, quantitative subgraph extraction using graph-theoretic algorithms or by querying an optimized b-tree, dynamic level-of-detail graph abstraction, and template-based fuzzy classification using neural networks. We demonstrate our system using a real-world workflow from a large-scale, systems genetics study of mammalian gene co-expression.« less

  9. Visual analytics of brain networks.

    PubMed

    Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming

    2012-05-15

    Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Analysis, simulation and visualization of 1D tapping via reduced dynamical models

    NASA Astrophysics Data System (ADS)

    Blackmore, Denis; Rosato, Anthony; Tricoche, Xavier; Urban, Kevin; Zou, Luo

    2014-04-01

    A low-dimensional center-of-mass dynamical model is devised as a simplified means of approximately predicting some important aspects of the motion of a vertical column comprised of a large number of particles subjected to gravity and periodic vertical tapping. This model is investigated first as a continuous dynamical system using analytical, simulation and visualization techniques. Then, by employing an approach analogous to that used to approximate the dynamics of a bouncing ball on an oscillating flat plate, it is modeled as a discrete dynamical system and analyzed to determine bifurcations and transitions to chaotic motion along with other properties. The predictions of the analysis are then compared-primarily qualitatively-with visualization and simulation results of the reduced continuous model, and ultimately with simulations of the complete system dynamics.

  11. Analytical evaluation of two motion washout techniques

    NASA Technical Reports Server (NTRS)

    Young, L. R.

    1977-01-01

    Practical tools were developed which extend the state of the art of moving base flight simulation for research and training purposes. The use of visual and vestibular cues to minimize the actual motion of the simulator itself was a primary consideration. The investigation consisted of optimum programming of motion cues based on a physiological model of the vestibular system to yield 'ideal washout logic' for any given simulator constraints.

  12. Guided Text Search Using Adaptive Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Symons, Christopher T; Senter, James K

    This research demonstrates the promise of augmenting interactive visualizations with semi- supervised machine learning techniques to improve the discovery of significant associations and insights in the search and analysis of textual information. More specifically, we have developed a system called Gryffin that hosts a unique collection of techniques that facilitate individualized investigative search pertaining to an ever-changing set of analytical questions over an indexed collection of open-source documents related to critical national infrastructure. The Gryffin client hosts dynamic displays of the search results via focus+context record listings, temporal timelines, term-frequency views, and multiple coordinate views. Furthermore, as the analyst interactsmore » with the display, the interactions are recorded and used to label the search records. These labeled records are then used to drive semi-supervised machine learning algorithms that re-rank the unlabeled search records such that potentially relevant records are moved to the top of the record listing. Gryffin is described in the context of the daily tasks encountered at the US Department of Homeland Security s Fusion Center, with whom we are collaborating in its development. The resulting system is capable of addressing the analysts information overload that can be directly attributed to the deluge of information that must be addressed in the search and investigative analysis of textual information.« less

  13. Using business intelligence for efficient inter-facility patient transfer.

    PubMed

    Haque, Waqar; Derksen, Beth Ann; Calado, Devin; Foster, Lee

    2015-01-01

    In the context of inter-facility patient transfer, a transfer operator must be able to objectively identify a destination which meets the needs of a patient, while keeping in mind each facility's limitations. We propose a solution which uses Business Intelligence (BI) techniques to analyze data related to healthcare infrastructure and services, and provides a web based system to identify optimal destination(s). The proposed inter-facility transfer system uses a single data warehouse with an Online Analytical Processing (OLAP) cube built on top that supplies analytical data to multiple reports embedded in web pages. The data visualization tool includes map based navigation of the health authority as well as an interactive filtering mechanism which finds facilities meeting the selected criteria. The data visualization is backed by an intuitive data entry web form which safely constrains the data, ensuring consistency and a single version of truth. The overall time required to identify the destination for inter-facility transfers is reduced from hours to a few minutes with this interactive solution.

  14. Visual analytics of large multidimensional data using variable binned scatter plots

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Sharma, Ratnesh K.; Keim, Daniel A.; Janetzko, Halldór

    2010-01-01

    The scatter plot is a well-known method of visualizing pairs of two-dimensional continuous variables. Multidimensional data can be depicted in a scatter plot matrix. They are intuitive and easy-to-use, but often have a high degree of overlap which may occlude a significant portion of data. In this paper, we propose variable binned scatter plots to allow the visualization of large amounts of data without overlapping. The basic idea is to use a non-uniform (variable) binning of the x and y dimensions and plots all the data points that fall within each bin into corresponding squares. Further, we map a third attribute to color for visualizing clusters. Analysts are able to interact with individual data points for record level information. We have applied these techniques to solve real-world problems on credit card fraud and data center energy consumption to visualize their data distribution and cause-effect among multiple attributes. A comparison of our methods with two recent well-known variants of scatter plots is included.

  15. Measurement and Visualization of Mass Transport for the Flowing Atmospheric Pressure Afterglow (FAPA) Ambient Mass-Spectrometry Source

    PubMed Central

    Pfeuffer, Kevin P.; Ray, Steven J.; Hieftje, Gary M.

    2014-01-01

    Ambient desorption/ionization mass spectrometry (ADI-MS) has developed into an important analytical field over the last nine years. The ability to analyze samples under ambient conditions while retaining the sensitivity and specificity of mass spectrometry has led to numerous applications and a corresponding jump in the popularity of this field. Despite the great potential of ADI-MS, problems remain in the areas of ion identification and quantification. Difficulties with ion identification can be solved through modified instrumentation, including accurate-mass or MS/MS capabilities for analyte identification. More difficult problems include quantification due to the ambient nature of the sampling process. To characterize and improve sample volatilization, ionization, and introduction into the mass-spectrometer interface, a method of visualizing mass transport into the mass spectrometer is needed. Schlieren imaging is a well-established technique that renders small changes in refractive index visible. Here, schlieren imaging was used to visualize helium flow from a plasma-based ADI-MS source into a mass spectrometer while ion signals were recorded. Optimal sample positions for melting-point capillary and transmission-mode (stainless steel mesh) introduction were found to be near (within 1 mm of) the mass spectrometer inlet. Additionally, the orientation of the sampled surface plays a significant role. More efficient mass transport resulted for analyte deposits directly facing the MS inlet. Different surfaces (glass slide and rough surface) were also examined; for both it was found that the optimal position is immediately beneath the MS inlet. PMID:24658804

  16. Measurement and visualization of mass transport for the flowing atmospheric pressure afterglow (FAPA) ambient mass-spectrometry source.

    PubMed

    Pfeuffer, Kevin P; Ray, Steven J; Hieftje, Gary M

    2014-05-01

    Ambient desorption/ionization mass spectrometry (ADI-MS) has developed into an important analytical field over the last 9 years. The ability to analyze samples under ambient conditions while retaining the sensitivity and specificity of mass spectrometry has led to numerous applications and a corresponding jump in the popularity of this field. Despite the great potential of ADI-MS, problems remain in the areas of ion identification and quantification. Difficulties with ion identification can be solved through modified instrumentation, including accurate-mass or MS/MS capabilities for analyte identification. More difficult problems include quantification because of the ambient nature of the sampling process. To characterize and improve sample volatilization, ionization, and introduction into the mass spectrometer interface, a method of visualizing mass transport into the mass spectrometer is needed. Schlieren imaging is a well-established technique that renders small changes in refractive index visible. Here, schlieren imaging was used to visualize helium flow from a plasma-based ADI-MS source into a mass spectrometer while ion signals were recorded. Optimal sample positions for melting-point capillary and transmission-mode (stainless steel mesh) introduction were found to be near (within 1 mm of) the mass spectrometer inlet. Additionally, the orientation of the sampled surface plays a significant role. More efficient mass transport resulted for analyte deposits directly facing the MS inlet. Different surfaces (glass slide and rough surface) were also examined; for both it was found that the optimal position is immediately beneath the MS inlet.

  17. Measurement and Visualization of Mass Transport for the Flowing Atmospheric Pressure Afterglow (FAPA) Ambient Mass-Spectrometry Source

    NASA Astrophysics Data System (ADS)

    Pfeuffer, Kevin P.; Ray, Steven J.; Hieftje, Gary M.

    2014-05-01

    Ambient desorption/ionization mass spectrometry (ADI-MS) has developed into an important analytical field over the last 9 years. The ability to analyze samples under ambient conditions while retaining the sensitivity and specificity of mass spectrometry has led to numerous applications and a corresponding jump in the popularity of this field. Despite the great potential of ADI-MS, problems remain in the areas of ion identification and quantification. Difficulties with ion identification can be solved through modified instrumentation, including accurate-mass or MS/MS capabilities for analyte identification. More difficult problems include quantification because of the ambient nature of the sampling process. To characterize and improve sample volatilization, ionization, and introduction into the mass spectrometer interface, a method of visualizing mass transport into the mass spectrometer is needed. Schlieren imaging is a well-established technique that renders small changes in refractive index visible. Here, schlieren imaging was used to visualize helium flow from a plasma-based ADI-MS source into a mass spectrometer while ion signals were recorded. Optimal sample positions for melting-point capillary and transmission-mode (stainless steel mesh) introduction were found to be near (within 1 mm of) the mass spectrometer inlet. Additionally, the orientation of the sampled surface plays a significant role. More efficient mass transport resulted for analyte deposits directly facing the MS inlet. Different surfaces (glass slide and rough surface) were also examined; for both it was found that the optimal position is immediately beneath the MS inlet.

  18. Mining patterns in persistent surveillance systems with smart query and visual analytics

    NASA Astrophysics Data System (ADS)

    Habibi, Mohammad S.; Shirkhodaie, Amir

    2013-05-01

    In Persistent Surveillance Systems (PSS) the ability to detect and characterize events geospatially help take pre-emptive steps to counter adversary's actions. Interactive Visual Analytic (VA) model offers this platform for pattern investigation and reasoning to comprehend and/or predict such occurrences. The need for identifying and offsetting these threats requires collecting information from diverse sources, which brings with it increasingly abstract data. These abstract semantic data have a degree of inherent uncertainty and imprecision, and require a method for their filtration before being processed further. In this paper, we have introduced an approach based on Vector Space Modeling (VSM) technique for classification of spatiotemporal sequential patterns of group activities. The feature vectors consist of an array of attributes extracted from generated sensors semantic annotated messages. To facilitate proper similarity matching and detection of time-varying spatiotemporal patterns, a Temporal-Dynamic Time Warping (DTW) method with Gaussian Mixture Model (GMM) for Expectation Maximization (EM) is introduced. DTW is intended for detection of event patterns from neighborhood-proximity semantic frames derived from established ontology. GMM with EM, on the other hand, is employed as a Bayesian probabilistic model to estimated probability of events associated with a detected spatiotemporal pattern. In this paper, we present a new visual analytic tool for testing and evaluation group activities detected under this control scheme. Experimental results demonstrate the effectiveness of proposed approach for discovery and matching of subsequences within sequentially generated patterns space of our experiments.

  19. High Performance Visualization using Query-Driven Visualizationand Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Campbell, Scott; Dart, Eli

    2006-06-15

    Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.

  20. Chemical imaging of drug delivery systems with structured surfaces-a combined analytical approach of confocal raman microscopy and optical profilometry.

    PubMed

    Kann, Birthe; Windbergs, Maike

    2013-04-01

    Confocal Raman microscopy is an analytical technique with a steadily increasing impact in the field of pharmaceutics as the instrumental setup allows for nondestructive visualization of component distribution within drug delivery systems. Here, the attention is mainly focused on classic solid carrier systems like tablets, pellets, or extrudates. Due to the opacity of these systems, Raman analysis is restricted either to exterior surfaces or cross sections. As Raman spectra are only recorded from one focal plane at a time, the sample is usually altered to create a smooth and even surface. However, this manipulation can lead to misinterpretation of the analytical results. Here, we present a trendsetting approach to overcome these analytical pitfalls with a combination of confocal Raman microscopy and optical profilometry. By acquiring a topography profile of the sample area of interest prior to Raman spectroscopy, the profile height information allowed to level the focal plane to the sample surface for each spectrum acquisition. We first demonstrated the basic principle of this complementary approach in a case study using a tilted silica wafer. In a second step, we successfully adapted the two techniques to investigate an extrudate and a lyophilisate as two exemplary solid drug carrier systems. Component distribution analysis with the novel analytical approach was neither hampered by the curvature of the cylindrical extrudate nor the highly structured surface of the lyophilisate. Therefore, the combined analytical approach bears a great potential to be implemented in diversified fields of pharmaceutical sciences.

  1. Evaluating the decision accuracy and speed of clinical data visualizations.

    PubMed

    Pieczkiewicz, David S; Finkelstein, Stanley M

    2010-01-01

    Clinicians face an increasing volume of biomedical data. Assessing the efficacy of systems that enable accurate and timely clinical decision making merits corresponding attention. This paper discusses the multiple-reader multiple-case (MRMC) experimental design and linear mixed models as means of assessing and comparing decision accuracy and latency (time) for decision tasks in which clinician readers must interpret visual displays of data. These tools can assess and compare decision accuracy and latency (time). These experimental and statistical techniques, used extensively in radiology imaging studies, offer a number of practical and analytic advantages over more traditional quantitative methods such as percent-correct measurements and ANOVAs, and are recommended for their statistical efficiency and generalizability. An example analysis using readily available, free, and commercial statistical software is provided as an appendix. While these techniques are not appropriate for all evaluation questions, they can provide a valuable addition to the evaluative toolkit of medical informatics research.

  2. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies.

    PubMed

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-04-01

    The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. PAVA: Physiological and Anatomical Visual Analytics for Mapping of Tissue-Specific Concentration and Time-Course Data

    EPA Science Inventory

    We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...

  4. A Visual Analytics Paradigm Enabling Trillion-Edge Graph Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Haglin, David J.; Gillen, David S.

    We present a visual analytics paradigm and a system prototype for exploring web-scale graphs. A web-scale graph is described as a graph with ~one trillion edges and ~50 billion vertices. While there is an aggressive R&D effort in processing and exploring web-scale graphs among internet vendors such as Facebook and Google, visualizing a graph of that scale still remains an underexplored R&D area. The paper describes a nontraditional peek-and-filter strategy that facilitates the exploration of a graph database of unprecedented size for visualization and analytics. We demonstrate that our system prototype can 1) preprocess a graph with ~25 billion edgesmore » in less than two hours and 2) support database query and visualization on the processed graph database afterward. Based on our computational performance results, we argue that we most likely will achieve the one trillion edge mark (a computational performance improvement of 40 times) for graph visual analytics in the near future.« less

  5. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  6. MeRy-B: a web knowledgebase for the storage, visualization, analysis and annotation of plant NMR metabolomic profiles

    PubMed Central

    2011-01-01

    Background Improvements in the techniques for metabolomics analyses and growing interest in metabolomic approaches are resulting in the generation of increasing numbers of metabolomic profiles. Platforms are required for profile management, as a function of experimental design, and for metabolite identification, to facilitate the mining of the corresponding data. Various databases have been created, including organism-specific knowledgebases and analytical technique-specific spectral databases. However, there is currently no platform meeting the requirements for both profile management and metabolite identification for nuclear magnetic resonance (NMR) experiments. Description MeRy-B, the first platform for plant 1H-NMR metabolomic profiles, is designed (i) to provide a knowledgebase of curated plant profiles and metabolites obtained by NMR, together with the corresponding experimental and analytical metadata, (ii) for queries and visualization of the data, (iii) to discriminate between profiles with spectrum visualization tools and statistical analysis, (iv) to facilitate compound identification. It contains lists of plant metabolites and unknown compounds, with information about experimental conditions, the factors studied and metabolite concentrations for several plant species, compiled from more than one thousand annotated NMR profiles for various organs or tissues. Conclusion MeRy-B manages all the data generated by NMR-based plant metabolomics experiments, from description of the biological source to identification of the metabolites and determinations of their concentrations. It is the first database allowing the display and overlay of NMR metabolomic profiles selected through queries on data or metadata. MeRy-B is available from http://www.cbib.u-bordeaux2.fr/MERYB/index.php. PMID:21668943

  7. Applying Pragmatics Principles for Interaction with Visual Analytics.

    PubMed

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  8. Transformation of an uncertain video search pipeline to a sketch-based visual analytics loop.

    PubMed

    Legg, Philip A; Chung, David H S; Parry, Matthew L; Bown, Rhodri; Jones, Mark W; Griffiths, Iwan W; Chen, Min

    2013-12-01

    Traditional sketch-based image or video search systems rely on machine learning concepts as their core technology. However, in many applications, machine learning alone is impractical since videos may not be semantically annotated sufficiently, there may be a lack of suitable training data, and the search requirements of the user may frequently change for different tasks. In this work, we develop a visual analytics systems that overcomes the shortcomings of the traditional approach. We make use of a sketch-based interface to enable users to specify search requirement in a flexible manner without depending on semantic annotation. We employ active machine learning to train different analytical models for different types of search requirements. We use visualization to facilitate knowledge discovery at the different stages of visual analytics. This includes visualizing the parameter space of the trained model, visualizing the search space to support interactive browsing, visualizing candidature search results to support rapid interaction for active learning while minimizing watching videos, and visualizing aggregated information of the search results. We demonstrate the system for searching spatiotemporal attributes from sports video to identify key instances of the team and player performance.

  9. Visualization by discharge illumination technique and modification by plasma actuator of rarefied Mach 2 airflow around a cylinder

    NASA Astrophysics Data System (ADS)

    Leger, L.; Sellam, M.; Barbosa, E.; Depussay, E.

    2013-06-01

    The use of plasma actuators for flow control has received considerable attention in recent years. This kind of device seems to be an appropriate means of raising abilities in flow control thanks to total electric control, no moving parts and a fast response time. The experimental work presented here shows, firstly, the non-intrusive character of the visualization of the density field of an airflow around a cylinder obtained using a plasma luminescence technique. Experiments are made in a continuous supersonic wind tunnel. The static pressure in the flow is 8 Pa, the mean free path is about 0.3 mm and the airflow velocity is 510 m s-1. Pressure measurements obtained by means of glass Pitot tube without the visualization discharge are proposed. Measured and simulated pressure profiles are in good agreement in the region near the cylinder. There is good correlation between numerical simulations of the supersonic flow field, analytical model predictions and experimental flow visualizations obtained by a plasma luminescence technique. Consequently, we show that the plasma luminescence technique is non-intrusive. Secondly, the effect of a dc discharge on a supersonic rarefied air flow around a cylinder is studied. An electrode is flush mounted on the cylinder. Stagnation pressure profiles are examined for different electrode positions on the cylinder. A shock wave modification depending on the electrode location is observed. The discharge placed at the upstream stagnation point induces an upstream shift of the bow shock, whereas a modification of the shock wave shape is observed when it is placed at 45° or 90°.

  10. Lateral flow devices

    DOEpatents

    Mazumdar, Debapriya; Liu, Juewen; Lu, Yi

    2010-09-21

    An analytical test for an analyte comprises (a) a base, having a reaction area and a visualization area, (b) a capture species, on the base in the visualization area, comprising nucleic acid, and (c) analysis chemistry reagents, on the base in the reaction area. The analysis chemistry reagents comprise (i) a substrate comprising nucleic acid and a first label, and (ii) a reactor comprising nucleic acid. The analysis chemistry reagents can react with a sample comprising the analyte and water, to produce a visualization species comprising nucleic acid and the first label, and the capture species can bind the visualization species.

  11. Visual Analytics for Heterogeneous Geoscience Data

    NASA Astrophysics Data System (ADS)

    Pan, Y.; Yu, L.; Zhu, F.; Rilee, M. L.; Kuo, K. S.; Jiang, H.; Yu, H.

    2017-12-01

    Geoscience data obtained from diverse sources have been routinely leveraged by scientists to study various phenomena. The principal data sources include observations and model simulation outputs. These data are characterized by spatiotemporal heterogeneity originated from different instrument design specifications and/or computational model requirements used in data generation processes. Such inherent heterogeneity poses several challenges in exploring and analyzing geoscience data. First, scientists often wish to identify features or patterns co-located among multiple data sources to derive and validate certain hypotheses. Heterogeneous data make it a tedious task to search such features in dissimilar datasets. Second, features of geoscience data are typically multivariate. It is challenging to tackle the high dimensionality of geoscience data and explore the relations among multiple variables in a scalable fashion. Third, there is a lack of transparency in traditional automated approaches, such as feature detection or clustering, in that scientists cannot intuitively interact with their analysis processes and interpret results. To address these issues, we present a new scalable approach that can assist scientists in analyzing voluminous and diverse geoscience data. We expose a high-level query interface that allows users to easily express their customized queries to search features of interest across multiple heterogeneous datasets. For identified features, we develop a visualization interface that enables interactive exploration and analytics in a linked-view manner. Specific visualization techniques such as scatter plots to parallel coordinates are employed in each view to allow users to explore various aspects of features. Different views are linked and refreshed according to user interactions in any individual view. In such a manner, a user can interactively and iteratively gain understanding into the data through a variety of visual analytics operations. We demonstrate with use cases how scientists can combine the query and visualization interfaces to enable a customized workflow facilitating studies using heterogeneous geoscience datasets.

  12. Big, Deep, and Smart Data in Scanning Probe Microscopy

    DOE PAGES

    Kalinin, Sergei V.; Strelcov, Evgheni; Belianinov, Alex; ...

    2016-09-27

    Scanning probe microscopy techniques open the door to nanoscience and nanotechnology by enabling imaging and manipulation of structure and functionality of matter on nanometer and atomic scales. We analyze the discovery process by SPM in terms of information flow from tip-surface junction to the knowledge adoption by scientific community. Furthermore, we discuss the challenges and opportunities offered by merging of SPM and advanced data mining, visual analytics, and knowledge discovery technologies.

  13. From Data to Knowledge – Promising Analytical Tools and Techniques for Capture and Reuse of Corporate Knowledge and to Aid in the State Evaluation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danielson, Gary R.; Augustenborg, Elsa C.; Beck, Andrew E.

    2010-10-29

    The IAEA is challenged with limited availability of human resources for inspection and data analysis while proliferation threats increase. PNNL has a variety of IT solutions and techniques (at varying levels of maturity and development) that take raw data closer to useful knowledge, thereby assisting with and standardizing the analytical processes. This paper highlights some PNNL tools and techniques which are applicable to the international safeguards community, including: • Intelligent in-situ triage of data prior to reliable transmission to an analysis center resulting in the transmission of smaller and more relevant data sets • Capture of expert knowledge in re-usablemore » search strings tailored to specific mission outcomes • Image based searching fused with text based searching • Use of gaming to discover unexpected proliferation scenarios • Process modeling (e.g. Physical Model) as the basis for an information integration portal, which links to data storage locations along with analyst annotations, categorizations, geographic data, search strings and visualization outputs.« less

  14. Reducing the Analytical Bottleneck for Domain Scientists: Lessons from a Climate Data Visualization Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Poco, Jorge; Bertini, Enrico

    2016-01-01

    The gap between large-scale data production rate and the rate of generation of data-driven scientific insights has led to an analytical bottleneck in scientific domains like climate, biology, etc. This is primarily due to the lack of innovative analytical tools that can help scientists efficiently analyze and explore alternative hypotheses about the data, and communicate their findings effectively to a broad audience. In this paper, by reflecting on a set of successful collaborative research efforts between with a group of climate scientists and visualization researchers, we introspect how interactive visualization can help reduce the analytical bottleneck for domain scientists.

  15. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase one, volume 4 : use of knowledge integrated visual analytics system in supporting bridge management.

    DOT National Transportation Integrated Search

    2009-12-01

    The goals of integration should be: Supporting domain oriented data analysis through the use of : knowledge augmented visual analytics system. In this project, we focus on: : Providing interactive data exploration for bridge managements. : ...

  16. Application and further development of diffusion based 2D chemical imaging techniques in the rhizosphere

    NASA Astrophysics Data System (ADS)

    Hoefer, Christoph; Santner, Jakob; Borisov, Sergey; Kreuzeder, Andreas; Wenzel, Walter; Puschenreiter, Markus

    2015-04-01

    Two dimensional chemical imaging of root processes refers to novel in situ methods to investigate and map solutes at a high spatial resolution (sub-mm). The visualization of these solutes reveals new insights in soil biogeochemistry and root processes. We derive chemical images by using data from DGT-LA-ICP-MS (Diffusive Gradients in Thin Films and Laser Ablation Inductively Coupled Plasma Mass Spectrometry) and POS (Planar Optode Sensors). Both technologies have shown promising results when applied in aqueous environment but need to be refined and improved for imaging at the soil-plant interface. Co-localized mapping using combined DGT and POS technologies and the development of new gel combinations are in our focus. DGTs are smart and thin (<0.4 mm) hydrogels; containing a binding resin for the targeted analytes (e.g. trace metals, phosphate, sulphide or radionuclides). The measurement principle is passive and diffusion based. The present analytes are diffusing into the gel and are bound by the resin. Thereby, the resin acts as zero sink. After application, DGTs are retrieved, dried, and analysed using LA-ICP-MS. The data is then normalized by an internal standard (e.g. 13C), calibrated using in-house standards and chemical images of the target area are plotted using imaging software. POS are, similar to DGT, thin sensor foils containing a fluorophore coating depending on the target analyte. The measurement principle is based on excitation of the flourophore by a specific wavelength and emission of the fluorophore depending on the presence of the analyte. The emitted signal is captured using optical filters and a DSLR camera. While DGT analysis is destructive, POS measurements can be performed continuously during the application. Both semi-quantitative techniques allow an in situ application to visualize chemical processes directly at the soil-plant interface. Here, we present a summary of results from rhizotron experiments with different plants in metal contaminated and agricultural soils.

  17. Automated indirect immunofluorescence evaluation of antinuclear autoantibodies on HEp-2 cells.

    PubMed

    Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried

    2012-01-01

    Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability.

  18. Automated Indirect Immunofluorescence Evaluation of Antinuclear Autoantibodies on HEp-2 Cells

    PubMed Central

    Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried

    2012-01-01

    Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability. PMID:23251220

  19. Big, Deep, and Smart Data in Scanning Probe Microscopy.

    PubMed

    Kalinin, Sergei V; Strelcov, Evgheni; Belianinov, Alex; Somnath, Suhas; Vasudevan, Rama K; Lingerfelt, Eric J; Archibald, Richard K; Chen, Chaomei; Proksch, Roger; Laanait, Nouamane; Jesse, Stephen

    2016-09-27

    Scanning probe microscopy (SPM) techniques have opened the door to nanoscience and nanotechnology by enabling imaging and manipulation of the structure and functionality of matter at nanometer and atomic scales. Here, we analyze the scientific discovery process in SPM by following the information flow from the tip-surface junction, to knowledge adoption by the wider scientific community. We further discuss the challenges and opportunities offered by merging SPM with advanced data mining, visual analytics, and knowledge discovery technologies.

  20. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  1. Comparative Characterization of Crofelemer Samples Using Data Mining and Machine Learning Approaches With Analytical Stability Data Sets.

    PubMed

    Nariya, Maulik K; Kim, Jae Hyun; Xiong, Jian; Kleindl, Peter A; Hewarathna, Asha; Fisher, Adam C; Joshi, Sangeeta B; Schöneich, Christian; Forrest, M Laird; Middaugh, C Russell; Volkin, David B; Deeds, Eric J

    2017-11-01

    There is growing interest in generating physicochemical and biological analytical data sets to compare complex mixture drugs, for example, products from different manufacturers. In this work, we compare various crofelemer samples prepared from a single lot by filtration with varying molecular weight cutoffs combined with incubation for different times at different temperatures. The 2 preceding articles describe experimental data sets generated from analytical characterization of fractionated and degraded crofelemer samples. In this work, we use data mining techniques such as principal component analysis and mutual information scores to help visualize the data and determine discriminatory regions within these large data sets. The mutual information score identifies chemical signatures that differentiate crofelemer samples. These signatures, in many cases, would likely be missed by traditional data analysis tools. We also found that supervised learning classifiers robustly discriminate samples with around 99% classification accuracy, indicating that mathematical models of these physicochemical data sets are capable of identifying even subtle differences in crofelemer samples. Data mining and machine learning techniques can thus identify fingerprint-type attributes of complex mixture drugs that may be used for comparative characterization of products. Copyright © 2017 American Pharmacists Association®. All rights reserved.

  2. Visualizing Molecular Diffusion through Passive Permeability Barriers in Cells: Conventional and Novel Approaches

    PubMed Central

    Lin, Yu-Chun; Phua, Siew Cheng; Lin, Benjamin; Inoue, Takanari

    2013-01-01

    Diffusion barriers are universal solutions for cells to achieve distinct organizations, compositions, and activities within a limited space. The influence of diffusion barriers on the spatiotemporal dynamics of signaling molecules often determines cellular physiology and functions. Over the years, the passive permeability barriers in various subcellular locales have been characterized using elaborate analytical techniques. In this review, we will summarize the current state of knowledge on the various passive permeability barriers present in mammalian cells. We will conclude with a description of several conventional techniques and one new approach based on chemically-inducible diffusion trap (C-IDT) for probing permeable barriers. PMID:23731778

  3. Multiplexed Paper Analytical Device for Quantification of Metals using Distance-Based Detection

    PubMed Central

    Cate, David M.; Noblitt, Scott D.; Volckens, John; Henry, Charles S.

    2015-01-01

    Exposure to metal-containing aerosols has been linked with adverse health outcomes for almost every organ in the human body. Commercially available techniques for quantifying particulate metals are time-intensive, laborious, and expensive; often sample analysis exceeds $100. We report a simple technique, based upon a distance-based detection motif, for quantifying metal concentrations of Ni, Cu, and Fe in airborne particulate matter using microfluidic paper-based analytical devices. Paper substrates are used to create sensors that are self-contained, self-timing, and require only a drop of sample for operation. Unlike other colorimetric approaches in paper microfluidics that rely on optical instrumentation for analysis, with distance-based detection, analyte is quantified visually based on the distance of a colorimetric reaction, similar to reading temperature on a thermometer. To demonstrate the effectiveness of this approach, Ni, Cu, and Fe were measured individually in single-channel devices; detection limits as low as 0.1, 0.1, and 0.05 µg were reported for Ni, Cu, and Fe. Multiplexed analysis of all three metals was achieved with detection limits of 1, 5, and 1 µg for Ni, Cu, and Fe. We also extended the dynamic range for multi-analyte detection by printing concentration gradients of colorimetric reagents using an off the shelf inkjet printer. Analyte selectivity was demonstrated for common interferences. To demonstrate utility of the method, Ni, Cu, and Fe were measured from samples of certified welding fume; levels measured with paper sensors matched known values determined gravimetrically. PMID:26009988

  4. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    ERIC Educational Resources Information Center

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors…

  5. Literature and Product Review of Visual Analytics for Maritime Awareness

    DTIC Science & Technology

    2009-10-28

    the user’s knowledge and experience. • Riveiro et al [107] provide a useful discussion of the cognitive process of anomaly detection based on...changes over time can be seen visually. • Wilkinson et al [140] suggests that we need visual analytics for three principal purposes: checking raw data...Predictions within the Current Plot • Yue et al [146] describe an AI blackboard-based agent that leverages interactive visualization and mixed

  6. Podium: Ranking Data Using Mixed-Initiative Visual Analytics.

    PubMed

    Wall, Emily; Das, Subhajit; Chawla, Ravish; Kalidindi, Bharath; Brown, Eli T; Endert, Alex

    2018-01-01

    People often rank and order data points as a vital part of making decisions. Multi-attribute ranking systems are a common tool used to make these data-driven decisions. Such systems often take the form of a table-based visualization in which users assign weights to the attributes representing the quantifiable importance of each attribute to a decision, which the system then uses to compute a ranking of the data. However, these systems assume that users are able to quantify their conceptual understanding of how important particular attributes are to a decision. This is not always easy or even possible for users to do. Rather, people often have a more holistic understanding of the data. They form opinions that data point A is better than data point B but do not necessarily know which attributes are important. To address these challenges, we present a visual analytic application to help people rank multi-variate data points. We developed a prototype system, Podium, that allows users to drag rows in the table to rank order data points based on their perception of the relative value of the data. Podium then infers a weighting model using Ranking SVM that satisfies the user's data preferences as closely as possible. Whereas past systems help users understand the relationships between data points based on changes to attribute weights, our approach helps users to understand the attributes that might inform their understanding of the data. We present two usage scenarios to describe some of the potential uses of our proposed technique: (1) understanding which attributes contribute to a user's subjective preferences for data, and (2) deconstructing attributes of importance for existing rankings. Our proposed approach makes powerful machine learning techniques more usable to those who may not have expertise in these areas.

  7. Visualization and Time-Series Analysis of Ground-Water Data for C-Area, Savannah River Site, South Carolina, 1984-2004

    USGS Publications Warehouse

    Conrads, Paul; Roehl, Edwin A.; Daamen, Ruby C.; Chapelle, Francis H.; Lowery, Mark A.; Mundry, Uwe H.

    2007-01-01

    In 2004, the U.S. Geological Survey, in cooperation with the U.S. Department of Energy, initiated a study of historical ground-water data of C-Area on the Savannah River Site in South Carolina. The soils and ground water at C-Area are contaminated with high concentrations of trichloroethylene and lesser amounts of tetrachloroethylene. The objectives of the investigation were (1) to analyze the historical data to determine if data-mining techniques could be applied to the historical database to ascertain whether natural attenuation of recalcitrant contaminants, such as volatile organic compounds, is occurring and (2) to determine whether inferential (surrogate) analytes could be used for more cost-effective monitoring. Twenty-one years of data (1984-2004) were collected from 396 wells in the study area and converted from record data to time-series data for analysis. A Ground-Water Data Viewer was developed to allow users to spatially and temporally visualize the analyte data. Overall, because the data were temporally and spatially sparse, data analysis was limited to only qualitative descriptions.

  8. An Empirical Study of Chronic Diseases in the United States: A Visual Analytics Approach to Public Health

    PubMed Central

    Raghupathi, Wullianallur; Raghupathi, Viju

    2018-01-01

    In this research we explore the current state of chronic diseases in the United States, using data from the Centers for Disease Control and Prevention and applying visualization and descriptive analytics techniques. Five main categories of variables are studied, namely chronic disease conditions, behavioral health, mental health, demographics, and overarching conditions. These are analyzed in the context of regions and states within the U.S. to discover possible correlations between variables in several categories. There are widespread variations in the prevalence of diverse chronic diseases, the number of hospitalizations for specific diseases, and the diagnosis and mortality rates for different states. Identifying such correlations is fundamental to developing insights that will help in the creation of targeted management, mitigation, and preventive policies, ultimately minimizing the risks and costs of chronic diseases. As the population ages and individuals suffer from multiple conditions, or comorbidity, it is imperative that the various stakeholders, including the government, non-governmental organizations (NGOs), policy makers, health providers, and society as a whole, address these adverse effects in a timely and efficient manner. PMID:29494555

  9. Physiological and Anatomical Visual Analytics (PAVA) Background

    EPA Pesticide Factsheets

    The need to efficiently analyze human chemical disposition data from in vivo studies or in silico PBPK modeling efforts, and to see complex disposition data in a logical manner, has created a unique opportunity for visual analytics applid to PAD.

  10. BiNA: A Visual Analytics Tool for Biological Network Data

    PubMed Central

    Gerasch, Andreas; Faber, Daniel; Küntzer, Jan; Niermann, Peter; Kohlbacher, Oliver; Lenhof, Hans-Peter; Kaufmann, Michael

    2014-01-01

    Interactive visual analysis of biological high-throughput data in the context of the underlying networks is an essential task in modern biomedicine with applications ranging from metabolic engineering to personalized medicine. The complexity and heterogeneity of data sets require flexible software architectures for data analysis. Concise and easily readable graphical representation of data and interactive navigation of large data sets are essential in this context. We present BiNA - the Biological Network Analyzer - a flexible open-source software for analyzing and visualizing biological networks. Highly configurable visualization styles for regulatory and metabolic network data offer sophisticated drawings and intuitive navigation and exploration techniques using hierarchical graph concepts. The generic projection and analysis framework provides powerful functionalities for visual analyses of high-throughput omics data in the context of networks, in particular for the differential analysis and the analysis of time series data. A direct interface to an underlying data warehouse provides fast access to a wide range of semantically integrated biological network databases. A plugin system allows simple customization and integration of new analysis algorithms or visual representations. BiNA is available under the 3-clause BSD license at http://bina.unipax.info/. PMID:24551056

  11. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  12. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  13. Just-in-time Time Data Analytics and Visualization of Climate Simulations using the Bellerophon Framework

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.

    2015-12-01

    Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.

  14. SeeDB: Efficient Data-Driven Visualization Recommendations to Support Visual Analytics

    PubMed Central

    Vartak, Manasi; Rahman, Sajjadur; Madden, Samuel; Parameswaran, Aditya; Polyzotis, Neoklis

    2015-01-01

    Data analysts often build visualizations as the first step in their analytical workflow. However, when working with high-dimensional datasets, identifying visualizations that show relevant or desired trends in data can be laborious. We propose SeeDB, a visualization recommendation engine to facilitate fast visual analysis: given a subset of data to be studied, SeeDB intelligently explores the space of visualizations, evaluates promising visualizations for trends, and recommends those it deems most “useful” or “interesting”. The two major obstacles in recommending interesting visualizations are (a) scale: evaluating a large number of candidate visualizations while responding within interactive time scales, and (b) utility: identifying an appropriate metric for assessing interestingness of visualizations. For the former, SeeDB introduces pruning optimizations to quickly identify high-utility visualizations and sharing optimizations to maximize sharing of computation across visualizations. For the latter, as a first step, we adopt a deviation-based metric for visualization utility, while indicating how we may be able to generalize it to other factors influencing utility. We implement SeeDB as a middleware layer that can run on top of any DBMS. Our experiments show that our framework can identify interesting visualizations with high accuracy. Our optimizations lead to multiple orders of magnitude speedup on relational row and column stores and provide recommendations at interactive time scales. Finally, we demonstrate via a user study the effectiveness of our deviation-based utility metric and the value of recommendations in supporting visual analytics. PMID:26779379

  15. SeeDB: Efficient Data-Driven Visualization Recommendations to Support Visual Analytics.

    PubMed

    Vartak, Manasi; Rahman, Sajjadur; Madden, Samuel; Parameswaran, Aditya; Polyzotis, Neoklis

    2015-09-01

    Data analysts often build visualizations as the first step in their analytical workflow. However, when working with high-dimensional datasets, identifying visualizations that show relevant or desired trends in data can be laborious. We propose SeeDB, a visualization recommendation engine to facilitate fast visual analysis: given a subset of data to be studied, SeeDB intelligently explores the space of visualizations, evaluates promising visualizations for trends, and recommends those it deems most "useful" or "interesting". The two major obstacles in recommending interesting visualizations are (a) scale : evaluating a large number of candidate visualizations while responding within interactive time scales, and (b) utility : identifying an appropriate metric for assessing interestingness of visualizations. For the former, SeeDB introduces pruning optimizations to quickly identify high-utility visualizations and sharing optimizations to maximize sharing of computation across visualizations. For the latter, as a first step, we adopt a deviation-based metric for visualization utility, while indicating how we may be able to generalize it to other factors influencing utility. We implement SeeDB as a middleware layer that can run on top of any DBMS. Our experiments show that our framework can identify interesting visualizations with high accuracy. Our optimizations lead to multiple orders of magnitude speedup on relational row and column stores and provide recommendations at interactive time scales. Finally, we demonstrate via a user study the effectiveness of our deviation-based utility metric and the value of recommendations in supporting visual analytics.

  16. Effects of Using Dynamic Mathematics Software on Preservice Mathematics Teachers' Spatial Visualization Skills: The Case of Spatial Analytic Geometry

    ERIC Educational Resources Information Center

    Kösa, Temel

    2016-01-01

    The purpose of this study was to investigate the effects of using dynamic geometry software on preservice mathematics teachers' spatial visualization skills and to determine whether spatial visualization skills can be a predictor of success in learning analytic geometry of space. The study used a quasi-experimental design with a control group.…

  17. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  18. Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Visual Analytic Judgments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik

    2017-05-08

    Scientists often use specific data analysis and presentation methods familiar within their domain. But does high familiarity drive better analytical judgment? This question is especially relevant when familiar methods themselves can have shortcomings: many visualizations used conventionally for scientific data analysis and presentation do not follow established best practices. This necessitates new methods that might be unfamiliar yet prove to be more effective. But there is little empirical understanding of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their visual analytic judgments. To address this gap and to study these factors, we focusmore » on visualizations used for comparison of climate model performance. We report on a comprehensive survey-based user study with 47 climate scientists and present an analysis of : i) relationships among scientists’ familiarity, their perceived lev- els of comfort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less

  19. The 2D analytic signal for envelope detection and feature extraction on ultrasound images.

    PubMed

    Wachinger, Christian; Klein, Tassilo; Navab, Nassir

    2012-08-01

    The fundamental property of the analytic signal is the split of identity, meaning the separation of qualitative and quantitative information in form of the local phase and the local amplitude, respectively. Especially the structural representation, independent of brightness and contrast, of the local phase is interesting for numerous image processing tasks. Recently, the extension of the analytic signal from 1D to 2D, covering also intrinsic 2D structures, was proposed. We show the advantages of this improved concept on ultrasound RF and B-mode images. Precisely, we use the 2D analytic signal for the envelope detection of RF data. This leads to advantages for the extraction of the information-bearing signal from the modulated carrier wave. We illustrate this, first, by visual assessment of the images, and second, by performing goodness-of-fit tests to a Nakagami distribution, indicating a clear improvement of statistical properties. The evaluation is performed for multiple window sizes and parameter estimation techniques. Finally, we show that the 2D analytic signal allows for an improved estimation of local features on B-mode images. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Finite-difference time-domain modelling of through-the-Earth radio signal propagation

    NASA Astrophysics Data System (ADS)

    Ralchenko, M.; Svilans, M.; Samson, C.; Roper, M.

    2015-12-01

    This research seeks to extend the knowledge of how a very low frequency (VLF) through-the-Earth (TTE) radio signal behaves as it propagates underground, by calculating and visualizing the strength of the electric and magnetic fields for an arbitrary geology through numeric modelling. To achieve this objective, a new software tool has been developed using the finite-difference time-domain method. This technique is particularly well suited to visualizing the distribution of electromagnetic fields in an arbitrary geology. The frequency range of TTE radio (400-9000 Hz) and geometrical scales involved (1 m resolution for domains a few hundred metres in size) involves processing a grid composed of millions of cells for thousands of time steps, which is computationally expensive. Graphics processing unit acceleration was used to reduce execution time from days and weeks, to minutes and hours. Results from the new modelling tool were compared to three cases for which an analytic solution is known. Two more case studies were done featuring complex geologic environments relevant to TTE communications that cannot be solved analytically. There was good agreement between numeric and analytic results. Deviations were likely caused by numeric artifacts from the model boundaries; however, in a TTE application in field conditions, the uncertainty in the conductivity of the various geologic formations will greatly outweigh these small numeric errors.

  1. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    NASA Astrophysics Data System (ADS)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  2. Electrospun polyvinyl alcohol ultra-thin layer chromatography of amino acids.

    PubMed

    Lu, Tian; Olesik, Susan V

    2013-01-01

    Electrospun polyvinyl alcohol (PVA) ultrathin layer chromatographic (UTLC) plates were fabricated using in situ crosslinking electrospinning technique. The value of these ULTC plates were characterized using the separation of fluorescein isothiocyanate (FITC) labeled amino acids and the separation of amino acids followed visualization using ninhydrin. The in situ crosslinked electrospun PVA plates showed enhanced stability in water and were stable when used for the UTLC study. The selectivity of FITC labeled amino acids on PVA plate was compared with that on commercial Si-Gel plate. The efficiency of the separation varied with analyte concentration, size of capillary analyte applicator, analyte volume, and mat thickness. The concentration of 7mM or less, 50μm i.d. capillary applicator, minimum volume of analyte solution and three-layered mat provides the best efficiency of FITC-labeled amino acids on PVA UTLC plate. The efficiency on PVA plate was greatly improved compared to the efficiency on Si-Gel HPTLC plate. The hydrolysis products of aspartame in diet coke, aspartic acid and phenylalanine, were also successfully analyzed using PVA-UTLC plate. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Visualizing molecular diffusion through passive permeability barriers in cells: conventional and novel approaches.

    PubMed

    Lin, Yu-Chun; Phua, Siew Cheng; Lin, Benjamin; Inoue, Takanari

    2013-08-01

    Diffusion barriers are universal solutions for cells to achieve distinct organizations, compositions, and activities within a limited space. The influence of diffusion barriers on the spatiotemporal dynamics of signaling molecules often determines cellular physiology and functions. Over the years, the passive permeability barriers in various subcellular locales have been characterized using elaborate analytical techniques. In this review, we will summarize the current state of knowledge on the various passive permeability barriers present in mammalian cells. We will conclude with a description of several conventional techniques and one new approach based on chemically inducible diffusion trap (CIDT) for probing permeable barriers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Fast detection and visualization of minced lamb meat adulteration using NIR hyperspectral imaging and multivariate image analysis.

    PubMed

    Kamruzzaman, Mohammed; Sun, Da-Wen; ElMasry, Gamal; Allen, Paul

    2013-01-15

    Many studies have been carried out in developing non-destructive technologies for predicting meat adulteration, but there is still no endeavor for non-destructive detection and quantification of adulteration in minced lamb meat. The main goal of this study was to develop and optimize a rapid analytical technique based on near-infrared (NIR) hyperspectral imaging to detect the level of adulteration in minced lamb. Initial investigation was carried out using principal component analysis (PCA) to identify the most potential adulterate in minced lamb. Minced lamb meat samples were then adulterated with minced pork in the range 2-40% (w/w) at approximately 2% increments. Spectral data were used to develop a partial least squares regression (PLSR) model to predict the level of adulteration in minced lamb. Good prediction model was obtained using the whole spectral range (910-1700 nm) with a coefficient of determination (R(2)(cv)) of 0.99 and root-mean-square errors estimated by cross validation (RMSECV) of 1.37%. Four important wavelengths (940, 1067, 1144 and 1217 nm) were selected using weighted regression coefficients (Bw) and a multiple linear regression (MLR) model was then established using these important wavelengths to predict adulteration. The MLR model resulted in a coefficient of determination (R(2)(cv)) of 0.98 and RMSECV of 1.45%. The developed MLR model was then applied to each pixel in the image to obtain prediction maps to visualize the distribution of adulteration of the tested samples. The results demonstrated that the laborious and time-consuming tradition analytical techniques could be replaced by spectral data in order to provide rapid, low cost and non-destructive testing technique for adulterate detection in minced lamb meat. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Visualization and Analytics Software Tools for Peregrine System |

    Science.gov Websites

    R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open

  6. Modeling human comprehension of data visualizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzen, Laura E.; Haass, Michael Joseph; Divis, Kristin Marie

    This project was inspired by two needs. The first is a need for tools to help scientists and engineers to design effective data visualizations for communicating information, whether to the user of a system, an analyst who must make decisions based on complex data, or in the context of a technical report or publication. Most scientists and engineers are not trained in visualization design, and they could benefit from simple metrics to assess how well their visualization's design conveys the intended message. In other words, will the most important information draw the viewer's attention? The second is the need formore » cognition-based metrics for evaluating new types of visualizations created by researchers in the information visualization and visual analytics communities. Evaluating visualizations is difficult even for experts. However, all visualization methods and techniques are intended to exploit the properties of the human visual system to convey information efficiently to a viewer. Thus, developing evaluation methods that are rooted in the scientific knowledge of the human visual system could be a useful approach. In this project, we conducted fundamental research on how humans make sense of abstract data visualizations, and how this process is influenced by their goals and prior experience. We then used that research to develop a new model, the Data Visualization Saliency Model, that can make accurate predictions about which features in an abstract visualization will draw a viewer's attention. The model is an evaluation tool that can address both of the needs described above, supporting both visualization research and Sandia mission needs.« less

  7. Visual Analytics for MOOC Data.

    PubMed

    Qu, Huamin; Chen, Qing

    2015-01-01

    With the rise of massive open online courses (MOOCs), tens of millions of learners can now enroll in more than 1,000 courses via MOOC platforms such as Coursera and edX. As a result, a huge amount of data has been collected. Compared with traditional education records, the data from MOOCs has much finer granularity and also contains new pieces of information. It is the first time in history that such comprehensive data related to learning behavior has become available for analysis. What roles can visual analytics play in this MOOC movement? The authors survey the current practice and argue that MOOCs provide an opportunity for visualization researchers and that visual analytics systems for MOOCs can benefit a range of end users such as course instructors, education researchers, students, university administrators, and MOOC providers.

  8. Mass diffusion coefficient measurement for vitreous humor using FEM and MRI

    NASA Astrophysics Data System (ADS)

    Rattanakijsuntorn, Komsan; Penkova, Anita; Sadha, Satwindar S.

    2018-01-01

    In early studies, the ‘contour method’ for determining the diffusion coefficient of the vitreous humor was developed. This technique relied on careful injection of an MRI contrast agent (surrogate drug) into the vitreous humor of fresh bovine eyes, and tracking the contours of the contrast agent in time. In addition, an analytical solution was developed for the theoretical contours built on point source model for the injected surrogate drug. The match between theoretical and experimental contours as a least square fit, while floating the diffusion coefficient, led to the value of the diffusion coefficient. This method had its limitation that the initial injection of the surrogate had to be spherical or ellipsoidal because of the analytical result based on the point-source model. With a new finite element model for the analysis in this study, the technique is much less restrictive and handles irregular shapes of the initial bolus. The fresh bovine eyes were used for drug diffusion study in the vitreous and three contrast agents of different molecular masses: gadolinium-diethylenetriaminepentaacetic acid (Gd-DTPA, 938 Da), non-ionic gadoteridol (Prohance, 559 Da), and bovine albumin conjugated with gadolinium (Galbumin, 74 kDa) were used as drug surrogates to visualize the diffusion process by MRI. The 3D finite element model was developed to determine the diffusion coefficients of these surrogates with the images from MRI. This method can be used for other types of bioporous media provided the concentration profile can be visualized (by methods such as MRI or fluorescence).

  9. A workflow learning model to improve geovisual analytics utility

    PubMed Central

    Roth, Robert E; MacEachren, Alan M; McCabe, Craig A

    2011-01-01

    Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545

  10. A workflow learning model to improve geovisual analytics utility.

    PubMed

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.

  11. Application of Genomic In Situ Hybridization in Horticultural Science

    PubMed Central

    Ramzan, Fahad; Lim, Ki-Byung

    2017-01-01

    Molecular cytogenetic techniques, such as in situ hybridization methods, are admirable tools to analyze the genomic structure and function, chromosome constituents, recombination patterns, alien gene introgression, genome evolution, aneuploidy, and polyploidy and also genome constitution visualization and chromosome discrimination from different genomes in allopolyploids of various horticultural crops. Using GISH advancement as multicolor detection is a significant approach to analyze the small and numerous chromosomes in fruit species, for example, Diospyros hybrids. This analytical technique has proved to be the most exact and effective way for hybrid status confirmation and helps remarkably to distinguish donor parental genomes in hybrids such as Clivia, Rhododendron, and Lycoris ornamental hybrids. The genome characterization facilitates in hybrid selection having potential desirable characteristics during the early hybridization breeding, as this technique expedites to detect introgressed sequence chromosomes. This review study epitomizes applications and advancements of genomic in situ hybridization (GISH) techniques in horticultural plants. PMID:28459054

  12. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  13. Collaborative visual analytics of radio surveys in the Big Data era

    NASA Astrophysics Data System (ADS)

    Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.

    2017-06-01

    Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.

  14. Atomic force microscopic investigation of commercial pressure sensitive adhesives for forensic analysis.

    PubMed

    Canetta, Elisabetta; Adya, Ashok K

    2011-07-15

    Pressure sensitive adhesive (PSA), such as those used in packaging and adhesive tapes, are very often encountered in forensic investigations. In criminal activities, packaging tapes may be used for sealing packets containing drugs, explosive devices, or questioned documents, while adhesive and electrical tapes are used occasionally in kidnapping cases. In this work, the potential of using atomic force microscopy (AFM) in both imaging and force mapping (FM) modes to derive additional analytical information from PSAs is demonstrated. AFM has been used to illustrate differences in the ultrastructural and nanomechanical properties of three visually distinguishable commercial PSAs to first test the feasibility of using this technique. Subsequently, AFM was used to detect nanoscopic differences between three visually indistinguishable PSAs. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. Ultrasensitive Visual Detection of HIV DNA Biomarkers via a Multi-amplification Nanoplatform.

    PubMed

    Long, Yuyin; Zhou, Cuisong; Wang, Congmin; Cai, Honglian; Yin, Cuiyun; Yang, Qiufang; Xiao, Dan

    2016-04-01

    Methodologies to detect disease biomarkers at ultralow concentrations can potentially improve the standard of living. A facile and label-free multi-amplification strategy is proposed for the ultrasensitive visual detection of HIV DNA biomarkers in real physiological media. This multi-amplification strategy not only exhibits a signficantly low detection limit down to 4.8 pM but also provides a label-free, cost-effective and facile technique for visualizing a few molecules of nucleic acid analyte with the naked eye. Importantly, the biosensor is capable of discriminating single-based mismatch lower than 5.0 nM in human serum samples. Moreover, the visual sensing platform exhibits excellent specificity, acceptable reusability and a long-term stability. All these advantages could be attributed to the nanofibrous sensing platform that 1) has a high surface-area-to-volume provided by electrospun nanofibrous membrane, and 2) combines glucose oxidase (GOx) biocatalysis, DNAzyme-catalyzed colorimetric reaction and catalytic hairpin assembly (CHA) recycling amplification together. This multi-amplification nanoplatform promises label-free and visual single-based mismatch DNA monitoring with high sensitivity and specificity, suggesting wide applications that range from virus detection to genetic disease diagnosis.

  16. Streaming Visual Analytics Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Burtner, Edwin R.; Kritzstein, Brian P.

    How can we best enable users to understand complex emerging events and make appropriate assessments from streaming data? This was the central question addressed at a three-day workshop on streaming visual analytics. This workshop was organized by Pacific Northwest National Laboratory for a government sponsor. It brought together forty researchers and subject matter experts from government, industry, and academia. This report summarizes the outcomes from that workshop. It describes elements of the vision for a streaming visual analytic environment and set of important research directions needed to achieve this vision. Streaming data analysis is in many ways the analysis andmore » understanding of change. However, current visual analytics systems usually focus on static data collections, meaning that dynamically changing conditions are not appropriately addressed. The envisioned mixed-initiative streaming visual analytics environment creates a collaboration between the analyst and the system to support the analysis process. It raises the level of discourse from low-level data records to higher-level concepts. The system supports the analyst’s rapid orientation and reorientation as situations change. It provides an environment to support the analyst’s critical thinking. It infers tasks and interests based on the analyst’s interactions. The system works as both an assistant and a devil’s advocate, finding relevant data and alerts as well as considering alternative hypotheses. Finally, the system supports sharing of findings with others. Making such an environment a reality requires research in several areas. The workshop discussions focused on four broad areas: support for critical thinking, visual representation of change, mixed-initiative analysis, and the use of narratives for analysis and communication.« less

  17. A technique system for the measurement, reconstruction and character extraction of rice plant architecture

    PubMed Central

    Li, Xumeng; Wang, Xiaohui; Wei, Hailin; Zhu, Xinguang; Peng, Yulin; Li, Ming; Li, Tao; Huang, Huang

    2017-01-01

    This study developed a technique system for the measurement, reconstruction, and trait extraction of rice canopy architectures, which have challenged functional–structural plant modeling for decades and have become the foundation of the design of ideo-plant architectures. The system uses the location-separation-measurement method (LSMM) for the collection of data on the canopy architecture and the analytic geometry method for the reconstruction and visualization of the three-dimensional (3D) digital architecture of the rice plant. It also uses the virtual clipping method for extracting the key traits of the canopy architecture such as the leaf area, inclination, and azimuth distribution in spatial coordinates. To establish the technique system, we developed (i) simple tools to measure the spatial position of the stem axis and azimuth of the leaf midrib and to capture images of tillers and leaves; (ii) computer software programs for extracting data on stem diameter, leaf nodes, and leaf midrib curves from the tiller images and data on leaf length, width, and shape from the leaf images; (iii) a database of digital architectures that stores the measured data and facilitates the reconstruction of the 3D visual architecture and the extraction of architectural traits; and (iv) computation algorithms for virtual clipping to stratify the rice canopy, to extend the stratified surface from the horizontal plane to a general curved surface (including a cylindrical surface), and to implement in silico. Each component of the technique system was quantitatively validated and visually compared to images, and the sensitivity of the virtual clipping algorithms was analyzed. This technique is inexpensive and accurate and provides high throughput for the measurement, reconstruction, and trait extraction of rice canopy architectures. The technique provides a more practical method of data collection to serve functional–structural plant models of rice and for the optimization of rice canopy types. Moreover, the technique can be easily adapted for other cereal crops such as wheat, which has numerous stems and leaves sheltering each other. PMID:28558045

  18. Semantic Interaction for Sensemaking: Inferring Analytical Reasoning for Model Steering.

    PubMed

    Endert, A; Fiaux, P; North, C

    2012-12-01

    Visual analytic tools aim to support the cognitively demanding task of sensemaking. Their success often depends on the ability to leverage capabilities of mathematical models, visualization, and human intuition through flexible, usable, and expressive interactions. Spatially clustering data is one effective metaphor for users to explore similarity and relationships between information, adjusting the weighting of dimensions or characteristics of the dataset to observe the change in the spatial layout. Semantic interaction is an approach to user interaction in such spatializations that couples these parametric modifications of the clustering model with users' analytic operations on the data (e.g., direct document movement in the spatialization, highlighting text, search, etc.). In this paper, we present results of a user study exploring the ability of semantic interaction in a visual analytic prototype, ForceSPIRE, to support sensemaking. We found that semantic interaction captures the analytical reasoning of the user through keyword weighting, and aids the user in co-creating a spatialization based on the user's reasoning and intuition.

  19. Big data and visual analytics in anaesthesia and health care.

    PubMed

    Simpao, A F; Ahumada, L M; Rehman, M A

    2015-09-01

    Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  1. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  2. The Visual Analogue Scale for Rating, Ranking and Paired-Comparison (VAS-RRP): A new technique for psychological measurement.

    PubMed

    Sung, Yao-Ting; Wu, Jeng-Shin

    2018-04-17

    Traditionally, the visual analogue scale (VAS) has been proposed to overcome the limitations of ordinal measures from Likert-type scales. However, the function of VASs to overcome the limitations of response styles to Likert-type scales has not yet been addressed. Previous research using ranking and paired comparisons to compensate for the response styles of Likert-type scales has suffered from limitations, such as that the total score of ipsative measures is a constant that cannot be analyzed by means of many common statistical techniques. In this study we propose a new scale, called the Visual Analogue Scale for Rating, Ranking, and Paired-Comparison (VAS-RRP), which can be used to collect rating, ranking, and paired-comparison data simultaneously, while avoiding the limitations of each of these data collection methods. The characteristics, use, and analytic method of VAS-RRPs, as well as how they overcome the disadvantages of Likert-type scales, ranking, and VASs, are discussed. On the basis of analyses of simulated and empirical data, this study showed that VAS-RRPs improved reliability, response style bias, and parameter recovery. Finally, we have also designed a VAS-RRP Generator for researchers' construction and administration of their own VAS-RRPs.

  3. Visual analytics of anomaly detection in large data streams

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.; Sharma, Ratnesh K.; Mehta, Abhay

    2009-01-01

    Most data streams usually are multi-dimensional, high-speed, and contain massive volumes of continuous information. They are seen in daily applications, such as telephone calls, retail sales, data center performance, and oil production operations. Many analysts want insight into the behavior of this data. They want to catch the exceptions in flight to reveal the causes of the anomalies and to take immediate action. To guide the user in finding the anomalies in the large data stream quickly, we derive a new automated neighborhood threshold marking technique, called AnomalyMarker. This technique is built on cell-based data streams and user-defined thresholds. We extend the scope of the data points around the threshold to include the surrounding areas. The idea is to define a focus area (marked area) which enables users to (1) visually group the interesting data points related to the anomalies (i.e., problems that occur persistently or occasionally) for observing their behavior; (2) discover the factors related to the anomaly by visualizing the correlations between the problem attribute with the attributes of the nearby data items from the entire multi-dimensional data stream. Mining results are quickly presented in graphical representations (i.e., tooltip) for the user to zoom into the problem regions. Different algorithms are introduced which try to optimize the size and extent of the anomaly markers. We have successfully applied this technique to detect data stream anomalies in large real-world enterprise server performance and data center energy management.

  4. Combustion of bulk titanium in oxygen

    NASA Technical Reports Server (NTRS)

    Clark, A. F.; Moulder, J. C.; Runyan, C. C.

    1975-01-01

    The combustion of bulk titanium in one atmosphere oxygen is studied using laser ignition and several analytical techniques. These were high-speed color cinematography, time and space resolved spectra in the visible region, metallography (including SEM) of specimens quenched in argon gas, X-ray and chemical product analyses, and a new optical technique, the Hilbert transform method. The cinematographic application of this technique for visualizing phase objects in the combustion zone is described. The results indicate an initial vapor phase reaction immediately adjacent to the molten surface but as the oxygen uptake progresses the evaporation approaches the point of congruency and a much reduced evaporation rate. This and the accumulation of the various soluble oxides soon drive the reaction zone below the surface where gas formation causes boiling and ejection of particles. The buildup of rutile cuts off the oxygen supply and the reaction ceases.

  5. Immersive Visual Analytics for Transformative Neutron Scattering Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Daniel, Jamison R; Drouhard, Margaret

    The ORNL Spallation Neutron Source (SNS) provides the most intense pulsed neutron beams in the world for scientific research and development across a broad range of disciplines. SNS experiments produce large volumes of complex data that are analyzed by scientists with varying degrees of experience using 3D visualization and analysis systems. However, it is notoriously difficult to achieve proficiency with 3D visualizations. Because 3D representations are key to understanding the neutron scattering data, scientists are unable to analyze their data in a timely fashion resulting in inefficient use of the limited and expensive SNS beam time. We believe a moremore » intuitive interface for exploring neutron scattering data can be created by combining immersive virtual reality technology with high performance data analytics and human interaction. In this paper, we present our initial investigations of immersive visualization concepts as well as our vision for an immersive visual analytics framework that could lower the barriers to 3D exploratory data analysis of neutron scattering data at the SNS.« less

  6. A Software Developer’s Guide to Informal Evaluation of Visual Analytics Environments Using VAST Challenge Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Scholtz, Jean; Whiting, Mark A.

    The VAST Challenge has been a popular venue for academic and industry participants for over ten years. Many participants comment that the majority of their time in preparing VAST Challenge entries is discovering elements in their software environments that need to be redesigned in order to solve the given task. Fortunately, there is no need to wait until the VAST Challenge is announced to test out software systems. The Visual Analytics Benchmark Repository contains all past VAST Challenge tasks, data, solutions and submissions. This paper details the various types of evaluations that may be conducted using the Repository information. Inmore » this paper we describe how developers can do informal evaluations of various aspects of their visual analytics environments using VAST Challenge information. Aspects that can be evaluated include the appropriateness of the software for various tasks, the various data types and formats that can be accommodated, the effectiveness and efficiency of the process supported by the software, and the intuitiveness of the visualizations and interactions. Researchers can compare their visualizations and interactions to those submitted to determine novelty. In addition, the paper provides pointers to various guidelines that software teams can use to evaluate the usability of their software. While these evaluations are not a replacement for formal evaluation methods, this information can be extremely useful during the development of visual analytics environments.« less

  7. Early visual analysis tool using magnetoencephalography for treatment and recovery of neuronal dysfunction.

    PubMed

    Rasheed, Waqas; Neoh, Yee Yik; Bin Hamid, Nor Hisham; Reza, Faruque; Idris, Zamzuri; Tang, Tong Boon

    2017-10-01

    Functional neuroimaging modalities play an important role in deciding the diagnosis and course of treatment of neuronal dysfunction and degeneration. This article presents an analytical tool with visualization by exploiting the strengths of the MEG (magnetoencephalographic) neuroimaging technique. The tool automates MEG data import (in tSSS format), channel information extraction, time/frequency decomposition, and circular graph visualization (connectogram) for simple result inspection. For advanced users, the tool also provides magnitude squared coherence (MSC) values allowing personalized threshold levels, and the computation of default model from MEG data of control population. Default model obtained from healthy population data serves as a useful benchmark to diagnose and monitor neuronal recovery during treatment. The proposed tool further provides optional labels with international 10-10 system nomenclature in order to facilitate comparison studies with EEG (electroencephalography) sensor space. Potential applications in epilepsy and traumatic brain injury studies are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. IDP camp evolvement analysis in Darfur using VHSR optical satellite image time series and scientific visualization on virtual globes

    NASA Astrophysics Data System (ADS)

    Tiede, Dirk; Lang, Stefan

    2010-11-01

    In this paper we focus on the application of transferable, object-based image analysis algorithms for dwelling extraction in a camp for internally displaced people (IDP) in Darfur, Sudan along with innovative means for scientific visualisation of the results. Three very high spatial resolution satellite images (QuickBird: 2002, 2004, 2008) were used for: (1) extracting different types of dwellings and (2) calculating and visualizing added-value products such as dwelling density and camp structure. The results were visualized on virtual globes (Google Earth and ArcGIS Explorer) revealing the analysis results (analytical 3D views,) transformed into the third dimension (z-value). Data formats depend on virtual globe software including KML/KMZ (keyhole mark-up language) and ESRI 3D shapefiles streamed as ArcGIS Server-based globe service. In addition, means for improving overall performance of automated dwelling structures using grid computing techniques are discussed using examples from a similar study.

  9. Theoretical investigation of confocal microscopy using an elliptically polarized cylindrical vector laser beam: Visualization of quantum emitters near interfaces

    NASA Astrophysics Data System (ADS)

    Boichenko, Stepan

    2018-04-01

    We theoretically study laser-scanning confocal fluorescence microscopy using elliptically polarized cylindrical vector excitation light as a tool for visualization of arbitrarily oriented single quantum dipole emitters located (1) near planar surfaces enhancing fluorescence, (2) in a thin supported polymer film, (3) in a freestanding polymer film, and (4) in a dielectric planar microcavity. It is shown analytically that by using a tightly focused azimuthally polarized beam, it is possible to exclude completely the orientational dependence of the image intensity maximum of a quantum emitter that absorbs light as a pair of incoherent independent linear dipoles. For linear dipole quantum emitters, the orientational independence degree higher than 0.9 can normally be achieved (this quantity equal to 1 corresponds to completely excluded orientational dependence) if the collection efficiency of the microscope objective and the emitter's total quantum yield are not strongly orientationally dependent. Thus, the visualization of arbitrarily oriented single quantum emitters by means of the studied technique can be performed quite efficiently.

  10. The Role of Visual Learning in Improving Students' High-Order Thinking Skills

    ERIC Educational Resources Information Center

    Raiyn, Jamal

    2016-01-01

    Various concepts have been introduced to improve students' analytical thinking skills based on problem based learning (PBL). This paper introduces a new concept to increase student's analytical thinking skills based on a visual learning strategy. Such a strategy has three fundamental components: a teacher, a student, and a learning process. The…

  11. A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.

    2013-01-01

    This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…

  12. DEVELOPMENTS IN GRworkbench

    NASA Astrophysics Data System (ADS)

    Moylan, Andrew; Scott, Susan M.; Searle, Anthony C.

    2006-02-01

    The software tool GRworkbench is an ongoing project in visual, numerical General Relativity at The Australian National University. Recently, GRworkbench has been significantly extended to facilitate numerical experimentation in analytically-defined space-times. The numerical differential geometric engine has been rewritten using functional programming techniques, enabling objects which are normally defined as functions in the formalism of differential geometry and General Relativity to be directly represented as function variables in the C++ code of GRworkbench. The new functional differential geometric engine allows for more accurate and efficient visualisation of objects in space-times and makes new, efficient computational techniques available. Motivated by the desire to investigate a recent scientific claim using GRworkbench, new tools for numerical experimentation have been implemented, allowing for the simulation of complex physical situations.

  13. Investigation of new radar-data-reduction techniques used to determine drag characteristics of a free-flight vehicle

    NASA Technical Reports Server (NTRS)

    Woodbury, G. E.; Wallace, J. W.

    1974-01-01

    An investigation was conducted of new techniques used to determine the complete transonic drag characteristics of a series of free-flight drop-test models using principally radar tracking data. The full capabilities of the radar tracking and meteorological measurement systems were utilized. In addition, preflight trajectory design, exact kinematic equations, and visual-analytical filtering procedures were employed. The results of this study were compared with the results obtained from analysis of the onboard, accelerometer and pressure sensor data of the only drop-test model that was instrumented. The accelerometer-pressure drag curve was approximated by the radar-data drag curve. However, a small amplitude oscillation on the latter curve precluded a precise definition of its drag rise.

  14. Unlocking Proteomic Heterogeneity in Complex Diseases through Visual Analytics

    PubMed Central

    Bhavnani, Suresh K.; Dang, Bryant; Bellala, Gowtham; Divekar, Rohit; Visweswaran, Shyam; Brasier, Allan; Kurosky, Alex

    2015-01-01

    Despite years of preclinical development, biological interventions designed to treat complex diseases like asthma often fail in phase III clinical trials. These failures suggest that current methods to analyze biomedical data might be missing critical aspects of biological complexity such as the assumption that cases and controls come from homogeneous distributions. Here we discuss why and how methods from the rapidly evolving field of visual analytics can help translational teams (consisting of biologists, clinicians, and bioinformaticians) to address the challenge of modeling and inferring heterogeneity in the proteomic and phenotypic profiles of patients with complex diseases. Because a primary goal of visual analytics is to amplify the cognitive capacities of humans for detecting patterns in complex data, we begin with an overview of the cognitive foundations for the field of visual analytics. Next, we organize the primary ways in which a specific form of visual analytics called networks have been used to model and infer biological mechanisms, which help to identify the properties of networks that are particularly useful for the discovery and analysis of proteomic heterogeneity in complex diseases. We describe one such approach called subject-protein networks, and demonstrate its application on two proteomic datasets. This demonstration provides insights to help translational teams overcome theoretical, practical, and pedagogical hurdles for the widespread use of subject-protein networks for analyzing molecular heterogeneities, with the translational goal of designing biomarker-based clinical trials, and accelerating the development of personalized approaches to medicine. PMID:25684269

  15. The science of visual analysis at extreme scale

    NASA Astrophysics Data System (ADS)

    Nowell, Lucy T.

    2011-01-01

    Driven by market forces and spanning the full spectrum of computational devices, computer architectures are changing in ways that present tremendous opportunities and challenges for data analysis and visual analytic technologies. Leadership-class high performance computing system will have as many as a million cores by 2020 and support 10 billion-way concurrency, while laptop computers are expected to have as many as 1,000 cores by 2015. At the same time, data of all types are increasing exponentially and automated analytic methods are essential for all disciplines. Many existing analytic technologies do not scale to make full use of current platforms and fewer still are likely to scale to the systems that will be operational by the end of this decade. Furthermore, on the new architectures and for data at extreme scales, validating the accuracy and effectiveness of analytic methods, including visual analysis, will be increasingly important.

  16. Developing Visual Thinking in the Electronic Health Record.

    PubMed

    Boyd, Andrew D; Young, Christine D; Amatayakul, Margret; Dieter, Michael G; Pawola, Lawrence M

    2017-01-01

    The purpose of this vision paper is to identify how data visualization could transform healthcare. Electronic Health Records (EHRs) are maturing with new technology and tools being applied. Researchers are reaping the benefits of data visualization to better access compilations of EHR data for enhanced clinical research. Data visualization, while still primarily the domain of clinical researchers, is beginning to show promise for other stakeholders. A non-exhaustive review of the literature indicates that respective to the growth and development of the EHR, the maturity of data visualization in healthcare is in its infancy. Visual analytics has been only cursorily applied to healthcare. A fundamental issue contributing to fragmentation and poor coordination of healthcare delivery is that each member of the healthcare team, including patients, has a different view. Summarizing all of this care comprehensively for any member of the healthcare team is a "wickedly hard" visual analytics and data visualization problem to solve.

  17. A Data-Driven Approach to Interactive Visualization of Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Jun

    Driven by emerging industry standards, electric utilities and grid coordination organizations are eager to seek advanced tools to assist grid operators to perform mission-critical tasks and enable them to make quick and accurate decisions. The emerging field of visual analytics holds tremendous promise for improving the business practices in today’s electric power industry. The conducted investigation, however, has revealed that the existing commercial power grid visualization tools heavily rely on human designers, hindering user’s ability to discover. Additionally, for a large grid, it is very labor-intensive and costly to build and maintain the pre-designed visual displays. This project proposes amore » data-driven approach to overcome the common challenges. The proposed approach relies on developing powerful data manipulation algorithms to create visualizations based on the characteristics of empirically or mathematically derived data. The resulting visual presentations emphasize what the data is rather than how the data should be presented, thus fostering comprehension and discovery. Furthermore, the data-driven approach formulates visualizations on-the-fly. It does not require a visualization design stage, completely eliminating or significantly reducing the cost for building and maintaining visual displays. The research and development (R&D) conducted in this project is mainly divided into two phases. The first phase (Phase I & II) focuses on developing data driven techniques for visualization of power grid and its operation. Various data-driven visualization techniques were investigated, including pattern recognition for auto-generation of one-line diagrams, fuzzy model based rich data visualization for situational awareness, etc. The R&D conducted during the second phase (Phase IIB) focuses on enhancing the prototyped data driven visualization tool based on the gathered requirements and use cases. The goal is to evolve the prototyped tool developed during the first phase into a commercial grade product. We will use one of the identified application areas as an example to demonstrate how research results achieved in this project are successfully utilized to address an emerging industry need. In summary, the data-driven visualization approach developed in this project has proven to be promising for building the next-generation power grid visualization tools. Application of this approach has resulted in a state-of-the-art commercial tool currently being leveraged by more than 60 utility organizations in North America and Europe .« less

  18. Scientific Visualization and Computational Science: Natural Partners

    NASA Technical Reports Server (NTRS)

    Uselton, Samuel P.; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Scientific visualization is developing rapidly, stimulated by computational science, which is gaining acceptance as a third alternative to theory and experiment. Computational science is based on numerical simulations of mathematical models derived from theory. But each individual simulation is like a hypothetical experiment; initial conditions are specified, and the result is a record of the observed conditions. Experiments can be simulated for situations that can not really be created or controlled. Results impossible to measure can be computed.. Even for observable values, computed samples are typically much denser. Numerical simulations also extend scientific exploration where the mathematics is analytically intractable. Numerical simulations are used to study phenomena from subatomic to intergalactic scales and from abstract mathematical structures to pragmatic engineering of everyday objects. But computational science methods would be almost useless without visualization. The obvious reason is that the huge amounts of data produced require the high bandwidth of the human visual system, and interactivity adds to the power. Visualization systems also provide a single context for all the activities involved from debugging the simulations, to exploring the data, to communicating the results. Most of the presentations today have their roots in image processing, where the fundamental task is: Given an image, extract information about the scene. Visualization has developed from computer graphics, and the inverse task: Given a scene description, make an image. Visualization extends the graphics paradigm by expanding the possible input. The goal is still to produce images; the difficulty is that the input is not a scene description displayable by standard graphics methods. Visualization techniques must either transform the data into a scene description or extend graphics techniques to display this odd input. Computational science is a fertile field for visualization research because the results vary so widely and include things that have no known appearance. The amount of data creates additional challenges for both hardware and software systems. Evaluations of visualization should ultimately reflect the insight gained into the scientific phenomena. So making good visualizations requires consideration of characteristics of the user and the purpose of the visualization. Knowledge about human perception and graphic design is also relevant. It is this breadth of knowledge that stimulates proposals for multidisciplinary visualization teams and intelligent visualization assistant software. Visualization is an immature field, but computational science is stimulating research on a broad front.

  19. SmartR: an open-source platform for interactive visual analytics for translational research data

    PubMed Central

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-01-01

    Abstract Summary: In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR, a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. Availability and Implementation: The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR. Contact: reinhard.schneider@uni.lu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28334291

  20. SmartR: an open-source platform for interactive visual analytics for translational research data.

    PubMed

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  1. Focusing on Environmental Biofilms With Variable-Pressure Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Joubert, L.; Wolfaardt, G. M.; Du Plessis, K.

    2006-12-01

    Since the term biofilm has been coined almost 30 years ago, visualization has formed an integral part of investigations on microbial attachment. Electron microscopic (EM) biofilm studies, however, have been limited by the hydrated extracellular matrix which loses structural integrity with conventional preparative techniques, and under required high-vacuum conditions, resulting in a loss of information on spatial relationships and distribution of biofilm microbes. Recent advances in EM technology enable the application of Variable Pressure Scanning Electron Microscopy (VP SEM) to biofilms, allowing low vacuum and hydrated chamber atmosphere during visualization. Environmental biofilm samples can be viewed in situ, unfixed and fully hydrated, with application of gold-sputter-coating only, to increase image resolution. As the impact of microbial biofilms can be both hazardous and beneficial to man and his environment, recognition of biofilms as a natural form of microbial existence is needed to fully assess the potential role of microbial communities on technology. The integration of multiple techniques to elucidate biofilm processes has become imperative for unraveling complex phenotypic adaptations of this microbial lifestyle. We applied VP SEM as integrative technique with traditional and novel analytical techniques to (1)localize lignocellulosic microbial consortia applied for producing alternative bio-energy sources in the mining wastewater industry, (2) characterize and visualize wetland microbial communities in the treatment of winery wastewater, and (3)determine the impact of recombinant technology on yeast biofilm behavior. Visualization of microbial attachment to a lignocellulose substrate, and degradation of exposed plant tissue, gave insight into fiber degradation and volatile fatty acid production for biological sulphate removal from mining wastewater. Also, the 3D-architecture of complex biofilms developing in constructed wetlands was correlated with molecular fingerprints of wetland communities using tRFLP (Terminal Restriction Fragment Length Polymorphism) - and gave evidence of temporal and spatial variation in a wetland system, to potentially be applied as management tool in wastewater treatment. Visualization of differences in biofilm development by wild and recombinant yeast strains furthermore supported real-time quantitative data of biofilm development by Cryptococcus laurentii and Saccharomyces yeast strains. In all cases VP SEM allowed a more holistic interpretation of biofilm processes than afforded by quantitative empirical data only.

  2. Visual Analytics of Surveillance Data on Foodborne Vibriosis, United States, 1973–2010

    PubMed Central

    Sims, Jennifer N.; Isokpehi, Raphael D.; Cooper, Gabrielle A.; Bass, Michael P.; Brown, Shyretha D.; St John, Alison L.; Gulig, Paul A.; Cohly, Hari H.P.

    2011-01-01

    Foodborne illnesses caused by microbial and chemical contaminants in food are a substantial health burden worldwide. In 2007, human vibriosis (non-cholera Vibrio infections) became a notifiable disease in the United States. In addition, Vibrio species are among the 31 major known pathogens transmitted through food in the United States. Diverse surveillance systems for foodborne pathogens also track outbreaks, illnesses, hospitalization and deaths due to non-cholera vibrios. Considering the recognition of vibriosis as a notifiable disease in the United States and the availability of diverse surveillance systems, there is a need for the development of easily deployed visualization and analysis approaches that can combine diverse data sources in an interactive manner. Current efforts to address this need are still limited. Visual analytics is an iterative process conducted via visual interfaces that involves collecting information, data preprocessing, knowledge representation, interaction, and decision making. We have utilized public domain outbreak and surveillance data sources covering 1973 to 2010, as well as visual analytics software to demonstrate integrated and interactive visualizations of data on foodborne outbreaks and surveillance of Vibrio species. Through the data visualization, we were able to identify unique patterns and/or novel relationships within and across datasets regarding (i) causative agent; (ii) foodborne outbreaks and illness per state; (iii) location of infection; (iv) vehicle (food) of infection; (v) anatomical site of isolation of Vibrio species; (vi) patients and complications of vibriosis; (vii) incidence of laboratory-confirmed vibriosis and V. parahaemolyticus outbreaks. The additional use of emerging visual analytics approaches for interaction with data on vibriosis, including non-foodborne related disease, can guide disease control and prevention as well as ongoing outbreak investigations. PMID:22174586

  3. Bridging Theory with Practice: An Exploratory Study of Visualization Use and Design for Climate Model Comparison

    DOE PAGES

    Dasgupta, Aritra; Poco, Jorge; Wei, Yaxing; ...

    2015-03-16

    Evaluation methodologies in visualization have mostly focused on how well the tools and techniques cater to the analytical needs of the user. While this is important in determining the effectiveness of the tools and advancing the state-of-the-art in visualization research, a key area that has mostly been overlooked is how well established visualization theories and principles are instantiated in practice. This is especially relevant when domain experts, and not visualization researchers, design visualizations for analysis of their data or for broader dissemination of scientific knowledge. There is very little research on exploring the synergistic capabilities of cross-domain collaboration between domainmore » experts and visualization researchers. To fill this gap, in this paper we describe the results of an exploratory study of climate data visualizations conducted in tight collaboration with a pool of climate scientists. The study analyzes a large set of static climate data visualizations for identifying their shortcomings in terms of visualization design. The outcome of the study is a classification scheme that categorizes the design problems in the form of a descriptive taxonomy. The taxonomy is a first attempt for systematically categorizing the types, causes, and consequences of design problems in visualizations created by domain experts. We demonstrate the use of the taxonomy for a number of purposes, such as, improving the existing climate data visualizations, reflecting on the impact of the problems for enabling domain experts in designing better visualizations, and also learning about the gaps and opportunities for future visualization research. We demonstrate the applicability of our taxonomy through a number of examples and discuss the lessons learnt and implications of our findings.« less

  4. Bridging Theory with Practice: An Exploratory Study of Visualization Use and Design for Climate Model Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Poco, Jorge; Wei, Yaxing

    Evaluation methodologies in visualization have mostly focused on how well the tools and techniques cater to the analytical needs of the user. While this is important in determining the effectiveness of the tools and advancing the state-of-the-art in visualization research, a key area that has mostly been overlooked is how well established visualization theories and principles are instantiated in practice. This is especially relevant when domain experts, and not visualization researchers, design visualizations for analysis of their data or for broader dissemination of scientific knowledge. There is very little research on exploring the synergistic capabilities of cross-domain collaboration between domainmore » experts and visualization researchers. To fill this gap, in this paper we describe the results of an exploratory study of climate data visualizations conducted in tight collaboration with a pool of climate scientists. The study analyzes a large set of static climate data visualizations for identifying their shortcomings in terms of visualization design. The outcome of the study is a classification scheme that categorizes the design problems in the form of a descriptive taxonomy. The taxonomy is a first attempt for systematically categorizing the types, causes, and consequences of design problems in visualizations created by domain experts. We demonstrate the use of the taxonomy for a number of purposes, such as, improving the existing climate data visualizations, reflecting on the impact of the problems for enabling domain experts in designing better visualizations, and also learning about the gaps and opportunities for future visualization research. We demonstrate the applicability of our taxonomy through a number of examples and discuss the lessons learnt and implications of our findings.« less

  5. Revealing 3D Ultrastructure and Morphology of Stem Cell Spheroids by Electron Microscopy.

    PubMed

    Jaros, Josef; Petrov, Michal; Tesarova, Marketa; Hampl, Ales

    2017-01-01

    Cell culture methods have been developed in efforts to produce biologically relevant systems for developmental and disease modeling, and appropriate analytical tools are essential. Knowledge of ultrastructural characteristics represents the basis to reveal in situ the cellular morphology, cell-cell interactions, organelle distribution, niches in which cells reside, and many more. The traditional method for 3D visualization of ultrastructural components, serial sectioning using transmission electron microscopy (TEM), is very labor-intensive due to contentious TEM slice preparation and subsequent image processing of the whole collection. In this chapter, we present serial block-face scanning electron microscopy, together with complex methodology for spheroid formation, contrasting of cellular compartments, image processing, and 3D visualization. The described technique is effective for detailed morphological analysis of stem cell spheroids, organoids, as well as organotypic cell cultures.

  6. Frozen Stiff: Cartographic Design and Permafrost Mapping

    NASA Astrophysics Data System (ADS)

    Nelson, F. E.; Li, J.; Nyland, K. E.

    2016-12-01

    Maps are the primary vehicle used to communicate geographical relationships. Ironically, interest in the formal practice of cartography, the art and science of geographic visualization, has fallen significantly during a period when the sophistication and availability of GIS software has increased dramatically. Although the number of geographically oriented permafrost studies has increased significantly in recent years, little discussion about competing visualization strategies, map accuracy, and the psychophysical impact of cartographic design is evident in geocryological literature. Failure to use the full potential of the tools and techniques that contemporary cartographic and spatial-analytic theory makes possible affects our ability to effectively and accurately communicate the impacts and hazards associated with thawing permafrost, particularly in the context of global climate change. This presentation examines recent permafrost studies involving primarily small-scale (large area) mapping, and suggests cartographic strategies for rectifying existing problems.

  7. How can knowledge discovery methods uncover spatio-temporal patterns in environmental data?

    NASA Astrophysics Data System (ADS)

    Wachowicz, Monica

    2000-04-01

    This paper proposes the integration of KDD, GVis and STDB as a long-term strategy, which will allow users to apply knowledge discovery methods for uncovering spatio-temporal patterns in environmental data. The main goal is to combine innovative techniques and associated tools for exploring very large environmental data sets in order to arrive at valid, novel, potentially useful, and ultimately understandable spatio-temporal patterns. The GeoInsight approach is described using the principles and key developments in the research domains of KDD, GVis, and STDB. The GeoInsight approach aims at the integration of these research domains in order to provide tools for performing information retrieval, exploration, analysis, and visualization. The result is a knowledge-based design, which involves visual thinking (perceptual-cognitive process) and automated information processing (computer-analytical process).

  8. Visualizing the BEC-BCS crossover in a two-dimensional Fermi gas: Pairing gaps and dynamical response functions from ab initio computations

    NASA Astrophysics Data System (ADS)

    Vitali, Ettore; Shi, Hao; Qin, Mingpu; Zhang, Shiwei

    2017-12-01

    Experiments with ultracold atoms provide a highly controllable laboratory setting with many unique opportunities for precision exploration of quantum many-body phenomena. The nature of such systems, with strong interaction and quantum entanglement, makes reliable theoretical calculations challenging. Especially difficult are excitation and dynamical properties, which are often the most directly relevant to experiment. We carry out exact numerical calculations, by Monte Carlo sampling of imaginary-time propagation of Slater determinants, to compute the pairing gap in the two-dimensional Fermi gas from first principles. Applying state-of-the-art analytic continuation techniques, we obtain the spectral function and the density and spin structure factors providing unique tools to visualize the BEC-BCS crossover. These quantities will allow for a direct comparison with experiments.

  9. Potential of Near-Infrared Chemical Imaging as Process Analytical Technology Tool for Continuous Freeze-Drying.

    PubMed

    Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas

    2018-04-03

    Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.

  10. Counterfeit drugs: analytical techniques for their identification.

    PubMed

    Martino, R; Malet-Martino, M; Gilard, V; Balayssac, S

    2010-09-01

    In recent years, the number of counterfeit drugs has increased dramatically, including not only "lifestyle" products but also vital medicines. Besides the threat to public health, the financial and reputational damage to pharmaceutical companies is substantial. The lack of robust information on the prevalence of fake drugs is an obstacle in the fight against drug counterfeiting. It is generally accepted that approximately 10% of drugs worldwide could be counterfeit, but it is also well known that this number covers very different situations depending on the country, the places where the drugs are purchased, and the definition of what constitutes a counterfeit drug. The chemical analysis of drugs suspected to be fake is a crucial step as counterfeiters are becoming increasingly sophisticated, rendering visual inspection insufficient to distinguish the genuine products from the counterfeit ones. This article critically reviews the recent analytical methods employed to control the quality of drug formulations, using as an example artemisinin derivatives, medicines particularly targeted by counterfeiters. Indeed, a broad panel of techniques have been reported for their analysis, ranging from simple and cheap in-field ones (colorimetry and thin-layer chromatography) to more advanced laboratory methods (mass spectrometry, nuclear magnetic resonance, and vibrational spectroscopies) through chromatographic methods, which remain the most widely used. The conclusion section of the article highlights the questions to be posed before selecting the most appropriate analytical approach.

  11. Four-dimensional modeling of recent vertical movements in the area of the southern California uplift

    USGS Publications Warehouse

    Vanicek, Petr; Elliot, Michael R.; Castle, Robert O.

    1979-01-01

    This paper describes an analytical technique that utilizes scattered geodetic relevelings and tide-gauge records to portray Recent vertical crustal movements that may have been characterized by spasmodic changes in velocity. The technique is based on the fitting of a time-varying algebraic surface of prescribed degree to the geodetic data treated as tilt elements and to tide-gauge readings treated as point movements. Desired variations in time can be selected as any combination of powers of vertical movement velocity and episodic events. The state of the modeled vertical displacement can be shown for any number of dates for visual display. Statistical confidence limits of the modeled displacements, derived from the density of measurements in both space and time, line length, and accuracy of input data, are also provided. The capabilities of the technique are demonstrated on selected data from the region of the southern California uplift. 

  12. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    NASA Astrophysics Data System (ADS)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  13. A Paper-Based Electrochromic Array for Visualized Electrochemical Sensing.

    PubMed

    Zhang, Fengling; Cai, Tianyi; Ma, Liang; Zhan, Liyuan; Liu, Hong

    2017-01-31

    We report a battery-powered, paper-based electrochromic array for visualized electrochemical sensing. The paper-based sensing system consists of six parallel electrochemical cells, which are powered by an aluminum-air battery. Each single electrochemical cell uses a Prussian Blue spot electrodeposited on an indium-doped tin oxide thin film as the electrochromic indicator. Each electrochemical cell is preloaded with increasing amounts of analyte. The sample activates the battery for the sensing. Both the preloaded analyte and the analyte in the sample initiate the color change of Prussian Blue to Prussian White. With a reaction time of 60 s, the number of electrochemical cells with complete color changes is correlated to the concentration of analyte in the sample. As a proof-of-concept analyte, lactic acid was detected semi-quantitatively using the naked eye.

  14. Insight solutions are correct more often than analytic solutions

    PubMed Central

    Salvi, Carola; Bricolo, Emanuela; Kounios, John; Bowden, Edward; Beeman, Mark

    2016-01-01

    How accurate are insights compared to analytical solutions? In four experiments, we investigated how participants’ solving strategies influenced their solution accuracies across different types of problems, including one that was linguistic, one that was visual and two that were mixed visual-linguistic. In each experiment, participants’ self-judged insight solutions were, on average, more accurate than their analytic ones. We hypothesised that insight solutions have superior accuracy because they emerge into consciousness in an all-or-nothing fashion when the unconscious solving process is complete, whereas analytic solutions can be guesses based on conscious, prematurely terminated, processing. This hypothesis is supported by the finding that participants’ analytic solutions included relatively more incorrect responses (i.e., errors of commission) than timeouts (i.e., errors of omission) compared to their insight responses. PMID:27667960

  15. Multi-Resolution Climate Ensemble Parameter Analysis with Nested Parallel Coordinates Plots.

    PubMed

    Wang, Junpeng; Liu, Xiaotong; Shen, Han-Wei; Lin, Guang

    2017-01-01

    Due to the uncertain nature of weather prediction, climate simulations are usually performed multiple times with different spatial resolutions. The outputs of simulations are multi-resolution spatial temporal ensembles. Each simulation run uses a unique set of values for multiple convective parameters. Distinct parameter settings from different simulation runs in different resolutions constitute a multi-resolution high-dimensional parameter space. Understanding the correlation between the different convective parameters, and establishing a connection between the parameter settings and the ensemble outputs are crucial to domain scientists. The multi-resolution high-dimensional parameter space, however, presents a unique challenge to the existing correlation visualization techniques. We present Nested Parallel Coordinates Plot (NPCP), a new type of parallel coordinates plots that enables visualization of intra-resolution and inter-resolution parameter correlations. With flexible user control, NPCP integrates superimposition, juxtaposition and explicit encodings in a single view for comparative data visualization and analysis. We develop an integrated visual analytics system to help domain scientists understand the connection between multi-resolution convective parameters and the large spatial temporal ensembles. Our system presents intricate climate ensembles with a comprehensive overview and on-demand geographic details. We demonstrate NPCP, along with the climate ensemble visualization system, based on real-world use-cases from our collaborators in computational and predictive science.

  16. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  17. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  18. Investigations on flexural wave propagation and attenuation in a modified one-dimensional acoustic black hole using a laser excitation technique

    NASA Astrophysics Data System (ADS)

    Ji, Hongli; Luo, Jing; Qiu, Jinhao; Cheng, Li

    2018-05-01

    Acoustic Black Holes (ABHs), as a new type of passive structure for vibration damping enhancement and noise attenuation, have been drawing increasing attentions of many researchers. Due to the difficulty in manufacturing the sharp edges required by the ABH structures, it is important to understand the wave propagation and attenuation process in the presence of damping layers in non-ideal ABHs with a truncated edge. In this paper, an analytical expression of the wave reflection coefficient in a modified one-dimensional ABH is derived and a time-domain experimental method based on a laser excitation technique is used to visualize the wave propagation. In the experimental studies, the flexural waves in the ABH were excited by a scanning pulse laser and measured by a Laser Doppler Vibrometer (LDV). The incident wave and reflected wave were separated from the measured original wave field and the decrease of the wave velocity in the ABH was exhibited. The reflection coefficient was calculated from the ratio of the amplitude of the reflected wave to that of the incident wave for different ABH parameters and different thicknesses of the damping layer. The measured reflection coefficients were used to identify the unknown coefficients in the theoretical formula. The results confirm that there exists an optimal thickness for the damping layer, which leads to the minimum wave reflection. Based on the laser-induced visualization technique and various signal processing and feature extraction methods, the entire process of the wave propagation in a non-ideal one-dimensional ABH structure can be visualized and scrutinized.

  19. Methods for the analysis of azo dyes employed in food industry--A review.

    PubMed

    Yamjala, Karthik; Nainar, Meyyanathan Subramania; Ramisetti, Nageswara Rao

    2016-02-01

    A wide variety of azo dyes are generally added for coloring food products not only to make them visually aesthetic but also to reinstate the original appearance lost during the production process. However, many countries in the world have banned the use of most of the azo dyes in food and their usage is highly regulated by domestic and export food supplies. The regulatory authorities and food analysts adopt highly sensitive and selective analytical methods for monitoring as well as assuring the quality and safety of food products. The present manuscript presents a comprehensive review of various analytical techniques used in the analysis of azo dyes employed in food industries of different parts of the world. A brief description on the use of different extraction methods such as liquid-liquid, solid phase and membrane extraction has also been presented. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Systematic Assessment of the Hemolysis Index: Pros and Cons.

    PubMed

    Lippi, Giuseppe

    2015-01-01

    Preanalytical quality is as important as the analytical and postanalytical quality in laboratory diagnostics. After decades of visual inspection to establish whether or not a diagnostic sample may be suitable for testing, automated assessment of hemolysis index (HI) has now become available in a large number of laboratory analyzers. Although most national and international guidelines support systematic assessment of sample quality via HI, there is widespread perception that this indication has not been thoughtfully acknowledged. Potential explanations include concern of increased specimen rejection rate, poor harmonization of analytical techniques, lack of standardized units of measure, differences in instrument-specific cutoff, negative impact on throughput, organization and laboratory economics, and lack of a reliable quality control system. Many of these concerns have been addressed. Evidence now supports automated HI in improving quality and patient safety. These will be discussed. © 2015 Elsevier Inc. All rights reserved.

  1. The structure of separated flow regions occurring near the leading edge of airfoils - including transition

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Laser Doppler Velocimeter data, static pressure data, and smoke flow visualization data was obtained and analyzed to correlate with separation bubble data. The Eppler 387 airfoil was focused on at a chord Reynolds number of 100,000 and an angle of attack of 2 deg. Additional data was also obtained from the NACA 663-018 airfoil at a chord Reynolds number of 160,000 and an angle of attack of 12 deg. The structure and behavior of the transition separation bubble was documented along with the redeveloping boundary layer after reattachment over an airfoil at low Reynolds numbers. The understanding of the complex flow phenomena was examined so that analytic methods for predicting their formation and development can be improved. These analytic techniques have applications in the design and performance prediction of airfoils operating in the low Reynolds number flight regime.

  2. Generation of electrochemiluminescence at bipolar electrodes: concepts and applications.

    PubMed

    Bouffier, Laurent; Arbault, Stéphane; Kuhn, Alexander; Sojic, Neso

    2016-10-01

    Bipolar electrochemistry (BPE) is an unconventional technique where a conducting object is addressed electrochemically in an electrolyte without any wire connection with an external power supply. BPE has been known for decades but remained limited to only a couple of niche applications. However, it is now undergoing a true renewal of interest especially in the context of analytical chemistry. The bipolar electrode exhibits two distinct poles of opposite polarization with respect to the solution. This allows one to separate the localization of sensing elements versus reporting ones. Also, arrays of bipolar microelectrodes can be addressed simultaneously to perform parallel analyses. Among several reporting strategies, the combination of BPE with electro-chemiluminescence (ECL) is the most frequent choice owing to the very simple visual readout provided by ECL. This article reviews the field from the initial reports to the most recent ones, revealing numerous opportunities including novel analytical strategies for the detection of small molecular analytes and biorelevant molecules such as DNA, RNA, peptides, or other biomarkers. Graphical Abstract Principle of electrochemiluminescence generation at one extremity of a bipolar electrode.

  3. Experimental investigation of recirculating cells in laminar coaxial jets.

    NASA Technical Reports Server (NTRS)

    Warpinski, N. R.; Nagib, H. M.; Lavan, Z.

    1972-01-01

    Utilizing several unique means of introducing smoke into the flow field for careful visualization in addition to hot-wire techniques, experiments are performed in a specially designed facility producing laminar flows up to considerably high Reynolds numbers. Characteristics of the cells and the flow conditions that bring them about are documented by smoke photographs in the Reynolds number velocity ratio plane and the results are compared to previous analytical predictions. The cells are found to fall into three categories with different flow characteristics involving unsteadiness in position, and shear layer instabilities which result in higher mixing with the outer streams.-

  4. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    PubMed Central

    Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-01-01

    Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928

  5. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  6. Highly selective visual monitoring of hazardous fluoride ion in aqueous media using thiobarbituric-capped gold nanoparticles.

    PubMed

    Boken, Jyoti; Thatai, Sheenam; Khurana, Parul; Prasad, Surendra; Kumar, Dinesh

    2015-01-01

    The rapid, selective and sensitive measurement and monitoring of hazardous materials as analytes are the central themes in the development of any successful analytical technique. With this aim, we have synthesized the thiobarbituric-capped gold nanoparticles (TBA-capped Au NPs) involving chemical reduction of HAuCl4 using 2-thiobarbituric acid (TBA) as a reducing and capping agent. The morphology of the TBA-capped Au NPs was confirmed using transmission electron microscope images. For the first time this article reports that the developed TAB-capped Au NPs displays selective, ultrafast and sensitive colorimetric detection of fluoride ion in aqueous samples. The detection of fluoride ion was confirmed by the disappearance of the localized surface plasmon resonance (LSPR) band at 554 nm using UV-vis spectroscopy. The interaction of F(-) with TBA-capped Au NPs in aqueous solution has also been confirmed by Raman and FTIR spectroscopy. One of the most exciting accomplishments is the visual detection limit for fluoride ion has been found to be 10 mM at commonly acceptable water pH range 7-8. The whole detection procedure takes not more than 40s with excellent selectivity providing sample throughput of more than 60 per hour. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Leveraging multidisciplinarity in a visual analytics graduate course.

    PubMed

    Elmqvist, Niklas; Ebert, David S

    2012-01-01

    Demand is growing in engineering, business, science, research, and industry for students with visual analytics expertise. However, teaching VA is challenging owing to the multidisciplinary nature of the topic, students' diverse backgrounds, and the corresponding requirements for instructors. This article reports best practices from a VA graduate course at Purdue University, where instructors leveraged these challenges to their advantage instead of trying to mitigate them.

  8. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    ERIC Educational Resources Information Center

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  9. An interactive web-based system using cloud for large-scale visual analytics

    NASA Astrophysics Data System (ADS)

    Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.

    2015-03-01

    Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.

  10. In Situ Visualization of Block Copolymer Self‐Assembly in Organic Media by Super‐Resolution Fluorescence Microscopy

    PubMed Central

    Boott, Charlotte E.; Laine, Romain F.; Mahou, Pierre; Finnegan, John R.; Leitao, Erin M.

    2015-01-01

    Abstract Analytical methods that enable visualization of nanomaterials derived from solution self‐assembly processes in organic solvents are highly desirable. Herein, we demonstrate the use of stimulated emission depletion microscopy (STED) and single molecule localization microscopy (SMLM) to map living crystallization‐driven block copolymer (BCP) self‐assembly in organic media at the sub‐diffraction scale. Four different dyes were successfully used for single‐colour super‐resolution imaging of the BCP nanostructures allowing micelle length distributions to be determined in situ. Dual‐colour SMLM imaging was used to measure and compare the rate of addition of red fluorescent BCP to the termini of green fluorescent seed micelles to generate block comicelles. Although well‐established for aqueous systems, the results highlight the potential of super‐resolution microscopy techniques for the interrogation of self‐assembly processes in organic media. PMID:26477697

  11. Twelve tips to promote successful development of a learner performance dashboard within a medical education program.

    PubMed

    Boscardin, Christy; Fergus, Kirkpatrick B; Hellevig, Bonnie; Hauer, Karen E

    2017-11-09

    Easily accessible and interpretable performance data constitute critical feedback for learners that facilitate informed self-assessment and learning planning. To provide this feedback, there has been a proliferation of educational dashboards in recent years. An educational (learner) dashboard systematically delivers timely and continuous feedback on performance and can provide easily visualized and interpreted performance data. In this paper, we provide practical tips for developing a functional, user-friendly individual learner performance dashboard and literature review of dashboard development, assessment theory, and users' perspectives. Considering key design principles and maximizing current technological advances in data visualization techniques can increase dashboard utility and enhance the user experience. By bridging current technology with assessment strategies that support learning, educators can continue to improve the field of learning analytics and design of information management tools such as dashboards in support of improved learning outcomes.

  12. New developments of X-ray fluorescence imaging techniques in laboratory

    NASA Astrophysics Data System (ADS)

    Tsuji, Kouichi; Matsuno, Tsuyoshi; Takimoto, Yuki; Yamanashi, Masaki; Kometani, Noritsugu; Sasaki, Yuji C.; Hasegawa, Takeshi; Kato, Shuichi; Yamada, Takashi; Shoji, Takashi; Kawahara, Naoki

    2015-11-01

    X-ray fluorescence (XRF) analysis is a well-established analytical technique with a long research history. Many applications have been reported in various fields, such as in the environmental, archeological, biological, and forensic sciences as well as in industry. This is because XRF has a unique advantage of being a nondestructive analytical tool with good precision for quantitative analysis. Recent advances in XRF analysis have been realized by the development of new x-ray optics and x-ray detectors. Advanced x-ray focusing optics enables the making of a micro x-ray beam, leading to micro-XRF analysis and XRF imaging. A confocal micro-XRF technique has been applied for the visualization of elemental distributions inside the samples. This technique was applied for liquid samples and for monitoring chemical reactions such as the metal corrosion of steel samples in the NaCl solutions. In addition, a principal component analysis was applied for reducing the background intensity in XRF spectra obtained during XRF mapping, leading to improved spatial resolution of confocal micro-XRF images. In parallel, the authors have proposed a wavelength dispersive XRF (WD-XRF) imaging spectrometer for a fast elemental imaging. A new two dimensional x-ray detector, the Pilatus detector was applied for WD-XRF imaging. Fast XRF imaging in 1 s or even less was demonstrated for Euro coins and industrial samples. In this review paper, these recent advances in laboratory-based XRF imaging, especially in a laboratory setting, will be introduced.

  13. Visual motherese? Signal-to-noise ratios in toddler-directed television

    PubMed Central

    Wass, Sam V; Smith, Tim J

    2015-01-01

    Younger brains are noisier information processing systems; this means that information for younger individuals has to allow clearer differentiation between those aspects that are required for the processing task in hand (the ‘signal’) and those that are not (the ‘noise’). We compared toddler-directed and adult-directed TV programmes (TotTV/ATV). We examined how low-level visual features (that previous research has suggested influence gaze allocation) relate to semantic information, namely the location of the character speaking in each frame. We show that this relationship differs between TotTV and ATV. First, we conducted Receiver Operator Characteristics analyses and found that feature congestion predicted speaking character location in TotTV but not ATV. Second, we used multiple analytical strategies to show that luminance differentials (flicker) predict face location more strongly in TotTV than ATV. Our results suggest that TotTV designers have intuited techniques for controlling toddler attention using low-level visual cues. The implications of these findings for structuring childhood learning experiences away from a screen are discussed. PMID:24702791

  14. The Flushtration Count Illusion: Attribute substitution tricks our interpretation of a simple visual event sequence.

    PubMed

    Thomas, Cyril; Didierjean, André; Kuhn, Gustav

    2018-04-17

    When faced with a difficult question, people sometimes work out an answer to a related, easier question without realizing that a substitution has taken place (e.g., Kahneman, 2011, Thinking, fast and slow. New York, Farrar, Strauss, Giroux). In two experiments, we investigated whether this attribute substitution effect can also affect the interpretation of a simple visual event sequence. We used a magic trick called the 'Flushtration Count Illusion', which involves a technique used by magicians to give the illusion of having seen multiple cards with identical backs, when in fact only the back of one card (the bottom card) is repeatedly shown. In Experiment 1, we demonstrated that most participants are susceptible to the illusion, even if they have the visual and analytical reasoning capacity to correctly process the sequence. In Experiment 2, we demonstrated that participants construct a biased and simplified representation of the Flushtration Count by substituting some attributes of the event sequence. We discussed of the psychological processes underlying this attribute substitution effect. © 2018 The British Psychological Society.

  15. Visual motherese? Signal-to-noise ratios in toddler-directed television.

    PubMed

    Wass, Sam V; Smith, Tim J

    2015-01-01

    Younger brains are noisier information processing systems; this means that information for younger individuals has to allow clearer differentiation between those aspects that are required for the processing task in hand (the 'signal') and those that are not (the 'noise'). We compared toddler-directed and adult-directed TV programmes (TotTV/ATV). We examined how low-level visual features (that previous research has suggested influence gaze allocation) relate to semantic information, namely the location of the character speaking in each frame. We show that this relationship differs between TotTV and ATV. First, we conducted Receiver Operator Characteristics analyses and found that feature congestion predicted speaking character location in TotTV but not ATV. Second, we used multiple analytical strategies to show that luminance differentials (flicker) predict face location more strongly in TotTV than ATV. Our results suggest that TotTV designers have intuited techniques for controlling toddler attention using low-level visual cues. The implications of these findings for structuring childhood learning experiences away from a screen are discussed. © 2014 The Authors. Developmental Science Published by John Wiley & Sons Ltd.

  16. A Visual Analytics Approach for Station-Based Air Quality Data

    PubMed Central

    Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui

    2016-01-01

    With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support. PMID:28029117

  17. A Visual Analytics Approach for Station-Based Air Quality Data.

    PubMed

    Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui

    2016-12-24

    With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support.

  18. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  19. Heat as a tracer to determine streambed water exchanges

    USGS Publications Warehouse

    Constantz, J.

    2010-01-01

    This work reviews the use of heat as a tracer of shallow groundwater movement and describes current temperature-based approaches for estimating streambed water exchanges. Four common hydrologic conditions in stream channels are graphically depicted with the expected underlying streambed thermal responses, and techniques are discussed for installing and monitoring temperature and stage equipment for a range of hydrological environments. These techniques are divided into direct-measurement techniques in streams and streambeds, groundwater techniques relying on traditional observation wells, and remote sensing and other large-scale advanced temperatureacquisition techniques. A review of relevant literature suggests researchers often graphically visualize temperature data to enhance conceptual models of heat and water flow in the near-stream environment and to determine site-specific approaches of data analysis. Common visualizations of stream and streambed temperature patterns include thermographs, temperature envelopes, and one-, two-, and three-dimensional temperature contour plots. Heat and water transport governing equations are presented for the case of transport in streambeds, followed by methods of streambed data analysis, including simple heat-pulse arrival time and heat-loss procedures, analytical and time series solutions, and heat and water transport simulation models. A series of applications of these methods are presented for a variety of stream settings ranging from arid to continental climates. Progressive successes to quantify both streambed fluxes and the spatial extent of streambeds indicate heat-tracing tools help define the streambed as a spatially distinct field (analogous to soil science), rather than simply the lower boundary in stream research or an amorphous zone beneath the stream channel.

  20. Model-based ultrasound temperature visualization during and following HIFU exposure.

    PubMed

    Ye, Guoliang; Smith, Penny Probert; Noble, J Alison

    2010-02-01

    This paper describes the application of signal processing techniques to improve the robustness of ultrasound feedback for displaying changes in temperature distribution in treatment using high-intensity focused ultrasound (HIFU), especially at the low signal-to-noise ratios that might be expected in in vivo abdominal treatment. Temperature estimation is based on the local displacements in ultrasound images taken during HIFU treatment, and a method to improve robustness to outliers is introduced. The main contribution of the paper is in the application of a Kalman filter, a statistical signal processing technique, which uses a simple analytical temperature model of heat dispersion to improve the temperature estimation from the ultrasound measurements during and after HIFU exposure. To reduce the sensitivity of the method to previous assumptions on the material homogeneity and signal-to-noise ratio, an adaptive form is introduced. The method is illustrated using data from HIFU exposure of ex vivo bovine liver. A particular advantage of the stability it introduces is that the temperature can be visualized not only in the intervals between HIFU exposure but also, for some configurations, during the exposure itself. 2010 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  1. KNMI DataLab experiences in serving data-driven innovations

    NASA Astrophysics Data System (ADS)

    Noteboom, Jan Willem; Sluiter, Raymond

    2016-04-01

    Climate change research and innovations in weather forecasting rely more and more on (Big) data. Besides increasing data from traditional sources (such as observation networks, radars and satellites), the use of open data, crowd sourced data and the Internet of Things (IoT) is emerging. To deploy these sources of data optimally in our services and products, KNMI has established a DataLab to serve data-driven innovations in collaboration with public and private sector partners. Big data management, data integration, data analytics including machine learning and data visualization techniques are playing an important role in the DataLab. Cross-domain data-driven innovations that arise from public-private collaborative projects and research programmes can be explored, experimented and/or piloted by the KNMI DataLab. Furthermore, advice can be requested on (Big) data techniques and data sources. In support of collaborative (Big) data science activities, scalable environments are offered with facilities for data integration, data analysis and visualization. In addition, Data Science expertise is provided directly or from a pool of internal and external experts. At the EGU conference, gained experiences and best practices are presented in operating the KNMI DataLab to serve data-driven innovations for weather and climate applications optimally.

  2. Toward a Shared Vocabulary for Visual Analysis: An Analytic Toolkit for Deconstructing the Visual Design of Graphic Novels

    ERIC Educational Resources Information Center

    Connors, Sean P.

    2012-01-01

    Literacy educators might advocate using graphic novels to develop students' visual literacy skills, but teachers who lack a vocabulary for engaging in close analysis of visual texts may be reluctant to teach them. Recognizing this, teacher educators should equip preservice teachers with a vocabulary for analyzing visual texts. This article…

  3. Application of Andrew's Plots to Visualization of Multidimensional Data

    ERIC Educational Resources Information Center

    Grinshpun, Vadim

    2016-01-01

    Importance: The article raises a point of visual representation of big data, recently considered to be demanded for many scientific and real-life applications, and analyzes particulars for visualization of multi-dimensional data, giving examples of the visual analytics-related problems. Objectives: The purpose of this paper is to study application…

  4. GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration

    PubMed Central

    Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng

    2015-01-01

    The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315

  5. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    NASA Astrophysics Data System (ADS)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  7. A Strategy for Uncertainty Visualization Design

    DTIC Science & Technology

    2009-10-01

    143–156, Magdeburg , Germany . [11] Thomson, J., Hetzler, E., MacEachren, A., Gahegan, M. and Pavel, M. (2005), A Typology for Visualizing Uncertainty...and Stasko [20] to bridge analytic gaps in visualization design, when tasks in the strategy overlap (and therefore complement) design frameworks

  8. Urban Space Explorer: A Visual Analytics System for Urban Planning.

    PubMed

    Karduni, Alireza; Cho, Isaac; Wessel, Ginette; Ribarsky, William; Sauda, Eric; Dou, Wenwen

    2017-01-01

    Understanding people's behavior is fundamental to many planning professions (including transportation, community development, economic development, and urban design) that rely on data about frequently traveled routes, places, and social and cultural practices. Based on the results of a practitioner survey, the authors designed Urban Space Explorer, a visual analytics system that utilizes mobile social media to enable interactive exploration of public-space-related activity along spatial, temporal, and semantic dimensions.

  9. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Beaver, Justin M; BogenII, Paul L.

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing thesemore » contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.« less

  10. Visual analytics as a translational cognitive science.

    PubMed

    Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard

    2011-07-01

    Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.

  11. Visual Analytics for Pattern Discovery in Home Care

    PubMed Central

    Monsen, Karen A.; Bae, Sung-Heui; Zhang, Wenhui

    2016-01-01

    Summary Background Visualization can reduce the cognitive load of information, allowing users to easily interpret and assess large amounts of data. The purpose of our study was to examine home health data using visual analysis techniques to discover clinically salient associations between patient characteristics with problem-oriented health outcomes of older adult home health patients during the home health service period. Methods Knowledge, Behavior and Status ratings at discharge as well as change from admission to discharge that was coded using the Omaha System was collected from a dataset on 988 de-identified patient data from 15 home health agencies. SPSS Visualization Designer v1.0 was used to visually analyze patterns between independent and outcome variables using heat maps and histograms. Visualizations suggesting clinical salience were tested for significance using correlation analysis. Results The mean age of the patients was 80 years, with the majority female (66%). Of the 150 visualizations, 69 potentially meaningful patterns were statistically evaluated through bivariate associations, revealing 21 significant associations. Further, 14 associations between episode length and Charlson co-morbidity index mainly with urinary related diagnoses and problems remained significant after adjustment analyses. Through visual analysis, the adverse association of the longer home health episode length and higher Charlson co-morbidity index with behavior or status outcomes for patients with impaired urinary function was revealed. Conclusions We have demonstrated the use of visual analysis to discover novel patterns that described high-needs subgroups among the older home health patient population. The effective presentation of these data patterns can allow clinicians to identify areas of patient improvement, and time periods that are most effective for implementing home health interventions to improve patient outcomes. PMID:27466053

  12. Visualizing frequent patterns in large multivariate time series

    NASA Astrophysics Data System (ADS)

    Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.

    2011-01-01

    The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.

  13. TrajGraph: A Graph-Based Visual Analytics Approach to Studying Urban Network Centralities Using Taxi Trajectory Data.

    PubMed

    Huang, Xiaoke; Zhao, Ye; Yang, Jing; Zhang, Chong; Ma, Chao; Ye, Xinyue

    2016-01-01

    We propose TrajGraph, a new visual analytics method, for studying urban mobility patterns by integrating graph modeling and visual analysis with taxi trajectory data. A special graph is created to store and manifest real traffic information recorded by taxi trajectories over city streets. It conveys urban transportation dynamics which can be discovered by applying graph analysis algorithms. To support interactive, multiscale visual analytics, a graph partitioning algorithm is applied to create region-level graphs which have smaller size than the original street-level graph. Graph centralities, including Pagerank and betweenness, are computed to characterize the time-varying importance of different urban regions. The centralities are visualized by three coordinated views including a node-link graph view, a map view and a temporal information view. Users can interactively examine the importance of streets to discover and assess city traffic patterns. We have implemented a fully working prototype of this approach and evaluated it using massive taxi trajectories of Shenzhen, China. TrajGraph's capability in revealing the importance of city streets was evaluated by comparing the calculated centralities with the subjective evaluations from a group of drivers in Shenzhen. Feedback from a domain expert was collected. The effectiveness of the visual interface was evaluated through a formal user study. We also present several examples and a case study to demonstrate the usefulness of TrajGraph in urban transportation analysis.

  14. Fusion Analytics: A Data Integration System for Public Health and Medical Disaster Response Decision Support

    PubMed Central

    Passman, Dina B.

    2013-01-01

    Objective The objective of this demonstration is to show conference attendees how they can integrate, analyze, and visualize diverse data type data from across a variety of systems by leveraging an off-the-shelf enterprise business intelligence (EBI) solution to support decision-making in disasters. Introduction Fusion Analytics is the data integration system developed by the Fusion Cell at the U.S. Department of Health and Human Services (HHS), Office of the Assistant Secretary for Preparedness and Response (ASPR). Fusion Analytics meaningfully augments traditional public and population health surveillance reporting by providing web-based data analysis and visualization tools. Methods Fusion Analytics serves as a one-stop-shop for the web-based data visualizations of multiple real-time data sources within ASPR. The 24-7 web availability makes it an ideal analytic tool for situational awareness and response allowing stakeholders to access the portal from any internet-enabled device without installing any software. The Fusion Analytics data integration system was built using off-the-shelf EBI software. Fusion Analytics leverages the full power of statistical analysis software and delivers reports to users in a secure web-based environment. Fusion Analytics provides an example of how public health staff can develop and deploy a robust public health informatics solution using an off-the shelf product and with limited development funding. It also provides the unique example of a public health information system that combines patient data for traditional disease surveillance with manpower and resource data to provide overall decision support for federal public health and medical disaster response operations. Conclusions We are currently in a unique position within public health. One the one hand, we have been gaining greater and greater access to electronic data of all kinds over the last few years. On the other, we are working in a time of reduced government spending to support leveraging this data for decision support with robust analytics and visualizations. Fusion Analytics provides an opportunity for attendees to see how various types of data are integrated into a single application for population health decision support. It also can provide them with ideas of how they can use their own staff to create analyses and reports that support their public health activities.

  15. Visualization of the Mode Shapes of Pressure Oscillation in a Cylindrical Cavity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Xin; Qi, Yunliang; Wang, Zhi

    Our work describes a novel experimental method to visualize the mode shapes of pressure oscillation in a cylindrical cavity. Acoustic resonance in a cavity is a grand old problem that has been under investigation (using both analytical and numerical methods) for more than a century. In this article, a novel method based on high speed imaging of combustion chemiluminescence was presented to visualize the mode shapes of pressure oscillation in a cylindrical cavity. By generating high-temperature combustion gases and strong pressure waves simultaneously in a cylindrical cavity, the pressure oscillation can be inferred due to the chemiluminescence emissions of themore » combustion products. We can then visualized the mode shapes by reconstructing the images based on the amplitudes of the luminosity spectrum at the corresponding resonant frequencies. Up to 11 resonant mode shapes were clearly visualized, each matching very well with the analytical solutions.« less

  16. Combining computational analyses and interactive visualization for document exploration and sensemaking in jigsaw.

    PubMed

    Görg, Carsten; Liu, Zhicheng; Kihm, Jaeyeon; Choo, Jaegul; Park, Haesun; Stasko, John

    2013-10-01

    Investigators across many disciplines and organizations must sift through large collections of text documents to understand and piece together information. Whether they are fighting crime, curing diseases, deciding what car to buy, or researching a new field, inevitably investigators will encounter text documents. Taking a visual analytics approach, we integrate multiple text analysis algorithms with a suite of interactive visualizations to provide a flexible and powerful environment that allows analysts to explore collections of documents while sensemaking. Our particular focus is on the process of integrating automated analyses with interactive visualizations in a smooth and fluid manner. We illustrate this integration through two example scenarios: an academic researcher examining InfoVis and VAST conference papers and a consumer exploring car reviews while pondering a purchase decision. Finally, we provide lessons learned toward the design and implementation of visual analytics systems for document exploration and understanding.

  17. Integration and Visualization of Translational Medicine Data for Better Understanding of Human Diseases

    PubMed Central

    Satagopam, Venkata; Gu, Wei; Eifes, Serge; Gawron, Piotr; Ostaszewski, Marek; Gebel, Stephan; Barbosa-Silva, Adriano; Balling, Rudi; Schneider, Reinhard

    2016-01-01

    Abstract Translational medicine is a domain turning results of basic life science research into new tools and methods in a clinical environment, for example, as new diagnostics or therapies. Nowadays, the process of translation is supported by large amounts of heterogeneous data ranging from medical data to a whole range of -omics data. It is not only a great opportunity but also a great challenge, as translational medicine big data is difficult to integrate and analyze, and requires the involvement of biomedical experts for the data processing. We show here that visualization and interoperable workflows, combining multiple complex steps, can address at least parts of the challenge. In this article, we present an integrated workflow for exploring, analysis, and interpretation of translational medicine data in the context of human health. Three Web services—tranSMART, a Galaxy Server, and a MINERVA platform—are combined into one big data pipeline. Native visualization capabilities enable the biomedical experts to get a comprehensive overview and control over separate steps of the workflow. The capabilities of tranSMART enable a flexible filtering of multidimensional integrated data sets to create subsets suitable for downstream processing. A Galaxy Server offers visually aided construction of analytical pipelines, with the use of existing or custom components. A MINERVA platform supports the exploration of health and disease-related mechanisms in a contextualized analytical visualization system. We demonstrate the utility of our workflow by illustrating its subsequent steps using an existing data set, for which we propose a filtering scheme, an analytical pipeline, and a corresponding visualization of analytical results. The workflow is available as a sandbox environment, where readers can work with the described setup themselves. Overall, our work shows how visualization and interfacing of big data processing services facilitate exploration, analysis, and interpretation of translational medicine data. PMID:27441714

  18. Modeling and evaluating user behavior in exploratory visual analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reda, Khairi; Johnson, Andrew E.; Papka, Michael E.

    Empirical evaluation methods for visualizations have traditionally focused on assessing the outcome of the visual analytic process as opposed to characterizing how that process unfolds. There are only a handful of methods that can be used to systematically study how people use visualizations, making it difficult for researchers to capture and characterize the subtlety of cognitive and interaction behaviors users exhibit during visual analysis. To validate and improve visualization design, however, it is important for researchers to be able to assess and understand how users interact with visualization systems under realistic scenarios. This paper presents a methodology for modeling andmore » evaluating the behavior of users in exploratory visual analysis. We model visual exploration using a Markov chain process comprising transitions between mental, interaction, and computational states. These states and the transitions between them can be deduced from a variety of sources, including verbal transcripts, videos and audio recordings, and log files. This model enables the evaluator to characterize the cognitive and computational processes that are essential to insight acquisition in exploratory visual analysis, and reconstruct the dynamics of interaction between the user and the visualization system. We illustrate this model with two exemplar user studies, and demonstrate the qualitative and quantitative analytical tools it affords.« less

  19. Visualizing Qualitative Information

    ERIC Educational Resources Information Center

    Slone, Debra J.

    2009-01-01

    The abundance of qualitative data in today's society and the need to easily scrutinize, digest, and share this information calls for effective visualization and analysis tools. Yet, no existing qualitative tools have the analytic power, visual effectiveness, and universality of familiar quantitative instruments like bar charts, scatter-plots, and…

  20. Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Analytical Judgment Using Visualizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik

    Scientists working in a particular domain often adhere to conventional data analysis and presentation methods and this leads to familiarity with these methods over time. But does high familiarity always lead to better analytical judgment? This question is especially relevant when visualizations are used in scientific tasks, as there can be discrepancies between visualization best practices and domain conventions. However, there is little empirical evidence of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their effect on scientific judgment. To address this gap and to study these factors, we focus on the climatemore » science domain, specifically on visualizations used for comparison of model performance. We present a comprehensive user study with 47 climate scientists where we explored the following factors: i) relationships between scientists’ familiarity, their perceived levels of com- fort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less

  1. Thinking graphically: Connecting vision and cognition during graph comprehension.

    PubMed

    Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A

    2008-03-01

    Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved

  2. Electroencephalographic monitoring of complex mental tasks

    NASA Technical Reports Server (NTRS)

    Guisado, Raul; Montgomery, Richard; Montgomery, Leslie; Hickey, Chris

    1992-01-01

    Outlined here is the development of neurophysiological procedures to monitor operators during the performance of cognitive tasks. Our approach included the use of electroencepalographic (EEG) and rheoencephalographic (REG) techniques to determine changes in cortical function associated with cognition in the operator's state. A two channel tetrapolar REG, a single channel forearm impedance plethysmograph, a Lead I electrocardiogram (ECG) and a 21 channel EEG were used to measure subject responses to various visual-motor cognitive tasks. Testing, analytical, and display procedures for EEG and REG monitoring were developed that extend the state of the art and provide a valuable tool for the study of cerebral circulatory and neural activity during cognition.

  3. Forensic applications of chemical imaging: latent fingerprint detection using visible absorption and luminescence.

    PubMed

    Exline, David L; Wallace, Christie; Roux, Claude; Lennard, Chris; Nelson, Matthew P; Treado, Patrick J

    2003-09-01

    Chemical imaging technology is a rapid examination technique that combines molecular spectroscopy and digital imaging, providing information on morphology, composition, structure, and concentration of a material. Among many other applications, chemical imaging offers an array of novel analytical testing methods, which limits sample preparation and provides high-quality imaging data essential in the detection of latent fingerprints. Luminescence chemical imaging and visible absorbance chemical imaging have been successfully applied to ninhydrin, DFO, cyanoacrylate, and luminescent dye-treated latent fingerprints, demonstrating the potential of this technology to aid forensic investigations. In addition, visible absorption chemical imaging has been applied successfully to visualize untreated latent fingerprints.

  4. How I Learned to Swim: The Visual Journal as a Companion to Creative Inquiry

    ERIC Educational Resources Information Center

    Scott Shields, Sara

    2016-01-01

    In this paper, I discuss my engagement with a visual journal as a companion to creative research practice during my dissertation research. Grounded in arts based research methodologies; I explore visual journals in relationship to research, reflection and analytic processes. I begin with a discussion of the visual journal as an artifact of…

  5. An Interactive Visual Analytics Framework for Multi-Field Data in a Geo-Spatial Context

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Zhiyuan; Tong, Xiaonan; McDonnell, Kevin T.

    2013-04-01

    Climate research produces a wealth of multivariate data. These data often have a geospatial reference and so it is of interest to show them within their geospatial context. One can consider this configuration as a multi field visualization problem, where the geospace provides the expanse of the field. However, there is a limit on the amount of multivariate information that can be fit within a certain spatial location, and the use of linked multivari ate information displays has previously been devised to bridge this gap. In this paper we focus on the interactions in the geographical display, present an implementationmore » that uses Google Earth, and demonstrate it within a tightly linked parallel coordinates display. Several other visual representations, such as pie and bar charts are integrated into the Google Earth display and can be interactively manipulated. Further, we also demonstrate new brushing and visualization techniques for parallel coordinates, such as fixedwindow brushing and correlationenhanced display. We conceived our system with a team of climate researchers, who already made a few important discov eries using it. This demonstrates our system’s great potential to enable scientific discoveries, possibly also in oth er domains where data have a geospatial reference.« less

  6. High-Throughput Biophysical Analysis and Data Visualization of Conformational Stability of an IgG1 Monoclonal Antibody (mAb) After Deglycosylation

    PubMed Central

    Alsenaidy, Mohammad A.; Kim, Jae Hyun; Majumdar, Ranajoy; Weis, David D.; Joshi, Sangeeta B.; Tolbert, Thomas J.; Middaugh, C. Russell; Volkin, David B.

    2013-01-01

    The structural integrity and conformational stability of an IgG1 monoclonal antibody (mAb), after partial and complete enzymatic removal of the N-linked Fc glycan, was compared to the untreated mAb over a wide range of temperature (10° to 90°C) and solution pH (3 to 8) using circular dichroism, fluorescence spectroscopy, and static light scattering combined with data visualization employing empirical phase diagrams (EPDs). Subtle to larger stability differences between the different glycoforms were observed. Improved detection of physical stability differences was then demonstrated over narrower pH range (4.0-6.0) using smaller temperature increments, especially when combined with an alternative data visualization method (radar plots). Differential scanning calorimetry and differential scanning fluorimetry were then utilized and also showed an improved ability to detect differences in mAb glycoform physical stability. Based on these results, a two-step methodology was used in which mAb glycoform conformational stability is first screened with a wide variety of instruments and environmental stresses, followed by a second evaluation with optimally sensitive experimental conditions, analytical techniques and data visualization methods. With this approach, high-throughput biophysical analysis to assess relatively subtle conformational stability differences in protein glycoforms is demonstrated. PMID:24114789

  7. Penetrating the Fog: Analytics in Learning and Education

    ERIC Educational Resources Information Center

    Siemens, George; Long, Phil

    2011-01-01

    Attempts to imagine the future of education often emphasize new technologies--ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that people cannot actually touch or see: "big data and analytics." Learning analytics is still in…

  8. Be the Data: Embodied Visual Analytics

    ERIC Educational Resources Information Center

    Chen, Xin; Self, Jessica Zeitz; House, Leanna; Wenskovitch, John; Sun, Maoyuan; Wycoff, Nathan; Evia, Jane Robertson; Leman, Scotland; North, Chris

    2018-01-01

    With the rise of big data, it is becoming increasingly important to educate groups of students at many educational levels about data analytics. In particular, students without a strong mathematical background may have an unenthusiastic attitude towards high-dimensional data and find it challenging to understand relevant complex analytical methods,…

  9. VIGOR: Interactive Visual Exploration of Graph Query Results.

    PubMed

    Pienta, Robert; Hohman, Fred; Endert, Alex; Tamersoy, Acar; Roundy, Kevin; Gates, Chris; Navathe, Shamkant; Chau, Duen Horng

    2018-01-01

    Finding patterns in graphs has become a vital challenge in many domains from biological systems, network security, to finance (e.g., finding money laundering rings of bankers and business owners). While there is significant interest in graph databases and querying techniques, less research has focused on helping analysts make sense of underlying patterns within a group of subgraph results. Visualizing graph query results is challenging, requiring effective summarization of a large number of subgraphs, each having potentially shared node-values, rich node features, and flexible structure across queries. We present VIGOR, a novel interactive visual analytics system, for exploring and making sense of query results. VIGOR uses multiple coordinated views, leveraging different data representations and organizations to streamline analysts sensemaking process. VIGOR contributes: (1) an exemplar-based interaction technique, where an analyst starts with a specific result and relaxes constraints to find other similar results or starts with only the structure (i.e., without node value constraints), and adds constraints to narrow in on specific results; and (2) a novel feature-aware subgraph result summarization. Through a collaboration with Symantec, we demonstrate how VIGOR helps tackle real-world problems through the discovery of security blindspots in a cybersecurity dataset with over 11,000 incidents. We also evaluate VIGOR with a within-subjects study, demonstrating VIGOR's ease of use over a leading graph database management system, and its ability to help analysts understand their results at higher speed and make fewer errors.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kris A.; Scholtz, Jean; Whiting, Mark A.

    The VAST Challenge has been a popular venue for academic and industry participants for over ten years. Many participants comment that the majority of their time in preparing VAST Challenge entries is discovering elements in their software environments that need to be redesigned in order to solve the given task. Fortunately, there is no need to wait until the VAST Challenge is announced to test out software systems. The Visual Analytics Benchmark Repository contains all past VAST Challenge tasks, data, solutions and submissions. This paper details the various types of evaluations that may be conducted using the Repository information. Inmore » this paper we describe how developers can do informal evaluations of various aspects of their visual analytics environments using VAST Challenge information. Aspects that can be evaluated include the appropriateness of the software for various tasks, the various data types and formats that can be accommodated, the effectiveness and efficiency of the process supported by the software, and the intuitiveness of the visualizations and interactions. Researchers can compare their visualizations and interactions to those submitted to determine novelty. In addition, the paper provides pointers to various guidelines that software teams can use to evaluate the usability of their software. While these evaluations are not a replacement for formal evaluation methods, this information can be extremely useful during the development of visual analytics environments.« less

  11. The Top 10 Challenges in Extreme-Scale Visual Analytics

    PubMed Central

    Wong, Pak Chung; Shen, Han-Wei; Johnson, Christopher R.; Chen, Chaomei; Ross, Robert B.

    2013-01-01

    In this issue of CG&A, researchers share their R&D findings and results on applying visual analytics (VA) to extreme-scale data. Having surveyed these articles and other R&D in this field, we’ve identified what we consider the top challenges of extreme-scale VA. To cater to the magazine’s diverse readership, our discussion evaluates challenges in all areas of the field, including algorithms, hardware, software, engineering, and social issues. PMID:24489426

  12. Imaging of oxygen and hypoxia in cell and tissue samples.

    PubMed

    Papkovsky, Dmitri B; Dmitriev, Ruslan I

    2018-05-14

    Molecular oxygen (O 2 ) is a key player in cell mitochondrial function, redox balance and oxidative stress, normal tissue function and many common disease states. Various chemical, physical and biological methods have been proposed for measurement, real-time monitoring and imaging of O 2 concentration, state of decreased O 2 (hypoxia) and related parameters in cells and tissue. Here, we review the established and emerging optical microscopy techniques allowing to visualize O 2 levels in cells and tissue samples, mostly under in vitro and ex vivo, but also under in vivo settings. Particular examples include fluorescent hypoxia stains, fluorescent protein reporter systems, phosphorescent probes and nanosensors of different types. These techniques allow high-resolution mapping of O 2 gradients in live or post-mortem tissue, in 2D or 3D, qualitatively or quantitatively. They enable control and monitoring of oxygenation conditions and their correlation with other biomarkers of cell and tissue function. Comparison of these techniques and corresponding imaging setups, their analytical capabilities and typical applications are given.

  13. Word aligned bitmap compression method, data structure, and apparatus

    DOEpatents

    Wu, Kesheng; Shoshani, Arie; Otoo, Ekow

    2004-12-14

    The Word-Aligned Hybrid (WAH) bitmap compression method and data structure is a relatively efficient method for searching and performing logical, counting, and pattern location operations upon large datasets. The technique is comprised of a data structure and methods that are optimized for computational efficiency by using the WAH compression method, which typically takes advantage of the target computing system's native word length. WAH is particularly apropos to infrequently varying databases, including those found in the on-line analytical processing (OLAP) industry, due to the increased computational efficiency of the WAH compressed bitmap index. Some commercial database products already include some version of a bitmap index, which could possibly be replaced by the WAH bitmap compression techniques for potentially increased operation speed, as well as increased efficiencies in constructing compressed bitmaps. Combined together, this technique may be particularly useful for real-time business intelligence. Additional WAH applications may include scientific modeling, such as climate and combustion simulations, to minimize search time for analysis and subsequent data visualization.

  14. Comprehensive Analysis of LC/MS Data Using Pseudocolor Plots

    NASA Astrophysics Data System (ADS)

    Crutchfield, Christopher A.; Olson, Matthew T.; Gourgari, Evgenia; Nesterova, Maria; Stratakis, Constantine A.; Yergey, Alfred L.

    2013-02-01

    We have developed new applications of the pseudocolor plot for the analysis of LC/MS data. These applications include spectral averaging, analysis of variance, differential comparison of spectra, and qualitative filtering by compound class. These applications have been motivated by the need to better understand LC/MS data generated from analysis of human biofluids. The examples presented use data generated to profile steroid hormones in urine extracts from a Cushing's disease patient relative to a healthy control, but are general to any discovery-based scanning mass spectrometry technique. In addition to new visualization techniques, we introduce a new metric of variance: the relative maximum difference from the mean. We also introduce the concept of substructure-dependent analysis of steroid hormones using precursor ion scans. These new analytical techniques provide an alternative approach to traditional untargeted metabolomics workflow. We present an approach to discovery using MS that essentially eliminates alignment or preprocessing of spectra. Moreover, we demonstrate the concept that untargeted metabolomics can be achieved using low mass resolution instrumentation.

  15. Enabling fluorescent biosensors for the forensic identification of body fluids.

    PubMed

    Frascione, Nunzianda; Gooch, James; Daniel, Barbara

    2013-11-12

    The search for body fluids often forms a crucial element of many forensic investigations. Confirming fluid presence at a scene can not only support or refute the circumstantial claims of a victim, suspect or witness, but may additionally provide a valuable source of DNA for further identification purposes. However, current biological fluid testing techniques are impaired by a number of well-characterised limitations; they often give false positives, cannot be used simultaneously, are sample destructive and lack the ability to visually locate fluid depositions. These disadvantages can negatively affect the outcome of a case through missed or misinterpreted evidence. Biosensors are devices able to transduce a biological recognition event into a measurable signal, resulting in real-time analyte detection. The use of innovative optical sensing technology may enable the highly specific and non-destructive detection of biological fluid depositions through interaction with several fluid-endogenous biomarkers. Despite considerable impact in a variety of analytical disciplines, biosensor application within forensic analyses may be considered extremely limited. This article aims to explore a number of prospective biosensing mechanisms and to outline the challenges associated with their adaptation towards detection of fluid-specific analytes.

  16. Enhancement of PET Images

    NASA Astrophysics Data System (ADS)

    Davis, Paul B.; Abidi, Mongi A.

    1989-05-01

    PET is the only imaging modality that provides doctors with early analytic and quantitative biochemical assessment and precise localization of pathology. In PET images, boundary information as well as local pixel intensity are both crucial for manual and/or automated feature tracing, extraction, and identification. Unfortunately, the present PET technology does not provide the necessary image quality from which such precise analytic and quantitative measurements can be made. PET images suffer from significantly high levels of radial noise present in the form of streaks caused by the inexactness of the models used in image reconstruction. In this paper, our objective is to model PET noise and remove it without altering dominant features in the image. The ultimate goal here is to enhance these dominant features to allow for automatic computer interpretation and classification of PET images by developing techniques that take into consideration PET signal characteristics, data collection, and data reconstruction. We have modeled the noise steaks in PET images in both rectangular and polar representations and have shown both analytically and through computer simulation that it exhibits consistent mapping patterns. A class of filters was designed and applied successfully. Visual inspection of the filtered images show clear enhancement over the original images.

  17. Baselining PMU Data to Find Patterns and Anomalies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amidan, Brett G.; Follum, James D.; Freeman, Kimberly A.

    This paper looks at the application of situational awareness methodologies with respect to power grid data. These methodologies establish baselines that look for typical patterns and atypical behavior in the data. The objectives of the baselining analyses are to provide: real-time analytics, the capability to look at historical trends and events, and reliable predictions of the near future state of the grid. Multivariate algorithms were created to establish normal baseline behavior and then score each moment in time according to its variance from the baseline. Detailed multivariate analytical techniques are described in this paper that produced ways to identify typicalmore » patterns and atypical behavior. In this case, atypical behavior is behavior that is unenvisioned. Visualizations were also produced to help explain the behavior that was identified mathematically. Examples are shown to help describe how to read and interpret the analyses and visualizations. Preliminary work has been performed on PMU data sets from BPA (Bonneville Power Administration) and EI (Eastern Interconnect). Actual results are not fully shown here because of confidentiality issues. Comparisons between atypical events found mathematically and actual events showed that many of the actual events are also atypical events; however there are many atypical events that do not correlate to any actual events. Additional work needs to be done to help classify the atypical events into actual events, so that the importance of the events can be better understood.« less

  18. Visualizing Time-Varying Phenomena In Numerical Simulations Of Unsteady Flows

    NASA Technical Reports Server (NTRS)

    Lane, David A.

    1996-01-01

    Streamlines, contour lines, vector plots, and volume slices (cutting planes) are commonly used for flow visualization. These techniques are sometimes referred to as instantaneous flow visualization techniques because calculations are based on an instant of the flowfield in time. Although instantaneous flow visualization techniques are effective for depicting phenomena in steady flows,they sometimes do not adequately depict time-varying phenomena in unsteady flows. Streaklines and timelines are effective visualization techniques for depicting vortex shedding, vortex breakdown, and shock waves in unsteady flows. These techniques are examples of time-dependent flow visualization techniques, which are based on many instants of the flowfields in time. This paper describes the algorithms for computing streaklines and timelines. Using numerically simulated unsteady flows, streaklines and timelines are compared with streamlines, contour lines, and vector plots. It is shown that streaklines and timelines reveal vortex shedding and vortex breakdown more clearly than instantaneous flow visualization techniques.

  19. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.

    PubMed

    Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa

    2016-05-01

    The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  2. Prototyping Visual Learning Analytics Guided by an Educational Theory Informed Goal

    ERIC Educational Resources Information Center

    Hillaire, Garron; Rappolt-Schlichtmann, Gabrielle; Ducharme, Kim

    2016-01-01

    Prototype work can support the creation of data visualizations throughout the research and development process through paper prototypes with sketching, designed prototypes with graphic design tools, and functional prototypes to explore how the implementation will work. One challenging aspect of data visualization work is coordinating the expertise…

  3. An Automated Directed Spectral Search Methodology for Small Target Detection

    NASA Astrophysics Data System (ADS)

    Grossman, Stanley I.

    Much of the current efforts in remote sensing tackle macro-level problems such as determining the extent of wheat in a field, the general health of vegetation or the extent of mineral deposits in an area. However, for many of the remaining remote sensing challenges being studied currently, such as border protection, drug smuggling, treaty verification, and the war on terror, most targets are very small in nature - a vehicle or even a person. While in typical macro-level problems the objective vegetation is in the scene, for small target detection problems it is not usually known if the desired small target even exists in the scene, never mind finding it in abundance. The ability to find specific small targets, such as vehicles, typifies this problem. Complicating the analyst's life, the growing number of available sensors is generating mountains of imagery outstripping the analysts' ability to visually peruse them. This work presents the important factors influencing spectral exploitation using multispectral data and suggests a different approach to small target detection. The methodology of directed search is presented, including the use of scene-modeled spectral libraries, various search algorithms, and traditional statistical and ROC curve analysis. The work suggests a new metric to calibrate analysis labeled the analytic sweet spot as well as an estimation method for identifying the sweet spot threshold for an image. It also suggests a new visualization aid for highlighting the target in its entirety called nearest neighbor inflation (NNI). It brings these all together to propose that these additions to the target detection arena allow for the construction of a fully automated target detection scheme. This dissertation next details experiments to support the hypothesis that the optimum detection threshold is the analytic sweet spot and that the estimation method adequately predicts it. Experimental results and analysis are presented for the proposed directed search techniques of spectral image based small target detection. It offers evidence of the functionality of the NNI visualization and also provides evidence that the increased spectral dimensionality of the 8-band Worldview-2 datasets provides noteworthy improvement in results over traditional 4-band multispectral datasets. The final experiment presents the results from a prototype fully automated target detection scheme in support of the overarching premise. This work establishes the analytic sweet spot as the optimum threshold defined as the point where error detection rate curves -- false detections vs. missing detections -- cross. At this point the errors are minimized while the detection rate is maximized. It then demonstrates that taking the first moment statistic of the histogram of calculated target detection values from a detection search with test threshold set arbitrarily high will estimate the analytic sweet spot for that image. It also demonstrates that directed search techniques -- when utilized with appropriate scene-specific modeled signatures and atmospheric compensations -- perform at least as well as in-scene search techniques 88% of the time and grossly under-performing only 11% of the time; the in-scene only performs as well or better 50% of the time. It further demonstrates the clear advantage increased multispectral dimensionality brings to detection searches improving performance in 50% of the cases while performing at least as well 72% of the time. Lastly, it presents evidence that a fully automated prototype performs as anticipated laying the groundwork for further research into fully automated processes for small target detection.

  4. Encounter Detection Using Visual Analytics to Improve Maritime Domain Awareness

    DTIC Science & Technology

    2015-06-01

    assigned to be processed in a record set consisting of all the records within a one degree of latitude by one degree of longitude square box. For the case...0.002 3 30 185 0.001 4 30 370 0.002 37 a degree of latitude by a tenth of a degree of longitude . This prototype further reduces the processing ...STATEMENT Approved for public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) A visual analytics process

  5. Minimally-invasive Laser Ablation Inductively Coupled Plasma Mass Spectrometry analysis of model ancient copper alloys

    NASA Astrophysics Data System (ADS)

    Walaszek, Damian; Senn, Marianne; Wichser, Adrian; Faller, Markus; Wagner, Barbara; Bulska, Ewa; Ulrich, Andrea

    2014-09-01

    This work describes an evaluation of a strategy for multi-elemental analysis of typical ancient bronzes (copper, lead bronze and tin bronze) by means of laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS).The samples originating from archeological experiments on ancient metal smelting processes using direct reduction in a ‘bloomery’ furnace as well as historical casting techniques were investigated with the use of the previously proposed analytical procedure, including metallurgical observation and preliminary visual estimation of the homogeneity of the samples. The results of LA-ICPMS analysis were compared to the results of bulk composition obtained by X-ray fluorescence spectrometry (XRF) and by inductively coupled plasma mass spectrometry (ICPMS) after acid digestion. These results were coherent for most of the elements confirming the usefulness of the proposed analytical procedure, however the reliability of the quantitative information about the content of the most heterogeneously distributed elements was also discussed in more detail.

  6. Scanning electron microscopy as an analytical tool for the study of calcified intrauterine contraceptive devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khan, S.R.; Wilkinson, E.J.

    Within the endometrial cavity intrauterine contraceptive devices (IUDs) become encrusted with cellular, acellular, and fibrillar substances. Scanning electron microscopy was used to study the crust. Cellular material consisted mainly of blood cells and various types of bacteria. The fibrillar material appeared to be fibrin which was omnipresent in the crust and formed a thin layer immediately over the IUD surface. X-ray microanalysis of the acellular component of the crust revealed the presence of calcium. No other major peaks were identified. Near the IUD surface characteristic calcium phosphate crystals were present. Their microanalysis showed peaks for calcium and phosphorus. X-ray diffractionmore » of the crust however, showed it to contain only calcite. It is through the use of scanning electron microscopy that calcium phosphate has been detected in the IUD crust and a fibrillar layer has been visualized on the IUD surface. This study further demonstrates the effectiveness of SEM analytical techniques in the area of biomedical research.« less

  7. QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study

    NASA Astrophysics Data System (ADS)

    Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa

    2016-10-01

    Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.

  8. A study of the laminar separation bubble on an airfoil at low Reynolds numbers using flow visualization techniques

    NASA Technical Reports Server (NTRS)

    Schmidt, Gordon S.; Mueller, Thomas J.

    1987-01-01

    The use of flow visualization to study separation bubbles is evaluated. The wind tunnel, two NACA 66(3)-018 airfoil models, and kerosene vapor, titanium tetrachloride, and surface flow visualizations techniques are described. The application of the three visualization techniques to the two airfoil models reveals that the smoke and vapor techniques provide data on the location of laminar separation and the onset of transition, and the surface method produces information about the location of turbulent boundary layer separation. The data obtained with the three flow visualization techniques are compared to pressure distribution data and good correlation is detected. It is noted that flow visualization is an effective technique for examining separation bubbles.

  9. DIVE: A Graph-based Visual Analytics Framework for Big Data

    PubMed Central

    Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie

    2014-01-01

    The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197

  10. Planetary Surface Visualization and Analytics

    NASA Astrophysics Data System (ADS)

    Law, E. S.; Solar System Treks Team

    2018-04-01

    An introduction and update of the Solar System Treks Project which provides a suite of interactive visualization and analysis tools to enable users (engineers, scientists, public) to access large amounts of mapped planetary data products.

  11. Analytic information processing style in epilepsy patients.

    PubMed

    Buonfiglio, Marzia; Di Sabato, Francesco; Mandillo, Silvia; Albini, Mariarita; Di Bonaventura, Carlo; Giallonardo, Annateresa; Avanzini, Giuliano

    2017-08-01

    Relevant to the study of epileptogenesis is learning processing, given the pivotal role that neuroplasticity assumes in both mechanisms. Recently, evoked potential analyses showed a link between analytic cognitive style and altered neural excitability in both migraine and healthy subjects, regardless of cognitive impairment or psychological disorders. In this study we evaluated analytic/global and visual/auditory perceptual dimensions of cognitive style in patients with epilepsy. Twenty-five cryptogenic temporal lobe epilepsy (TLE) patients matched with 25 idiopathic generalized epilepsy (IGE) sufferers and 25 healthy volunteers were recruited and participated in three cognitive style tests: "Sternberg-Wagner Self-Assessment Inventory", the C. Cornoldi test series called AMOS, and the Mariani Learning style Questionnaire. Our results demonstrate a significant association between analytic cognitive style and both IGE and TLE and respectively a predominant auditory and visual analytic style (ANOVA: p values <0,0001). These findings should encourage further research to investigate information processing style and its neurophysiological correlates in epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Oak Ridge Bio-surveillance Toolkit (ORBiT): Integrating Big-Data Analytics with Visual Analysis for Public Health Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramanathan, Arvind; Pullum, Laura L; Steed, Chad A

    In this position paper, we describe the design and implementation of the Oak Ridge Bio-surveillance Toolkit (ORBiT): a collection of novel statistical and machine learning tools implemented for (1) integrating heterogeneous traditional (e.g. emergency room visits, prescription sales data, etc.) and non-traditional (social media such as Twitter and Instagram) data sources, (2) analyzing large-scale datasets and (3) presenting the results from the analytics as a visual interface for the end-user to interact and provide feedback. We present examples of how ORBiT can be used to summarize ex- tremely large-scale datasets effectively and how user interactions can translate into the datamore » analytics process for bio-surveillance. We also present a strategy to estimate parameters relevant to dis- ease spread models from near real time data feeds and show how these estimates can be integrated with disease spread models for large-scale populations. We conclude with a perspective on how integrating data and visual analytics could lead to better forecasting and prediction of disease spread as well as improved awareness of disease susceptible regions.« less

  13. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  14. Equilibrium relations and bipolar cognitive mapping for online analytical processing with applications in international relations and strategic decision support.

    PubMed

    Zhang, Wen-Ran

    2003-01-01

    Bipolar logic, bipolar sets, and equilibrium relations are proposed for bipolar cognitive mapping and visualization in online analytical processing (OLAP) and online analytical mining (OLAM). As cognitive models, cognitive maps (CMs) hold great potential for clustering and visualization. Due to the lack of a formal mathematical basis, however, CM-based OLAP and OLAM have not gained popularity. Compared with existing approaches, bipolar cognitive mapping has a number of advantages. First, bipolar CMs are formal logical models as well as cognitive models. Second, equilibrium relations (with polarized reflexivity, symmetry, and transitivity), as bipolar generalizations and fusions of equivalence relations, provide a theoretical basis for bipolar visualization and coordination. Third, an equilibrium relation or CM induces bipolar partitions that distinguish disjoint coalition subsets not involved in any conflict, disjoint coalition subsets involved in a conflict, disjoint conflict subsets, and disjoint harmony subsets. Finally, equilibrium energy analysis leads to harmony and stability measures for strategic decision and multiagent coordination. Thus, this work bridges a gap for CM-based clustering and visualization in OLAP and OLAM. Basic ideas are illustrated with example CMs in international relations.

  15. [Utilization of Big Data in Medicine and Future Outlook].

    PubMed

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.

  16. Chemical mapping of pharmaceutical cocrystals using terahertz spectroscopic imaging.

    PubMed

    Charron, Danielle M; Ajito, Katsuhiro; Kim, Jae-Young; Ueno, Yuko

    2013-02-19

    Terahertz (THz) spectroscopic imaging is a promising technique for distinguishing pharmaceuticals of similar molecular composition but differing crystal structures. Physicochemical properties, for instance bioavailability, are manipulated by altering a drug's crystal structure through methods such as cocrystallization. Cocrystals are molecular complexes having crystal structures different from those of their pure components. A technique for identifying the two-dimensional distribution of these alternate forms is required. Here we present the first demonstration of THz spectroscopic imaging of cocrystals. THz spectra of caffeine-oxalic acid cocrystal measured at low temperature exhibit sharp peaks, enabling us to visualize the cocrystal distribution in nonuniform tablets. The cocrystal distribution was clearly identified using THz spectroscopic data, and the cocrystal concentration was calculated with 0.3-1.3% w/w error from the known total concentration. From this result, THz spectroscopy allows quantitative chemical mapping of cocrystals and offers researchers and drug developers a new analytical tool.

  17. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  18. Decision exploration lab: a visual analytics solution for decision management.

    PubMed

    Broeksema, Bertjan; Baudel, Thomas; Telea, Arthur G; Crisafulli, Paolo

    2013-12-01

    We present a visual analytics solution designed to address prevalent issues in the area of Operational Decision Management (ODM). In ODM, which has its roots in Artificial Intelligence (Expert Systems) and Management Science, it is increasingly important to align business decisions with business goals. In our work, we consider decision models (executable models of the business domain) as ontologies that describe the business domain, and production rules that describe the business logic of decisions to be made over this ontology. Executing a decision model produces an accumulation of decisions made over time for individual cases. We are interested, first, to get insight in the decision logic and the accumulated facts by themselves. Secondly and more importantly, we want to see how the accumulated facts reveal potential divergences between the reality as captured by the decision model, and the reality as captured by the executed decisions. We illustrate the motivation, added value for visual analytics, and our proposed solution and tooling through a business case from the car insurance industry.

  19. Large Ensemble Analytic Framework for Consequence-Driven Discovery of Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Lamontagne, Jonathan R.; Reed, Patrick M.; Link, Robert; Calvin, Katherine V.; Clarke, Leon E.; Edmonds, James A.

    2018-03-01

    An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socioeconomic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of "challenges to mitigation" and "challenges to adaptation" to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial data set of 33,750 scenarios generated using the Global Change Assessment Model. We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user-specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support "bottom-up" scenario generation techniques that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.

  20. Data-driven exploration of copper mineralogy and its application to Earth's near-surface oxidation

    NASA Astrophysics Data System (ADS)

    Morrison, S. M.; Eleish, A.; Runyon, S.; Prabhu, A.; Fox, P. A.; Ralph, J.; Golden, J. J.; Downs, R. T.; Liu, C.; Meyer, M.; Hazen, R. M.

    2017-12-01

    Earth's atmospheric composition has changed radically throughout geologic history.1,2 The oxidation of our atmosphere, driven by biology, began with the Great Oxidation Event (GOE) 2.5 Ga and has heavily influenced Earth's near surface mineralogy. Therefore, temporal trends in mineral occurrence elucidate large and small scale geologic and biologic processes. Cu, and other first-row transition elements, are of particular interest due to their variation in valance state and sensitivity to ƒO2. Widespread formation of oxidized Cu mineral species (Cu2+) would not have been possible prior to the GOE and we have found that the proportion of oxidized Cu minerals increased steadily with the increase in atmospheric O2 on Earth's surface (see Fig. 1). To better characterize the changes in Cu mineralogy through time, we have employed advanced analytical and visualization methods. These techniques rely on large and growing mineral databases (e.g., rruff.info, mindat.org, earthchem.org, usgs.gov) and allow us to quantify and visualize multi-dimensional trends.5

  1. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  2. The Preference of Visualization in Teaching and Learning Absolute Value

    ERIC Educational Resources Information Center

    Konyalioglu, Alper Cihan; Aksu, Zeki; Senel, Esma Ozge

    2012-01-01

    Visualization is mostly despised although it complements and--sometimes--guides the analytical process. This study mainly investigates teachers' preferences concerning the use of the visualization method and determines the extent to which they encourage their students to make use of it within the problem-solving process. This study was conducted…

  3. DIA2: Web-based Cyberinfrastructure for Visual Analysis of Funding Portfolios.

    PubMed

    Madhavan, Krishna; Elmqvist, Niklas; Vorvoreanu, Mihaela; Chen, Xin; Wong, Yuetling; Xian, Hanjun; Dong, Zhihua; Johri, Aditya

    2014-12-01

    We present a design study of the Deep Insights Anywhere, Anytime (DIA2) platform, a web-based visual analytics system that allows program managers and academic staff at the U.S. National Science Foundation to search, view, and analyze their research funding portfolio. The goal of this system is to facilitate users' understanding of both past and currently active research awards in order to make more informed decisions of their future funding. This user group is characterized by high domain expertise yet not necessarily high literacy in visualization and visual analytics-they are essentially casual experts-and thus require careful visual and information design, including adhering to user experience standards, providing a self-instructive interface, and progressively refining visualizations to minimize complexity. We discuss the challenges of designing a system for casual experts and highlight how we addressed this issue by modeling the organizational structure and workflows of the NSF within our system. We discuss each stage of the design process, starting with formative interviews, prototypes, and finally live deployments and evaluation with stakeholders.

  4. DICON: interactive visual analysis of multidimensional clusters.

    PubMed

    Cao, Nan; Gotz, David; Sun, Jimeng; Qu, Huamin

    2011-12-01

    Clustering as a fundamental data analysis technique has been widely used in many analytic applications. However, it is often difficult for users to understand and evaluate multidimensional clustering results, especially the quality of clusters and their semantics. For large and complex data, high-level statistical information about the clusters is often needed for users to evaluate cluster quality while a detailed display of multidimensional attributes of the data is necessary to understand the meaning of clusters. In this paper, we introduce DICON, an icon-based cluster visualization that embeds statistical information into a multi-attribute display to facilitate cluster interpretation, evaluation, and comparison. We design a treemap-like icon to represent a multidimensional cluster, and the quality of the cluster can be conveniently evaluated with the embedded statistical information. We further develop a novel layout algorithm which can generate similar icons for similar clusters, making comparisons of clusters easier. User interaction and clutter reduction are integrated into the system to help users more effectively analyze and refine clustering results for large datasets. We demonstrate the power of DICON through a user study and a case study in the healthcare domain. Our evaluation shows the benefits of the technique, especially in support of complex multidimensional cluster analysis. © 2011 IEEE

  5. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    PubMed

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges according to associated data values. We demonstrated the advantages of these new capabilities through three biological network visualization case studies: human disease association network, drug-target interaction network and protein-peptide mapping network. The architectural design of ProteoLens makes it suitable for bioinformatics expert data analysts who are experienced with relational database management to perform large-scale integrated network visual explorations. ProteoLens is a promising visual analytic platform that will facilitate knowledge discoveries in future network and systems biology studies.

  6. Visual-area coding technique (VACT): optical parallel implementation of fuzzy logic and its visualization with the digital-halftoning process

    NASA Astrophysics Data System (ADS)

    Konishi, Tsuyoshi; Tanida, Jun; Ichioka, Yoshiki

    1995-06-01

    A novel technique, the visual-area coding technique (VACT), for the optical implementation of fuzzy logic with the capability of visualization of the results is presented. This technique is based on the microfont method and is considered to be an instance of digitized analog optical computing. Huge amounts of data can be processed in fuzzy logic with the VACT. In addition, real-time visualization of the processed result can be accomplished.

  7. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  8. Dissociation between recognition and detection advantage for facial expressions: a meta-analysis.

    PubMed

    Nummenmaa, Lauri; Calvo, Manuel G

    2015-04-01

    Happy facial expressions are recognized faster and more accurately than other expressions in categorization tasks, whereas detection in visual search tasks is widely believed to be faster for angry than happy faces. We used meta-analytic techniques for resolving this categorization versus detection advantage discrepancy for positive versus negative facial expressions. Effect sizes were computed on the basis of the r statistic for a total of 34 recognition studies with 3,561 participants and 37 visual search studies with 2,455 participants, yielding a total of 41 effect sizes for recognition accuracy, 25 for recognition speed, and 125 for visual search speed. Random effects meta-analysis was conducted to estimate effect sizes at population level. For recognition tasks, an advantage in recognition accuracy and speed for happy expressions was found for all stimulus types. In contrast, for visual search tasks, moderator analysis revealed that a happy face detection advantage was restricted to photographic faces, whereas a clear angry face advantage was found for schematic and "smiley" faces. Robust detection advantage for nonhappy faces was observed even when stimulus emotionality was distorted by inversion or rearrangement of the facial features, suggesting that visual features primarily drive the search. We conclude that the recognition advantage for happy faces is a genuine phenomenon related to processing of facial expression category and affective valence. In contrast, detection advantages toward either happy (photographic stimuli) or nonhappy (schematic) faces is contingent on visual stimulus features rather than facial expression, and may not involve categorical or affective processing. (c) 2015 APA, all rights reserved).

  9. The 3D widgets for exploratory scientific visualization

    NASA Technical Reports Server (NTRS)

    Herndon, Kenneth P.; Meyer, Tom

    1995-01-01

    Computational fluid dynamics (CFD) techniques are used to simulate flows of fluids like air or water around such objects as airplanes and automobiles. These techniques usually generate very large amounts of numerical data which are difficult to understand without using graphical scientific visualization techniques. There are a number of commercial scientific visualization applications available today which allow scientists to control visualization tools via textual and/or 2D user interfaces. However, these user interfaces are often difficult to use. We believe that 3D direct-manipulation techniques for interactively controlling visualization tools will provide opportunities for powerful and useful interfaces with which scientists can more effectively explore their datasets. A few systems have been developed which use these techniques. In this paper, we will present a variety of 3D interaction techniques for manipulating parameters of visualization tools used to explore CFD datasets, and discuss in detail various techniques for positioning tools in a 3D scene.

  10. New solutions for climate network visualization

    NASA Astrophysics Data System (ADS)

    Nocke, Thomas; Buschmann, Stefan; Donges, Jonathan F.; Marwan, Norbert

    2016-04-01

    An increasing amount of climate and climate impact research methods deals with geo-referenced networks, including energy, trade, supply-chain, disease dissemination and climatic tele-connection networks. At the same time, the size and complexity of these networks increases, resulting in networks of more than hundred thousand or even millions of edges, which are often temporally evolving, have additional data at nodes and edges, and can consist of multiple layers even in real 3D. This gives challenges to both the static representation and the interactive exploration of these networks, first of all avoiding edge clutter ("edge spagetti") and allowing interactivity even for unfiltered networks. Within this presentation, we illustrate potential solutions to these challenges. Therefore, we give a glimpse on a questionnaire performed with climate and complex system scientists with respect to their network visualization requirements, and on a review of available state-of-the-art visualization techniques and tools for this purpose (see as well Nocke et al., 2015). In the main part, we present alternative visualization solutions for several use cases (global, regional, and multi-layered climate networks) including alternative geographic projections, edge bundling, and 3-D network support (based on CGV and GTX tools), and implementation details to reach interactive frame rates. References: Nocke, T., S. Buschmann, J. F. Donges, N. Marwan, H.-J. Schulz, and C. Tominski: Review: Visual analytics of climate networks, Nonlinear Processes in Geophysics, 22, 545-570, doi:10.5194/npg-22-545-2015, 2015

  11. [On pre-requirements to the use of audio-visual techniques in deviant children. Thoughts from the psychoanalytical treatment of a thirteen months aged child (author's transl)].

    PubMed

    Lefort, R

    1977-01-01

    This paper deals with the analytical cure of a child between 13 and 21 months and hospitalized since birth. In a first phase, one can see the start of a relationship, then the objectal relationship to primary objects: on the one hand the object food and on the other hand the therapist within the transfert. These two objects, impossible in the reality dimension from the start, are progressively characterized by "neantisation", which can reach symbolic tone. A scene in front of the picture of a child on knees of a nurse progressively introduces Nadia to the third register: the imagery one. Her behaviour is exemplary during the 17 scences on the mirror, during which she can progressively assume the image of her unified body under the view of the other, in demonstrating that she can only do it by symbolizing primary objects, in particular "to drink nothing". The symbolisation acquired at the oral level permits to verbalise on the same mode her relationship to the anal object, i.e. on the non destructive symbolic mode. This treatment raises the question of the use of audio-visual techniques for psychotic and autistic subjects, and recuses the pedagogic use of image for such subjects, taking into account the primary importance of symbolic function on imagery function demonstrated in this paper. This is a prerequisite for any research in the field of audiovisual techniques with psychotics.

  12. From Streaming Data to Streaming Insights: The Impact of Data Velocities on Mental Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; Pike, William A.; Cook, Kristin A.

    The rise of Big Data has influenced the design and technical implementation of visual analytic tools required to handle the increased volumes, velocities, and varieties of data. This has required a set of data management and computational advancements to allow us to store and compute on such datasets. However, as the ultimate goal of visual analytic technology is to enable the discovery and creation of insights from the users, an under-explored area is understanding how these datasets impact their mental models. That is, how have the analytic processes and strategies of users changed? How have users changed their perception ofmore » how to leverage, and ask questions of, these datasets?« less

  13. Imaging of Selenium by Laser Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICP-MS) in 2-D Electrophoresis Gels and Biological Tissues.

    PubMed

    Cruz, Elisa Castañeda Santa; Susanne Becker, J; Sabine Becker, J; Sussulini, Alessandra

    2018-01-01

    Selenium and selenoproteins are important components of living organisms that play a role in different biological processes. Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is a powerful analytical technique that has been employed to obtain distribution maps of selenium in biological tissues in a direct manner, as well as in selenoproteins, previously separated by their molecular masses and isoelectric points using two-dimensional polyacrylamide gel electrophoresis (2-D PAGE). In this chapter, we present the protocols to perform LA-ICP-MS imaging experiments, allowing the distribution visualization and determination of selenium and/or selenoproteins in biological systems.

  14. DNA origami nanopillars as standards for three-dimensional superresolution microscopy.

    PubMed

    Schmied, Jürgen J; Forthmann, Carsten; Pibiri, Enrico; Lalkens, Birka; Nickels, Philipp; Liedl, Tim; Tinnefeld, Philip

    2013-02-13

    Nanopillars are promising nanostructures composed of various materials that bring new functionalities for applications ranging from photovoltaics to analytics. We developed DNA nanopillars with a height of 220 nm and a diameter of ~14 nm using the DNA origami technique. Modifying the base of the nanopillars with biotins allowed selective, upright, and rigid immobilization on solid substrates. With the help of site-selective dye labels, we visualized the structure and determined the orientation of the nanopillars by three-dimensional fluorescence superresolution microscopy. Because of their rigidity and nanometer-precise addressability, DNA origami nanopillars qualify as scaffold for the assembly of plasmonic devices as well as for three-dimensional superresolution standards.

  15. Sensory techniques for measuring differences in California navel oranges treated with doses of gamma-radiation below 0. 6 Kgray

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Mahony, M.; Goldstein, L.R.

    Navel oranges from California were given low post harvest doses of gamma radiation: 0.32-0.37 and 0.52-0.60 KGy (32-37 and 52-60 Krad); they were compared with nonirradiated controls for visual appearance, flavor by mouth, odor, taste and taste after sweetness suppression by gymnema sylvestre. Practiced judges were used as an analytical tool, with minimum cross-sensory interference, while untrained subjects were used to determine whether changes might be distinguished by nonexperts. Differences were found in appearance, flavor, taste and odor although they were less extreme at the lower dose. Untrained judges could discriminate the juice at the higher irradiation level only.

  16. Making the Dzyaloshinskii-Moriya interaction visible

    NASA Astrophysics Data System (ADS)

    Hrabec, A.; Belmeguenai, M.; Stashkevich, A.; Chérif, S. M.; Rohart, S.; Roussigné, Y.; Thiaville, A.

    2017-06-01

    Brillouin light spectroscopy is a powerful and robust technique for measuring the interfacial Dzyaloshinskii-Moriya interaction in thin films with broken inversion symmetry. Here, we show that the magnon visibility, i.e., the intensity of the inelastically scattered light, strongly depends on the thickness of the dielectric seed material—SiO2. By using both, analytical thin-film optics and numerical calculations, we reproduce the experimental data. We therefore provide a guideline for the maximization of the signal by adapting the substrate properties to the geometry of the measurement. Such a boost-up of the signal eases the magnon visualization in ultrathin magnetic films, speeds-up the measurement and increases the reliability of the data.

  17. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  18. Slushy weightings for the optimal pilot model. [considering visual tracking task

    NASA Technical Reports Server (NTRS)

    Dillow, J. D.; Picha, D. G.; Anderson, R. O.

    1975-01-01

    A pilot model is described which accounts for the effect of motion cues in a well defined visual tracking task. The effect of visual and motion cues are accounted for in the model in two ways. First, the observation matrix in the pilot model is structured to account for the visual and motion inputs presented to the pilot. Secondly, the weightings in the quadratic cost function associated with the pilot model are modified to account for the pilot's perception of the variables he considers important in the task. Analytic results obtained using the pilot model are compared to experimental results and in general good agreement is demonstrated. The analytic model yields small improvements in tracking performance with the addition of motion cues for easily controlled task dynamics and large improvements in tracking performance with the addition of motion cues for difficult task dynamics.

  19. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  20. A knowledge based system for scientific data visualization

    NASA Technical Reports Server (NTRS)

    Senay, Hikmet; Ignatius, Eve

    1992-01-01

    A knowledge-based system, called visualization tool assistant (VISTA), which was developed to assist scientists in the design of scientific data visualization techniques, is described. The system derives its knowledge from several sources which provide information about data characteristics, visualization primitives, and effective visual perception. The design methodology employed by the system is based on a sequence of transformations which decomposes a data set into a set of data partitions, maps this set of partitions to visualization primitives, and combines these primitives into a composite visualization technique design. Although the primary function of the system is to generate an effective visualization technique design for a given data set by using principles of visual perception the system also allows users to interactively modify the design, and renders the resulting image using a variety of rendering algorithms. The current version of the system primarily supports visualization techniques having applicability in earth and space sciences, although it may easily be extended to include other techniques useful in other disciplines such as computational fluid dynamics, finite-element analysis and medical imaging.

  1. The Elicitation Interview Technique: Capturing People's Experiences of Data Representations.

    PubMed

    Hogan, Trevor; Hinrichs, Uta; Hornecker, Eva

    2016-12-01

    Information visualization has become a popular tool to facilitate sense-making, discovery and communication in a large range of professional and casual contexts. However, evaluating visualizations is still a challenge. In particular, we lack techniques to help understand how visualizations are experienced by people. In this paper we discuss the potential of the Elicitation Interview technique to be applied in the context of visualization. The Elicitation Interview is a method for gathering detailed and precise accounts of human experience. We argue that it can be applied to help understand how people experience and interpret visualizations as part of exploration and data analysis processes. We describe the key characteristics of this interview technique and present a study we conducted to exemplify how it can be applied to evaluate data representations. Our study illustrates the types of insights this technique can bring to the fore, for example, evidence for deep interpretation of visual representations and the formation of interpretations and stories beyond the represented data. We discuss general visualization evaluation scenarios where the Elicitation Interview technique may be beneficial and specify what needs to be considered when applying this technique in a visualization context specifically.

  2. Integrating Heterogeneous Healthcare Datasets and Visual Analytics for Disease Bio-surveillance and Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramanathan, Arvind; Pullum, Laura L; Steed, Chad A

    2013-01-01

    n this paper, we present an overview of the big data chal- lenges in disease bio-surveillance and then discuss the use of visual analytics for integrating data and turning it into knowl- edge. We will explore two integration scenarios: (1) combining text and multimedia sources to improve situational awareness and (2) enhancing disease spread model data with real-time bio-surveillance data. Together, the proposed integration methodologies can improve awareness about when, where and how emerging diseases can affect wide geographic regions.

  3. Semantic Interaction for Visual Analytics: Toward Coupling Cognition and Computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander

    2014-07-01

    The dissertation discussed in this article [1] was written in the midst of an era of digitization. The world is becoming increasingly instrumented with sensors, monitoring, and other methods for generating data describing social, physical, and natural phenomena. Thus, data exist with the potential of being analyzed to uncover, or discover, the phenomena from which it was created. However, as the analytic models leveraged to analyze these data continue to increase in complexity and computational capability, how can visualizations and user interaction methodologies adapt and evolve to continue to foster discovery and sensemaking?

  4. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  5. Visual business ecosystem intelligence: lessons from the field.

    PubMed

    Basole, Rahul C

    2014-01-01

    Macroscopic insight into business ecosystems is becoming increasingly important. With the emergence of new digital business data, opportunities exist to develop rich, interactive visual-analytics tools. Georgia Institute of Technology researchers have been developing and implementing visual business ecosystem intelligence tools in corporate settings. This article discusses the challenges they faced, the lessons learned, and opportunities for future research.

  6. A Virtual World of Visualization

    NASA Technical Reports Server (NTRS)

    1998-01-01

    In 1990, Sterling Software, Inc., developed the Flow Analysis Software Toolkit (FAST) for NASA Ames on contract. FAST is a workstation based modular analysis and visualization tool. It is used to visualize and animate grids and grid oriented data, typically generated by finite difference, finite element and other analytical methods. FAST is now available through COSMIC, NASA's software storehouse.

  7. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Using design of experiments to optimize derivatization with methyl chloroformate for quantitative analysis of the aqueous phase from hydrothermal liquefaction of biomass.

    PubMed

    Madsen, René Bjerregaard; Jensen, Mads Mørk; Mørup, Anders Juul; Houlberg, Kasper; Christensen, Per Sigaard; Klemmer, Maika; Becker, Jacob; Iversen, Bo Brummerstedt; Glasius, Marianne

    2016-03-01

    Hydrothermal liquefaction is a promising technique for the production of bio-oil. The process produces an oil phase, a gas phase, a solid residue, and an aqueous phase. Gas chromatography coupled with mass spectrometry is used to analyze the complex aqueous phase. Especially small organic acids and nitrogen-containing compounds are of interest. The efficient derivatization reagent methyl chloroformate was used to make analysis of the complex aqueous phase from hydrothermal liquefaction of dried distillers grains with solubles possible. A circumscribed central composite design was used to optimize the responses of both derivatized and nonderivatized analytes, which included small organic acids, pyrazines, phenol, and cyclic ketones. Response surface methodology was used to visualize significant factors and identify optimized derivatization conditions (volumes of methyl chloroformate, NaOH solution, methanol, and pyridine). Twenty-nine analytes of small organic acids, pyrazines, phenol, and cyclic ketones were quantified. An additional three analytes were pseudoquantified with use of standards with similar mass spectra. Calibration curves with high correlation coefficients were obtained, in most cases R (2)  > 0.991. Method validation was evaluated with repeatability, and spike recoveries of all 29 analytes were obtained. The 32 analytes were quantified in samples from the commissioning of a continuous flow reactor and in samples from recirculation experiments involving the aqueous phase. The results indicated when the steady-state condition of the flow reactor was obtained and the effects of recirculation. The validated method will be especially useful for investigations of the effect of small organic acids on the hydrothermal liquefaction process.

  9. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  10. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  11. Coping with Volume and Variety in Temporal Event Sequences: Strategies for Sharpening Analytic Focus.

    PubMed

    Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam

    2017-06-01

    The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.

  12. Identification of microplastics by FTIR and Raman microscopy: a novel silicon filter substrate opens the important spectral range below 1300 cm(-1) for FTIR transmission measurements.

    PubMed

    Käppler, Andrea; Windrich, Frank; Löder, Martin G J; Malanin, Mikhail; Fischer, Dieter; Labrenz, Matthias; Eichhorn, Klaus-Jochen; Voit, Brigitte

    2015-09-01

    The presence of microplastics in aquatic ecosystems is a topical problem and leads to the need of appropriate and reliable analytical methods to distinctly identify and to quantify these particles in environmental samples. As an example transmission, Fourier transform infrared (FTIR) imaging can be used to analyze samples directly on filters without any visual presorting, when the environmental sample was afore extracted, purified, and filtered. However, this analytical approach is strongly restricted by the limited IR transparency of conventional filter materials. Within this study, we describe a novel silicon (Si) filter substrate produced by photolithographic microstructuring, which guarantees sufficient transparency for the broad mid-infrared region of 4000-600 cm(-1). This filter type features holes with a diameter of 10 μm and exhibits adequate mechanical stability. Furthermore, it will be shown that our Si filter substrate allows a distinct identification of the most common microplastics, polyethylene (PE), and polypropylene (PP), in the characteristic fingerprint region (1400-600 cm(-1)). Moreover, using the Si filter substrate, a differentiation of microparticles of polyesters having quite similar chemical structure, like polyethylene terephthalate (PET) and polybutylene terephthalate (PBT), is now possible, which facilitates a visualization of their distribution within a microplastic sample by FTIR imaging. Finally, this Si filter can also be used as substrate for Raman microscopy-a second complementary spectroscopic technique-to identify microplastic samples.

  13. Recovery of Anisakid larvae by means of chloro-peptic digestion and proposal of the method for the official control.

    PubMed

    Fraulo, Pasquale; Morena, Carmelo; Costa, Antonella

    2014-10-01

    Anisakidae larvae belonging to the genera Anisakis and Pseudoterranova, are the most responsible for zoonosis transmitted by fish products (anisakidosis). Acquired by the consumption of raw or undercooked marine fish or squid, the anisakid larvae may cause pathogenic diseases like gastric or intestinal anisakiasis and gastro-allergic disorders. In accordance with current EU legislation, the fresh fish products must be inspected visually in order to detect the possible presence of visible parasites. It is recognized that the visual method is not accurate enough to detect the larvae of parasites in food preparations containing raw or practically raw seafood and it clearly emerges that the official system of control needs to be able to utilise an most efficient analytical technique. In this work, the authors have drawn up and validated an analytical method, which involves artificial digestion and the use of a heated magnetic stirrer, based on the EU Regulation n. 2075/2005. The larvae isolated are then subjected to morphological identification at genus level by using optical microscope. The method, proved to be suitable for the detection of live and dead larvae of anisakidae in ready-to-eat foodstuffs containing raw fish or cephalopods and it is fast and accurate. The method showed high levels of sensitivity and specificity, and the suitability of its use in official food control was confirmed. Its use should be incorporated systematically into specific monitoring programs for the control of foodstuffs containing raw fish products.

  14. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  15. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Miniaturized Temperature-Controlled Planar Chromatography (Micro-TLC) as a Versatile Technique for Fast Screening of Micropollutants and Biomarkers Derived from Surface Water Ecosystems and During Technological Processes of Wastewater Treatment.

    PubMed

    Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2017-07-01

    There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.

  17. Using Text Analytics of AJPE Article Titles to Reveal Trends In Pharmacy Education Over the Past Two Decades.

    PubMed

    Pedrami, Farnoush; Asenso, Pamela; Devi, Sachin

    2016-08-25

    Objective. To identify trends in pharmacy education during last two decades using text mining. Methods. Articles published in the American Journal of Pharmaceutical Education (AJPE) in the past two decades were compiled in a database. Custom text analytics software was written using Visual Basic programming language in the Visual Basic for Applications (VBA) editor of Excel 2007. Frequency of words appearing in article titles was calculated using the custom VBA software. Data were analyzed to identify the emerging trends in pharmacy education. Results. Three educational trends emerged: active learning, interprofessional, and cultural competency. Conclusion. The text analytics program successfully identified trends in article topics and may be a useful compass to predict the future course of pharmacy education.

  18. Presentation-Oriented Visualization Techniques.

    PubMed

    Kosara, Robert

    2016-01-01

    Data visualization research focuses on data exploration and analysis, yet the vast majority of visualizations people see were created for a different purpose: presentation. Whether we are talking about charts showing data to help make a presenter's point, data visuals created to accompany a news story, or the ubiquitous infographics, many more people consume charts than make them. Traditional visualization techniques treat presentation as an afterthought, but are there techniques uniquely suited to data presentation but not necessarily ideal for exploration and analysis? This article focuses on presentation-oriented techniques, considering their usefulness for presentation first and any other purposes as secondary.

  19. Electrokinetic dispersion in microfluidic separation systems

    NASA Astrophysics Data System (ADS)

    Molho, Joshua Irving

    Numerous efforts have focused on engineering miniaturized chemical analysis devices that are faster, more portable and consume smaller volumes of expensive reagents than their macroscale counterparts. Many of these analysis devices employ electrokinetic effects to transport picoliter volumes of liquids and to separate chemical species from an initially mixed sample volume. In these microfluidic separation systems, dispersion must be minimized to obtain the highest resolution separation possible. This work focuses on modeling, simulation and experimental measurement of two electrokinetic dispersion mechanisms that can reduce the effectiveness of microfluidic separation systems: dispersion resulting from non-uniform wall zeta-potential, and dispersion caused by microchannel turns. When the surface of a microchannel has non-uniform zeta-potential (e.g., if the surface charge varies along the length of the microchannel), an applied electric field creates both electroosmotic and pressure-driven flow. A caged-fluorescence imaging technique was used to visualize the dispersion caused by this electrokinetically induced pressure-driven flow. A simple model for a single channel with an axially varying surface charge is presented and compared to experimental measurements. Microchannel turns have been shown to create dispersion of electrokinetically transported analyte bands. Using a method of moments analysis, a model is developed that quantifies this dispersion and identifies the conditions under which turn dispersion limits the resolution of a microfluidic separation system. Measurements using the caged-fluorescence visualization technique were used to verify this model. New turn geometries are presented and were optimized using both a reduced parameter technique as well as a more generalized, numerical shape optimization approach. These improved turn designs were manufactured using two fabrication techniques and then tested experimentally. The turn optimization approaches and resulting turn geometries described here are shown to reduce turn dispersion to less than 1% of the dispersion caused by unoptimized, constant-width turns.

  20. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  1. Survey of Network Visualization Tools

    DTIC Science & Technology

    2007-12-01

    Dimensionality • 2D Comments: Deployment Type: • Components for tool building • Standalone Tool OS: • Windows Extensibility • ActiveX ...Visual Basic Comments: Interoperability Daisy is fully compliant with Microsoft’s ActiveX , therefore, other Windows based programs can...other functions that improve analytic decision making. Available in ActiveX , C++, Java, and .NET editions. • Tom Sawyer Visualization: Enables you to

  2. Real time en face Fourier-domain optical coherence tomography with direct hardware frequency demodulation

    PubMed Central

    Biedermann, Benjamin R.; Wieser, Wolfgang; Eigenwillig, Christoph M.; Palte, Gesa; Adler, Desmond C.; Srinivasan, Vivek J.; Fujimoto, James G.; Huber, Robert

    2009-01-01

    We demonstrate en face swept source optical coherence tomography (ss-OCT) without requiring a Fourier transformation step. The electronic optical coherence tomography (OCT) interference signal from a k-space linear Fourier domain mode-locked laser is mixed with an adjustable local oscillator, yielding the analytic reflectance signal from one image depth for each frequency sweep of the laser. Furthermore, a method for arbitrarily shaping the spectral intensity profile of the laser is presented, without requiring the step of numerical apodization. In combination, these two techniques enable sampling of the in-phase and quadrature signal with a slow analog-to-digital converter and allow for real-time display of en face projections even for highest axial scan rates. Image data generated with this technique is compared to en face images extracted from a three-dimensional OCT data set. This technique can allow for real-time visualization of arbitrarily oriented en face planes for the purpose of alignment, registration, or operator-guided survey scans while simultaneously maintaining the full capability of high-speed volumetric ss-OCT functionality. PMID:18978919

  3. Quantitative Phase Fraction Detection in Organic Photovoltaic Materials through EELS Imaging

    DOE PAGES

    Dyck, Ondrej; Hu, Sheng; Das, Sanjib; ...

    2015-11-24

    Organic photovoltaic materials have recently seen intense interest from the research community. Improvements in device performance are occurring at an impressive rate; however, visualization of the active layer phase separation still remains a challenge. Our paper outlines the application of two electron energy-loss spectroscopic (EELS) imaging techniques that can complement and enhance current phase detection techniques. Specifically, the bulk plasmon peak position, often used to produce contrast between phases in energy filtered transmission electron microscopy (EFTEM), is quantitatively mapped across a sample cross section. One complementary spectrum image capturing the carbon and sulfur core loss edges is compared with themore » plasmon peak map and found to agree quite well, indicating that carbon and sulfur density differences between the two phases also allows phase discrimination. Additionally, an analytical technique for determining absolute atomic areal density is used to produce an absolute carbon and sulfur areal density map. We also show how these maps may be re-interpreted as a phase ratio map, giving quantitative information about the purity of the phases within the junction.« less

  4. Ammonium Sulfate Improves Detection of Hydrophilic Quaternary Ammonium Compounds through Decreased Ion Suppression in Matrix-Assisted Laser Desorption/Ionization Imaging Mass Spectrometry.

    PubMed

    Sugiyama, Eiji; Masaki, Noritaka; Matsushita, Shoko; Setou, Mitsutoshi

    2015-11-17

    Hydrophilic quaternary ammonium compounds (QACs) include derivatives of carnitine (Car) or choline, which are known to have essential bioactivities. Here we developed a technique for improving the detection of hydrophilic QACs using ammonium sulfate (AS) in matrix-assisted laser desorption/ionization-imaging mass spectrometry (MALDI-IMS). In MALDI mass spectrometry for brain homogenates, the addition of AS greatly increased the signal intensities of Car, acetylcarnitine (AcCar), and glycerophosphocholine (GPC) by approximately 300-, 700-, and 2500-fold. The marked improvement required a higher AS concentration than that needed for suppressing the potassium adduction on phosphatidylcholine and 2,5-dihydroxybenzoic acid. Adding AS also increased the signal intensities of Car, AcCar, and GPC by approximately 10-, 20-, and 40-fold in MALDI-IMS. Consequently, the distributions of five hydrophilic QACs (Car, AcCar, GPC, choline, and phosphocholine) were simultaneously visualized by this technique. The distinct mechanism from other techniques such as improved matrix application, derivatization, or postionization suggests the great potential of AS addition to achieve higher sensitivity of MALDI-IMS for various analytes.

  5. Giving raw data a chance to talk: a demonstration of exploratory visual analytics with a pediatric research database using Microsoft Live Labs Pivot to promote cohort discovery, research, and quality assessment.

    PubMed

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses.

  6. Giving Raw Data a Chance to Talk: A Demonstration of Exploratory Visual Analytics with a Pediatric Research Database Using Microsoft Live Labs Pivot to Promote Cohort Discovery, Research, and Quality Assessment

    PubMed Central

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V. Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses. PMID:24808811

  7. Estimating Starch Content in Roots of Deciduous Trees--A Visual Technique

    Treesearch

    Philip M. Wargo; Philip M. Wargo

    1975-01-01

    A visual technique for determining starch content in roots of forest trees, based onz iodine-staining of starch granules, was compared with a chemical method. Although the chemical method was more precise, roots could be sorted with the visual method into groups that are probably biologically important. The visual technique is simple and can be adapted for use in the...

  8. Evaluation of Visual Analytics Environments: The Road to the Visual Analytics Science and Technology Challenge Evaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Plaisant, Catherine; Whiting, Mark A.

    The evaluation of visual analytics environments was a topic in Illuminating the Path [Thomas 2005] as a critical aspect of moving research into practice. For a thorough understanding of the utility of the systems available, evaluation not only involves assessing the visualizations, interactions or data processing algorithms themselves, but also the complex processes that a tool is meant to support (such as exploratory data analysis and reasoning, communication through visualization, or collaborative data analysis [Lam 2012; Carpendale 2007]). Researchers and practitioners in the field have long identified many of the challenges faced when planning, conducting, and executing an evaluation ofmore » a visualization tool or system [Plaisant 2004]. Evaluation is needed to verify that algorithms and software systems work correctly and that they represent improvements over the current infrastructure. Additionally to effectively transfer new software into a working environment, it is necessary to ensure that the software has utility for the end-users and that the software can be incorporated into the end-user’s infrastructure and work practices. Evaluation test beds require datasets, tasks, metrics and evaluation methodologies. As noted in [Thomas 2005] it is difficult and expensive for any one researcher to setup an evaluation test bed so in many cases evaluation is setup for communities of researchers or for various research projects or programs. Examples of successful community evaluations can be found [Chinchor 1993; Voorhees 2007; FRGC 2012]. As visual analytics environments are intended to facilitate the work of human analysts, one aspect of evaluation needs to focus on the utility of the software to the end-user. This requires representative users, representative tasks, and metrics that measure the utility to the end-user. This is even more difficult as now one aspect of the test methodology is access to representative end-users to participate in the evaluation. In many cases the sensitive nature of data and tasks and difficult access to busy analysts puts even more of a burden on researchers to complete this type of evaluation. User-centered design goes beyond evaluation and starts with the user [Beyer 1997, Shneiderman 2009]. Having some knowledge of the type of data, tasks, and work practices helps researchers and developers know the correct paths to pursue in their work. When access to the end-users is problematic at best and impossible at worst, user-centered design becomes difficult. Researchers are unlikely to go to work on the type of problems faced by inaccessible users. Commercial vendors have difficulties evaluating and improving their products when they cannot observe real users working with their products. In well-established fields such as web site design or office software design, user-interface guidelines have been developed based on the results of empirical studies or the experience of experts. Guidelines can speed up the design process and replace some of the need for observation of actual users [heuristics review references]. In 2006 when the visual analytics community was initially getting organized, no such guidelines existed. Therefore, we were faced with the problem of developing an evaluation framework for the field of visual analytics that would provide representative situations and datasets, representative tasks and utility metrics, and finally a test methodology which would include a surrogate for representative users, increase interest in conducting research in the field, and provide sufficient feedback to the researchers so that they could improve their systems.« less

  9. MO-C-BRCD-03: The Role of Informatics in Medical Physics and Vice Versa.

    PubMed

    Andriole, K

    2012-06-01

    Like Medical Physics, Imaging Informatics encompasses concepts touching every aspect of the imaging chain from image creation, acquisition, management and archival, to image processing, analysis, display and interpretation. The two disciplines are in fact quite complementary, with similar goals to improve the quality of care provided to patients using an evidence-based approach, to assure safety in the clinical and research environments, to facilitate efficiency in the workplace, and to accelerate knowledge discovery. Use-cases describing several areas of informatics activity will be given to illustrate current limitations that would benefit from medical physicist participation, and conversely areas in which informaticists may contribute to the solution. Topics to be discussed include radiation dose monitoring, process management and quality control, display technologies, business analytics techniques, and quantitative imaging. Quantitative imaging is increasingly becoming an essential part of biomedicalresearch as well as being incorporated into clinical diagnostic activities. Referring clinicians are asking for more objective information to be gleaned from the imaging tests that they order so that they may make the best clinical management decisions for their patients. Medical Physicists may be called upon to identify existing issues as well as develop, validate and implement new approaches and technologies to help move the field further toward quantitative imaging methods for the future. Biomedical imaging informatics tools and techniques such as standards, integration, data mining, cloud computing and new systems architectures, ontologies and lexicons, data visualization and navigation tools, and business analytics applications can be used to overcome some of the existing limitations. 1. Describe what is meant by Medical Imaging Informatics and understand why the medical physicist should care. 2. Identify existing limitations in information technologies with respect to Medical Physics, and conversely see how Informatics may assist the medical physicist in filling some of the current gaps in their activities. 3. Understand general informatics concepts and areas of investigation including imaging and workflow standards, systems integration, computing architectures, ontologies, data mining and business analytics, data visualization and human-computer interface tools, and the importance of quantitative imaging for the future of Medical Physics and Imaging Informatics. 4. Become familiar with on-going efforts to address current challenges facing future research into and clinical implementation of quantitative imaging applications. © 2012 American Association of Physicists in Medicine.

  10. Integrated genome browser: visual analytics platform for genomics.

    PubMed

    Freese, Nowlan H; Norris, David C; Loraine, Ann E

    2016-07-15

    Genome browsers that support fast navigation through vast datasets and provide interactive visual analytics functions can help scientists achieve deeper insight into biological systems. Toward this end, we developed Integrated Genome Browser (IGB), a highly configurable, interactive and fast open source desktop genome browser. Here we describe multiple updates to IGB, including all-new capabilities to display and interact with data from high-throughput sequencing experiments. To demonstrate, we describe example visualizations and analyses of datasets from RNA-Seq, ChIP-Seq and bisulfite sequencing experiments. Understanding results from genome-scale experiments requires viewing the data in the context of reference genome annotations and other related datasets. To facilitate this, we enhanced IGB's ability to consume data from diverse sources, including Galaxy, Distributed Annotation and IGB-specific Quickload servers. To support future visualization needs as new genome-scale assays enter wide use, we transformed the IGB codebase into a modular, extensible platform for developers to create and deploy all-new visualizations of genomic data. IGB is open source and is freely available from http://bioviz.org/igb aloraine@uncc.edu. © The Author 2016. Published by Oxford University Press.

  11. World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World s Largest Open Source Geographic Data Sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Robert N; Piburn, Jesse O; Sorokine, Alexandre

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of thismore » integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings. Acknowledgment Prepared by Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6285, managed by UT-Battelle, LLC for the U. S. Department of Energy under contract no. DEAC05-00OR22725. Copyright This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.« less

  12. Damage states in laminated composite three-point bend specimens: An experimental-analytical correlation study

    NASA Technical Reports Server (NTRS)

    Starbuck, J. Michael; Guerdal, Zafer; Pindera, Marek-Jerzy; Poe, Clarence C.

    1990-01-01

    Damage states in laminated composites were studied by considering the model problem of a laminated beam subjected to three-point bending. A combination of experimental and theoretical research techniques was used to correlate the experimental results with the analytical stress distributions. The analytical solution procedure was based on the stress formulation approach of the mathematical theory of elasticity. The solution procedure is capable of calculating the ply-level stresses and beam displacements for any laminated beam of finite length using the generalized plane deformation or plane stress state assumption. Prior to conducting the experimental phase, the results from preliminary analyses were examined. Significant effects in the ply-level stress distributions were seen depending on the fiber orientation, aspect ratio, and whether or not a grouped or interspersed stacking sequence was used. The experimental investigation was conducted to determine the different damage modes in laminated three-point bend specimens. The test matrix consisted of three-point bend specimens of 0 deg unidirectional, cross-ply, and quasi-isotropic stacking sequences. The dependence of the damage initiation loads and ultimate failure loads were studied, and their relation to damage susceptibility and damage tolerance of the mean configuration was discussed. Damage modes were identified by visual inspection of the damaged specimens using an optical microscope. The four fundamental damage mechanisms identified were delaminations, matrix cracking, fiber breakage, and crushing. The correlation study between the experimental results and the analytical results were performed for the midspan deflection, indentation, damage modes, and damage susceptibility.

  13. Some new features of Direct Analysis in Real Time mass spectrometry utilizing the desorption at an angle option.

    PubMed

    Chernetsova, Elena S; Revelsky, Alexander I; Morlock, Gertrud E

    2011-08-30

    The present study is a first step towards the unexplored capabilities of Direct Analysis in Real Time (DART) mass spectrometry (MS) arising from the possibility of the desorption at an angle: scanning analysis of surfaces, including the coupling of thin-layer chromatography (TLC) with DART-MS, and a more sensitive analysis due to the preliminary concentration of analytes dissolved in large volumes of liquids on glass surfaces. In order to select the most favorable conditions for DART-MS analysis, proper positioning of samples is important. Therefore, a simple and cheap technique for the visualization of the impact region of the DART gas stream onto a substrate was developed. A filter paper or TLC plate, previously loaded with the analyte, was immersed in a derivatization solution. On this substrate, owing to the impact of the hot DART gas, reaction of the analyte to a colored product occurred. An improved capability of detection of DART-MS for the analysis of liquids was demonstrated by applying large volumes of model solutions of coumaphos into small glass vessels and drying these solutions prior to DART-MS analysis under ambient conditions. This allowed the introduction of, by up to more than two orders of magnitude, increased quantities of analyte compared with the conventional DART-MS analysis of liquids. Through this improved detectability, the capabilities of DART-MS in trace analysis could be strengthened. Copyright © 2011 John Wiley & Sons, Ltd.

  14. Depth-resolved monitoring of analytes diffusion in ocular tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.

    2007-02-01

    Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.

  15. Experimental and Analytic Evaluation of the Effects of Visual and Motion Simulation in SH-3 Helicopter Training. Technical Report 85-002.

    ERIC Educational Resources Information Center

    Pfeiffer, Mark G.; Scott, Paul G.

    A fly-only group (N=16) of Navy replacement pilots undergoing fleet readiness training in the SH-3 helicopter was compared with groups pre-trained on Device 2F64C with: (1) visual only (N=13); (2) no visual/no motion (N=14); and (3) one visual plus motion group (N=19). Groups were compared for their SH-3 helicopter performance in the transition…

  16. Liquid-to-gel transition for visual and tactile detection of biological analytes.

    PubMed

    Fedotova, Tatiana A; Kolpashchikov, Dmitry M

    2017-11-23

    So far all visual and instrument-free methods have been based on a color change. However, colorimetric assays cannot be used by blind or color-blind people. Here we introduce a liquid-to-gel transition as a general output platform. The signal output (a piece of gel) can be unambiguously distinguished from liquid both visually and by touch. This approach promises to contribute to the development of an accessible environment for visually impaired persons.

  17. A framework for inference about carnivore density from unstructured spatial sampling of scat using detector dogs

    USGS Publications Warehouse

    Thompson, Craig M.; Royle, J. Andrew; Garner, James D.

    2012-01-01

    Wildlife management often hinges upon an accurate assessment of population density. Although undeniably useful, many of the traditional approaches to density estimation such as visual counts, livetrapping, or mark–recapture suffer from a suite of methodological and analytical weaknesses. Rare, secretive, or highly mobile species exacerbate these problems through the reality of small sample sizes and movement on and off study sites. In response to these difficulties, there is growing interest in the use of non-invasive survey techniques, which provide the opportunity to collect larger samples with minimal increases in effort, as well as the application of analytical frameworks that are not reliant on large sample size arguments. One promising survey technique, the use of scat detecting dogs, offers a greatly enhanced probability of detection while at the same time generating new difficulties with respect to non-standard survey routes, variable search intensity, and the lack of a fixed survey point for characterizing non-detection. In order to account for these issues, we modified an existing spatially explicit, capture–recapture model for camera trap data to account for variable search intensity and the lack of fixed, georeferenced trap locations. We applied this modified model to a fisher (Martes pennanti) dataset from the Sierra National Forest, California, and compared the results (12.3 fishers/100 km2) to more traditional density estimates. We then evaluated model performance using simulations at 3 levels of population density. Simulation results indicated that estimates based on the posterior mode were relatively unbiased. We believe that this approach provides a flexible analytical framework for reconciling the inconsistencies between detector dog survey data and density estimation procedures.

  18. ID-Viewer: a visual analytics architecture for infectious diseases surveillance and response management in Pakistan.

    PubMed

    Ali, M A; Ahsan, Z; Amin, M; Latif, S; Ayyaz, A; Ayyaz, M N

    2016-05-01

    Globally, disease surveillance systems are playing a significant role in outbreak detection and response management of Infectious Diseases (IDs). However, in developing countries like Pakistan, epidemic outbreaks are difficult to detect due to scarcity of public health data and absence of automated surveillance systems. Our research is intended to formulate an integrated service-oriented visual analytics architecture for ID surveillance, identify key constituents and set up a baseline for easy reproducibility of such systems in the future. This research focuses on development of ID-Viewer, which is a visual analytics decision support system for ID surveillance. It is a blend of intelligent approaches to make use of real-time streaming data from Emergency Departments (EDs) for early outbreak detection, health care resource allocation and epidemic response management. We have developed a robust service-oriented visual analytics architecture for ID surveillance, which provides automated mechanisms for ID data acquisition, outbreak detection and epidemic response management. Classification of chief-complaints is accomplished using dynamic classification module, which employs neural networks and fuzzy-logic to categorize syndromes. Standard routines by Center for Disease Control (CDC), i.e. c1-c3 (c1-mild, c2-medium and c3-ultra), and spatial scan statistics are employed for detection of temporal and spatio-temporal disease outbreaks respectively. Prediction of imminent disease threats is accomplished using support vector regression for early warnings and response planning. Geographical visual analytics displays are developed that allow interactive visualization of syndromic clusters, monitoring disease spread patterns, and identification of spatio-temporal risk zones. We analysed performance of surveillance framework using ID data for year 2011-2015. Dynamic syndromic classifier is able to classify chief-complaints to appropriate syndromes with high classification accuracy. Outbreak detection methods are able to detect the ID outbreaks in start of epidemic time zones. Prediction model is able to forecast dengue trend for 20 weeks ahead with nominal normalized root mean square error of 0.29. Interactive geo-spatiotemporal displays, i.e. heat-maps, and choropleth are shown in respective sections. The proposed framework will set a standard and provide necessary details for future implementation of such a system for resource-constrained regions. It will improve early outbreak detection attributable to natural and man-made biological threats, monitor spatio-temporal epidemic trends and provide assurance that an outbreak has, or has not occurred. Advanced analytics features will be beneficial in timely organization/formulation of health management policies, disease control activities and efficient health care resource allocation. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  19. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  20. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  1. Serum proteins by capillary zone electrophoresis: approaches to the definition of reference values.

    PubMed

    Petrini, C; Alessio, M G; Scapellato, L; Brambilla, S; Franzini, C

    1999-10-01

    The Paragon CZE 2000 (Beckman Analytical, Milan, Italy) is an automatic dedicated capillary zone electrophoresis (CZE) system, producing a five-zone serum protein pattern with quantitative estimation of the zones. With the view of substituting this instrument for two previously used serum protein electrophoresis techniques, we planned to produce reference values for the "new" systems leading to compatible interpretation of the results. High resolution cellulose acetate electrophoresis with visual inspection and descriptive reporting (HR-CAE) and five-zone cellulose acetate electrophoresis with densitometry (CAE-D) were the previously used techniques. Serum samples (n = 167) giving "normal pattern" with HR-CAE were assayed with the CZE system, and the results were statistically assessed to yield 0.95 reference intervals. One thousand normal and pathological serum samples were then assayed with the CAE-D and the CZE techniques, and the regression equations of the CAE-D values over the CZE values for the five zones were used to transform the CAE-D reference limits into the CZE reference limits. The two sets of reference values thereby produced were in good agreement with each other and also with reference values previously reported for the CZE system. Thus, reference values for the CZE techniques permit interpretation of results coherent with the previously used techniques and reporting modes.

  2. Spin-mapping of Coal Structures with ESE and ENDOR

    DOE R&D Accomplishments Database

    Belford, R. L.; Clarkson, R. B.

    1989-12-01

    The broad goals of this project are to determine by nondestructive magnetic resonance methods chemical and physical structural characteristics of organic parts of native and treated coals. In this project period, we have begun to explore a technique which promises to enable us to follow to course of coal cleaning processes with microscopic spatial resolution. For the past five years, our laboratory has worked on extensions of the EPR technique as applied to coal to address these analytical problems. In this report we (1) describe the world's first nuclear magnetic resonance imaging results from an Illinois {number sign}6 coal and (2) transmit a manuscript describing how organic sulfur affect the very-high-frequency EPR spectra of coals. Magnetic resonance imaging (MRI) is a non-destructive technique that has found wide medical application as a means of visualizing the interior of human bodies. We have used MRI techniques to study the diffusion of an organic solvent (DMSO) into the pores of Illinois {number sign}6 coal. Proton MRI images reveal that this solvent at room temperature does not penetrate approximately 30% of the coal volume. Regions of the coal that exclude solvent could be related to inertinite and mineral components. A multi-technique imaging program is contemplated.

  3. Interactive Visual Analytics Approch for Exploration of Geochemical Model Simulations with Different Parameter Sets

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2015-04-01

    Many geoscience applications can benefit from testing many combinations of input parameters for geochemical simulation models. It is, however, a challenge to screen the input and output data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we propose a Visual Analytics approach that has been developed in an ongoing collaboration between computer science and geoscience researchers. Our Visual Analytics approach uses visualization methods of hierarchical horizontal axis, multi-factor stacked bar charts and interactive semi-automated filtering for input and output data together with automatic sensitivity analysis. This guides the users towards significant relationships. We implement our approach as an interactive data exploration tool. It is designed with flexibility in mind, so that a diverse set of tasks such as inverse modeling, sensitivity analysis and model parameter refinement can be supported. Here we demonstrate the capabilities of our approach by two examples for gas storage applications. For the first example our Visual Analytics approach enabled the analyst to observe how the element concentrations change around previously established baselines in response to thousands of different combinations of mineral phases. This supported combinatorial inverse modeling for interpreting observations about the chemical composition of the formation fluids at the Ketzin pilot site for CO2 storage. The results indicate that, within the experimental error range, the formation fluid cannot be considered at local thermodynamical equilibrium with the mineral assemblage of the reservoir rock. This is a valuable insight from the predictive geochemical modeling for the Ketzin site. For the second example our approach supports sensitivity analysis for a reaction involving the reductive dissolution of pyrite with formation of pyrrothite in presence of gaseous hydrogen. We determine that this reaction is thermodynamically favorable under a broad range of conditions. This includes low temperatures and absence of microbial catalysators. Our approach has potential for use in other applications that involve exploration of relationships in geochemical simulation model data.

  4. Analyzing a 35-Year Hourly Data Record: Why So Difficult?

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2014-01-01

    At the Goddard Distributed Active Archive Center, we have recently added a 35-Year record of output data from the North American Land Assimilation System (NLDAS) to the Giovanni web-based analysis and visualization tool. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) offers a variety of data summarization and visualization to users that operate at the data center, obviating the need for users to download and read the data themselves for exploratory data analysis. However, the NLDAS data has proven surprisingly resistant to application of the summarization algorithms. Algorithms that were perfectly happy analyzing 15 years of daily satellite data encountered limitations both at the algorithm and system level for 35 years of hourly data. Failures arose, sometimes unexpectedly, from command line overflows, memory overflows, internal buffer overflows, and time-outs, among others. These serve as an early warning sign for the problems likely to be encountered by the general user community as they try to scale up to Big Data analytics. Indeed, it is likely that more users will seek to perform remote web-based analysis precisely to avoid the issues, or the need to reprogram around them. We will discuss approaches to mitigating the limitations and the implications for data systems serving the user communities that try to scale up their current techniques to analyze Big Data.

  5. In-line monitoring of pellet coating thickness growth by means of visual imaging.

    PubMed

    Oman Kadunc, Nika; Sibanc, Rok; Dreu, Rok; Likar, Boštjan; Tomaževič, Dejan

    2014-08-15

    Coating thickness is the most important attribute of coated pharmaceutical pellets as it directly affects release profiles and stability of the drug. Quality control of the coating process of pharmaceutical pellets is thus of utmost importance for assuring the desired end product characteristics. A visual imaging technique is presented and examined as a process analytic technology (PAT) tool for noninvasive continuous in-line and real time monitoring of coating thickness of pharmaceutical pellets during the coating process. Images of pellets were acquired during the coating process through an observation window of a Wurster coating apparatus. Image analysis methods were developed for fast and accurate determination of pellets' coating thickness during a coating process. The accuracy of the results for pellet coating thickness growth obtained in real time was evaluated through comparison with an off-line reference method and a good agreement was found. Information about the inter-pellet coating uniformity was gained from further statistical analysis of the measured pellet size distributions. Accuracy and performance analysis of the proposed method showed that visual imaging is feasible as a PAT tool for in-line and real time monitoring of the coating process of pharmaceutical pellets. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  7. Visual Thinking and Gender Differences in High School Calculus

    ERIC Educational Resources Information Center

    Haciomeroglu, Erhan Selcuk; Chicken, Eric

    2012-01-01

    This study sought to examine calculus students' mathematical performances and preferences for visual or analytic thinking regarding derivative and antiderivative tasks presented graphically. It extends previous studies by investigating factors mediating calculus students' mathematical performances and their preferred modes of thinking. Data were…

  8. Visual Basic programs for spreadsheet analysis.

    PubMed

    Hunt, Bruce

    2005-01-01

    A collection of Visual Basic programs, entitled Function.xls, has been written for ground water spreadsheet calculations. This collection includes programs for calculating mathematical functions and for evaluating analytical solutions in ground water hydraulics and contaminant transport. Several spreadsheet examples are given to illustrate their use.

  9. Report: New analytical and statistical approaches for interpreting the relationships among environmental stressors and biomarkers

    EPA Science Inventory

    The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical, and statistical perspectives (Pleil et al. 2014; Sobus et al. 2011...

  10. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  11. Investigation of physical and chemical stability of ointment with herbals.

    PubMed

    Zdoryk, Oleksandr A; Khokhlova, Kateryna O; Georgiyants, Victoriya A; Vyshnevska, Liliia I

    2014-01-01

    The physical and chemical stability of a stock preparation ointment with active ingredients-herbal tinctures of calendula and arnica-for the treatment of hemorrhoids was studied. Evaluations for physical and chemical stability were performed initially and throughout the storage period. Physical stability of the ointment was assessed by means of visual observation in normal room light. Throughout the study period, the physical appearance of the ointment did not change. The chemical stability of the ointment was evaluated by means of a stability-indicating, thin-layer chromatography analytical technique. The shelf-life was found to be one month at 25 degrees C +/- 2 degrees C/60% RH and two months at 5 degrees C +/- 3 degrees C, when protected from light.

  12. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  13. Quality of the antimalarial medicine artemether - lumefantrine in eight cities of the Democratic Republic of the Congo.

    PubMed

    Mufusama, Jean-Pierre; Ioset, Karine Ndjoko; Feineis, Doris; Hoellein, Ludwig; Holzgrabe, Ulrike; Bringmann, Gerhard

    2018-06-12

    In the context of post-marketing surveillance supporting public-health authorities to take evidence-based decisions to fight the spread of poor-quality medicines, the quality of antimalarial artemether-lumefantrine (AL) medicines was assessed in the Democratic Republic of the Congo (DRC). A total of 150 samples of AL containing products was collected from private pharmaceutical outlets in eight main cities: Goma, Kikwit, Kinshasa, Kisangani, Lubumbashi, Matadi, Mbandaka, and Mbuji-Mayi. All drug samples were successively analyzed by visual inspection, thin-layer chromatography (TLC), and high-performance liquid chromatography (HPLC) following The International Pharmacopoeia. Out of the 150 collected drug samples, three (2%) failed the visual inspection as they had shelf lives different from those of other samples with the same brand name. Four samples (2.7%) did not pass the TLC test as they contained only one or even none of the two declared active pharmaceutical ingredients (APIs). HPLC assays showed that 46 (30.7%) samples had artemether contents below 90% and 17 (11.3%) above 110% of the content claimed on the label. For lumefantrine, 32 (21.7%) samples had contents below 90%, and eight (5.3%) had contents above 110%. This survey in DRC gives evidence that poor-quality antimalarial medicines are widely present. Based on three detection techniques, the study shows the necessity to equip developing countries with modern techniques such as HPLC, which, if combined with affordable techniques like TLC, could provide a pertinent analytical strategy to combat drug counterfeiting and poor manufacturing. This article is protected by copyright. All rights reserved.

  14. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  15. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  16. An automated baseline correction protocol for infrared spectra of atmospheric aerosols collected on polytetrafluoroethylene (Teflon) filters

    NASA Astrophysics Data System (ADS)

    Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi

    2016-06-01

    A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification, and (3) thermal optical reflectance (TOR) organic carbon (OC) and elemental carbon (EC) predictions. The discrepancy rate for a four-cluster solution is 10 %. For all functional groups but carboxylic COH the discrepancy is ≤ 10 %. Performance metrics obtained from TOR OC and EC predictions (R2 ≥ 0.94 %, bias ≤ 0.01 µg m-3, and error ≤ 0.04 µg m-3) are on a par with those obtained from uncorrected and PB-corrected spectra. The proposed protocol leads to visually and analytically similar estimates as those generated by the polynomial method. More importantly, the automated solution allows us and future users to evaluate its analytical reproducibility while minimizing reducible user bias. We anticipate the protocol will enable FT-IR researchers and data analysts to quickly and reliably analyze a large amount of data and connect them to a variety of available statistical learning methods to be applied to analyte absorbances isolated in atmospheric aerosol samples.

  17. a Web-Based Framework for Visualizing Industrial Spatiotemporal Distribution Using Standard Deviational Ellipse and Shifting Routes of Gravity Centers

    NASA Astrophysics Data System (ADS)

    Song, Y.; Gui, Z.; Wu, H.; Wei, Y.

    2017-09-01

    Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  18. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less

  19. Teaching Science through Pictorial Models during Read-Alouds

    ERIC Educational Resources Information Center

    Oliveira, Alandeom W.; Rivera, Seema; Glass, Rory; Mastroianni, Michael; Wizner, Francine; Amodeo, Vincent

    2013-01-01

    This study examines how three elementary teachers refer to pictorial models (photographs, drawings, and cartoons) during science read-alouds. While one teacher used realistic photographs for the purpose of visually verifying facts about crystals, another employed analytical diagrams as heuristic tools to help students visualize complex target…

  20. 75 FR 53262 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-31

    ... a new Privacy Act system of records, JUSTICE/FBI- 021, the Data Integration and Visualization System... Act system of records, the Data Integration and Visualization System (DIVS), Justice/FBI-021. The... investigative mission by enabling access, search, integration, and analytics across multiple existing databases...

  1. Combining data visualization and statistical approaches for interpreting measurements and meta-data: Integrating heatmaps, variable clustering, and mixed regression models

    EPA Science Inventory

    The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...

  2. Reimagining Khan Analytics for Student Coaches

    ERIC Educational Resources Information Center

    Cunningham, Jim

    2015-01-01

    In this paper, I describe preliminary work on a new research project in learning analytics at Arizona State University. In conjunction with an innovative remedial mathematics course using Khan Academy and student coaches, this study seeks to measure the effectiveness of visualized data in assisting student coaches as they help remedial math…

  3. Instrumentation: Photodiode Array Detectors in UV-VIS Spectroscopy. Part II.

    ERIC Educational Resources Information Center

    Jones, Dianna G.

    1985-01-01

    A previous part (Analytical Chemistry; v57 n9 p1057A) discussed the theoretical aspects of diode ultraviolet-visual (UV-VIS) spectroscopy. This part describes the applications of diode arrays in analytical chemistry, also considering spectroelectrochemistry, high performance liquid chromatography (HPLC), HPLC data processing, stopped flow, and…

  4. The Evolution of Three Dimensional Visualization for Commanding the Mars Rovers

    NASA Technical Reports Server (NTRS)

    Hartman, Frank R.; Wright, John; Cooper, Brian

    2014-01-01

    NASA's Jet Propulsion Laboratory has built and operated four rovers on the surface of Mars. Two and three dimensional visualization has been extensively employed to command both the mobility and robotic arm operations of these rovers. Stereo visualization has been an important component in this set of visualization techniques. This paper discusses the progression of the implementation and use of visualization techniques for in-situ operations of these robotic missions. Illustrative examples will be drawn from the results of using these techniques over more than ten years of surface operations on Mars.

  5. Curriculum Mapping with Academic Analytics in Medical and Healthcare Education.

    PubMed

    Komenda, Martin; Víta, Martin; Vaitsis, Christos; Schwarz, Daniel; Pokorná, Andrea; Zary, Nabil; Dušek, Ladislav

    2015-01-01

    No universal solution, based on an approved pedagogical approach, exists to parametrically describe, effectively manage, and clearly visualize a higher education institution's curriculum, including tools for unveiling relationships inside curricular datasets. We aim to solve the issue of medical curriculum mapping to improve understanding of the complex structure and content of medical education programs. Our effort is based on the long-term development and implementation of an original web-based platform, which supports an outcomes-based approach to medical and healthcare education and is suitable for repeated updates and adoption to curriculum innovations. We adopted data exploration and visualization approaches in the context of medical curriculum innovations in higher education institutions domain. We have developed a robust platform, covering detailed formal metadata specifications down to the level of learning units, interconnections, and learning outcomes, in accordance with Bloom's taxonomy and direct links to a particular biomedical nomenclature. Furthermore, we used selected modeling techniques and data mining methods to generate academic analytics reports from medical curriculum mapping datasets. We present a solution that allows users to effectively optimize a curriculum structure that is described with appropriate metadata, such as course attributes, learning units and outcomes, a standardized vocabulary nomenclature, and a tree structure of essential terms. We present a case study implementation that includes effective support for curriculum reengineering efforts of academics through a comprehensive overview of the General Medicine study program. Moreover, we introduce deep content analysis of a dataset that was captured with the use of the curriculum mapping platform; this may assist in detecting any potentially problematic areas, and hence it may help to construct a comprehensive overview for the subsequent global in-depth medical curriculum inspection. We have proposed, developed, and implemented an original framework for medical and healthcare curriculum innovations and harmonization, including: planning model, mapping model, and selected academic analytics extracted with the use of data mining.

  6. Curriculum Mapping with Academic Analytics in Medical and Healthcare Education

    PubMed Central

    Komenda, Martin; Víta, Martin; Vaitsis, Christos; Schwarz, Daniel; Pokorná, Andrea; Zary, Nabil; Dušek, Ladislav

    2015-01-01

    Background No universal solution, based on an approved pedagogical approach, exists to parametrically describe, effectively manage, and clearly visualize a higher education institution’s curriculum, including tools for unveiling relationships inside curricular datasets. Objective We aim to solve the issue of medical curriculum mapping to improve understanding of the complex structure and content of medical education programs. Our effort is based on the long-term development and implementation of an original web-based platform, which supports an outcomes-based approach to medical and healthcare education and is suitable for repeated updates and adoption to curriculum innovations. Methods We adopted data exploration and visualization approaches in the context of medical curriculum innovations in higher education institutions domain. We have developed a robust platform, covering detailed formal metadata specifications down to the level of learning units, interconnections, and learning outcomes, in accordance with Bloom’s taxonomy and direct links to a particular biomedical nomenclature. Furthermore, we used selected modeling techniques and data mining methods to generate academic analytics reports from medical curriculum mapping datasets. Results We present a solution that allows users to effectively optimize a curriculum structure that is described with appropriate metadata, such as course attributes, learning units and outcomes, a standardized vocabulary nomenclature, and a tree structure of essential terms. We present a case study implementation that includes effective support for curriculum reengineering efforts of academics through a comprehensive overview of the General Medicine study program. Moreover, we introduce deep content analysis of a dataset that was captured with the use of the curriculum mapping platform; this may assist in detecting any potentially problematic areas, and hence it may help to construct a comprehensive overview for the subsequent global in-depth medical curriculum inspection. Conclusions We have proposed, developed, and implemented an original framework for medical and healthcare curriculum innovations and harmonization, including: planning model, mapping model, and selected academic analytics extracted with the use of data mining. PMID:26624281

  7. Emergent 1d Ising Behavior in AN Elementary Cellular Automaton Model

    NASA Astrophysics Data System (ADS)

    Kassebaum, Paul G.; Iannacchione, Germano S.

    The fundamental nature of an evolving one-dimensional (1D) Ising model is investigated with an elementary cellular automaton (CA) simulation. The emergent CA simulation employs an ensemble of cells in one spatial dimension, each cell capable of two microstates interacting with simple nearest-neighbor rules and incorporating an external field. The behavior of the CA model provides insight into the dynamics of coupled two-state systems not expressible by exact analytical solutions. For instance, state progression graphs show the causal dynamics of a system through time in relation to the system's entropy. Unique graphical analysis techniques are introduced through difference patterns, diffusion patterns, and state progression graphs of the 1D ensemble visualizing the evolution. All analyses are consistent with the known behavior of the 1D Ising system. The CA simulation and new pattern recognition techniques are scalable (in both dimension, complexity, and size) and have many potential applications such as complex design of materials, control of agent systems, and evolutionary mechanism design.

  8. Observation of the development of secondary features in a Richtmyer–Meshkov instability driven flow

    DOE PAGES

    Bernard, Tennille; Truman, C. Randall; Vorobieff, Peter; ...

    2014-09-10

    Richtmyer–Meshkov instability (RMI) has long been the subject of interest for analytical, numerical, and experimental studies. In comparing results of experiment with numerics, it is important to understand the limitations of experimental techniques inherent in the chosen method(s) of data acquisition. We discuss results of an experiment where a laminar, gravity-driven column of heavy gas is injected into surrounding light gas and accelerated by a planar shock. A popular and well-studied method of flow visualization (using glycol droplet tracers) does not produce a flow pattern that matches the numerical model of the same conditions, while revealing the primary feature ofmore » the flow developing after shock acceleration: the pair of counter-rotating vortex columns. However, visualization using fluorescent gaseous tracer confirms the presence of features suggested by the numerics; in particular, a central spike formed due to shock focusing in the heavy-gas column. Furthermore, the streamwise growth rate of the spike appears to exhibit the same scaling with Mach number as that of the counter-rotating vortex pair (CRVP).« less

  9. Chemical compounds and mechanisms involved in the formation and stabilization of foam in sparkling wines.

    PubMed

    Kemp, Belinda; Condé, Bruna; Jégou, Sandrine; Howell, Kate; Vasserot, Yann; Marchal, Richard

    2018-02-08

    The visual properties of sparkling wine including foam and bubbles are an indicator of sparkling wine quality. Foam properties, particularly foam height (FH) and foam stability (TS), are significantly influenced by the chemical composition of the wine. This review investigates our current knowledge of specific chemical compounds and, the mechanisms by which they influence the foam properties of sparkling wines. Grape and yeast proteins, amino acids, polysaccharides, phenolic compounds, organic acids, fatty acids, ethanol and sugar are examined with respect to their contribution to foam characteristics in sparkling wines made with the Traditional, Transfer, and Charmat and carbonation methods. Contradictory results have been identified that appear to be due to the analytical methods used to measure and quantify compounds and foam. Biopolymer complexes are discussed and absent knowledge with regards to thaumatin-like proteins (TLPs), polysaccharides, amino acids, oak-derived phenolic compounds and organic acids are identified. Future research is also likely to concentrate on visual analysis of sparkling wines by in-depth imaging analysis and specific sensory analysis techniques.

  10. Situation exploration in a persistent surveillance system with multidimensional data

    NASA Astrophysics Data System (ADS)

    Habibi, Mohammad S.

    2013-03-01

    There is an emerging need for fusing hard and soft sensor data in an efficient surveillance system to provide accurate estimation of situation awareness. These mostly abstract, multi-dimensional and multi-sensor data pose a great challenge to the user in performing analysis of multi-threaded events efficiently and cohesively. To address this concern an interactive Visual Analytics (VA) application is developed for rapid assessment and evaluation of different hypotheses based on context-sensitive ontology spawn from taxonomies describing human/human and human/vehicle/object interactions. A methodology is described here for generating relevant ontology in a Persistent Surveillance System (PSS) and demonstrates how they can be utilized in the context of PSS to track and identify group activities pertaining to potential threats. The proposed VA system allows for visual analysis of raw data as well as metadata that have spatiotemporal representation and content-based implications. Additionally in this paper, a technique for rapid search of tagged information contingent to ranking and confidence is explained for analysis of multi-dimensional data. Lastly the issue of uncertainty associated with processing and interpretation of heterogeneous data is also addressed.

  11. Evaluation of unconfined-aquifer parameters from pumping test data by nonlinear least squares

    NASA Astrophysics Data System (ADS)

    Heidari, Manoutchehr; Wench, Allen

    1997-05-01

    Nonlinear least squares (NLS) with automatic differentiation was used to estimate aquifer parameters from drawdown data obtained from published pumping tests conducted in homogeneous, water-table aquifers. The method is based on a technique that seeks to minimize the squares of residuals between observed and calculated drawdown subject to bounds that are placed on the parameter of interest. The analytical model developed by Neuman for flow to a partially penetrating well of infinitesimal diameter situated in an infinite, homogeneous and anisotropic aquifer was used to obtain calculated drawdown. NLS was first applied to synthetic drawdown data from a hypothetical but realistic aquifer to demonstrate that the relevant hydraulic parameters (storativity, specific yield, and horizontal and vertical hydraulic conductivity) can be evaluated accurately. Next the method was used to estimate the parameters at three field sites with widely varying hydraulic properties. NLS produced unbiased estimates of the aquifer parameters that are close to the estimates obtained with the same data using a visual curve-matching approach. Small differences in the estimates are a consequence of subjective interpretation introduced in the visual approach.

  12. Evaluation of unconfined-aquifer parameters from pumping test data by nonlinear least squares

    USGS Publications Warehouse

    Heidari, M.; Moench, A.

    1997-01-01

    Nonlinear least squares (NLS) with automatic differentiation was used to estimate aquifer parameters from drawdown data obtained from published pumping tests conducted in homogeneous, water-table aquifers. The method is based on a technique that seeks to minimize the squares of residuals between observed and calculated drawdown subject to bounds that are placed on the parameter of interest. The analytical model developed by Neuman for flow to a partially penetrating well of infinitesimal diameter situated in an infinite, homogeneous and anisotropic aquifer was used to obtain calculated drawdown. NLS was first applied to synthetic drawdown data from a hypothetical but realistic aquifer to demonstrate that the relevant hydraulic parameters (storativity, specific yield, and horizontal and vertical hydraulic conductivity) can be evaluated accurately. Next the method was used to estimate the parameters at three field sites with widely varying hydraulic properties. NLS produced unbiased estimates of the aquifer parameters that are close to the estimates obtained with the same data using a visual curve-matching approach. Small differences in the estimates are a consequence of subjective interpretation introduced in the visual approach.

  13. Visual detection of Brucella in bovine biological samples using DNA-activated gold nanoparticles

    PubMed Central

    Kumar, Satish; Kaur, Gurpreet; Ali, Syed Atif; Shrivastava, Sameer; Gupta, Praveen K.; Cooper, Jonathan M.; Chaudhuri, Pallab

    2017-01-01

    Brucellosis is a bacterial disease, which, although affecting cattle primarily, has been associated with human infections, making its detection an important challenge. The existing gold standard diagnosis relies on the culture of bacteria which is a lengthy and costly process, taking up to 45 days. New technologies based on molecular diagnosis have been proposed, either through dip-stick, immunological assays, which have limited specificity, or using nucleic acid tests, which enable to identify the pathogen, but are impractical for use in the field, where most of the reservoir cases are located. Here we demonstrate a new test based on hybridization assays with metal nanoparticles, which, upon detection of a specific pathogen-derived DNA sequence, yield a visual colour change. We characterise the components used in the assay with a range of analytical techniques and show sensitivities down to 1000 cfu/ml for the detection of Brucella. Finally, we demonstrate that the assay works in a range of bovine samples including semen, milk and urine, opening up the potential for its use in the field, in low-resource settings. PMID:28719613

  14. Multimodal Preception and Multicriterion Control of Nested Systems. 3; A Functional Visual Assessment Test for Human Health Maintenance and Countermeasure Evaluation

    NASA Technical Reports Server (NTRS)

    Riccio, Gary E.; McDonald, P. Vernon; Bloomberg, Jacob

    1999-01-01

    Our theoretical and empirical research on the whole-body coordination during locomotion led to a Phase 1 SBIR grant from NASA JSC. The purpose of the SBIR grant was to design an innovative system for evaluating eye-head-trunk coordination during whole-body perturbations that are characteristic of locomotion. The approach we used to satisfy the Phase 1 objectives was based on a structured methodology for the development of human-systems technology. Accordingly the project was broken down into a number of tasks and subtasks. In sequence, the major tasks were: (1) identify needs for functional assessment of visual acuity under conditions involving whole-body perturbation within the NASA Space Medical Monitoring and Countermeasures (SMMaC) program and in other related markets; (2) analyze the needs into the causes and symptoms of impaired visual acuity under conditions involving whole-body perturbation; (3) translate the analyzed needs into technology requirements for the Functional Visual Assessment Test (FVAT); (4) identify candidate technology solutions and implementations of FVAT; and (5) prioritize and select technology solutions. The work conducted in these tasks is described in this final volume of the series on Multimodal Perception and Multicriterion Control of Nested Systems. While prior volumes (1 and 2) in the series focus on theoretical foundations and novel data-analytic techniques, this volume addresses technology that is necessary for minimally intrusive data collection and near-real-time data analysis and display.

  15. Analytic modeling of aerosol size distributions

    NASA Technical Reports Server (NTRS)

    Deepack, A.; Box, G. P.

    1979-01-01

    Mathematical functions commonly used for representing aerosol size distributions are studied parametrically. Methods for obtaining best fit estimates of the parameters are described. A catalog of graphical plots depicting the parametric behavior of the functions is presented along with procedures for obtaining analytical representations of size distribution data by visual matching of the data with one of the plots. Examples of fitting the same data with equal accuracy by more than one analytic model are also given.

  16. Seeking Information with an Information Visualization System: A Study of Cognitive Styles

    ERIC Educational Resources Information Center

    Yuan, Xiaojun; Zhang, Xiangman; Chen, Chaomei; Avery, Joshua M.

    2011-01-01

    Introduction: This study investigated the effect of cognitive styles on users' information-seeking task performance using a knowledge domain information visualization system called CiteSpace. Method: Sixteen graduate students participated in a user experiment. Each completed an extended cognitive style analysis wholistic-analytic test (the…

  17. Improving Student Performance Using Nudge Analytics

    ERIC Educational Resources Information Center

    Feild, Jacqueline

    2015-01-01

    Providing students with continuous and personalized feedback on their performance is an important part of encouraging self regulated learning. As part of our higher education platform, we built a set of data visualizations to provide feedback to students on their assignment performance. These visualizations give students information about how they…

  18. Visualizing the Solute Vaporization Interference in Flame Atomic Absorption Spectroscopy

    ERIC Educational Resources Information Center

    Dockery, Christopher R.; Blew, Michael J.; Goode, Scott R.

    2008-01-01

    Every day, tens of thousands of chemists use analytical atomic spectroscopy in their work, often without knowledge of possible interferences. We present a unique approach to study these interferences by using modern response surface methods to visualize an interference in which aluminum depresses the calcium atomic absorption signal. Calcium…

  19. Innovative Didactic Designs: Visual Analytics and Visual Literacy in School

    ERIC Educational Resources Information Center

    Stenliden, Linnéa; Nissen, Jörgen; Bodén, Ulrika

    2017-01-01

    In a world of massively mediated information and communication, students must learn to handle rapidly growing information volumes inside and outside school. Pedagogy attuned to processing this growing production and communication of information is needed. However, ordinary educational models often fail to support students, trialing neither…

  20. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. The identification of van Hiele level students on the topic of space analytic geometry

    NASA Astrophysics Data System (ADS)

    Yudianto, E.; Sunardi; Sugiarti, T.; Susanto; Suharto; Trapsilasiwi, D.

    2018-03-01

    Geometry topics are still considered difficult by most students. Therefore, this study focused on the identification of students related to van Hiele levels. The task used from result of the development of questions related to analytical geometry of space. The results of the work involving 78 students who worked on these questions covered 11.54% (nine students) classified on a visual level; 5.13% (four students) on analysis level; 1.28% (one student) on informal deduction level; 2.56% (two students) on deduction and 2.56% (two students) on rigor level, and 76.93% (sixty students) classified on the pre-visualization level.

  2. T.Rex Visual Analytics for Transactional Exploration

    ScienceCinema

    None

    2018-01-16

    T.Rex is PNNL's visual analytics tool that specializes in tabular structured data, like you might open with Excel. It's a client-server application, allowing the server to do a lot of the heavy lifting and the client to open spreadsheets with millions of rows. With datasets of that size, especially if you're unfamiliar with the contents, it's very hard to get a good grasp of what's in it using traditional tools. With T.Rex, the multiple views allow you to see categorical, temporal, numerical, relational, and summary data. The interactivity lets you look across your data and see how things relate to each other.

  3. T.Rex Visual Analytics for Transactional Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-07-01

    T.Rex is PNNL's visual analytics tool that specializes in tabular structured data, like you might open with Excel. It's a client-server application, allowing the server to do a lot of the heavy lifting and the client to open spreadsheets with millions of rows. With datasets of that size, especially if you're unfamiliar with the contents, it's very hard to get a good grasp of what's in it using traditional tools. With T.Rex, the multiple views allow you to see categorical, temporal, numerical, relational, and summary data. The interactivity lets you look across your data and see how things relate tomore » each other.« less

  4. AUVA - Augmented Reality Empowers Visual Analytics to explore Medical Curriculum Data.

    PubMed

    Nifakos, Sokratis; Vaitsis, Christos; Zary, Nabil

    2015-01-01

    Medical curriculum data play a key role in the structure and the organization of medical programs in Universities around the world. The effective processing and usage of these data may improve the educational environment of medical students. As a consequence, the new generation of health professionals would have improved skills from the previous ones. This study introduces the process of enhancing curriculum data by the use of augmented reality technology as a management and presentation tool. The final goal is to enrich the information presented from a visual analytics approach applied on medical curriculum data and to sustain low levels of complexity of understanding these data.

  5. One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.

    PubMed

    Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz

    2009-07-15

    The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.

  6. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  7. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

    PubMed

    Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

    2017-06-05

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  8. The role of the putamen in language: a meta-analytic connectivity modeling study.

    PubMed

    Viñas-Guasch, Nestor; Wu, Yan Jing

    2017-12-01

    The putamen is a subcortical structure that forms part of the dorsal striatum of basal ganglia, and has traditionally been associated with reinforcement learning and motor control, including speech articulation. However, recent studies have shown involvement of the left putamen in other language functions such as bilingual language processing (Abutalebi et al. 2012) and production, with some authors arguing for functional segregation of anterior and posterior putamen (Oberhuber et al. 2013). A further step in exploring the role of putamen in language would involve identifying the network of coactivations of not only the left, but also the right putamen, given the involvement of right hemisphere in high order language functions (Vigneau et al. 2011). Here, a meta-analytic connectivity modeling technique was used to determine the patterns of coactivation of anterior and bilateral putamen in the language domain. Based on previous evidence, we hypothesized that left putamen coactivations would include brain regions directly associated with language processing, whereas right putamen coactivations would encompass regions involved in broader semantic processes, such as memory and visual imagery. The results showed that left anterior putamen coactivated with clusters predominantly in left hemisphere, encompassing regions directly associated with language processing, a left posterior putamen network spanning both hemispheres, and cerebellum. In right hemisphere, coactivations were in both hemispheres, in regions associated with visual and orthographic processing. These results confirm the differential involvement of right and left putamen in different language components, thus highlighting the need for further research into the role of putamen in language.

  9. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  10. A survey of visualization systems for network security.

    PubMed

    Shiravi, Hadi; Shiravi, Ali; Ghorbani, Ali A

    2012-08-01

    Security Visualization is a very young term. It expresses the idea that common visualization techniques have been designed for use cases that are not supportive of security-related data, demanding novel techniques fine tuned for the purpose of thorough analysis. Significant amount of work has been published in this area, but little work has been done to study this emerging visualization discipline. We offer a comprehensive review of network security visualization and provide a taxonomy in the form of five use-case classes encompassing nearly all recent works in this area. We outline the incorporated visualization techniques and data sources and provide an informative table to display our findings. From the analysis of these systems, we examine issues and concerns regarding network security visualization and provide guidelines and directions for future researchers and visual system developers.

  11. The Generation of Novel MR Imaging Techniques to Visualize Inflammatory/Degenerative Mechanisms and the Correlation of MR Data with 3D Microscopic Changes

    DTIC Science & Technology

    2013-09-01

    existing MR scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently...and unique system for analysis of affected brain regions and coupled with other imaging techniques and molecular measurements holds significant...scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently stain

  12. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  13. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  14. Experimental and analytical determination of stability parameters for a balloon tethered in a wind

    NASA Technical Reports Server (NTRS)

    Redd, L. T.; Bennett, R. M.; Bland, S. R.

    1973-01-01

    Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.

  15. Predicting the Development of Analytical and Creative Abilities in Upper Elementary Grades

    ERIC Educational Resources Information Center

    Gubbels, Joyce; Segers, Eliane; Verhoeven, Ludo

    2017-01-01

    In some models, intelligence has been described as a multidimensional construct comprising both analytical and creative abilities. In addition, intelligence is considered to be dynamic rather than static. A structural equation model was used to examine the predictive role of cognitive (visual short-term memory, verbal short-term memory, selective…

  16. Evidence for Holistic Representations of Ignored Images and Analytic Representations of Attended Images

    ERIC Educational Resources Information Center

    Thoma, Volker; Hummel, John E.; Davidoff, Jules

    2004-01-01

    According to the hybrid theory of object recognition (J. E. Hummel, 2001), ignored object images are represented holistically, and attended images are represented both holistically and analytically. This account correctly predicts patterns of visual priming as a function of translation, scale (B. J. Stankiewicz & J. E. Hummel, 2002), and…

  17. Large High Resolution Displays for Co-Located Collaborative Sensemaking: Display Usage and Territoriality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradel, Lauren; Endert, Alexander; Koch, Kristen

    2013-08-01

    Large, high-resolution vertical displays carry the potential to increase the accuracy of collaborative sensemaking, given correctly designed visual analytics tools. From an exploratory user study using a fictional textual intelligence analysis task, we investigated how users interact with the display to construct spatial schemas and externalize information, as well as how they establish shared and private territories. We investigated the space management strategies of users partitioned by type of tool philosophy followed (visualization- or text-centric). We classified the types of territorial behavior exhibited in terms of how the users interacted with information on the display (integrated or independent workspaces). Next,more » we examined how territorial behavior impacted the common ground between the pairs of users. Finally, we offer design suggestions for building future co-located collaborative visual analytics tools specifically for use on large, high-resolution vertical displays.« less

  18. A collection of flow visualization techniques used in the Aerodynamic Research Branch

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.

  19. Use of multiple colorimetric indicators for paper-based microfluidic devices.

    PubMed

    Dungchai, Wijitar; Chailapakul, Orawon; Henry, Charles S

    2010-08-03

    We report here the use of multiple indicators for a single analyte for paper-based microfluidic devices (microPAD) in an effort to improve the ability to visually discriminate between analyte concentrations. In existing microPADs, a single dye system is used for the measurement of a single analyte. In our approach, devices are designed to simultaneously quantify analytes using multiple indicators for each analyte improving the accuracy of the assay. The use of multiple indicators for a single analyte allows for different indicator colors to be generated at different analyte concentration ranges as well as increasing the ability to better visually discriminate colors. The principle of our devices is based on the oxidation of indicators by hydrogen peroxide produced by oxidase enzymes specific for each analyte. Each indicator reacts at different peroxide concentrations and therefore analyte concentrations, giving an extended range of operation. To demonstrate the utility of our approach, the mixture of 4-aminoantipyrine and 3,5-dichloro-2-hydroxy-benzenesulfonic acid, o-dianisidine dihydrochloride, potassium iodide, acid black, and acid yellow were chosen as the indicators for simultaneous semi-quantitative measurement of glucose, lactate, and uric acid on a microPAD. Our approach was successfully applied to quantify glucose (0.5-20 mM), lactate (1-25 mM), and uric acid (0.1-7 mM) in clinically relevant ranges. The determination of glucose, lactate, and uric acid in control serum and urine samples was also performed to demonstrate the applicability of this device for biological sample analysis. Finally results for the multi-indicator and single indicator system were compared using untrained readers to demonstrate the improvements in accuracy achieved with the new system. 2010 Elsevier B.V. All rights reserved.

  20. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  1. Exploratory Visual Analytics of a Dynamically Built Network of Nodes in a WebGL-Enabled Browser

    DTIC Science & Technology

    2014-01-01

    dimensionality reduction, feature extraction, high-dimensional data, t-distributed stochastic neighbor embedding, neighbor retrieval visualizer, visual...WebGL-enabled rendering is supported natively by browsers such as the latest Mozilla Firefox , Google Chrome, and Microsoft Internet Explorer 11. At the...appropriate names. The resultant 26-node network is displayed in a Mozilla Firefox browser in figure 2 (also see appendix B). 3 Figure 1. The

  2. Advanced Video Activity Analytics (AVAA): Human Factors Evaluation

    DTIC Science & Technology

    2015-05-01

    video, and 3) creating and saving annotations (Fig. 11). (The logging program was updated after the pilot to also capture search clicks.) Playing and... visual search task and the auditory task together and thus automatically focused on the visual task. Alternatively, the operator may have intentionally...affect performance on the primary task; however, in the current test there was no apparent effect on the operator’s performance in the visual search task

  3. HyFinBall: A Two-Handed, Hybrid 2D/3D Desktop VR Interface for Visualization

    DTIC Science & Technology

    2013-01-01

    user study . This is done in the context of a rich, visual analytics interface containing coordinated views with 2D and 3D visualizations and...the user interface (hardware and software), the design space, as well as preliminary results of a formal user study . This is done in the context of a ... virtual reality , user interface , two-handed interface , hybrid user interface , multi-touch, gesture,

  4. One-Way Particle Transport Using Oscillatory Flow in Asymmetric Traps.

    PubMed

    Lee, Jaesung; Burns, Mark A

    2018-03-01

    One challenge of integrating of passive, microparticles manipulation techniques into multifunctional microfluidic devices is coupling the continuous-flow format of most systems with the often batch-type operation of particle separation systems. Here, a passive fluidic technique-one-way particle transport-that can conduct microparticle operations in a closed fluidic circuit is presented. Exploiting pass/capture interactions between microparticles and asymmetric traps, this technique accomplishes a net displacement of particles in an oscillatory flow field. One-way particle transport is achieved through four kinds of trap-particle interactions: mechanical capture of the particle, asymmetric interactions between the trap and the particle, physical collision of the particle with an obstacle, and lateral shift of the particle into a particle-trapping stream. The critical dimensions for those four conditions are found by numerically solving analytical mass balance equations formulated using the characteristics of the flow field in periodic obstacle arrays. Visual observation of experimental trap-particle dynamics in low Reynolds number flow (<0.01) confirms the validity of the theoretical predictions. This technique can transport hundreds of microparticles across trap rows in only a few fluid oscillations (<500 ms per oscillation) and separate particles by their size differences. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Enhanced visualization of MR angiogram with modified MIP and 3D image fusion

    NASA Astrophysics Data System (ADS)

    Kim, JongHyo; Yeon, Kyoung M.; Han, Man Chung; Lee, Dong Hyuk; Cho, Han I.

    1997-05-01

    We have developed a 3D image processing and display technique that include image resampling, modification of MIP, volume rendering, and fusion of MIP image with volumetric rendered image. This technique facilitates the visualization of the 3D spatial relationship between vasculature and surrounding organs by overlapping the MIP image on the volumetric rendered image of the organ. We applied this technique to a MR brain image data to produce an MRI angiogram that is overlapped with 3D volume rendered image of brain. MIP technique was used to visualize the vasculature of brain, and volume rendering was used to visualize the other structures of brain. The two images are fused after adjustment of contrast and brightness levels of each image in such a way that both the vasculature and brain structure are well visualized either by selecting the maximum value of each image or by assigning different color table to each image. The resultant image with this technique visualizes both the brain structure and vasculature simultaneously, allowing the physicians to inspect their relationship more easily. The presented technique will be useful for surgical planning for neurosurgery.

  6. How the viewing of familiar landscapes prior to release allows pigeons to home faster: evidence from GPS tracking.

    PubMed

    Biro, Dora; Guilford, Tim; Dell'Omo, Giacomo; Lipp, Hans-Peter

    2002-12-01

    Providing homing pigeons with a 5 min preview of the landscape at familiar sites prior to release reliably improves the birds' subsequent homing speeds. This phenomenon has been taken to suggest that the visual panorama is involved in familiar-site recognition, yet the exact nature of the improvement has never been elucidated. We employed newly developed miniature Global Positioning System (GPS) tracking technology to investigate how access to visual cues prior to release affects pigeons' flight along the length of the homing route. By applying a variety of novel analytical techniques enabled by the high-resolution GPS data (track efficiency, virtual vanishing bearings, orientation threshold), we localised the preview effect to the first 1000 m of the journey. Birds denied preview of a familiar landscape for 5 min before take-off flew an initially more tortuous path, including a high incidence of circling, possibly as part of an information-gathering strategy to determine their position. Beyond the first 1000 m, no differences were found in the performance of birds with or without preview. That the effect of the visual treatment was evident only in the early part of the journey suggests that lack of access to visual cues prior to release does not result in a non-specific effect on behaviour that is maintained throughout the flight. Instead, it seems that at least some decisions regarding the direction of home can be made prior to release and that such decisions are delayed if visual access to the landscape is denied. Overall, the variety of approaches applied here clearly highlight the potential for future applications of GPS tracking technology in navigation studies.

  7. What's Going on in This Picture? Visual Thinking Strategies and Adult Learning

    ERIC Educational Resources Information Center

    Landorf, Hilary

    2006-01-01

    The Visual Thinking Strategies (VTS) curriculum and teaching method uses art to help students think critically, listen attentively, communicate, and collaborate. VTS has been proven to enhance reading, writing, comprehension, and creative and analytical skills among students of all ages. The origins and procedures of the VTS curriculum are…

  8. Uncertainty Representation in Visualizations of Learning Analytics for Learners: Current Approaches and Opportunities

    ERIC Educational Resources Information Center

    Demmans Epp, Carrie; Bull, Susan

    2015-01-01

    Adding uncertainty information to visualizations is becoming increasingly common across domains since its addition helps ensure that informed decisions are made. This work has shown the difficulty that is inherent to representing uncertainty. Moreover, the representation of uncertainty has yet to be thoroughly explored in educational domains even…

  9. Interpreting Visual Texts: Use of an Analytic Rubric to Evaluate Student Responses to Writing Prompts

    ERIC Educational Resources Information Center

    Sundeen, Todd H.; O'Neil, Kathleen; Fanselow, Stephanie A.

    2017-01-01

    Younger students' visual texts are statements and stories conveyed through drawings or other artwork and often convey meaning beyond the child's capability to communicate with written expression. Although opportunities for expression through drawing are routinely offered to children in the initial and middle stages of early childhood literacy…

  10. Communicating Science Concepts through Art: 21st-Century Skills in Practice

    ERIC Educational Resources Information Center

    Buczynski, Sandy; Ireland, Kathleen; Reed, Sherri; Lacanienta, Evelyn

    2012-01-01

    There is a dynamic synergy between the visual arts and the natural sciences. For example, science relies heavily on individuals with visual-art skills to render detailed illustrations, depicting everything from atoms to zebras. Likewise, artists apply analytic, linear, and logical thinking to compose and scale their work of art. These parallel…

  11. Domain Coloring and the Argument Principle

    ERIC Educational Resources Information Center

    Farris, Frank A.

    2017-01-01

    The "domain-coloring algorithm" allows us to visualize complex-valued functions on the plane in a single image--an alternative to before-and-after mapping diagrams. It helps us see when a function is analytic and aids in understanding contour integrals. The culmination of this article is a visual discovery and subsequent proof of the…

  12. Explorative visual analytics on interval-based genomic data and their metadata.

    PubMed

    Jalili, Vahid; Matteucci, Matteo; Masseroli, Marco; Ceri, Stefano

    2017-12-04

    With the wide-spreading of public repositories of NGS processed data, the availability of user-friendly and effective tools for data exploration, analysis and visualization is becoming very relevant. These tools enable interactive analytics, an exploratory approach for the seamless "sense-making" of data through on-the-fly integration of analysis and visualization phases, suggested not only for evaluating processing results, but also for designing and adapting NGS data analysis pipelines. This paper presents abstractions for supporting the early analysis of NGS processed data and their implementation in an associated tool, named GenoMetric Space Explorer (GeMSE). This tool serves the needs of the GenoMetric Query Language, an innovative cloud-based system for computing complex queries over heterogeneous processed data. It can also be used starting from any text files in standard BED, BroadPeak, NarrowPeak, GTF, or general tab-delimited format, containing numerical features of genomic regions; metadata can be provided as text files in tab-delimited attribute-value format. GeMSE allows interactive analytics, consisting of on-the-fly cycling among steps of data exploration, analysis and visualization that help biologists and bioinformaticians in making sense of heterogeneous genomic datasets. By means of an explorative interaction support, users can trace past activities and quickly recover their results, seamlessly going backward and forward in the analysis steps and comparative visualizations of heatmaps. GeMSE effective application and practical usefulness is demonstrated through significant use cases of biological interest. GeMSE is available at http://www.bioinformatics.deib.polimi.it/GeMSE/ , and its source code is available at https://github.com/Genometric/GeMSE under GPLv3 open-source license.

  13. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes.

    PubMed

    Ragan, Eric D; Endert, Alex; Sanyal, Jibonananda; Chen, Jian

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance information and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research.

  14. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  15. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE PAGES

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda; ...

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  16. A visual analytics approach for pattern-recognition in patient-generated data.

    PubMed

    Feller, Daniel J; Burgermaster, Marissa; Levine, Matthew E; Smaldone, Arlene; Davidson, Patricia G; Albers, David J; Mamykina, Lena

    2018-06-13

    To develop and test a visual analytics tool to help clinicians identify systematic and clinically meaningful patterns in patient-generated data (PGD) while decreasing perceived information overload. Participatory design was used to develop Glucolyzer, an interactive tool featuring hierarchical clustering and a heatmap visualization to help registered dietitians (RDs) identify associative patterns between blood glucose levels and per-meal macronutrient composition for individuals with type 2 diabetes (T2DM). Ten RDs participated in a within-subjects experiment to compare Glucolyzer to a static logbook format. For each representation, participants had 25 minutes to examine 1 month of diabetes self-monitoring data captured by an individual with T2DM and identify clinically meaningful patterns. We compared the quality and accuracy of the observations generated using each representation. Participants generated 50% more observations when using Glucolyzer (98) than when using the logbook format (64) without any loss in accuracy (69% accuracy vs 62%, respectively, p = .17). Participants identified more observations that included ingredients other than carbohydrates using Glucolyzer (36% vs 16%, p = .027). Fewer RDs reported feelings of information overload using Glucolyzer compared to the logbook format. Study participants displayed variable acceptance of hierarchical clustering. Visual analytics have the potential to mitigate provider concerns about the volume of self-monitoring data. Glucolyzer helped dietitians identify meaningful patterns in self-monitoring data without incurring perceived information overload. Future studies should assess whether similar tools can support clinicians in personalizing behavioral interventions that improve patient outcomes.

  17. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  18. Incidence loss for fan turbine rotor blade in two-dimensional cascade

    NASA Technical Reports Server (NTRS)

    Kline, J. F.; Moffitt, T. P.; Stabe, R. G.

    1983-01-01

    The effect of incidence angle on the aerodynamic performance of a fan turbine rotor blade was investigated experimentally in a two dimensional cascade. The test covered a range of incidence angles from -15 deg to 10 deg and exit ideal critical velocity ratios from 0.75 to 0.95. The principal measurements were blade-surface static pressures and cross-channel survey of exit total pressure, static pressure, and flow angle. Flow adjacent to surfaces was examined using a visualization technique. The results of the investigation include blade-surface velocity distribution and overall kinetic energy loss coefficients for the incidence angles and exit velocity ratios tested. The measured losses are compared with those from a reference core turbine rotor blade and also with two common analytical methods of predicting incidence loss.

  19. Geodynamics for Everyone: Robust Finite-Difference Heat Transfer Models using MS Excel 2007 Spreadsheets

    NASA Astrophysics Data System (ADS)

    Grose, C. J.

    2008-05-01

    Numerical geodynamics models of heat transfer are typically thought of as specialized topics of research requiring knowledge of specialized modelling software, linux platforms, and state-of-the-art finite-element codes. I have implemented analytical and numerical finite-difference techniques with Microsoft Excel 2007 spreadsheets to solve for complex solid-earth heat transfer problems for use by students, teachers, and practicing scientists without specialty in geodynamics modelling techniques and applications. While implementation of equations for use in Excel spreadsheets is occasionally cumbersome, once case boundary structure and node equations are developed, spreadsheet manipulation becomes routine. Model experimentation by modifying parameter values, geometry, and grid resolution makes Excel a useful tool whether in the classroom at the undergraduate or graduate level or for more engaging student projects. Furthermore, the ability to incorporate complex geometries and heat-transfer characteristics makes it ideal for first and occasionally higher order geodynamics simulations to better understand and constrain the results of professional field research in a setting that does not require the constraints of state-of-the-art modelling codes. The straightforward expression and manipulation of model equations in excel can also serve as a medium to better understand the confusing notations of advanced mathematical problems. To illustrate the power and robustness of computation and visualization in spreadsheet models I focus primarily on one-dimensional analytical and two-dimensional numerical solutions to two case problems: (i) the cooling of oceanic lithosphere and (ii) temperatures within subducting slabs. Excel source documents will be made available.

  20. Enhanced identification of trace element fingerprint of prehistoric pigments by PIXE mapping

    NASA Astrophysics Data System (ADS)

    Lebon, M.; Pichon, L.; Beck, L.

    2018-02-01

    The elemental composition of Fe rich rocks used as pigment during prehistoric periods can provide valuable information about the type of material used and their geological origin. However, these materials present several analytical constraints since their patrimonial value involve using non-invasive techniques maintaining a high sensitivity of the detection and the quantification of trace elements. Micro-beam techniques also require to take into account the heterogeneity of these geomaterials from the macroscopic to microscopic scales. Several previous studies have demonstrated that PIXE analysis satisfies these analytical conditions. However, application of micro-PIXE analysis is still complex when thin and discontinuous layer of pigment is deposed on the surface of other materials such as rocks or bones. In such case, PIXE imaging could improve the ability to take into account the high heterogeneity of such archaeological objects. In study, we used PIXE imaging system developed at the NewAGLAE facility in order to visualize distribution of elements associated with iron-rich pigment phase. The results obtained show that PIXE maps can improve the identification of the main trace elements specific to the iron mineral phase. By grouping pixels of iron-rich areas and performing quantitative treatment, it was possible to reveal additional trace elements associated to pigment. This study highlights the contribution of PIXE imaging to the identification of elements associated with mineral phases of interest and to use them as proxies to discriminate different geological materials used in archaeological context.

  1. Hybridization chain reaction: a versatile molecular tool for biosensing, bioimaging, and biomedicine.

    PubMed

    Bi, Sai; Yue, Shuzhen; Zhang, Shusheng

    2017-07-17

    Developing powerful, simple and low-cost DNA amplification techniques is of great significance to bioanalysis and biomedical research. Thus far, many signal amplification strategies have been developed, such as polymerase chain reaction (PCR), rolling circle amplification (RCA), and DNA strand displacement amplification (SDA). In particular, hybridization chain reaction (HCR), a type of toehold-mediated strand displacement (TMSD) reaction, has attracted great interest because of its enzyme-free nature, isothermal conditions, simple protocols, and excellent amplification efficiency. In a typical HCR, an analyte initiates the cross-opening of two DNA hairpins, yielding nicked double helices that are analogous to alternating copolymers. As an efficient amplification platform, HCR has been utilized for the sensitive detection of a wide variety of analytes, including nucleic acids, proteins, small molecules, and cells. In recent years, more complicated sets of monomers have been designed to develop nonlinear HCR, such as branched HCR and even dendritic systems, achieving quadratic and exponential growth mechanisms. In addition, HCR has attracted enormous attention in the fields of bioimaging and biomedicine, including applications in fluorescence in situ hybridization (FISH) imaging, live cell imaging, and targeted drug delivery. In this review, we introduce the fundamentals of HCR and examine the visualization and analysis techniques for HCR products in detail. The most recent HCR developments in biosensing, bioimaging, and biomedicine are subsequently discussed with selected examples. Finally, the review provides insight into the challenges and future perspectives of HCR.

  2. Chemical-mineralogical characterization of C&D waste recycled aggregates from São Paulo, Brazil.

    PubMed

    Angulo, S C; Ulsen, C; John, V M; Kahn, H; Cincotto, M A

    2009-02-01

    This study presents a methodology for the characterization of construction and demolition (C&D) waste recycled aggregates based on a combination of analytical techniques (X-ray fluorescence (XRF), soluble ions, semi-quantitative X-ray diffraction (XRD), thermogravimetric analysis (TGA-DTG) and hydrochloric acid (HCl) selective dissolution). These combined analytical techniques allow for the estimation of the amount of cement paste, its most important hydrated and carbonated phases, as well as the amount of clay and micas. Details of the methodology are presented here and the results of three representative C&D samples taken from the São Paulo region in Brazil are discussed. Chemical compositions of mixed C&D aggregate samples have mostly been influenced by particle size rather than the visual classification of C&D into red or grey and geographical origin. The amount of measured soluble salts in C&D aggregates (0.15-25.4mm) is lower than the usual limits for mortar and concrete production. The content of porous cement paste in the C&D aggregates is around 19.3% (w/w). However, this content is significantly lower than the 43% detected for the C&D powders (<0.15 mm). The clay content of the powders was also high, potentially resulting from soil intermixed with the C&D waste, as well as poorly burnt red ceramic. Since only about 50% of the measured CaO is combined with CO(2), the powders have potential use as raw materials for the cement industry.

  3. Comparing the onset of maxillary infiltration local anaesthesia and pain experience using the conventional technique vs. the Wand in children.

    PubMed

    Kandiah, P; Tahmassebi, J F

    2012-11-01

    This prospective, randomised, parallel, controlled study was conducted firstly to compare the onset of local anaesthesia (LA) when using the conventional technique versus the Wand computer-controlled LA and secondly to assess the pain experience in children. Thirty children were randomly allocated to the treatment group (Wand) or the control group (conventional). Lidocaine 2% with adrenaline (1:80,000) was given as a buccal infiltration. The onset of pulpal anaesthesia was tested using an analytic electric pulp tester (EPT). The pain experience during the LA was recorded using a modified visual analogue score (VAS). Median time for the onset of LA was 6.30 minutes for the control and 7.25 minutes for the Wand group. Mean pain experience score for the control group was 9.78% as opposed to 8.46% in the Wand group. Statistical analysis showed that there was no statistically significant difference in the onset of LA (p = 0.486) and the pain experience (p = 0.713) between the two groups. When placing a buccal infiltration on upper first permanent molars, the onset of LA and the pain experience was no different using the Wand and the conventional technique.

  4. Lamb wave detection of limpet mines on ship hulls.

    PubMed

    Bingham, Jill; Hinders, Mark; Friedman, Adam

    2009-12-01

    This paper describes the use of ultrasonic guided waves for identifying the mass loading due to underwater limpet mines on ship hulls. The Dynamic Wavelet Fingerprint Technique (DFWT) is used to render the guided wave mode information in two-dimensional binary images because the waveform features of interest are too subtle to identify in time domain. The use of wavelets allows both time and scale features from the original signals to be retained, and image processing can be used to automatically extract features that correspond to the arrival times of the guided wave modes. For further understanding of how the guided wave modes propagate through the real structures, a parallel processing, 3D elastic wave simulation is developed using the finite integration technique (EFIT). This full field, technique models situations that are too complex for analytical solutions, such as built up 3D structures. The simulations have produced informative visualizations of the guided wave modes in the structures as well as mimicking directly the output from sensors placed in the simulation space for direct comparison to experiments. Results from both drydock and in-water experiments with dummy mines are also shown.

  5. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  6. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  7. Thermoelectrically cooled water trap

    DOEpatents

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  8. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  9. Flow Charts: Visualization of Vector Fields on Arbitrary Surfaces

    PubMed Central

    Li, Guo-Shi; Tricoche, Xavier; Weiskopf, Daniel; Hansen, Charles

    2009-01-01

    We introduce a novel flow visualization method called Flow Charts, which uses a texture atlas approach for the visualization of flows defined over curved surfaces. In this scheme, the surface and its associated flow are segmented into overlapping patches, which are then parameterized and packed in the texture domain. This scheme allows accurate particle advection across multiple charts in the texture domain, providing a flexible framework that supports various flow visualization techniques. The use of surface parameterization enables flow visualization techniques requiring the global view of the surface over long time spans, such as Unsteady Flow LIC (UFLIC), particle-based Unsteady Flow Advection Convolution (UFAC), or dye advection. It also prevents visual artifacts normally associated with view-dependent methods. Represented as textures, Flow Charts can be naturally integrated into hardware accelerated flow visualization techniques for interactive performance. PMID:18599918

  10. 4D imaging and quantification of pore structure modifications inside natural building stones by means of high resolution X-ray CT.

    PubMed

    Dewanckele, J; De Kock, T; Boone, M A; Cnudde, V; Brabant, L; Boone, M N; Fronteau, G; Van Hoorebeke, L; Jacobs, P

    2012-02-01

    Weathering processes have been studied in detail for many natural building stones. The most commonly used analytical techniques in these studies are thin-section petrography, SEM, XRD and XRF. Most of these techniques are valuable for chemical and mineralogical analysis of the weathering patterns. However, to obtain crucial quantitative information on structural evolutions like porosity changes and growth of weathering crusts in function of time, non-destructive techniques become necessary. In this study, a Belgian historical calcareous sandstone, the Lede stone, was exposed to gaseous SO(2) under wet surface conditions according to the European Standard NBN EN 13919 (2003). Before, during and after the strong acid test, high resolution X-ray tomography has been performed to visualize gypsum crust formation to yield a better insight into the effects of gaseous SO(2) on the pore modification in 3D. The tomographic scans were taken at the Centre for X-ray Tomography at Ghent University (UGCT). With the aid of image analysis, partial porosity changes were calculated in different stadia of the process. Increasing porosity has been observed visually and quantitatively below the new superficial formed layer of gypsum crystals. In some cases micro-cracks and dissolution zones were detected on the grain boundaries of quartz. By using Morpho+, an in-house developed image analysis program, radial porosity, partial porosity, ratio of open and closed porosity and equivalent diameter of individual pore structures have been calculated. The results obtained in this study are promising for a better understanding of gypsum weathering mechanisms, porosity changes and patterns on natural building stones in four dimensions. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Non-Inferential Multi-Subject Study of Functional Connectivity during Visual Stimulation.

    PubMed

    Esposito, F; Cirillo, M; Aragri, A; Caranci, F; Cirillo, L; Di Salle, F; Cirillo, S

    2007-01-31

    Independent component analysis (ICA) is a powerful technique for the multivariate, non-inferential, data-driven analysis of functional magnetic resonance imaging (fMRI) data-sets. The non-inferential nature of ICA makes this a suitable technique for the study of complex mental states whose temporal evolution would be difficult to describe analytically in terms of classical statistical regressors. Taking advantage of this feature, ICA can extract a number of functional connectivity patterns regardless of the task executed by the subject. The technique is so powerful that functional connectivity patterns can be derived even when the subject is just resting in the scanner, opening the opportunity for functional investigation of the human mind at its basal "default" state, which has been proposed to be altered in several brain disorders. However, one major drawback of ICA consists in the difficulty of managing its results, which are not represented by a single functional image as in inferential studies. This produces the need for a classification of ICA results and exacerbates the difficulty of obtaining group "averaged" functional connectivity patterns, while preserving the interpretation of individual differences. Addressing the subject-level variability in the very same framework of "grouping" appears to be a favourable approach towards the clinical evaluation and application of ICA-based methodologies. Here we present a novel strategy for group-level ICA analyses, namely the self-organizing group-level ICA (sog-ICA), which is used on visual activation fMRI data from a block-design experiment repeated on six subjects. We propose the sog-ICA as a multi-subject analysis tool for grouping ICA data while assessing the similarity and variability of the fMRI results of individual subject decompositions.

  12. VisOHC: Designing Visual Analytics for Online Health Communities

    PubMed Central

    Kwon, Bum Chul; Kim, Sung-Hee; Lee, Sukwon; Choo, Jaegul; Huh, Jina; Yi, Ji Soo

    2015-01-01

    Through online health communities (OHCs), patients and caregivers exchange their illness experiences and strategies for overcoming the illness, and provide emotional support. To facilitate healthy and lively conversations in these communities, their members should be continuously monitored and nurtured by OHC administrators. The main challenge of OHC administrators' tasks lies in understanding the diverse dimensions of conversation threads that lead to productive discussions in their communities. In this paper, we present a design study in which three domain expert groups participated, an OHC researcher and two OHC administrators of online health communities, which was conducted to find with a visual analytic solution. Through our design study, we characterized the domain goals of OHC administrators and derived tasks to achieve these goals. As a result of this study, we propose a system called VisOHC, which visualizes individual OHC conversation threads as collapsed boxes–a visual metaphor of conversation threads. In addition, we augmented the posters' reply authorship network with marks and/or beams to show conversation dynamics within threads. We also developed unique measures tailored to the characteristics of OHCs, which can be encoded for thread visualizations at the users' requests. Our observation of the two administrators while using VisOHC showed that it supports their tasks and reveals interesting insights into online health communities. Finally, we share our methodological lessons on probing visual designs together with domain experts by allowing them to freely encode measurements into visual variables. PMID:26529688

  13. VisOHC: Designing Visual Analytics for Online Health Communities.

    PubMed

    Kwon, Bum Chul; Kim, Sung-Hee; Lee, Sukwon; Choo, Jaegul; Huh, Jina; Yi, Ji Soo

    2016-01-01

    Through online health communities (OHCs), patients and caregivers exchange their illness experiences and strategies for overcoming the illness, and provide emotional support. To facilitate healthy and lively conversations in these communities, their members should be continuously monitored and nurtured by OHC administrators. The main challenge of OHC administrators' tasks lies in understanding the diverse dimensions of conversation threads that lead to productive discussions in their communities. In this paper, we present a design study in which three domain expert groups participated, an OHC researcher and two OHC administrators of online health communities, which was conducted to find with a visual analytic solution. Through our design study, we characterized the domain goals of OHC administrators and derived tasks to achieve these goals. As a result of this study, we propose a system called VisOHC, which visualizes individual OHC conversation threads as collapsed boxes-a visual metaphor of conversation threads. In addition, we augmented the posters' reply authorship network with marks and/or beams to show conversation dynamics within threads. We also developed unique measures tailored to the characteristics of OHCs, which can be encoded for thread visualizations at the users' requests. Our observation of the two administrators while using VisOHC showed that it supports their tasks and reveals interesting insights into online health communities. Finally, we share our methodological lessons on probing visual designs together with domain experts by allowing them to freely encode measurements into visual variables.

  14. Profiling Oman education data using data visualization technique

    NASA Astrophysics Data System (ADS)

    Alalawi, Sultan Juma Sultan; Shaharanee, Izwan Nizal Mohd; Jamil, Jastini Mohd

    2016-10-01

    This research works presents an innovative data visualization technique to understand and visualize the information of Oman's education data generated from the Ministry of Education Oman "Educational Portal". The Ministry of Education in Sultanate of Oman have huge databases contains massive information. The volume of data in the database increase yearly as many students, teachers and employees enter into the database. The task for discovering and analyzing these vast volumes of data becomes increasingly difficult. Information visualization and data mining offer a better ways in dealing with large volume of information. In this paper, an innovative information visualization technique is developed to visualize the complex multidimensional educational data. Microsoft Excel Dashboard, Visual Basic Application (VBA) and Pivot Table are utilized to visualize the data. Findings from the summarization of the data are presented, and it is argued that information visualization can help related stakeholders to become aware of hidden and interesting information from large amount of data drowning in their educational portal.

  15. Two Geo-Arithmetic Representations of n[superscript 3]: Sum of Hex Numbers

    ERIC Educational Resources Information Center

    Unal, Husan

    2009-01-01

    Studies have shown that students' understanding is typically analytic and not visual. Two possible reasons for this are when the analytic mode, instead of the graphic mode, is most frequently used in instruction or, when students or teachers hold the belief that mathematics consists simply of skillful manipulation of symbols and numbers. The…

  16. Understanding Fluorescence Measurements through a Guided-Inquiry and Discovery Experiment in Advanced Analytical Laboratory

    ERIC Educational Resources Information Center

    Wilczek-Vera, Grazyna; Salin, Eric Dunbar

    2011-01-01

    An experiment on fluorescence spectroscopy suitable for an advanced analytical laboratory is presented. Its conceptual development used a combination of the expository and discovery styles. The "learn-as-you-go" and direct "hands-on" methodology applied ensures an active role for a student in the process of visualization and discovery of concepts.…

  17. Effects of Learning Analytics Dashboard: Analyzing the Relations among Dashboard Utilization, Satisfaction, and Learning Achievement

    ERIC Educational Resources Information Center

    Kim, Jeonghyun; Jo, Il-Hyun; Park, Yeonjeong

    2016-01-01

    The learning analytics dashboard (LAD) is a newly developed learning support tool for virtual classrooms that is believed to allow students to review their online learning behavior patterns intuitively through the provision of visual information. The purpose of this study was to empirically validate the effects of LAD. An experimental study was…

  18. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  19. Visualization of 3D CT-based anatomical models

    NASA Astrophysics Data System (ADS)

    Alaytsev, Innokentiy K.; Danilova, Tatyana V.; Manturov, Alexey O.; Mareev, Gleb O.; Mareev, Oleg V.

    2018-04-01

    Biomedical volumetric data visualization techniques for the exploration purposes are well developed. Most of the known methods are inappropriate for surgery simulation systems due to lack of realism. A segmented data visualization is a well-known approach for the visualization of the structured volumetric data. The research is focused on improvement of the segmented data visualization technique by the aliasing problems resolution and the use of material transparency modeling for better semitransparent structures rendering.

  20. Advanced Visualization and Interactive Display Rapid Innovation and Discovery Evaluation Research (VISRIDER) Program Task 6: Point Cloud Visualization Techniques for Desktop and Web Platforms

    DTIC Science & Technology

    2017-04-01

    ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH (VISRIDER) PROGRAM TASK 6: POINT CLOUD...To) OCT 2013 – SEP 2014 4. TITLE AND SUBTITLE ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH...various point cloud visualization techniques for viewing large scale LiDAR datasets. Evaluate their potential use for thick client desktop platforms

Top