Sample records for analytical tools needed

  1. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  2. Guidance for the Design and Adoption of Analytic Tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  3. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    NASA Astrophysics Data System (ADS)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  4. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    NASA Technical Reports Server (NTRS)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  5. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  6. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  7. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  8. Modeling of the Global Water Cycle - Analytical Models

    Treesearch

    Yongqiang Liu; Roni Avissar

    2005-01-01

    Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...

  9. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. User`s and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Femec, D.A.

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user`s guide and a reference guide. The user`s guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMAmore » and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices.« less

  11. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  12. Update on SLD Engineering Tools Development

    NASA Technical Reports Server (NTRS)

    Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.

    2004-01-01

    The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.

  13. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less

  14. Truck trip generation data

    DOT National Transportation Integrated Search

    2001-01-01

    The increased importance of truck activity in both transportation engineering and planning : has created a need for truck-oriented analytical tools. A particular planning need is for trip : generation data that can be used to estimate truck traffic p...

  15. Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.

    PubMed

    Dunn, Joshua G; Weissman, Jonathan S

    2016-11-22

    Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .

  16. Using geovisual analytics in Google Earth to understand disease distribution: a case study of campylobacteriosis in the Czech Republic (2008-2012).

    PubMed

    Marek, Lukáš; Tuček, Pavel; Pászto, Vít

    2015-01-28

    Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.

  17. Analysis of past NBI ratings to determine future bridge preservation needs.

    DOT National Transportation Integrated Search

    2004-01-01

    Bridge Management System (BMS) needs an analytical tool that can predict bridge element deterioration and answer questions related to bridge preservation. PONTIS, a comprehensive BMS software, was developed to serve this purpose. However, the intensi...

  18. Visualizing Qualitative Information

    ERIC Educational Resources Information Center

    Slone, Debra J.

    2009-01-01

    The abundance of qualitative data in today's society and the need to easily scrutinize, digest, and share this information calls for effective visualization and analysis tools. Yet, no existing qualitative tools have the analytic power, visual effectiveness, and universality of familiar quantitative instruments like bar charts, scatter-plots, and…

  19. Making Sense of Game-Based User Data: Learning Analytics in Applied Games

    ERIC Educational Resources Information Center

    Steiner, Christina M.; Kickmeier-Rus, Michael D.; Albert, Dietrich

    2015-01-01

    Digital learning games are useful educational tools with high motivational potential. With the application of games for instruction there comes the need of acknowledging learning game experiences also in the context of educational assessment. Learning analytics provides new opportunities for supporting assessment in and of educational games. We…

  20. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  1. Rapid deletion plasmid construction methods for protoplast and Agrobacterium based fungal transformation systems

    USDA-ARS?s Scientific Manuscript database

    Increasing availability of genomic data and sophistication of analytical methodology in fungi has elevated the need for functional genomics tools in these organisms. Gene deletion is a critical tool for functional analysis. The targeted deletion of genes requires both a suitable method for the trans...

  2. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  3. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  4. IBM's Health Analytics and Clinical Decision Support.

    PubMed

    Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W

    2014-08-15

    This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.

  5. Novel tools for in situ detection of biodiversity and function of dechlorinating and uranium-reducing bacteria in contaminated environments

    USDA-ARS?s Scientific Manuscript database

    Toxic heavy metals and radionuclides pose a growing, global threat to the environment. For an intelligent remediation design, reliable analytical tools for detection of relevant species are needed, such as PCR. However, PCR cannot visualize its targets and thus provide information about the morpholo...

  6. Laboratory, Field, and Analytical Procedures for Using Passive Sampling in the Evaluation of Contaminated Sediments: User's Manual

    EPA Science Inventory

    Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sedi...

  7. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  9. National Facilities Study. Volume 1: Facilities Inventory

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The inventory activity was initiated to solve the critical need for a single source of site specific descriptive and parametric data on major public and privately held aeronautics and aerospace related facilities. This a challenging undertaking due to the scope of the effort and the short lead time in which to assemble the inventory and have it available to support the task group study needs. The inventory remains dynamic as sites are being added and the data is accessed and refined as the study progresses. The inventory activity also included the design and implementation of a computer database and analytical tools to simplify access to the data. This volume describes the steps which were taken to define the data requirements, select sites, and solicit and acquire data from them. A discussion of the inventory structure and analytical tools is also provided.

  10. Information and communication technology to support self-management of patients with mild acquired cognitive impairments: systematic review.

    PubMed

    Eghdam, Aboozar; Scholl, Jeremiah; Bartfai, Aniko; Koch, Sabine

    2012-11-19

    Mild acquired cognitive impairment (MACI) is a new term used to describe a subgroup of patients with mild cognitive impairment (MCI) who are expected to reach a stable cognitive level over time. This patient group is generally young and have acquired MCI from a head injury or mild stroke. Although the past decade has seen a large amount of research on how to use information and communication technology (ICT) to support self-management of patients with chronic diseases, MACI has not received much attention. Therefore, there is a lack of information about what tools have been created and evaluated that are suitable for self-management of MACI patients, and a lack of clear direction on how best to proceed with ICT tools to support self-management of MACI patients. This paper aims to provide direction for further research and development of tools that can support health care professionals in assisting MACI patients with self-management. An overview of studies reporting on the design and/or evaluation of ICT tools for assisting MACI patients in self-management is presented. We also analyze the evidence of benefit provided by these tools, and how their functionality matches MACI patients' needs to determine areas of interest for further research and development. A review of the existing literature about available assistive ICT tools for MACI patients was conducted using 8 different medical, scientific, engineering, and physiotherapy library databases. The functionality of tools was analyzed using an analytical framework based on the International Classification of Functioning, Disability and Health (ICF) and a subset of common and important problems for patients with MACI created by MACI experts in Sweden. A total of 55 search phrases applied in the 8 databases returned 5969 articles. After review, 7 articles met the inclusion criteria. Most articles reported case reports and exploratory research. Out of the 7 articles, 4 (57%) studies had less than 10 participants, 5 (71%) technologies were memory aids, and 6 studies were mobile technologies. All 7 studies fit the profile for patients with MACI as described by our analytical framework. However, several areas in the framework important for meeting patient needs were not covered by the functionality in any of the ICT tools. This study shows a lack of ICT tools developed and evaluated for supporting self-management of MACI patients. Our analytical framework was a valuable tool for providing an overview of how the functionality of these tools matched patient needs. There are a number of important areas for MACI patients that are not covered by the functionality of existing tools, such as support for interpersonal interactions and relationships. Further research on ICT tools to support self-management for patients with MACI is needed.

  11. IBM’s Health Analytics and Clinical Decision Support

    PubMed Central

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  12. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  13. Laboratory, Field, and Analytical Procedures for Using ...

    EPA Pesticide Factsheets

    Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sediments, toxicity testing and bioaccumulation studies performed on site sediments, and application of partitioning, transport and fate modeling. All of these tools served as lines of evidence for making informed environmental management decisions at contaminated sediment sites. In the last ten years, a new tool for assessing remedial effectiveness has gained a great deal of attention. Passive sampling offers a tool capable of measuring the freely dissolved concentration (Cfree) of legacy contaminants in water and sediments. In addition to assessing the effectiveness of the remedy, passive sampling can be applied for a variety of other contaminated sediments site purposes involved with performing the preliminary assessment and site inspection, conducting the remedial investigation and feasibility study, preparing the remedial design, and assessing the potential for contaminant bioaccumulation. While there is a distinct need for using passive sampling at contaminated sediments sites and several previous documents and research articles have discussed various aspects of passive sampling, there has not been definitive guidance on the laboratory, field and analytical procedures for using pas

  14. Big Data: Are Biomedical and Health Informatics Training Programs Ready?

    PubMed Central

    Hersh, W.; Ganesh, A. U. Jai

    2014-01-01

    Summary Objectives The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? Methods We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. Results The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one’s area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Conclusions Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in “deep analytical talent” as well as those who need knowledge to support such individuals. PMID:25123740

  15. Big Data: Are Biomedical and Health Informatics Training Programs Ready? Contribution of the IMIA Working Group for Health and Medical Informatics Education.

    PubMed

    Otero, P; Hersh, W; Jai Ganesh, A U

    2014-08-15

    The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one's area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in "deep analytical talent" as well as those who need knowledge to support such individuals.

  16. Analytical challenges for conducting rapid metabolism characterization for QIVIVE.

    PubMed

    Tolonen, Ari; Pelkonen, Olavi

    2015-06-05

    For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Optical Drug Monitoring: Photoacoustic Imaging of Nanosensors to Monitor Therapeutic Lithium In Vivo

    PubMed Central

    Cash, Kevin J.; Li, Chiye; Xia, Jun; Wang, Lihong V.; Clark, Heather A.

    2015-01-01

    Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes. PMID:25588028

  18. Optical drug monitoring: photoacoustic imaging of nanosensors to monitor therapeutic lithium in vivo.

    PubMed

    Cash, Kevin J; Li, Chiye; Xia, Jun; Wang, Lihong V; Clark, Heather A

    2015-02-24

    Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal, we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes.

  19. Big data and new knowledge in medicine: the thinking, training, and tools needed for a learning health system.

    PubMed

    Krumholz, Harlan M

    2014-07-01

    Big data in medicine--massive quantities of health care data accumulating from patients and populations and the advanced analytics that can give those data meaning--hold the prospect of becoming an engine for the knowledge generation that is necessary to address the extensive unmet information needs of patients, clinicians, administrators, researchers, and health policy makers. This article explores the ways in which big data can be harnessed to advance prediction, performance, discovery, and comparative effectiveness research to address the complexity of patients, populations, and organizations. Incorporating big data and next-generation analytics into clinical and population health research and practice will require not only new data sources but also new thinking, training, and tools. Adequately utilized, these reservoirs of data can be a practically inexhaustible source of knowledge to fuel a learning health care system. Project HOPE—The People-to-People Health Foundation, Inc.

  20. Big Data And New Knowledge In Medicine: The Thinking, Training, And Tools Needed For A Learning Health System

    PubMed Central

    Krumholz, Harlan M.

    2017-01-01

    Big data in medicine--massive quantities of health care data accumulating from patients and populations and the advanced analytics that can give it meaning--hold the prospect of becoming an engine for the knowledge generation that is necessary to address the extensive unmet information needs of patients, clinicians, administrators, researchers, and health policy makers. This paper explores the ways in which big data can be harnessed to advance prediction, performance, discovery, and comparative effectiveness research to address the complexity of patients, populations, and organizations. Incorporating big data and next-generation analytics into clinical and population health research and practice will require not only new data sources but also new thinking, training, and tools. Adequately used, these reservoirs of data can be a practically inexhaustible source of knowledge to fuel a learning health care system. PMID:25006142

  1. FDT 2.0: Improving scalability of the fuzzy decision tree induction tool - integrating database storage.

    PubMed

    Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W

    2014-12-01

    Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.

  2. Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y; Glascoe, L

    The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less

  3. Coping with Volume and Variety in Temporal Event Sequences: Strategies for Sharpening Analytic Focus.

    PubMed

    Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam

    2017-06-01

    The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.

  4. Automation is key to managing a population's health.

    PubMed

    Matthews, Michael B; Hodach, Richard

    2012-04-01

    Online tools for automating population health management can help healthcare organizations meet their patients' needs both during and between encounters with the healthcare system. These tools can facilitate: The use of registries to track patients' health status and care gaps. Outbound messaging to notify patients when they need care. Care team management of more patients at different levels of risk. Automation of workflows related to case management and transitions of care. Online educational and mobile health interventions to engage patients in their care. Analytics programs to identify opportunities for improvement.

  5. Electrochemical Enzyme Biosensors Revisited: Old Solutions for New Problems.

    PubMed

    Monteiro, Tiago; Almeida, Maria Gabriela

    2018-05-14

    Worldwide legislation is driving the development of novel and highly efficient analytical tools for assessing the composition of every material that interacts with Consumers or Nature. The biosensor technology is one of the most active R&D domains of Analytical Sciences focused on the challenge of taking analytical chemistry to the field. Electrochemical biosensors based on redox enzymes, in particular, are highly appealing due to their usual quick response, high selectivity and sensitivity, low cost and portable dimensions. This review paper aims to provide an overview of the most important advances made in the field since the proposal of the first biosensor, the well-known hand-held glucose meter. The first section addresses the current needs and challenges for novel analytical tools, followed by a brief description of the different components and configurations of biosensing devices, and the fundamentals of enzyme kinetics and amperometry. The following sections emphasize on enzyme-based amperometric biosensors and the different stages of their development.

  6. The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox

    NASA Astrophysics Data System (ADS)

    Harris, A. T., III; Goodman, J.; Justice, B.

    2014-12-01

    As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.

  7. Relationships between models used to analyze fire and fuel management alternatives

    Treesearch

    Nicholas L. Crookston; Werner A. Kurz; Sarah J. Beukema; Elizabeth D. Reinhardt

    2000-01-01

    Needs for analytical tools, the roles existing tools play, the processes they represent, and how they might interact are elements of key findings generated during a workshop held in Seattle February 17-18, 1999. The workshop was attended by 26 Joint Fire Science Program (JFSP) stakeholders and researchers. A focus of the workshop was the Fire and Fuels Extension to the...

  8. Information and Communication Technology to Support Self-Management of Patients with Mild Acquired Cognitive Impairments: Systematic Review

    PubMed Central

    Scholl, Jeremiah; Bartfai, Aniko; Koch, Sabine

    2012-01-01

    Background Mild acquired cognitive impairment (MACI) is a new term used to describe a subgroup of patients with mild cognitive impairment (MCI) who are expected to reach a stable cognitive level over time. This patient group is generally young and have acquired MCI from a head injury or mild stroke. Although the past decade has seen a large amount of research on how to use information and communication technology (ICT) to support self-management of patients with chronic diseases, MACI has not received much attention. Therefore, there is a lack of information about what tools have been created and evaluated that are suitable for self-management of MACI patients, and a lack of clear direction on how best to proceed with ICT tools to support self-management of MACI patients. Objective This paper aims to provide direction for further research and development of tools that can support health care professionals in assisting MACI patients with self-management. An overview of studies reporting on the design and/or evaluation of ICT tools for assisting MACI patients in self-management is presented. We also analyze the evidence of benefit provided by these tools, and how their functionality matches MACI patients’ needs to determine areas of interest for further research and development. Methods A review of the existing literature about available assistive ICT tools for MACI patients was conducted using 8 different medical, scientific, engineering, and physiotherapy library databases. The functionality of tools was analyzed using an analytical framework based on the International Classification of Functioning, Disability and Health (ICF) and a subset of common and important problems for patients with MACI created by MACI experts in Sweden. Results A total of 55 search phrases applied in the 8 databases returned 5969 articles. After review, 7 articles met the inclusion criteria. Most articles reported case reports and exploratory research. Out of the 7 articles, 4 (57%) studies had less than 10 participants, 5 (71%) technologies were memory aids, and 6 studies were mobile technologies. All 7 studies fit the profile for patients with MACI as described by our analytical framework. However, several areas in the framework important for meeting patient needs were not covered by the functionality in any of the ICT tools. Conclusions This study shows a lack of ICT tools developed and evaluated for supporting self-management of MACI patients. Our analytical framework was a valuable tool for providing an overview of how the functionality of these tools matched patient needs. There are a number of important areas for MACI patients that are not covered by the functionality of existing tools, such as support for interpersonal interactions and relationships. Further research on ICT tools to support self-management for patients with MACI is needed. PMID:23165152

  9. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  10. Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.

    2008-07-30

    As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less

  11. Shouting in the Jungle - The SETI Transmission Debate

    NASA Astrophysics Data System (ADS)

    Schuch, H. P.; Almar, I.

    The prudence of transmitting deliberate messages from Earth into interstellar space remains controversial. Reasoned risk- benefit analysis is needed, to inform policy recommendations by such bodies as the International Academy of Astronautics SETI Permanent Study Group. As a first step, at the 2005 International Astronautical Congress in Fukuoka, we discussed the San Marino Scale, a new analytical tool for assessing transmission risk. That Scale was updated, and a revised version presented at the 2006 IAC in Valencia. We are now in a position to recommend specific improvements to the scale we proposed for quantifying terrestrial transmissions. Our intent is to make this tool better reflect the detectability and potential impact of recent and proposed messages beamed from Earth. We believe the changes proposed herein strengthen the San Marino Scale as an analytical tool, and bring us closer to its eventual adoption.

  12. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  13. Analytical Chemistry in the Regulatory Science of Medical Devices.

    PubMed

    Wang, Yi; Guan, Allan; Wickramasekara, Samanthi; Phillips, K Scott

    2018-06-12

    In the United States, regulatory science is the science of developing new tools, standards, and approaches to assess the safety, efficacy, quality, and performance of all Food and Drug Administration-regulated products. Good regulatory science facilitates consumer access to innovative medical devices that are safe and effective throughout the Total Product Life Cycle (TPLC). Because the need to measure things is fundamental to the regulatory science of medical devices, analytical chemistry plays an important role, contributing to medical device technology in two ways: It can be an integral part of an innovative medical device (e.g., diagnostic devices), and it can be used to support medical device development throughout the TPLC. In this review, we focus on analytical chemistry as a tool for the regulatory science of medical devices. We highlight recent progress in companion diagnostics, medical devices on chips for preclinical testing, mass spectrometry for postmarket monitoring, and detection/characterization of bacterial biofilm to prevent infections.

  14. BIRCH: a user-oriented, locally-customizable, bioinformatics system.

    PubMed

    Fristensky, Brian

    2007-02-09

    Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.

  15. BIRCH: A user-oriented, locally-customizable, bioinformatics system

    PubMed Central

    Fristensky, Brian

    2007-01-01

    Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351

  16. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela; Bond, Nathaniel

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  17. Incorporating Ecosystem Goods and Services in Environmental Planning - Definitions, Classification and Operational Approaches

    DTIC Science & Technology

    2013-07-01

    water resource project planning and management; the authors also seek to identify any research needs to accommodate that goal. This technical note and...review of the state of the science of EGS and highlights the types of analytical tools, techniques, and considerations that would be needed within a...instructions, searching existing data sources, gathering and maintaining the data needed , and completing and reviewing the collection of information. Send

  18. Meeting report: Ocean 'omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013).

    PubMed

    Gilbert, Jack A; Dick, Gregory J; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R M; DeLong, Edward F

    2014-06-15

    The National Science Foundation's EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on 'omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, "big-data capable" analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean 'omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the 'omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography.

  19. Meeting report: Ocean ‘omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013)

    PubMed Central

    Gilbert, Jack A; Dick, Gregory J.; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R. M.

    2014-01-01

    The National Science Foundation’s EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on ‘omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, “big-data capable” analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean ‘omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the ‘omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography. PMID:25197495

  20. New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Eric W; Rames, Clement L; Muratori, Matteo

    This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.

  1. Updates in metabolomics tools and resources: 2014-2015.

    PubMed

    Misra, Biswapriya B; van der Hooft, Justin J J

    2016-01-01

    Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. The National energy modeling system

    NASA Astrophysics Data System (ADS)

    The DOE uses a variety of energy and economic models to forecast energy supply and demand. It also uses a variety of more narrowly focussed analytical tools to examine energy policy options. For the purpose of the scope of this work, this set of models and analytical tools is called the National Energy Modeling System (NEMS). The NEMS is the result of many years of development of energy modeling and analysis tools, many of which were developed for different applications and under different assumptions. As such, NEMS is believed to be less than satisfactory in certain areas. For example, NEMS is difficult to keep updated and expensive to use. Various outputs are often difficult to reconcile. Products were not required to interface, but were designed to stand alone. Because different developers were involved, the inner workings of the NEMS are often not easily or fully understood. Even with these difficulties, however, NEMS comprises the best tools currently identified to deal with our global, national and regional energy modeling, and energy analysis needs.

  3. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  4. PRE-QAPP AGREEMENT (PQA) AND ANALYTICAL METHOD CHECKLISTS (AMCS): TOOLS FOR PLANNING RESEARCH PROJECTS

    EPA Science Inventory

    The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...

  5. The Global War on Terrorism: Analytical Support, Tools and Metrics of Assessment. MORS Workshop

    DTIC Science & Technology

    2005-08-11

    is the matter of intelligence, as COL(P) Keller pointed out, we need to spend less time in the intelligence cycle on managing information and...models, decision aids: "named things " * Methodologies: potentially useful things "* Resources: databases, people, books? * Meta-data on tools * Develop a...experience. Only one member (Mr. Garry Greco) had served on the Joint Intelligence Task Force for Counter Terrorism. Although Gary heavily participated

  6. Facilitating adaptive management in the Chesapeake Bay Watershed through the use of online decision support tools

    USGS Publications Warehouse

    Mullinx, Cassandra; Phillips, Scott; Shenk, Kelly; Hearn, Paul; Devereux, Olivia

    2009-01-01

    The Chesapeake Bay Program (CBP) is attempting to more strategically implement management actions to improve the health of the Nation’s largest estuary. In 2007 the U.S. Geological Survey (USGS) and U.S. Environmental Protection Agency (USEPA) CBP office began a joint effort to develop a suite of Internetaccessible decision-support tools and to help meet the needs of CBP partners to improve water quality and habitat conditions in the Chesapeake Bay and its watersheds. An adaptive management framework is being used to provide a structured decision process for information and individual tools needed to implement and assess practices to improve the condition of the Chesapeake Bay ecosystem. The Chesapeake Online Adaptive Support Toolkit (COAST) is a collection of web-based analytical tools and information, organized in an adaptive management framework, intended to aid decisionmakers in protecting and restoring the integrity of the Bay ecosystem. The initial version of COAST is focused on water quality issues. During early and mid- 2008, initial ideas for COAST were shared and discussed with various CBP partners and other potential user groups. At these meetings, test cases were selected to help improve understanding of the types of information and analytical functionality that would be most useful for specific partners’ needs. These discussions added considerable knowledge about the nature of decisionmaking for Federal, State, local and nongovernmental partners. Version 1.0 of COAST, released in early winter of 2008, will be further reviewed to determine improvements needed to address implementation and assessment of water quality practices. Future versions of COAST may address other aspects of ecosystem restoration, including restoration of habitat and living resources and maintaining watershed health.

  7. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  8. Big Data Analytics in Chemical Engineering.

    PubMed

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  9. Aviation Modeling and Simulation Needs and Requirements Workshop: January 27-28, 1999

    DOT National Transportation Integrated Search

    1999-01-01

    A two-day workshop was held at the Volpe Center on January 27-28, 1999. The purpose of the workshop was to: 1) identify and understand the requirements for analytical and planning tool initiatives that will give decision makers insight into the capac...

  10. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  11. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    PubMed

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  12. SU-C-204-01: A Fast Analytical Approach for Prompt Gamma and PET Predictions in a TPS for Proton Range Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroniger, K; Herzog, M; Landry, G

    2015-06-15

    Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less

  13. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Tiered Approach to Resilience Assessment.

    PubMed

    Linkov, Igor; Fox-Lent, Cate; Read, Laura; Allen, Craig R; Arnott, James C; Bellini, Emanuele; Coaffee, Jon; Florin, Marie-Valentine; Hatfield, Kirk; Hyde, Iain; Hynes, William; Jovanovic, Aleksandar; Kasperson, Roger; Katzenberger, John; Keys, Patrick W; Lambert, James H; Moss, Richard; Murdoch, Peter S; Palma-Oliveira, Jose; Pulwarty, Roger S; Sands, Dale; Thomas, Edward A; Tye, Mari R; Woods, David

    2018-04-25

    Regulatory agencies have long adopted a three-tier framework for risk assessment. We build on this structure to propose a tiered approach for resilience assessment that can be integrated into the existing regulatory processes. Comprehensive approaches to assessing resilience at appropriate and operational scales, reconciling analytical complexity as needed with stakeholder needs and resources available, and ultimately creating actionable recommendations to enhance resilience are still lacking. Our proposed framework consists of tiers by which analysts can select resilience assessment and decision support tools to inform associated management actions relative to the scope and urgency of the risk and the capacity of resource managers to improve system resilience. The resilience management framework proposed is not intended to supplant either risk management or the many existing efforts of resilience quantification method development, but instead provide a guide to selecting tools that are appropriate for the given analytic need. The goal of this tiered approach is to intentionally parallel the tiered approach used in regulatory contexts so that resilience assessment might be more easily and quickly integrated into existing structures and with existing policies. Published 2018. This article is a U.S. government work and is in the public domain in the USA.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, S.; Katz, J.; Wurtenberger, L.

    Low emission development strategies (LEDS) articulate economy-wide policies and implementation plans designed to enable a country to meet its long-term development objectives while reducing greenhouse gas emissions. A development impact assessment tool was developed to inform an analytically robust and transparent prioritization of LEDS actions based on their economic, social, and environmental impacts. The graphical tool helps policymakers communicate the development impacts of LEDS options and identify actions that help meet both emissions reduction and development goals. This paper summarizes the adaptation and piloting of the tool in Kenya and Montenegro. The paper highlights strengths of the tool and discussesmore » key needs for improving it.« less

  16. Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.

    PubMed

    Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D

    2016-02-01

    Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  17. Technology advancement for integrative stem cell analyses.

    PubMed

    Jeong, Yoon; Choi, Jonghoon; Lee, Kwan Hyi

    2014-12-01

    Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose--by introducing a concept of vertical and horizontal approach--that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment.

  18. Technology Advancement for Integrative Stem Cell Analyses

    PubMed Central

    Jeong, Yoon

    2014-01-01

    Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose—by introducing a concept of vertical and horizontal approach—that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment. PMID:24874188

  19. THREAT ANTICIPATION AND DECEPTIVE REASONING USING BAYESIAN BELIEF NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E

    Recent events highlight the need for tools to anticipate threats posed by terrorists. Assessing these threats requires combining information from disparate data sources such as analytic models, simulations, historical data, sensor networks, and user judgments. These disparate data can be combined in a coherent, analytically defensible, and understandable manner using a Bayesian belief network (BBN). In this paper, we develop a BBN threat anticipatory model based on a deceptive reasoning algorithm using a network engineering process that treats the probability distributions of the BBN nodes within the broader context of the system development process.

  20. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  1. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    EPA Pesticide Factsheets

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  2. PipeCraft: Flexible open-source toolkit for bioinformatics analysis of custom high-throughput amplicon sequencing data.

    PubMed

    Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho

    2017-11-01

    High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.

  3. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  4. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  5. Ontario Universities Statistical Compendium, 1970-90. Part A: Micro-Indicators.

    ERIC Educational Resources Information Center

    Council of Ontario Universities, Toronto. Research Div.

    This publication provides macro-indicators and complementary analyses and supporting data for use by policy and decision makers concerned with Ontario universities. These analytical tools are meant to unambiguously measure what is taking place in Canadian postsecondary education, and therefore, assist in focusing on what decisions need to be made.…

  6. Experts workshop on the ecotoxicological risk assessment of ionizable organic chemicals: Towards a science-based framework for chemical assessment

    EPA Science Inventory

    There is a growing need to develop analytical methods and tools that can be applied to assess the environmental risks associated with charged, polar, and ionisable organic chemicals, such as those used as active pharmaceutical ingredients, biocides, and surface active chemicals. ...

  7. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  8. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    PubMed

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  9. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    PubMed

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  10. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory

    PubMed Central

    Kumar, B. Vinodh; Mohan, Thuthi

    2018-01-01

    OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587

  11. Using 3D Printing for Rapid Prototyping of Characterization Tools for Investigating Powder Blend Behavior.

    PubMed

    Hirschberg, Cosima; Boetker, Johan P; Rantanen, Jukka; Pein-Hackelbusch, Miriam

    2018-02-01

    There is an increasing need to provide more detailed insight into the behavior of particulate systems. The current powder characterization tools are developed empirically and in many cases, modification of existing equipment is difficult. More flexible tools are needed to provide understanding of complex powder behavior, such as mixing process and segregation phenomenon. An approach based on the fast prototyping of new powder handling geometries and interfacing solutions for process analytical tools is reported. This study utilized 3D printing for rapid prototyping of customized geometries; overall goal was to assess mixing process of powder blends at small-scale with a combination of spectroscopic and mechanical monitoring. As part of the segregation evaluation studies, the flowability of three different paracetamol/filler-blends at different ratios was investigated, inter alia to define the percolation thresholds. Blends with a paracetamol wt% above the percolation threshold were subsequently investigated in relation to their segregation behavior. Rapid prototyping using 3D printing allowed designing two funnels with tailored flow behavior (funnel flow) of model formulations, which could be monitored with an in-line near-infrared (NIR) spectrometer. Calculating the root mean square (RMS) of the scores of the two first principal components of the NIR spectra visualized spectral variation as a function of process time. In a same setup, mechanical properties (basic flow energy) of the powder blend were monitored during blending. Rapid prototyping allowed for fast modification of powder testing geometries and easy interfacing with process analytical tools, opening new possibilities for more detailed powder characterization.

  12. Big data analytics in hyperspectral imaging for detection of microbial colonies on agar plates (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Yoon, Seung-Chul; Park, Bosoon; Lawrence, Kurt C.

    2017-05-01

    Various types of optical imaging techniques measuring light reflectivity and scattering can detect microbial colonies of foodborne pathogens on agar plates. Until recently, these techniques were developed to provide solutions for hypothesis-driven studies, which focused on developing tools and batch/offline machine learning methods with well defined sets of data. These have relatively high accuracy and rapid response time because the tools and methods are often optimized for the collected data. However, they often need to be retrained or recalibrated when new untrained data and/or features are added. A big-data driven technique is more suitable for online learning of new/ambiguous samples and for mining unknown or hidden features. Although big data research in hyperspectral imaging is emerging in remote sensing and many tools and methods have been developed so far in many other applications such as bioinformatics, the tools and methods still need to be evaluated and adjusted in applications where the conventional batch machine learning algorithms were dominant. The primary objective of this study is to evaluate appropriate big data analytic tools and methods for online learning and mining of foodborne pathogens on agar plates. After the tools and methods are successfully identified, they will be applied to rapidly search big color and hyperspectral image data of microbial colonies collected over the past 5 years in house and find the most probable colony or a group of colonies in the collected big data. The meta-data, such as collection time and any unstructured data (e.g. comments), will also be analyzed and presented with output results. The expected results will be novel, big data-driven technology to correctly detect and recognize microbial colonies of various foodborne pathogens on agar plates.

  13. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  14. Strategic Integration of Multiple Bioinformatics Resources for System Level Analysis of Biological Networks.

    PubMed

    D'Souza, Mark; Sulakhe, Dinanath; Wang, Sheng; Xie, Bing; Hashemifar, Somaye; Taylor, Andrew; Dubchak, Inna; Conrad Gilliam, T; Maltsev, Natalia

    2017-01-01

    Recent technological advances in genomics allow the production of biological data at unprecedented tera- and petabyte scales. Efficient mining of these vast and complex datasets for the needs of biomedical research critically depends on a seamless integration of the clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships. Such experimental data accumulated in publicly available databases should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining.We present an integrated computational platform Lynx (Sulakhe et al., Nucleic Acids Res 44:D882-D887, 2016) ( http://lynx.cri.uchicago.edu ), a web-based database and knowledge extraction engine. It provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization. It gives public access to the Lynx integrated knowledge base (LynxKB) and its analytical tools via user-friendly web services and interfaces. The Lynx service-oriented architecture supports annotation and analysis of high-throughput experimental data. Lynx tools assist the user in extracting meaningful knowledge from LynxKB and experimental data, and in the generation of weighted hypotheses regarding the genes and molecular mechanisms contributing to human phenotypes or conditions of interest. The goal of this integrated platform is to support the end-to-end analytical needs of various translational projects.

  15. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  16. Customisation of the exome data analysis pipeline using a combinatorial approach.

    PubMed

    Pattnaik, Swetansu; Vaidyanathan, Srividya; Pooja, Durgad G; Deepak, Sa; Panda, Binay

    2012-01-01

    The advent of next generation sequencing (NGS) technologies have revolutionised the way biologists produce, analyse and interpret data. Although NGS platforms provide a cost-effective way to discover genome-wide variants from a single experiment, variants discovered by NGS need follow up validation due to the high error rates associated with various sequencing chemistries. Recently, whole exome sequencing has been proposed as an affordable option compared to whole genome runs but it still requires follow up validation of all the novel exomic variants. Customarily, a consensus approach is used to overcome the systematic errors inherent to the sequencing technology, alignment and post alignment variant detection algorithms. However, the aforementioned approach warrants the use of multiple sequencing chemistry, multiple alignment tools, multiple variant callers which may not be viable in terms of time and money for individual investigators with limited informatics know-how. Biologists often lack the requisite training to deal with the huge amount of data produced by NGS runs and face difficulty in choosing from the list of freely available analytical tools for NGS data analysis. Hence, there is a need to customise the NGS data analysis pipeline to preferentially retain true variants by minimising the incidence of false positives and make the choice of right analytical tools easier. To this end, we have sampled different freely available tools used at the alignment and post alignment stage suggesting the use of the most suitable combination determined by a simple framework of pre-existing metrics to create significant datasets.

  17. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...

  18. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  19. Verification of the Icarus Material Response Tool

    NASA Technical Reports Server (NTRS)

    Schroeder, Olivia; Palmer, Grant; Stern, Eric; Schulz, Joseph; Muppidi, Suman; Martin, Alexandre

    2017-01-01

    Due to the complex physics encountered during reentry, material response solvers are used for two main purposes: improve the understanding of the physical phenomena; and design and size thermal protection systems (TPS). Icarus, is a three dimensional, unstructured material response tool that is intended to be used for design while maintaining the flexibility to easily implement physical models as needed. Because TPS selection and sizing is critical, it is of the utmost importance that the design tools be extensively verified and validated before their use. Verification tests aim at insuring that the numerical schemes and equations are implemented correctly by comparison to analytical solutions and grid convergence tests.

  20. Machine Learning Technologies Translates Vigilant Surveillance Satellite Big Data into Predictive Alerts for Environmental Stressors

    NASA Astrophysics Data System (ADS)

    Johnson, S. P.; Rohrer, M. E.

    2017-12-01

    The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention, intervention, and prediction of emerging environmental threats.

  1. A miniaturized optoelectronic system for rapid quantitative label-free detection of harmful species in food

    NASA Astrophysics Data System (ADS)

    Raptis, Ioannis; Misiakos, Konstantinos; Makarona, Eleni; Salapatas, Alexandros; Petrou, Panagiota; Kakabakos, Sotirios; Botsialas, Athanasios; Jobst, Gerhard; Haasnoot, Willem; Fernandez-Alba, Amadeo; Lees, Michelle; Valamontes, Evangelos

    2016-03-01

    Optical biosensors have emerged in the past decade as the most promising candidates for portable, highly-sensitive bioanalytical systems that can be employed for in-situ measurements. In this work, a miniaturized optoelectronic system for rapid, quantitative, label-free detection of harmful species in food is presented. The proposed system has four distinctive features that can render to a powerful tool for the next generation of Point-of-Need applications, namely it accommodates the light sources and ten interferometric biosensors on a single silicon chip of a less-than-40mm2 footprint, each sensor can be individually functionalized for a specific target analyte, the encapsulation can be performed at the wafer-scale, and finally it exploits a new operation principle, Broad-band Mach-Zehnder Interferometry to ameliorate its analytical capabilities. Multi-analyte evaluation schemes for the simultaneous detection of harmful contaminants, such as mycotoxins, allergens and pesticides, proved that the proposed system is capable of detecting within short time these substances at concentrations below the limits imposed by regulatory authorities, rendering it to a novel tool for the near-future food safety applications.

  2. An Analytic Tool to Investigate the Effect of Binder on the Sensitivity of HMX-Based Plastic Bonded Explosives in the Skid Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayden, D. W.

    This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried tomore » develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of Hatler et. al., and the reactive temperature rise will be obtained from Mader's work. Finally, the assessment of when a detonation occurs will be derived from Bowden and Yoffe's thermal explosion theory (hot spot).« less

  3. 2015 Army Science Planning and Strategy Meeting Series: Outcomes and Conclusions

    DTIC Science & Technology

    2017-12-21

    modeling and nanoscale characterization tools to enable efficient design of hybridized manufacturing ; realtime, multiscale computational capability...to enable predictive analytics for expeditionary on-demand manufacturing • Discovery of design principles to enable programming advanced genetic...goals, significant research is needed to mature the fundamental materials science, processing and manufacturing sciences, design methodologies, data

  4. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.

  5. Usage Patterns of a Mobile Palliative Care Application.

    PubMed

    Zhang, Haipeng; Liu, David; Marks, Sean; Rickerson, Elizabeth M; Wright, Adam; Gordon, William J; Landman, Adam

    2018-06-01

    Fast Facts Mobile (FFM) was created to be a convenient way for clinicians to access the Fast Facts and Concepts database of palliative care articles on a smartphone or tablet device. We analyzed usage patterns of FFM through an integrated analytics platform on the mobile versions of the FFM application. The primary objective of this study was to evaluate the usage data from FFM as a way to better understand user behavior for FFM as a palliative care educational tool. This is an exploratory, retrospective analysis of de-identified analytics data collected through the iOS and Android versions of FFM captured from November 2015 to November 2016. FFM App download statistics from November 1, 2015, to November 1, 2016, were accessed from the Apple and Google development websites. Further FFM session data were obtained from the analytics platform built into FFM. FFM was downloaded 9409 times over the year with 201,383 articles accessed. The most searched-for terms in FFM include the following: nausea, methadone, and delirium. We compared frequent users of FFM to infrequent users of FFM and found that 13% of all users comprise 66% of all activity in the application. Demand for useful and scalable tools for both primary palliative care and specialty palliative care will likely continue to grow. Understanding the usage patterns for FFM has the potential to inform the development of future versions of Fast Facts. Further studies of mobile palliative care educational tools will be needed to further define the impact of these educational tools.

  6. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  7. Detection of nanomaterials in food and consumer products: bridging the gap from legislation to enforcement.

    PubMed

    Stamm, H; Gibson, N; Anklam, E

    2012-08-01

    This paper describes the requirements and resulting challenges for the implementation of current and upcoming European Union legislation referring to the use of nanomaterials in food, cosmetics and other consumer products. The European Commission has recently adopted a recommendation for the definition of nanomaterials. There is now an urgent need for appropriate and fit-for-purpose analytical methods in order to identify nanomaterials properly according to this definition and to assess whether or not a product contains nanomaterials. Considering the lack of such methods to date, this paper elaborates on the challenges of the legislative framework and the type of methods needed, not only to facilitate implementation of labelling requirements, but also to ensure the safety of products coming to the market. Considering the many challenges in the analytical process itself, such as interaction of nanoparticles with matrix constituents, potential agglomeration and aggregation due to matrix environment, broad variety of matrices, etc., there is a need for integrated analytical approaches, not only for sample preparation (e.g. separation from matrix), but also for the actual characterisation. Furthermore, there is an urgent need for quality assurance tools such as validated methods and (certified) reference materials, including materials containing nanoparticles in a realistic matrix (food products, cosmetics, etc.).

  8. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  9. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  10. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  11. Sampling and analysis for radon-222 dissolved in ground water and surface water

    USGS Publications Warehouse

    DeWayne, Cecil L.; Gesell, T.F.

    1992-01-01

    Radon-222 is a naturally occurring radioactive gas in the uranium-238 decay series that has traditionally been called, simply, radon. The lung cancer risks associated with the inhalation of radon decay products have been well documented by epidemiological studies on populations of uranium miners. The realization that radon is a public health hazard has raised the need for sampling and analytical guidelines for field personnel. Several sampling and analytical methods are being used to document radon concentrations in ground water and surface water worldwide but no convenient, single set of guidelines is available. Three different sampling and analytical methods - bubbler, liquid scintillation, and field screening - are discussed in this paper. The bubbler and liquid scintillation methods have high accuracy and precision, and small analytical method detection limits of 0.2 and 10 pCi/l (picocuries per liter), respectively. The field screening method generally is used as a qualitative reconnaissance tool.

  12. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  13. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  14. Human/autonomy collaboration for the automated generation of intelligence products

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Schlachter, Jason; Kuter, Ugur; Goldman, Robert

    2017-05-01

    Intelligence Analysis remains a manual process despite trends toward autonomy in information processing. Analysts need agile decision-­-support tools that can adapt to the evolving information needs of the mission, allowing the analyst to pose novel analytic questions. Our research enables the analysts to only provide a constrained English specification of what the intelligence product should be. Using HTN planning, the autonomy discovers, decides, and generates a workflow of algorithms to create the intelligence product. Therefore, the analyst can quickly and naturally communicate to the autonomy what information product is needed, rather than how to create it.

  15. Prologue: Toward Accurate Identification of Developmental Language Disorder Within Linguistically Diverse Schools.

    PubMed

    Oetting, Janna B

    2018-04-05

    Although the 5 studies presented within this clinical forum include children who differ widely in locality, language learning profile, and age, all were motivated by a desire to improve the accuracy at which developmental language disorder is identified within linguistically diverse schools. The purpose of this prologue is to introduce the readers to a conceptual framework that unites the studies while also highlighting the approaches and methods each research team is pursuing to improve assessment outcomes within their respective linguistically diverse community. A disorder within diversity framework is presented to replace previous difference vs. disorder approaches. Then, the 5 studies within the forum are reviewed by clinical question, type of tool(s), and analytical approach. Across studies of different linguistically diverse groups, research teams are seeking answers to similar questions about child language screening and diagnostic practices, using similar analytical approaches to answer their questions, and finding promising results with tools focused on morphosyntax. More studies that are modeled after or designed to extend those in this forum are needed to improve the accuracy at which developmental language disorder is identified.

  16. SEURAT: Visual analytics for the integrated analysis of microarray data

    PubMed Central

    2010-01-01

    Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data. PMID:20525257

  17. Growth and yield model application in tropical rain forest management

    Treesearch

    James Atta-Boateng; John W., Jr. Moser

    2000-01-01

    Analytical tools are needed to evaluate the impact of management policies on the sustainable use of rain forest. Optimal decisions concerning the level of management inputs require accurate predictions of output at all relevant input levels. Using growth data from 40 l-hectare permanent plots obtained from the semi-deciduous forest of Ghana, a system of 77 differential...

  18. Using analytical tools for decision-making and program planning in natural resources: breaking the fear barrier

    Treesearch

    David L. Peterson; Daniel L. Schmoldt

    1999-01-01

    The National Park Service and other public agencies are increasing their emphasis on inventory and monitoring (I&M) programs to obtain the information needed to infer changes in resource conditions and trigger management responses.A few individuals on a planning team can develop I&M programs, although a focused workshop is more effective.Workshops are...

  19. Making sense of human ecology mapping: an overview of approaches to integrating socio-spatial data into environmental planning

    Treesearch

    Rebecca McLain; Melissa R. Poe; Kelly Biedenweg; Lee K. Cerveny; Diane Besser; Dale J. Blahna

    2013-01-01

    Ecosystem-based planning and management have stimulated the need to gather sociocultural values and human uses of land in formats accessible to diverse planners and researchers. Human Ecology Mapping (HEM) approaches offer promising spatial data gathering and analytical tools, while also addressing important questions about human-landscape connections. This article...

  20. The Possibilities and Limits of the Structure-Agency Dialectic in Advancing Science for All

    ERIC Educational Resources Information Center

    Gutiérrez, Kris D.; Calabrese Barton, Angela

    2015-01-01

    In this special issue, the structure-agency dialectic is used to shift the analytic frame in science education from focusing on youth as in need of remediation to rethinking new arrangements, tools, and forms of assistance and participation in support of youth learning science. This shift from "fixing" the individual to re-mediating and…

  1. Resilient Therapy: Strategic Therapeutic Engagement with Children in Crisis

    ERIC Educational Resources Information Center

    Hart, Angie; Blincow, Derek

    2008-01-01

    This article offers an overview of Resilient Therapy (RT) and outlines a case study of how it can be used in practice. RT draws on the resilience research base, and has been designed to meet the needs of children in crisis by providing insights and analytical tools that help carers and practitioners build relationships of trust in the hardest of…

  2. A comparison of 1D analytical model and 3D finite element analysis with experiments for a rosen-type piezoelectric transformer.

    PubMed

    Boukazouha, F; Poulin-Vittrant, G; Tran-Huu-Hue, L P; Bavencoffe, M; Boubenider, F; Rguiti, M; Lethiecq, M

    2015-07-01

    This article is dedicated to the study of Piezoelectric Transformers (PTs), which offer promising solutions to the increasing need for integrated power electronics modules within autonomous systems. The advantages offered by such transformers include: immunity to electromagnetic disturbances; ease of miniaturisation for example, using conventional micro fabrication processes; and enhanced performance in terms of voltage gain and power efficiency. Central to the adequate description of such transformers is the need for complex analytical modeling tools, especially if one is attempting to include combined contributions due to (i) mechanical phenomena owing to the different propagation modes which differ at the primary and secondary sides of the PT; and (ii) electrical phenomena such as the voltage gain and power efficiency, which depend on the electrical load. The present work demonstrates an original one-dimensional (1D) analytical model, dedicated to a Rosen-type PT and simulation results are successively compared against that of a three-dimensional (3D) Finite Element Analysis (COMSOL Multiphysics software) and experimental results. The Rosen-type PT studied here is based on a single layer soft PZT (P191) with corresponding dimensions 18 mm × 3 mm × 1.5 mm, which operated at the second harmonic of 176 kHz. Detailed simulational and experimental results show that the presented 1D model predicts experimental measurements to within less than 10% error of the voltage gain at the second and third resonance frequency modes. Adjustment of the analytical model parameters is found to decrease errors relative to experimental voltage gain to within 1%, whilst a 2.5% error on the output admittance magnitude at the second resonance mode were obtained. Relying on the unique assumption of one-dimensionality, the present analytical model appears as a useful tool for Rosen-type PT design and behavior understanding. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Supporting cognition in systems biology analysis: findings on users' processes and design implications.

    PubMed

    Mirel, Barbara

    2009-02-13

    Current usability studies of bioinformatics tools suggest that tools for exploratory analysis support some tasks related to finding relationships of interest but not the deep causal insights necessary for formulating plausible and credible hypotheses. To better understand design requirements for gaining these causal insights in systems biology analyses a longitudinal field study of 15 biomedical researchers was conducted. Researchers interacted with the same protein-protein interaction tools to discover possible disease mechanisms for further experimentation. Findings reveal patterns in scientists' exploratory and explanatory analysis and reveal that tools positively supported a number of well-structured query and analysis tasks. But for several of scientists' more complex, higher order ways of knowing and reasoning the tools did not offer adequate support. Results show that for a better fit with scientists' cognition for exploratory analysis systems biology tools need to better match scientists' processes for validating, for making a transition from classification to model-based reasoning, and for engaging in causal mental modelling. As the next great frontier in bioinformatics usability, tool designs for exploratory systems biology analysis need to move beyond the successes already achieved in supporting formulaic query and analysis tasks and now reduce current mismatches with several of scientists' higher order analytical practices. The implications of results for tool designs are discussed.

  4. Family history in public health practice: a genomic tool for disease prevention and health promotion.

    PubMed

    Valdez, Rodolfo; Yoon, Paula W; Qureshi, Nadeem; Green, Ridgely Fisk; Khoury, Muin J

    2010-01-01

    Family history is a risk factor for many chronic diseases, including cancer, cardiovascular disease, and diabetes. Professional guidelines usually include family history to assess health risk, initiate interventions, and motivate behavioral changes. The advantages of family history over other genomic tools include a lower cost, greater acceptability, and a reflection of shared genetic and environmental factors. However, the utility of family history in public health has been poorly explored. To establish family history as a public health tool, it needs to be evaluated within the ACCE framework (analytical validity; clinical validity; clinical utility; and ethical, legal, and social issues). Currently, private and public organizations are developing tools to collect standardized family histories of many diseases. Their goal is to create family history tools that have decision support capabilities and are compatible with electronic health records. These advances will help realize the potential of family history as a public health tool.

  5. Perspectives on making big data analytics work for oncology.

    PubMed

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from incorporating prior knowledge, using information-theoretic techniques to modern ensemble machine learning approaches or combination of these. We will particularly discuss the pros and cons of different approaches to improve mining of big data in oncology. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. [Management of pre-analytical nonconformities].

    PubMed

    Berkane, Z; Dhondt, J L; Drouillard, I; Flourié, F; Giannoli, J M; Houlbert, C; Surgat, P; Szymanowicz, A

    2010-12-01

    The main nonconformities enumerated to facilitate consensual codification. In each case, an action is defined: refusal to realize the examination with request of a new sample, request of information or correction, results cancellation, nurse or physician information. A traceability of the curative, corrective and preventive actions is needed. Then, methodology and indicators are proposed to assess nonconformity and to follow the quality improvements. The laboratory information system can be used instead of dedicated software. Tools for the follow-up of nonconformities scores are proposed. Finally, we propose an organization and some tools allowing the management and control of the nonconformities occurring during the pre-examination phase.

  7. Guidebook for analysis of tether applications

    NASA Technical Reports Server (NTRS)

    Carroll, J. A.

    1985-01-01

    This guidebook is intended as a tool to facilitate initial analyses of proposed tether applications in space. The guiding philosophy is that a brief analysis of all the common problem areas is far more useful than a detailed study in any one area. Such analyses can minimize the waste of resources on elegant but fatally flawed concepts, and can identify the areas where more effort is needed on concepts which do survive the initial analyses. The simplified formulas, approximations, and analytical tools included should be used only for preliminary analyses. For detailed analyses, the references with each topic and in the bibliography may be useful.

  8. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  9. A review of recent advances in risk analysis for wildfire management

    Treesearch

    Carol Miller; Alan A. Ager

    2012-01-01

    Risk analysis evolved out of the need to make decisions concerning highly stochastic events, and is well suited to analyze the timing, location and potential effects of wildfires. Over the past 10 years, the application of risk analysis to wildland fire management has seen steady growth with new risk-based analytical tools that support a wide range of fire and fuels...

  10. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  11. Scientists' sense making when hypothesizing about disease mechanisms from expression data and their needs for visualization support.

    PubMed

    Mirel, Barbara; Görg, Carsten

    2014-04-26

    A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists' analytical workflows and their implications for tool design.

  12. Scientists’ sense making when hypothesizing about disease mechanisms from expression data and their needs for visualization support

    PubMed Central

    2014-01-01

    A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists’ analytical workflows and their implications for tool design. PMID:24766796

  13. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  14. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  15. Electrochemical detection for microscale analytical systems: a review.

    PubMed

    Wang, Joseph

    2002-02-11

    As the field of chip-based microscale systems continues its rapid growth, there are urgent needs for developing compatible detection modes. Electrochemistry detection offers considerable promise for such microfluidic systems, with features that include remarkable sensitivity, inherent miniaturization and portability, independence of optical path length or sample turbidity, low cost, low-power requirements and high compatibility with advanced micromachining and microfabrication technologies. This paper highlights recent advances, directions and key strategies in controlled-potential electrochemical detectors for miniaturized analytical systems. Subjects covered include the design and integration of the electrochemical detection system, its requirements and operational principles, common electrode materials, derivatization reactions, electrical-field decouplers, typical applications and future prospects. It is expected that electrochemical detection will become a powerful tool for microscale analytical systems and will facilitate the creation of truly portable (and possibly disposable) devices.

  16. Evolution of microbiological analytical methods for dairy industry needs

    PubMed Central

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675

  17. Evolution of microbiological analytical methods for dairy industry needs.

    PubMed

    Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence

    2014-01-01

    Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.

  18. The Earth Data Analytic Services (EDAS) Framework

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  19. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  1. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  2. Reproducibility studies for experimental epitope detection in macrophages (EDIM).

    PubMed

    Japink, Dennis; Nap, Marius; Sosef, Meindert N; Nelemans, Patty J; Coy, Johannes F; Beets, Geerard; von Meyenfeldt, Maarten F; Leers, Math P G

    2014-05-01

    We have recently described epitope detection in macrophages (EDIM) by flow cytometry. This is a promising tool for the diagnosis and follow-up of malignancies. However, biological and technical validation is warranted before clinical applicability can be explored. The pre-analytic and analytic phases were investigated. Five different aspects were assessed: blood sample stability, intra-individual variability in healthy persons, intra-assay variation, inter-assay variation and assay transferability. The post-analytic phase was already partly standardized and described in an earlier study. The outcomes in the pre-analytic phase showed that samples are stable for 24h after venipuncture. Biological variation over time was similar to that of serum tumor marker assays; each patient has a baseline value. Intra-assay variation showed good reproducibility, while inter-assay variation showed reproducibility similar to that of to established serum tumor marker assays. Furthermore, the assay showed excellent transferability between analyzers. Under optimal analytic conditions the EDIM method is technically stable, reproducible and transferable. Biological variation over time needs further assessment in future work. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. The Barcode of Life Data Portal: Bridging the Biodiversity Informatics Divide for DNA Barcoding

    PubMed Central

    Sarkar, Indra Neil; Trizna, Michael

    2011-01-01

    With the volume of molecular sequence data that is systematically being generated globally, there is a need for centralized resources for data exploration and analytics. DNA Barcode initiatives are on track to generate a compendium of molecular sequence–based signatures for identifying animals and plants. To date, the range of available data exploration and analytic tools to explore these data have only been available in a boutique form—often representing a frustrating hurdle for many researchers that may not necessarily have resources to install or implement algorithms described by the analytic community. The Barcode of Life Data Portal (BDP) is a first step towards integrating the latest biodiversity informatics innovations with molecular sequence data from DNA barcoding. Through establishment of community driven standards, based on discussion with the Data Analysis Working Group (DAWG) of the Consortium for the Barcode of Life (CBOL), the BDP provides an infrastructure for incorporation of existing and next-generation DNA barcode analytic applications in an open forum. PMID:21818249

  4. Sensitive analytical method for simultaneous analysis of some vasoconstrictors with highly overlapped analytical signals

    NASA Astrophysics Data System (ADS)

    Nikolić, G. S.; Žerajić, S.; Cakić, M.

    2011-10-01

    Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.

  5. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned.

    PubMed

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation's largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration. 1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Data awareness, that is, an appreciation of the importance of data integrity, data hygiene 2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable.

  6. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned

    PubMed Central

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    Background As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation’s largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. Methods To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration.1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. Results The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Conclusion Data awareness, that is, an appreciation of the importance of data integrity, data hygiene2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable. PMID:28210424

  7. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    PubMed

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  8. Capricorn-A Web-Based Automatic Case Log and Volume Analytics for Diagnostic Radiology Residents.

    PubMed

    Chen, Po-Hao; Chen, Yin Jie; Cook, Tessa S

    2015-10-01

    On-service clinical learning is a mainstay of radiology education. However, an accurate and timely case log is difficult to keep, especially in the absence of software tools tailored to resident education. Furthermore, volume-related feedback from the residency program sometimes occurs months after a rotation ends, limiting the opportunity for meaningful intervention. We surveyed the residents of a single academic institution to evaluate the current state of and the existing need for tracking interpretation volume. Using the results of the survey, we created an open-source automated case log software. Finally, we evaluated the effect of the software tool on the residency in a 1-month, postimplementation survey. Before implementation of the system, 89% of respondents stated that volume is an important component of training, but 71% stated that volume data was inconvenient to obtain. Although the residency program provides semiannual reviews, 90% preferred reviewing interpretation volumes at least once monthly. After implementation, 95% of the respondents stated that the software is convenient to access, 75% found it useful, and 88% stated they would use the software at least once a month. The included analytics module, which benchmarks the user using historical aggregate average volumes, is the most often used feature of the software. Server log demonstrates that, on average, residents use the system approximately twice a week. An automated case log software system may fulfill a previously unmet need in diagnostic radiology training, making accurate and timely review of volume-related performance analytics a convenient process. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  9. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    PubMed

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges according to associated data values. We demonstrated the advantages of these new capabilities through three biological network visualization case studies: human disease association network, drug-target interaction network and protein-peptide mapping network. The architectural design of ProteoLens makes it suitable for bioinformatics expert data analysts who are experienced with relational database management to perform large-scale integrated network visual explorations. ProteoLens is a promising visual analytic platform that will facilitate knowledge discoveries in future network and systems biology studies.

  10. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  11. Variational Trajectory Optimization Tool Set: Technical description and user's manual

    NASA Technical Reports Server (NTRS)

    Bless, Robert R.; Queen, Eric M.; Cavanaugh, Michael D.; Wetzel, Todd A.; Moerder, Daniel D.

    1993-01-01

    The algorithms that comprise the Variational Trajectory Optimization Tool Set (VTOTS) package are briefly described. The VTOTS is a software package for solving nonlinear constrained optimal control problems from a wide range of engineering and scientific disciplines. The VTOTS package was specifically designed to minimize the amount of user programming; in fact, for problems that may be expressed in terms of analytical functions, the user needs only to define the problem in terms of symbolic variables. This version of the VTOTS does not support tabular data; thus, problems must be expressed in terms of analytical functions. The VTOTS package consists of two methods for solving nonlinear optimal control problems: a time-domain finite-element algorithm and a multiple shooting algorithm. These two algorithms, under the VTOTS package, may be run independently or jointly. The finite-element algorithm generates approximate solutions, whereas the shooting algorithm provides a more accurate solution to the optimization problem. A user's manual, some examples with results, and a brief description of the individual subroutines are included.

  12. Can we use high precision metal isotope analysis to improve our understanding of cancer?

    PubMed

    Larner, Fiona

    2016-01-01

    High precision natural isotope analyses are widely used in geosciences to trace elemental transport pathways. The use of this analytical tool is increasing in nutritional and disease-related research. In recent months, a number of groups have shown the potential this technique has in providing new observations for various cancers when applied to trace metal metabolism. The deconvolution of isotopic signatures, however, relies on mathematical models and geochemical data, which are not representative of the system under investigation. In addition to relevant biochemical studies of protein-metal isotopic interactions, technological development both in terms of sample throughput and detection sensitivity of these elements is now needed to translate this novel approach into a mainstream analytical tool. Following this, essential background healthy population studies must be performed, alongside observational, cross-sectional disease-based studies. Only then can the sensitivity and specificity of isotopic analyses be tested alongside currently employed methods, and important questions such as the influence of cancer heterogeneity and disease stage on isotopic signatures be addressed.

  13. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  14. On the Use of Accelerated Test Methods for Characterization of Advanced Composite Materials

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.

    2003-01-01

    A rational approach to the problem of accelerated testing for material characterization of advanced polymer matrix composites is discussed. The experimental and analytical methods provided should be viewed as a set of tools useful in the screening of material systems for long-term engineering properties in aerospace applications. Consideration is given to long-term exposure in extreme environments that include elevated temperature, reduced temperature, moisture, oxygen, and mechanical load. Analytical formulations useful for predictive models that are based on the principles of time-based superposition are presented. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for determining specific aging mechanisms.

  15. DIVE: A Graph-based Visual Analytics Framework for Big Data

    PubMed Central

    Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie

    2014-01-01

    The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197

  16. Nano-biosensors to detect beta-amyloid for Alzheimer's disease management.

    PubMed

    Kaushik, Ajeet; Jayant, Rahul Dev; Tiwari, Sneham; Vashist, Arti; Nair, Madhavan

    2016-06-15

    Beta-amyloid (β-A) peptides are potential biomarkers to monitor Alzheimer's diseases (AD) for diagnostic purposes. Increased β-A level is neurotoxic and induces oxidative stress in brain resulting in neurodegeneration and causes dementia. As of now, no sensitive and inexpensive method is available for β-A detection under physiological and pathological conditions. Although, available methods such as neuroimaging, enzyme-linked immunosorbent assay (ELISA), and polymerase chain reaction (PCR) detect β-A, but they are not yet extended at point-of-care (POC) due to sophisticated equipments, need of high expertize, complicated operations, and challenge of low detection limit. Recently, β-A antibody based electrochemical immuno-sensing approach has been explored to detect β-A at pM levels within 30-40 min compared to 6-8h of ELISA test. The introduction of nano-enabling electrochemical sensing technology could enable rapid detection of β-A at POC and may facilitate fast personalized health care delivery. This review explores recent advancements in nano-enabling electrochemical β-A sensing technologies towards POC application to AD management. These analytical tools can serve as an analytical tool for AD management program to obtain bio-informatics needed to optimize therapeutics for neurodegenerative diseases diagnosis management. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  18. Explorative visual analytics on interval-based genomic data and their metadata.

    PubMed

    Jalili, Vahid; Matteucci, Matteo; Masseroli, Marco; Ceri, Stefano

    2017-12-04

    With the wide-spreading of public repositories of NGS processed data, the availability of user-friendly and effective tools for data exploration, analysis and visualization is becoming very relevant. These tools enable interactive analytics, an exploratory approach for the seamless "sense-making" of data through on-the-fly integration of analysis and visualization phases, suggested not only for evaluating processing results, but also for designing and adapting NGS data analysis pipelines. This paper presents abstractions for supporting the early analysis of NGS processed data and their implementation in an associated tool, named GenoMetric Space Explorer (GeMSE). This tool serves the needs of the GenoMetric Query Language, an innovative cloud-based system for computing complex queries over heterogeneous processed data. It can also be used starting from any text files in standard BED, BroadPeak, NarrowPeak, GTF, or general tab-delimited format, containing numerical features of genomic regions; metadata can be provided as text files in tab-delimited attribute-value format. GeMSE allows interactive analytics, consisting of on-the-fly cycling among steps of data exploration, analysis and visualization that help biologists and bioinformaticians in making sense of heterogeneous genomic datasets. By means of an explorative interaction support, users can trace past activities and quickly recover their results, seamlessly going backward and forward in the analysis steps and comparative visualizations of heatmaps. GeMSE effective application and practical usefulness is demonstrated through significant use cases of biological interest. GeMSE is available at http://www.bioinformatics.deib.polimi.it/GeMSE/ , and its source code is available at https://github.com/Genometric/GeMSE under GPLv3 open-source license.

  19. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  20. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  1. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  2. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  3. Heterogeneous postsurgical data analytics for predictive modeling of mortality risks in intensive care units.

    PubMed

    Yun Chen; Hui Yang

    2014-01-01

    The rapid advancements of biomedical instrumentation and healthcare technology have resulted in data-rich environments in hospitals. However, the meaningful information extracted from rich datasets is limited. There is a dire need to go beyond current medical practices, and develop data-driven methods and tools that will enable and help (i) the handling of big data, (ii) the extraction of data-driven knowledge, (iii) the exploitation of acquired knowledge for optimizing clinical decisions. This present study focuses on the prediction of mortality rates in Intensive Care Units (ICU) using patient-specific healthcare recordings. It is worth mentioning that postsurgical monitoring in ICU leads to massive datasets with unique properties, e.g., variable heterogeneity, patient heterogeneity, and time asyncronization. To cope with the challenges in ICU datasets, we developed the postsurgical decision support system with a series of analytical tools, including data categorization, data pre-processing, feature extraction, feature selection, and predictive modeling. Experimental results show that the proposed data-driven methodology outperforms traditional approaches and yields better results based on the evaluation of real-world ICU data from 4000 subjects in the database. This research shows great potentials for the use of data-driven analytics to improve the quality of healthcare services.

  4. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE PAGES

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...

    2016-02-01

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  5. Visualization and Analytics Tools for Infectious Disease Epidemiology: A Systematic Review

    PubMed Central

    Carroll, Lauren N.; Au, Alan P.; Detwiler, Landon Todd; Fu, Tsung-chieh; Painter, Ian S.; Abernethy, Neil F.

    2014-01-01

    Background A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) Identify public health user needs and preferences for infectious disease information visualization tools; (2) Identify existing infectious disease information visualization tools and characterize their architecture and features; (3) Identify commonalities among approaches applied to different data types; and (4) Describe tool usability evaluation efforts and barriers to the adoption of such tools. Methods We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. Results A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool use. Discussion and Conclusion As the volume and complexity of infectious disease data increases, public health professionals must synthesize highly disparate data to facilitate communication with the public and inform decisions regarding measures to protect the public's health. Our review identified several themes: consideration of users' needs, preferences, and computer literacy; integration of tools into routine workflow; complications associated with understanding and use of visualizations; and the role of user trust and organizational support in the adoption of these tools. Interoperability also emerged as a prominent theme, highlighting challenges associated with the increasingly collaborative and interdisciplinary nature of infectious disease control and prevention. Future work should address methods for representing uncertainty and missing data to avoid misleading users as well as strategies to minimize cognitive overload. PMID:24747356

  6. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    PubMed

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool use. As the volume and complexity of infectious disease data increases, public health professionals must synthesize highly disparate data to facilitate communication with the public and inform decisions regarding measures to protect the public's health. Our review identified several themes: consideration of users' needs, preferences, and computer literacy; integration of tools into routine workflow; complications associated with understanding and use of visualizations; and the role of user trust and organizational support in the adoption of these tools. Interoperability also emerged as a prominent theme, highlighting challenges associated with the increasingly collaborative and interdisciplinary nature of infectious disease control and prevention. Future work should address methods for representing uncertainty and missing data to avoid misleading users as well as strategies to minimize cognitive overload. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  7. The Climate Data Analytic Services (CDAS) Framework.

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2016-12-01

    Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.

  8. Mining Mathematics in Textbook Lessons

    ERIC Educational Resources Information Center

    Ronda, Erlina; Adler, Jill

    2017-01-01

    In this paper, we propose an analytic tool for describing the mathematics made available to learn in a "textbook lesson". The tool is an adaptation of the Mathematics Discourse in Instruction (MDI) analytic tool that we developed to analyze what is made available to learn in teachers' lessons. Our motivation to adapt the use of the MDI…

  9. Fire behavior modeling-a decision tool

    Treesearch

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  10. Detection of UV-treatment effects on plankton by rapid analytic tools for ballast water compliance monitoring immediately following treatment

    NASA Astrophysics Data System (ADS)

    Bradie, Johanna; Gianoli, Claudio; He, Jianjun; Lo Curto, Alberto; Stehouwer, Peter; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Non-indigenous species seriously threaten native biodiversity. To reduce establishments, the International Maritime Organization established the Convention for the Control and Management of Ships' Ballast Water and Sediments which limits organism concentrations at discharge under regulation D-2. Most ships will comply by using on-board treatment systems to disinfect their ballast water. Port state control officers will need simple, rapid methods to detect compliance. Appropriate monitoring methods may be dependent on treatment type, since different treatments will affect organisms by a variety of mechanisms. Many indicative tools have been developed, but must be examined to ensure the measured variable is an appropriate signal for the response of the organisms to the applied treatment. We assessed the abilities of multiple analytic tools to rapidly detect the effects of a ballast water treatment system based on UV disinfection. All devices detected a large decrease in the concentrations of vital organisms ≥ 50 μm and organisms < 10 μm (mean 82.7-99.7% decrease across devices), but results were more variable for the ≥ 10 to < 50 μm size class (mean 9.0-99.9% decrease across devices). Results confirm the necessity to choose tools capable of detecting the damage inflicted on living organisms, as examined herein for UV-C treatment systems.

  11. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  12. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  13. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    PubMed

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  14. Physics-based and human-derived information fusion for analysts

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael

    2017-05-01

    Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.

  15. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  16. Numerical Study of Tip Vortex Flows

    NASA Technical Reports Server (NTRS)

    Dacles-Mariani, Jennifer; Hafez, Mohamed

    1998-01-01

    This paper presents an overview and summary of the many different research work related to tip vortex flows and wake/trailing vortices as applied to practical engineering problems. As a literature survey paper, it outlines relevant analytical, theoretical, experimental and computational study found in literature. It also discusses in brief some of the fundamental aspects of the physics and its complexities. An appendix is also included. The topics included in this paper are: 1) Analytical Vortices; 2) Experimental Studies; 3) Computational Studies; 4) Wake Vortex Control and Management; 5) Wake Modeling; 6) High-Lift Systems; 7) Issues in Numerical Studies; 8) Instabilities; 9) Related Topics; 10) Visualization Tools for Vertical Flows; 11) Further Work Needed; 12) Acknowledgements; 13) References; and 14) Appendix.

  17. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  18. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  19. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  20. Interaction Junk: User Interaction-Based Evaluation of Visual Analytic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; North, Chris

    2012-10-14

    With the growing need for visualization to aid users in understanding large, complex datasets, the ability for users to interact and explore these datasets is critical. As visual analytic systems have advanced to leverage powerful computational models and data analytics capabilities, the modes by which users engage and interact with the information are limited. Often, users are taxed with directly manipulating parameters of these models through traditional GUIs (e.g., using sliders to directly manipulate the value of a parameter). However, the purpose of user interaction in visual analytic systems is to enable visual data exploration – where users can focusmore » on their task, as opposed to the tool or system. As a result, users can engage freely in data exploration and decision-making, for the purpose of gaining insight. In this position paper, we discuss how evaluating visual analytic systems can be approached through user interaction analysis, where the goal is to minimize the cognitive translation between the visual metaphor and the mode of interaction (i.e., reducing the “Interactionjunk”). We motivate this concept through a discussion of traditional GUIs used in visual analytics for direct manipulation of model parameters, and the importance of designing interactions the support visual data exploration.« less

  1. Big Data in industry

    NASA Astrophysics Data System (ADS)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  2. Cryogenic Propellant Feed System Analytical Tool Development

    NASA Technical Reports Server (NTRS)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  3. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  4. Shared decision-making – transferring research into practice: the Analytic Hierarchy Process (AHP)

    PubMed Central

    Dolan, James G.

    2008-01-01

    Objective To illustrate how the Analytic Hierarchy Process (AHP) can be used to promote shared decision-making and enhance clinician-patient communication. Methods Tutorial review. Results The AHP promotes shared decision making by creating a framework that is used to define the decision, summarize the information available, prioritize information needs, elicit preferences and values, and foster meaningful communication among decision stakeholders. Conclusions The AHP and related multi-criteria methods have the potential for improving the quality of clinical decisions and overcoming current barriers to implementing shared decision making in busy clinical settings. Further research is needed to determine the best way to implement these tools and to determine their effectiveness. Practice Implications Many clinical decisions involve preference-based trade-offs between competing risks and benefits. The AHP is a well-developed method that provides a practical approach for improving patient-provider communication, clinical decision-making, and the quality of patient care in these situations. PMID:18760559

  5. Augmenting Austrian flood management practices through geospatial predictive analytics: a study in Carinthia

    NASA Astrophysics Data System (ADS)

    Ward, S. M.; Paulus, G.

    2013-06-01

    The Danube River basin has long been the location of significant flooding problems across central Europe. The last decade has seen a sharp increase in the frequency, duration and intensity of these flood events, unveiling a dire need for enhanced flood management policy and tools in the region. Located in the southern portion of Austria, the state of Carinthia has experienced a significant volume of intense flood impacts over the last decade. Although the Austrian government has acknowledged these issues, their remedial actions have been primarily structural to date. Continued focus on controlling the natural environment through infrastructure while disregarding the need to consider alternative forms of assessing flood exposure will only act as a provisional solution to this inescapable risk. In an attempt to remedy this flaw, this paper highlights the application of geospatial predictive analytics and spatial recovery index as a proxy for community resilience, as well as the cultural challenges associated with the application of foreign models within an Austrian environment.

  6. EPA Tools and Resources Webinar: EPA’s Environmental Sampling and Analytical Methods for Environmental Remediation and Recovery

    EPA Pesticide Factsheets

    EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.

  7. Inorganic Arsenic Determination in Food: A Review of Analytical Proposals and Quality Assessment Over the Last Six Years.

    PubMed

    Llorente-Mirandes, Toni; Rubio, Roser; López-Sánchez, José Fermín

    2017-01-01

    Here we review recent developments in analytical proposals for the assessment of inorganic arsenic (iAs) content in food products. Interest in the determination of iAs in products for human consumption such as food commodities, wine, and seaweed among others is fueled by the wide recognition of its toxic effects on humans, even at low concentrations. Currently, the need for robust and reliable analytical methods is recognized by various international safety and health agencies, and by organizations in charge of establishing acceptable tolerance levels of iAs in food. This review summarizes the state of the art of analytical methods while highlighting tools for the assessment of quality assessment of the results, such as the production and evaluation of certified reference materials (CRMs) and the availability of specific proficiency testing (PT) programmes. Because the number of studies dedicated to the subject of this review has increased considerably over recent years, the sources consulted and cited here are limited to those from 2010 to the end of 2015.

  8. Technical aspects and inter-laboratory variability in native peptide profiling: the CE-MS experience.

    PubMed

    Mischak, Harald; Vlahou, Antonia; Ioannidis, John P A

    2013-04-01

    Mass spectrometry platforms have attracted a lot of interest in the last 2 decades as profiling tools for native peptides and proteins with clinical potential. However, limitations associated with reproducibility and analytical robustness, especially pronounced with the initial SELDI systems, hindered the application of such platforms in biomarker qualification and clinical implementation. The scope of this article is to give a short overview on data available on performance and on analytical robustness of the different platforms for peptide profiling. Using the CE-MS platform as a paradigm, data on analytical performance are described including reproducibility (short-term and intermediate repeatability), stability, interference, quantification capabilities (limits of detection), and inter-laboratory variability. We discuss these issues by using as an example our experience with the development of a 273-peptide marker for chronic kidney disease. Finally, we discuss pros and cons and means for improvement and emphasize the need to test in terms of comparative clinical performance and impact, different platforms that pass reasonably well analytical validation tests. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. Translating statistical species-habitat models to interactive decision support tools

    USGS Publications Warehouse

    Wszola, Lyndsie S.; Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  10. Translating statistical species-habitat models to interactive decision support tools.

    PubMed

    Wszola, Lyndsie S; Simonsen, Victoria L; Stuber, Erica F; Gillespie, Caitlyn R; Messinger, Lindsey N; Decker, Karie L; Lusk, Jeffrey J; Jorgensen, Christopher F; Bishop, Andrew A; Fontaine, Joseph J

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  11. Translating statistical species-habitat models to interactive decision support tools

    PubMed Central

    Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences. PMID:29236707

  12. NeedATool: A Needlet Analysis Tool for Cosmological Data Processing

    NASA Astrophysics Data System (ADS)

    Pietrobon, Davide; Balbi, Amedeo; Cabella, Paolo; Gorski, Krzysztof M.

    2010-11-01

    We introduce NeedATool (Needlet Analysis Tool), a software for data analysis based on needlets, a wavelet rendition which is powerful for the analysis of fields defined on a sphere. Needlets have been applied successfully to the treatment of astrophysical and cosmological observations, and in particular to the analysis of cosmic microwave background (CMB) data. Usually, such analyses are performed in real space as well as in its dual domain, the harmonic one. Both spaces have advantages and disadvantages: for example, in pixel space it is easier to deal with partial sky coverage and experimental noise; in the harmonic domain, beam treatment and comparison with theoretical predictions are more effective. During the last decade, however, wavelets have emerged as a useful tool for CMB data analysis, since they allow us to combine most of the advantages of the two spaces, one of the main reasons being their sharp localization. In this paper, we outline the analytical properties of needlets and discuss the main features of the numerical code, which should be a valuable addition to the CMB analyst's toolbox.

  13. Understanding the behavior of Giardia and Cryptosporidium in an urban watershed: Explanation and application of techniques to collect and evaluate monitoring data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crockett, C.S.; Haas, C.N.

    1996-11-01

    Due to current proposed regulations requiring monitoring for protozoans and demonstration of adequate protozoan removal depending on source water concentrations detected, many utilities are considering or are engaged in protozoan monitoring activities within their watershed so that proper watershed management and treatment modifications can reduce their impact on drinking water safety and quality. However, due to the difficulties associated with the current analytical methods and sample collection many sampling efforts collect data that cannot be interpreted or lack the tools to interpret the information obtained. Therefore, it is necessary to determine how to develop an effective sampling program tailored tomore » a utility`s specific needs to provide interpretable data and develop tools for evaluating such data. The following case study describes the process in which a utility learned how to collect and interpret monitoring data for their specific needs and provides concepts and tools which other utilities can use to aid in their own macro and microwatershed management efforts.« less

  14. On Governance, Embedding and Marketing: Reflections on the Construction of Alternative Sustainable Food Networks.

    PubMed

    Roep, Dirk; Wiskerke, Johannes S C

    Based on the reconstruction of the development of 14 food supply chain initiatives in 7 European countries, we developed a conceptual framework that demonstrates that the process of increasing the sustainability of food supply chains is rooted in strategic choices regarding governance , embedding, and marketing and in the coordination of these three dimensions that are inextricably interrelated. The framework also shows that when seeking to further develop an initiative (e.g., through scaling up or product diversification) these interrelations need continuous rebalancing. We argue that the framework can serve different purposes: it can be used as an analytical tool by researchers studying food supply chain dynamics, as a policy tool by policymakers that want to support the development of sustainable food supply chains, and as a reflexive tool by practitioners and their advisors to help them to position themselves, develop a clear strategy, find the right allies, develop their skills, and build the capacities that they need. In this paper, we elaborate upon the latter function of the framework and illustrate this briefly with empirical evidence from three of the initiatives that we studied.

  15. Just love in live organ donation.

    PubMed

    Zeiler, Kristin

    2009-08-01

    Emotionally-related live organ donation is different from almost all other medical treatments in that a family member or, in some countries, a friend contributes with an organ or parts of an organ to the recipient. Furthermore, there is a long-acknowledged but not well-understood gender-imbalance in emotionally-related live kidney donation. This article argues for the benefit of the concept of just love as an analytic tool in the analysis of emotionally-related live organ donation where the potential donor(s) and the recipient are engaged in a love relation. The concept of just love is helpful in the analysis of these live organ donations even if no statistical gender-imbalance prevails. It is particularly helpful, however, in the analysis of the gender-imbalance in live kidney donations if these donations are seen as a specific kind of care-work, if care-work is experienced as a labour one should perform out of love and if women still experience stronger pressures to engage in care-work than do men. The aim of the article is to present arguments for the need of just love as an analytic tool in the analysis of emotionally-related live organ donation where the potential donor(s) and the recipient are engaged in a love relation. The aim is also to elaborate two criteria that need to be met in order for love to qualify as just and to highlight certain clinical implications.

  16. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  17. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  18. e-Healthcare in India: critical success factors for sustainable health systems.

    PubMed

    Taneja, Udita; Sushil

    2007-01-01

    As healthcare enterprises seek to move towards an integrated, sustainable healthcare delivery model an IT-enabled or e-Healthcare strategy is being increasingly adopted. In this study we identified the critical success factors influencing the effectiveness of an e-Healthcare strategy in India. The performance assessment criteria used to measure effectiveness were increasing reach and reducing cost of healthcare delivery. A survey of healthcare providers was conducted. Analytic Hierarchy Process (AHP) and Interpretive Structural Modeling (ISM) were the analytical tools used to determine the relative importance of the critical success factors in influencing effectiveness of e-Healthcare and their interplay with each other. To succeed in e-Healthcare initiatives the critical success factors that need to be in place are appropriate government policies, literacy levels, and telecommunications and power infrastructure in the country. The focus should not be on the IT tools and biomedical engineering technologies as is most often the case. Instead the nontechnology factors such as healthcare provider and consumer mindsets should be addressed to increase acceptance of, and enhance the effectiveness of, sustainable e-Healthcare services.

  19. MRI of human hair.

    PubMed

    Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael

    2009-06-01

    Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.

  20. Cyberhubs: Virtual Research Environments for Astronomy

    NASA Astrophysics Data System (ADS)

    Herwig, Falk; Andrassy, Robert; Annau, Nic; Clarkson, Ondrea; Côté, Benoit; D’Sa, Aaron; Jones, Sam; Moa, Belaid; O’Connell, Jericho; Porter, David; Ritter, Christian; Woodward, Paul

    2018-05-01

    Collaborations in astronomy and astrophysics are faced with numerous cyber-infrastructure challenges, such as large data sets, the need to combine heterogeneous data sets, and the challenge to effectively collaborate on those large, heterogeneous data sets with significant processing requirements and complex science software tools. The cyberhubs system is an easy-to-deploy package for small- to medium-sized collaborations based on the Jupyter and Docker technology, which allows web-browser-enabled, remote, interactive analytic access to shared data. It offers an initial step to address these challenges. The features and deployment steps of the system are described, as well as the requirements collection through an account of the different approaches to data structuring, handling, and available analytic tools for the NuGrid and PPMstar collaborations. NuGrid is an international collaboration that creates stellar evolution and explosion physics and nucleosynthesis simulation data. The PPMstar collaboration performs large-scale 3D stellar hydrodynamics simulations of interior convection in the late phases of stellar evolution. Examples of science that is currently performed on cyberhubs, in the areas of 3D stellar hydrodynamic simulations, stellar evolution and nucleosynthesis, and Galactic chemical evolution, are presented.

  1. Supporting Communication and Coordination in Collaborative Sensemaking.

    PubMed

    Mahyar, Narges; Tory, Melanie

    2014-12-01

    When people work together to analyze a data set, they need to organize their findings, hypotheses, and evidence, share that information with their collaborators, and coordinate activities amongst team members. Sharing externalizations (recorded information such as notes) could increase awareness and assist with team communication and coordination. However, we currently know little about how to provide tool support for this sort of sharing. We explore how linked common work (LCW) can be employed within a `collaborative thinking space', to facilitate synchronous collaborative sensemaking activities in Visual Analytics (VA). Collaborative thinking spaces provide an environment for analysts to record, organize, share and connect externalizations. Our tool, CLIP, extends earlier thinking spaces by integrating LCW features that reveal relationships between collaborators' findings. We conducted a user study comparing CLIP to a baseline version without LCW. Results demonstrated that LCW significantly improved analytic outcomes at a collaborative intelligence task. Groups using CLIP were also able to more effectively coordinate their work, and held more discussion of their findings and hypotheses. LCW enabled them to maintain awareness of each other's activities and findings and link those findings to their own work, preventing disruptive oral awareness notifications.

  2. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  3. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  4. Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.

    DOT National Transportation Integrated Search

    2015-01-01

    The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...

  5. SAM Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation

  6. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  7. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  8. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options.

    PubMed

    Reed, M S; Podesta, G; Fazey, I; Geeson, N; Hessel, R; Hubacek, K; Letson, D; Nainggolan, D; Prell, C; Rickenbach, M G; Ritsema, C; Schwilch, G; Stringer, L C; Thomas, A D

    2013-10-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.

  9. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options☆

    PubMed Central

    Reed, M.S.; Podesta, G.; Fazey, I.; Geeson, N.; Hessel, R.; Hubacek, K.; Letson, D.; Nainggolan, D.; Prell, C.; Rickenbach, M.G.; Ritsema, C.; Schwilch, G.; Stringer, L.C.; Thomas, A.D.

    2013-01-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change. PMID:25844020

  10. Helicase-dependent isothermal amplification: a novel tool in the development of molecular-based analytical systems for rapid pathogen detection.

    PubMed

    Barreda-García, Susana; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Lobo-Castañón, María Jesús

    2018-01-01

    Highly sensitive testing of nucleic acids is essential to improve the detection of pathogens, which pose a major threat for public health worldwide. Currently available molecular assays, mainly based on PCR, have a limited utility in point-of-need control or resource-limited settings. Consequently, there is a strong interest in developing cost-effective, robust, and portable platforms for early detection of these harmful microorganisms. Since its description in 2004, isothermal helicase-dependent amplification (HDA) has been successfully applied in the development of novel molecular-based technologies for rapid, sensitive, and selective detection of viruses and bacteria. In this review, we highlight relevant analytical systems using this simple nucleic acid amplification methodology that takes place at a constant temperature and that is readily compatible with microfluidic technologies. Different strategies for monitoring HDA amplification products are described. In addition, we present technological advances for integrating sample preparation, HDA amplification, and detection. Future perspectives and challenges toward point-of-need use not only for clinical diagnosis but also in food safety testing and environmental monitoring are also discussed. Graphical Abstract Expanding the analytical toolbox for the detection of DNA sequences specific of pathogens with isothermal helicase dependent amplification (HDA).

  11. Bio-analytical applications of mid-infrared spectroscopy using silver halide fiber-optic probes1

    NASA Astrophysics Data System (ADS)

    Heise, H. M.; Küpper, L.; Butvina, L. N.

    2002-10-01

    Infrared-spectroscopy has proved to be a powerful method for the study of various biomedical samples, in particular for in-vitro analysis in the clinical laboratory and for non-invasive diagnostics. In general, the analysis of biofluids such as whole blood, urine, microdialysates and bioreactor broth media takes advantage of the fact that a multitude of analytes can be quantified simultaneously and rapidly without the need for reagents. Progress in the quality of infrared silver halide fibers enabled us to construct several flexible fiber-optic probes of different geometries, which are particularly suitable for the measurement of small biosamples. Recent trends show that dry film measurements by mid-infrared spectroscopy could revolutionize analytical tools in the clinical chemistry laboratory, and an example is given. Infrared diagnostic tools show a promising potential for patients, and minimal-invasive blood glucose assays or skin tissue pathology in particular cannot be left out using mid-infrared fiber-based probes. Other applications include the measurement of skin samples including penetration studies of vitamins and constituents of cosmetic cream formulations. A further field is the micro-domain analysis of biopsy samples from bog mummified corpses, and recent results on the chemistry of dermis and hair samples are reported. Another field of application, for which results are reported, is food analysis and bio-reactor monitoring.

  12. Current Technical Approaches for the Early Detection of Foodborne Pathogens: Challenges and Opportunities.

    PubMed

    Cho, Il-Hoon; Ku, Seockmo

    2017-09-30

    The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.

  13. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  14. Advances in analytical technologies for environmental protection and public safety.

    PubMed

    Sadik, O A; Wanekaya, A K; Andreescu, S

    2004-06-01

    Due to the increased threats of chemical and biological agents of injury by terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat chemical and biochemical toxins. In addition to the right mix of policies and training of medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism still lies in the prevention, early detection and the efficient and timely response using reliable analytical technologies and powerful therapies for minimizing the effects in the event of an attack. The public and regulatory agencies expect reliable methodologies and devices for public security. Today's systems are too bulky or slow to meet the "detect-to-warn" needs for first responders such as soldiers and medical personnel. This paper presents the challenges in monitoring technologies for warfare agents and other toxins. It provides an overview of how advances in environmental analytical methodologies could be adapted to design reliable sensors for public safety and environmental surveillance. The paths to designing sensors that meet the needs of today's measurement challenges are analyzed using examples of novel sensors, autonomous cell-based toxicity monitoring, 'Lab-on-a-Chip' devices and conventional environmental analytical techniques. Finally, in order to ensure that the public and legal authorities are provided with quality data to make informed decisions, guidelines are provided for assessing data quality and quality assurance using the United States Environmental Protection Agency (US-EPA) methodologies.

  15. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  16. Big Data Management in US Hospitals: Benefits and Barriers.

    PubMed

    Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto

    Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.

  17. SAM Pathogen Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.

  18. Computer-aided diagnostic strategy selection.

    PubMed

    Greenes, R A

    1986-03-01

    Determination of the optimal diagnostic work-up strategy for the patient is becoming a major concern for the practicing physician. Overlap of the indications for various diagnostic procedures, differences in their invasiveness or risk, and high costs have made physicians aware of the need to consider the choice of procedure carefully, as well as its relation to management actions available. In this article, the author discusses research approaches that aim toward development of formal decision analytic methods to allow the physician to determine optimal strategy; clinical algorithms or rules as guides to physician decisions; improved measures for characterizing the performance of diagnostic tests; educational tools for increasing the familiarity of physicians with the concepts underlying these measures and analytic procedures; and computer-based aids for facilitating the employment of these resources in actual clinical practice.

  19. DNA Electrochemistry and Electrochemical Sensors for Nucleic Acids.

    PubMed

    Ferapontova, Elena E

    2018-06-12

    Sensitive, specific, and fast analysis of nucleic acids (NAs) is strongly needed in medicine, environmental science, biodefence, and agriculture for the study of bacterial contamination of food and beverages and genetically modified organisms. Electrochemistry offers accurate, simple, inexpensive, and robust tools for the development of such analytical platforms that can successfully compete with other approaches for NA detection. Here, electrode reactions of DNA, basic principles of electrochemical NA analysis, and their relevance for practical applications are reviewed and critically discussed.

  20. Modal method for Second Harmonic Generation in nanostructures

    NASA Astrophysics Data System (ADS)

    Héron, S.; Pardo, F.; Bouchon, P.; Pelouard, J.-L.; Haïdar, R.

    2015-05-01

    Nanophotonic devices show interesting features for nonlinear response enhancement but numerical tools are mandatory to fully determine their behaviour. To address this need, we present a numerical modal method dedicated to nonlinear optics calculations under the undepleted pump approximation. It is brie y explained in the frame of Second Harmonic Generation for both plane waves and focused beams. The nonlinear behaviour of selected nanostructures is then investigated to show comparison with existing analytical results and study the convergence of the code.

  1. MODELING MICROBUBBLE DYNAMICS IN BIOMEDICAL APPLICATIONS*

    PubMed Central

    CHAHINE, Georges L.; HSIAO, Chao-Tsung

    2012-01-01

    Controlling microbubble dynamics to produce desirable biomedical outcomes when and where necessary and avoid deleterious effects requires advanced knowledge, which can be achieved only through a combination of experimental and numerical/analytical techniques. The present communication presents a multi-physics approach to study the dynamics combining viscous- in-viscid effects, liquid and structure dynamics, and multi bubble interaction. While complex numerical tools are developed and used, the study aims at identifying the key parameters influencing the dynamics, which need to be included in simpler models. PMID:22833696

  2. Modeling energy/economy interactions for conservation and renewable energy-policy analysis

    NASA Astrophysics Data System (ADS)

    Groncki, P. J.

    Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.

  3. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Development of computer-based analytical tool for assessing physical protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less

  5. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  6. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  7. SAM Biotoxin Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.

  8. SAM Chemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery

  9. Using Learning Analytics to Support Engagement in Collaborative Writing

    ERIC Educational Resources Information Center

    Liu, Ming; Pardo, Abelardo; Liu, Li

    2017-01-01

    Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…

  10. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  11. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    ERIC Educational Resources Information Center

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  12. Workshop on Two-Phase Fluid Behavior in a Space Environment

    NASA Technical Reports Server (NTRS)

    Swanson, Theodore D. (Editor); Juhasz, AL (Editor); Long, W. Russ (Editor); Ottenstein, Laura (Editor)

    1989-01-01

    The Workshop was successful in achieving its main objective of identifying a large number of technical issues relating to the design of two-phase systems for space applications. The principal concern expressed was the need for verified analytical tools that will allow an engineer to confidently design a system to a known degree of accuracy. New and improved materials, for such applications as thermal storage and as heat transfer fluids, were also identified as major needs. In addition to these research efforts, a number of specific hardware needs were identified which will require development. These include heat pumps, low weight radiators, advanced heat pipes, stability enhancement devices, high heat flux evaporators, and liquid/vapor separators. Also identified was the need for a centralized source of reliable, up-to-date information on two-phase flow in a space environment.

  13. The early days of optical coatings

    NASA Astrophysics Data System (ADS)

    Macleod, Angus

    1999-12-01

    The history of optical coatings is a long one with many observations and publications but little industrial activity until a period of explosive growth began in the 1940s. The trigger was World War II and the need on all sides for improved optical instruments ranging from binoculars to periscopes and bombsights, but it is certain that the growth would have occurred, perhaps a year or two later, even without the war. After the war the growth continued, partly because of military needs, but much more because of other factors such as the continuing growth in general optics, the enormous expansion of the chemical industry and its need for infrared and visible analytical tools, the need for narrowband contrast enhancing filters in astronomy, and then, very significantly, the laser. Nowadays, optical coatings are indispensible features of virtually all optical systems.

  14. [Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].

    PubMed

    Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou

    2014-08-01

    In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).

  15. Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases

    PubMed Central

    Amos, Christopher I.; Bafna, Vineet; Hauser, Elizabeth R.; Hernandez, Ryan D.; Li, Chun; Liberles, David A.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Papanicolaou, George J.; Peng, Bo; Ritchie, Marylyn D.; Rosenfeld, Gabriel; Witte, John S.

    2014-01-01

    Genetic simulation programs are used to model data under specified assumptions to facilitate the understanding and study of complex genetic systems. Standardized data sets generated using genetic simulation are essential for the development and application of novel analytical tools in genetic epidemiology studies. With continuing advances in high-throughput genomic technologies and generation and analysis of larger, more complex data sets, there is a need for updating current approaches in genetic simulation modeling. To provide a forum to address current and emerging challenges in this area, the National Cancer Institute (NCI) sponsored a workshop, entitled “Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases” at the National Institutes of Health (NIH) in Bethesda, Maryland on March 11-12, 2014. The goals of the workshop were to: (i) identify opportunities, challenges and resource needs for the development and application of genetic simulation models; (ii) improve the integration of tools for modeling and analysis of simulated data; and (iii) foster collaborations to facilitate development and applications of genetic simulation. During the course of the meeting the group identified challenges and opportunities for the science of simulation, software and methods development, and collaboration. This paper summarizes key discussions at the meeting, and highlights important challenges and opportunities to advance the field of genetic simulation. PMID:25371374

  16. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  17. High precision analytical description of the allowed β spectrum shape

    NASA Astrophysics Data System (ADS)

    Hayen, Leendert; Severijns, Nathal; Bodek, Kazimierz; Rozpedzik, Dagmara; Mougeot, Xavier

    2018-01-01

    A fully analytical description of the allowed β spectrum shape is given in view of ongoing and planned measurements. Its study forms an invaluable tool in the search for physics beyond the standard electroweak model and the weak magnetism recoil term. Contributions stemming from finite size corrections, mass effects, and radiative corrections are reviewed. Particular focus is placed on atomic and chemical effects, where the existing description is extended and analytically provided. The effects of QCD-induced recoil terms are discussed, and cross-checks were performed for different theoretical formalisms. Special attention was given to a comparison of the treatment of nuclear structure effects in different formalisms. Corrections were derived for both Fermi and Gamow-Teller transitions, and methods of analytical evaluation thoroughly discussed. In its integrated form, calculated f values were in agreement with the most precise numerical results within the aimed for precision. The need for an accurate evaluation of weak magnetism contributions was stressed, and the possible significance of the oft-neglected induced pseudoscalar interaction was noted. Together with improved atomic corrections, an analytical description was presented of the allowed β spectrum shape accurate to a few parts in 10-4 down to 1 keV for low to medium Z nuclei, thereby extending the work by previous authors by nearly an order of magnitude.

  18. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  19. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  20. mcaGUI: microbial community analysis R-Graphical User Interface (GUI).

    PubMed

    Copeland, Wade K; Krishnan, Vandhana; Beck, Daniel; Settles, Matt; Foster, James A; Cho, Kyu-Chul; Day, Mitch; Hickey, Roxana; Schütte, Ursel M E; Zhou, Xia; Williams, Christopher J; Forney, Larry J; Abdo, Zaid

    2012-08-15

    Microbial communities have an important role in natural ecosystems and have an impact on animal and human health. Intuitive graphic and analytical tools that can facilitate the study of these communities are in short supply. This article introduces Microbial Community Analysis GUI, a graphical user interface (GUI) for the R-programming language (R Development Core Team, 2010). With this application, researchers can input aligned and clustered sequence data to create custom abundance tables and perform analyses specific to their needs. This GUI provides a flexible modular platform, expandable to include other statistical tools for microbial community analysis in the future. The mcaGUI package and source are freely available as part of Bionconductor at http://www.bioconductor.org/packages/release/bioc/html/mcaGUI.html

  1. A comparative assessment of tools for ecosystem services quantification and valuation

    USGS Publications Warehouse

    Bagstad, Kenneth J.; Semmens, Darius; Waage, Sissel; Winthrop, Robert

    2013-01-01

    To enter widespread use, ecosystem service assessments need to be quantifiable, replicable, credible, flexible, and affordable. With recent growth in the field of ecosystem services, a variety of decision-support tools has emerged to support more systematic ecosystem services assessment. Despite the growing complexity of the tool landscape, thorough reviews of tools for identifying, assessing, modeling and in some cases monetarily valuing ecosystem services have generally been lacking. In this study, we describe 17 ecosystem services tools and rate their performance against eight evaluative criteria that gauge their readiness for widespread application in public- and private-sector decision making. We describe each of the tools′ intended uses, services modeled, analytical approaches, data requirements, and outputs, as well time requirements to run seven tools in a first comparative concurrent application of multiple tools to a common location – the San Pedro River watershed in southeast Arizona, USA, and northern Sonora, Mexico. Based on this work, we offer conclusions about these tools′ current ‘readiness’ for widespread application within both public- and private-sector decision making processes. Finally, we describe potential pathways forward to reduce the resource requirements for running ecosystem services models, which are essential to facilitate their more widespread use in environmental decision making.

  2. Progress in the development and integration of fluid flow control tools in paper microfluidics.

    PubMed

    Fu, Elain; Downs, Corey

    2017-02-14

    Paper microfluidics is a rapidly growing subfield of microfluidics in which paper-like porous materials are used to create analytical devices. There is a need for higher performance field-use tests for many application domains including human disease diagnosis, environmental monitoring, and veterinary medicine. A key factor in creating high performance paper-based devices is the ability to manipulate fluid flow within the devices. This critical review is focused on the progress that has been made in (i) the development of fluid flow control tools and (ii) the integration of those tools into paper microfluidic devices. Further, we strive to be comprehensive in our presentation and provide historical context through discussion and performance comparisons, when possible, of both relevant earlier work and recent work. Finally, we discuss the major areas of focus for fluid flow methods development to advance the potential of paper microfluidics for high-performance field applications.

  3. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  4. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Portable Diagnostics Technology Assessment for Space Missions. Part 2; Market Survey

    NASA Technical Reports Server (NTRS)

    Nelson, Emily S.; Chait, Arnon

    2010-01-01

    A mission to Mars of several years duration requires more demanding standards for all onboard instruments than a 6-month mission to the Moon or the International Space Station. In Part 1, we evaluated generic technologies and suitability to NASA needs. This prior work considered crew safety, device maturity and flightworthiness, resource consumption, and medical value. In Part 2, we continue the study by assessing the current marketplace for reliable Point-of-Care diagnostics. The ultimate goal of this project is to provide a set of objective analytical tools to suggest efficient strategies for reaching specific medical targets for any given space mission as program needs, technological development, and scientific understanding evolve.

  7. The National Institutes of Health's Big Data to Knowledge (BD2K) initiative: capitalizing on biomedical big data.

    PubMed

    Margolis, Ronald; Derr, Leslie; Dunn, Michelle; Huerta, Michael; Larkin, Jennie; Sheehan, Jerry; Guyer, Mark; Green, Eric D

    2014-01-01

    Biomedical research has and will continue to generate large amounts of data (termed 'big data') in many formats and at all levels. Consequently, there is an increasing need to better understand and mine the data to further knowledge and foster new discovery. The National Institutes of Health (NIH) has initiated a Big Data to Knowledge (BD2K) initiative to maximize the use of biomedical big data. BD2K seeks to better define how to extract value from the data, both for the individual investigator and the overall research community, create the analytic tools needed to enhance utility of the data, provide the next generation of trained personnel, and develop data science concepts and tools that can be made available to all stakeholders. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. Continuous whole-system monitoring toward rapid understanding of production HPC applications and systems

    DOE PAGES

    Agelastos, Anthony; Allan, Benjamin; Brandt, Jim; ...

    2016-05-18

    A detailed understanding of HPC applications’ resource needs and their complex interactions with each other and HPC platform resources are critical to achieving scalability and performance. Such understanding has been difficult to achieve because typical application profiling tools do not capture the behaviors of codes under the potentially wide spectrum of actual production conditions and because typical monitoring tools do not capture system resource usage information with high enough fidelity to gain sufficient insight into application performance and demands. In this paper we present both system and application profiling results based on data obtained through synchronized system wide monitoring onmore » a production HPC cluster at Sandia National Laboratories (SNL). We demonstrate analytic and visualization techniques that we are using to characterize application and system resource usage under production conditions for better understanding of application resource needs. Furthermore, our goals are to improve application performance (through understanding application-to-resource mapping and system throughput) and to ensure that future system capabilities match their intended workloads.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agelastos, Anthony; Allan, Benjamin; Brandt, Jim

    A detailed understanding of HPC applications’ resource needs and their complex interactions with each other and HPC platform resources are critical to achieving scalability and performance. Such understanding has been difficult to achieve because typical application profiling tools do not capture the behaviors of codes under the potentially wide spectrum of actual production conditions and because typical monitoring tools do not capture system resource usage information with high enough fidelity to gain sufficient insight into application performance and demands. In this paper we present both system and application profiling results based on data obtained through synchronized system wide monitoring onmore » a production HPC cluster at Sandia National Laboratories (SNL). We demonstrate analytic and visualization techniques that we are using to characterize application and system resource usage under production conditions for better understanding of application resource needs. Furthermore, our goals are to improve application performance (through understanding application-to-resource mapping and system throughput) and to ensure that future system capabilities match their intended workloads.« less

  10. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  11. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  12. Look@NanoSIMS--a tool for the analysis of nanoSIMS data in environmental microbiology.

    PubMed

    Polerecky, Lubos; Adam, Birgit; Milucka, Jana; Musat, Niculina; Vagner, Tomas; Kuypers, Marcel M M

    2012-04-01

    We describe an open-source freeware programme for high throughput analysis of nanoSIMS (nanometre-scale secondary ion mass spectrometry) data. The programme implements basic data processing and analytical functions, including display and drift-corrected accumulation of scanned planes, interactive and semi-automated definition of regions of interest (ROIs), and export of the ROIs' elemental and isotopic composition in graphical and text-based formats. Additionally, the programme offers new functions that were custom-designed to address the needs of environmental microbiologists. Specifically, it allows manual and automated classification of ROIs based on the information that is derived either from the nanoSIMS dataset itself (e.g. from labelling achieved by halogen in situ hybridization) or is provided externally (e.g. as a fluorescence in situ hybridization image). Moreover, by implementing post-processing routines coupled to built-in statistical tools, the programme allows rapid synthesis and comparative analysis of results from many different datasets. After validation of the programme, we illustrate how these new processing and analytical functions increase flexibility, efficiency and depth of the nanoSIMS data analysis. Through its custom-made and open-source design, the programme provides an efficient, reliable and easily expandable tool that can help a growing community of environmental microbiologists and researchers from other disciplines process and analyse their nanoSIMS data. © 2012 Society for Applied Microbiology and Blackwell Publishing Ltd.

  13. TRAC, a collaborative computer tool for tracer-test interpretation

    NASA Astrophysics Data System (ADS)

    Gutierrez, A.; Klinka, T.; Thiéry, D.; Buscarlet, E.; Binet, S.; Jozja, N.; Défarge, C.; Leclerc, B.; Fécamp, C.; Ahumada, Y.; Elsass, J.

    2013-05-01

    Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being). Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr">http://trac.brgm.fr.

  14. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  15. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  16. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  17. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  18. Feasibility model of a high reliability five-year tape transport. Volume 3: Appendices. [detailed drawing and analytical tools used in analyses

    NASA Technical Reports Server (NTRS)

    Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.

    1973-01-01

    Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.

  19. Challenges and Opportunities in Analysing Students Modelling

    ERIC Educational Resources Information Center

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-01-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…

  20. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  1. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  2. Falls screening and assessment tools used in acute mental health settings: a review of policies in England and Wales

    PubMed Central

    Narayanan, V.; Dickinson, A.; Victor, C.; Griffiths, C.; Humphrey, D.

    2016-01-01

    Objectives There is an urgent need to improve the care of older people at risk of falls or who experience falls in mental health settings. The aims of this study were to evaluate the individual falls risk assessment tools adopted by National Health Service (NHS) mental health trusts in England and healthcare boards in Wales, to evaluate the comprehensiveness of these tools and to review their predictive validity. Methods All NHS mental health trusts in England (n = 56) and healthcare boards in Wales (n = 6) were invited to supply their falls policies and other relevant documentation (e.g. local falls audits). In order to check the comprehensiveness of tools listed in policy documents, the risk variables of the tools adopted by the mental health trusts’ policies were compared with the 2004 National Institute for Health and Care Excellence (NICE) falls prevention guidelines. A comprehensive analytical literature review was undertaken to evaluate the predictive validity of the tools used in these settings. Results Falls policies were obtained from 46 mental health trusts. Thirty-five policies met the study inclusion criteria and were included in the analysis. The main falls assessment tools used were the St. Thomas’ Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY), Falls Risk Assessment Scale for the Elderly, Morse Falls Scale (MFS) and Falls Risk Assessment Tool (FRAT). On detailed examination, a number of different versions of the FRAT were evident; validated tools had inconsistent predictive validity and none of them had been validated in mental health settings. Conclusions Falls risk assessment is the most commonly used component of risk prevention strategies, but most policies included unvalidated tools and even well validated tool such as the STRATIFY and the MFS that are reported to have inconsistent predictive accuracy. This raises questions about operational usefulness, as none of these tools have been tested in acute mental health settings. The falls risk assessment tools from only four mental health trusts met all the recommendations of the NICE falls guidelines on multifactorial assessment for prevention of falls. The recent NICE (2013) guidance states that tools predicting risk using numeric scales should no longer be used; however, multifactorial risk assessment and interventions tailored to patient needs is recommended. Trusts will need to update their policies in response to this guidance. PMID:26395210

  3. Effect-directed analysis supporting monitoring of aquatic environments--An in-depth overview.

    PubMed

    Brack, Werner; Ait-Aissa, Selim; Burgess, Robert M; Busch, Wibke; Creusot, Nicolas; Di Paolo, Carolina; Escher, Beate I; Mark Hewitt, L; Hilscherova, Klara; Hollender, Juliane; Hollert, Henner; Jonker, Willem; Kool, Jeroen; Lamoree, Marja; Muschket, Matthias; Neumann, Steffen; Rostkowski, Pawel; Ruttkies, Christoph; Schollee, Jennifer; Schymanski, Emma L; Schulze, Tobias; Seiler, Thomas-Benjamin; Tindall, Andrew J; De Aragão Umbuzeiro, Gisela; Vrana, Branislav; Krauss, Martin

    2016-02-15

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification. The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches. A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Systems Analysis - a new paradigm and decision support tools for the water framework directive

    NASA Astrophysics Data System (ADS)

    Bruen, M.

    2008-05-01

    In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE) and the Analytical Hierarchy Process (AHP) are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS) that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness.

  5. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  6. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance

    PubMed Central

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405

  7. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance.

    PubMed

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.

  8. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE PAGES

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...

    2016-01-28

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  9. Validation of Rapid Radiochemical Method for Californium ...

    EPA Pesticide Factsheets

    Technical Brief In the event of a radiological/nuclear contamination event, the response community would need tools and methodologies to rapidly assess the nature and the extent of contamination. To characterize a radiologically contaminated outdoor area and to inform risk assessment, large numbers of environmental samples would be collected and analyzed over a short period of time. To address the challenge of quickly providing analytical results to the field, the U.S. EPA developed a robust analytical method. This method allows response officials to characterize contaminated areas and to assess the effectiveness of remediation efforts, both rapidly and accurately, in the intermediate and late phases of environmental cleanup. Improvement in sample processing and analysis leads to increased laboratory capacity to handle the analysis of a large number of samples following the intentional or unintentional release of a radiological/nuclear contaminant.

  10. A high-throughput label-free nanoparticle analyser.

    PubMed

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M; Ruoslahti, Erkki; Cleland, Andrew N

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10⁻⁶ l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  11. Versatile electrophoresis-based self-test platform.

    PubMed

    Guijt, Rosanne M

    2015-03-01

    Lab on a Chip technology offers the possibility to extract chemical information from a complex sample in a simple, automated way without the need for a laboratory setting. In the health care sector, this chemical information could be used as a diagnostic tool for example to inform dosing. In this issue, the research underpinning a family of electrophoresis-based point-of-care devices for self-testing of ionic analytes in various sample matrices is described [Electrophoresis 2015, 36, 712-721.]. Hardware, software, and methodological chances made to improve the overall analytical performance in terms of accuracy, precision, detection limit, and reliability are discussed. In addition to the main focus of lithium monitoring, new applications including the use of the platform for veterinary purposes, sodium, and for creatinine measurements are included. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  14. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  15. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Quanlin; Oldenburg, Curtis M.; Spangler, Lee H.

    Analytical solutions with infinite exponential series are available to calculate the rate of diffusive transfer between low-permeability blocks and high-permeability zones in the subsurface. Truncation of these series is often employed by neglecting the early-time regime. Here in this paper, we present unified-form approximate solutions in which the early-time and the late-time solutions are continuous at a switchover time. The early-time solutions are based on three-term polynomial functions in terms of square root of dimensionless time, with the first coefficient dependent only on the dimensionless area-to-volume ratio. The last two coefficients are either determined analytically for isotropic blocks (e.g., spheresmore » and slabs) or obtained by fitting the exact solutions, and they solely depend on the aspect ratios for rectangular columns and parallelepipeds. For the late-time solutions, only the leading exponential term is needed for isotropic blocks, while a few additional exponential terms are needed for highly anisotropic rectangular blocks. The optimal switchover time is between 0.157 and 0.229, with highest relative approximation error less than 0.2%. The solutions are used to demonstrate the storage of dissolved CO 2 in fractured reservoirs with low-permeability matrix blocks of single and multiple shapes and sizes. These approximate solutions are building blocks for development of analytical and numerical tools for hydraulic, solute, and thermal diffusion processes in low-permeability matrix blocks.« less

  17. FEMA's Earthquake Incident Journal: A Web-Based Data Integration and Decision Support Tool for Emergency Management

    NASA Astrophysics Data System (ADS)

    Jones, M.; Pitts, R.

    2017-12-01

    For emergency managers, government officials, and others who must respond to rapidly changing natural disasters, timely access to detailed information related to affected terrain, population and infrastructure is critical for planning, response and recovery operations. Accessing, analyzing and disseminating such disparate information in near real-time are critical decision support components. However, finding a way to handle a variety of informative yet complex datasets poses a challenge when preparing for and responding to disasters. Here, we discuss the implementation of a web-based data integration and decision support tool for earthquakes developed by the Federal Emergency Management Agency (FEMA) as a solution to some of these challenges. While earthquakes are among the most well- monitored and measured of natural hazards, the spatially broad impacts of shaking, ground deformation, landslides, liquefaction, and even tsunamis, are extremely difficult to quantify without accelerated access to data, modeling, and analytics. This web-based application, deemed the "Earthquake Incident Journal", provides real-time access to authoritative and event-specific data from external (e.g. US Geological Survey, NASA, state and local governments, etc.) and internal (FEMA) data sources. The journal includes a GIS-based model for exposure analytics, allowing FEMA to assess the severity of an event, estimate impacts to structures and population in near real-time, and then apply planning factors to exposure estimates to answer questions such as: What geographic areas are impacted? Will federal support be needed? What resources are needed to support survivors? And which infrastructure elements or essential facilities are threatened? This presentation reviews the development of the Earthquake Incident Journal, detailing the data integration solutions, the methodology behind the GIS-based automated exposure model, and the planning factors as well as other analytical advances that provide near real-time decision support to the federal government.

  18. Integrated Data & Analysis in Support of Informed and Transparent Decision Making

    NASA Astrophysics Data System (ADS)

    Guivetchi, K.

    2012-12-01

    The California Water Plan includes a framework for improving water reliability, environmental stewardship, and economic stability through two initiatives - integrated regional water management to make better use of local water sources by integrating multiple aspects of managing water and related resources; and maintaining and improving statewide water management systems. The Water Plan promotes ways to develop a common approach for data standards and for understanding, evaluating, and improving regional and statewide water management systems, and for common ways to evaluate and select from alternative management strategies and projects. The California Water Plan acknowledges that planning for the future is uncertain and that change will continue to occur. It is not possible to know for certain how population growth, land use decisions, water demand patterns, environmental conditions, the climate, and many other factors that affect water use and supply may change by 2050. To anticipate change, our approach to water management and planning for the future needs to consider and quantify uncertainty, risk, and sustainability. There is a critical need for information sharing and information management to support over-arching and long-term water policy decisions that cross-cut multiple programs across many organizations and provide a common and transparent understanding of water problems and solutions. Achieving integrated water management with multiple benefits requires a transparent description of dynamic linkages between water supply, flood management, water quality, land use, environmental water, and many other factors. Water Plan Update 2013 will include an analytical roadmap for improving data, analytical tools, and decision-support to advance integrated water management at statewide and regional scales. It will include recommendations for linking collaborative processes with technical enhancements, providing effective analytical tools, and improving and sharing data and information. Specifically, this includes achieving better integration and consistency with other planning activities; obtaining consensus on quantitative deliverables; building a common conceptual understanding of the water management system; developing common schematics of the water management system; establishing modeling protocols and standards; and improving transparency and exchange of Water Plan information.

  19. Magic Angle Spinning NMR Metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhi Hu, Jian

    Nuclear Magnetic Resonance (NMR) spectroscopy is a non-destructive, quantitative, reproducible, untargeted and unbiased method that requires no or minimal sample preparation, and is one of the leading analytical tools for metabonomics research [1-3]. The easy quantification and the no need of prior knowledge about compounds present in a sample associated with NMR are advantageous over other techniques [1,4]. 1H NMR is especially attractive because protons are present in virtually all metabolites and its NMR sensitivity is high, enabling the simultaneous identification and monitoring of a wide range of low molecular weight metabolites.

  20. Controlling Wafer Contamination Using Automated On-Line Metrology during Wet Chemical Cleaning

    NASA Astrophysics Data System (ADS)

    Wang, Jason; Kingston, Skip; Han, Ye; Saini, Harmesh; McDonald, Robert; Mui, Rudy

    2003-09-01

    The capabilities of a trace contamination analyzer are discussed and demonstrated. This analytical tool utilizes an electrospray, time-of-flight mass spectrometer (ES-TOF-MS) for fully automated on-line monitoring of wafer cleaning solutions. The analyzer provides rich information on metallic, anionic, cationic, elemental, and organic species through its ability to provide harsh (elemental) and soft (molecular) ionization under both positive and negative modes. It is designed to meet semiconductor process control and yield management needs for the ever increasing complex new chemistries present in wafer fabrication.

  1. Optimization of on-line hydrogen stable isotope ratio measurements of halogen- and sulfur-bearing organic compounds using elemental analyzer–chromium/high-temperature conversion isotope ratio mass spectrometry (EA-Cr/HTC-IRMS)

    USGS Publications Warehouse

    Gehre, Matthias; Renpenning, Julian; Geilmann, Heike; Qi, Haiping; Coplen, Tyler B.; Kümmel, Steffen; Ivdra, Natalija; Brand, Willi A.; Schimmelmann, Arndt

    2017-01-01

    Conclusions: The optimized EA-Cr/HTC reactor design can be implemented in existing analytical equipment using commercially available material and is universally applicable for both heteroelement-bearing and heteroelement-free organic-compound classes. The sensitivity and simplicity of the on-line EA-Cr/HTC-IRMS technique provide a much needed tool for routine hydrogen-isotope source tracing of organic contaminants in the environment. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Optical Micromachining

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under an SBIR (Small Business Innovative Research) with Marshall Space Flight Center, Potomac Photonics, Inc., constructed and demonstrated a unique tool that fills a need in the area of diffractive and refractive micro-optics. It is an integrated computer-aided design and computer-aided micro-machining workstation that will extend the benefits of diffractive and micro-optic technology to optical designers. Applications of diffractive optics include sensors and monitoring equipment, analytical instruments, and fiber optic distribution and communication. The company has been making diffractive elements with the system as a commercial service for the last year.

  3. Chemical clocks, oscillations, and other temporal effects in analytical chemistry: oddity or viable approach?

    PubMed

    Prabhu, Gurpur Rakesh D; Witek, Henryk A; Urban, Pawel L

    2018-05-31

    Most analytical methods are based on "analogue" inputs from sensors of light, electric potentials, or currents. The signals obtained by such sensors are processed using certain calibration functions to determine concentrations of the target analytes. The signal readouts are normally done after an optimised and fixed time period, during which an assay mixture is incubated. This minireview covers another-and somewhat unusual-analytical strategy, which relies on the measurement of time interval between the occurrences of two distinguishable states in the assay reaction. These states manifest themselves via abrupt changes in the properties of the assay mixture (e.g. change of colour, appearance or disappearance of luminescence, change in pH, variations in optical activity or mechanical properties). In some cases, a correlation between the time of appearance/disappearance of a given property and the analyte concentration can be also observed. An example of an assay based on time measurement is an oscillating reaction, in which the period of oscillations is linked to the concentration of the target analyte. A number of chemo-chronometric assays, relying on the existing (bio)transformations or artificially designed reactions, were disclosed in the past few years. They are very attractive from the fundamental point of view but-so far-only few of them have be validated and used to address real-world problems. Then, can chemo-chronometric assays become a practical tool for chemical analysis? Is there a need for further development of such assays? We are aiming to answer these questions.

  4. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  5. The Benefits and Complexities of Operating Geographic Information Systems (GIS) in a High Performance Computing (HPC) Environment

    NASA Astrophysics Data System (ADS)

    Shute, J.; Carriere, L.; Duffy, D.; Hoy, E.; Peters, J.; Shen, Y.; Kirschbaum, D.

    2017-12-01

    The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center is building and maintaining an Enterprise GIS capability for its stakeholders, to include NASA scientists, industry partners, and the public. This platform is powered by three GIS subsystems operating in a highly-available, virtualized environment: 1) the Spatial Analytics Platform is the primary NCCS GIS and provides users discoverability of the vast DigitalGlobe/NGA raster assets within the NCCS environment; 2) the Disaster Mapping Platform provides mapping and analytics services to NASA's Disaster Response Group; and 3) the internal (Advanced Data Analytics Platform/ADAPT) enterprise GIS provides users with the full suite of Esri and open source GIS software applications and services. All systems benefit from NCCS's cutting edge infrastructure, to include an InfiniBand network for high speed data transfers; a mixed/heterogeneous environment featuring seamless sharing of information between Linux and Windows subsystems; and in-depth system monitoring and warning systems. Due to its co-location with the NCCS Discover High Performance Computing (HPC) environment and the Advanced Data Analytics Platform (ADAPT), the GIS platform has direct access to several large NCCS datasets including DigitalGlobe/NGA, Landsat, MERRA, and MERRA2. Additionally, the NCCS ArcGIS Desktop Windows virtual machines utilize existing NetCDF and OPeNDAP assets for visualization, modelling, and analysis - thus eliminating the need for data duplication. With the advent of this platform, Earth scientists have full access to vast data repositories and the industry-leading tools required for successful management and analysis of these multi-petabyte, global datasets. The full system architecture and integration with scientific datasets will be presented. Additionally, key applications and scientific analyses will be explained, to include the NASA Global Landslide Catalog (GLC) Reporter crowdsourcing application, the NASA GLC Viewer discovery and analysis tool, the DigitalGlobe/NGA Data Discovery Tool, the NASA Disaster Response Group Mapping Platform (https://maps.disasters.nasa.gov), and support for NASA's Arctic - Boreal Vulnerability Experiment (ABoVE).

  6. A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.

    2015-12-01

    Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.

  7. QSPR studies on the photoinduced-fluorescence behaviour of pharmaceuticals and pesticides.

    PubMed

    López-Malo, D; Bueso-Bordils, J I; Duart, M J; Alemán-López, P A; Martín-Algarra, R V; Antón-Fos, G M; Lahuerta-Zamora, L; Martínez-Calatayud, J

    2017-07-01

    Fluorimetric analysis is still a growing line of research in the determination of a wide range of organic compounds, including pharmaceuticals and pesticides, which makes necessary the development of new strategies aimed at improving the performance of fluorescence determinations as well as the sensitivity and, especially, the selectivity of the newly developed analytical methods. In this paper are presented applications of a useful and growing tool suitable for fostering and improving research in the analytical field. Experimental screening, molecular connectivity and discriminant analysis are applied to organic compounds to predict their fluorescent behaviour after their photodegradation by UV irradiation in a continuous flow manifold (multicommutation flow assembly). The screening was based on online fluorimetric measurement and comprised pre-selected compounds with different molecular structures (pharmaceuticals and some pesticides with known 'native' fluorescent behaviour) to study their changes in fluorescent behaviour after UV irradiation. Theoretical predictions agree with the results from the experimental screening and could be used to develop selective analytical methods, as well as helping to reduce the need for expensive, time-consuming and trial-and-error screening procedures.

  8. TLD efficiency calculations for heavy ions: an analytical approach

    DOE PAGES

    Boscolo, Daria; Scifoni, Emanuele; Carlino, Antonio; ...

    2015-12-18

    The use of thermoluminescent dosimeters (TLDs) in heavy charged particles’ dosimetry is limited by their non-linear dose response curve and by their response dependence on the radiation quality. Thus, in order to use TLDs with particle beams, a model that can reproduce the behavior of these detectors under different conditions is needed. Here a new, simple and completely analytical algorithm for the calculation of the relative TL-efficiency depending on the ion charge Z and energy E is presented. In addition, the detector response is evaluated starting from the single ion case, where the computed effectiveness values have been compared withmore » experimental data as well as with predictions from a different method. The main advantage of this approach is that, being fully analytical, it is computationally fast and can be efficiently integrated into treatment planning verification tools. In conclusion, the calculated efficiency values have been then implemented in the treatment planning code TRiP98 and dose calculations on a macroscopic target irradiated with an extended carbon ion field have been performed and verified against experimental data.« less

  9. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).

  10. How do gut feelings feature in tutorial dialogues on diagnostic reasoning in GP traineeship?

    PubMed

    Stolper, C F; Van de Wiel, M W J; Hendriks, R H M; Van Royen, P; Van Bokhoven, M A; Van der Weijden, T; Dinant, G J

    2015-05-01

    Diagnostic reasoning is considered to be based on the interaction between analytical and non-analytical cognitive processes. Gut feelings, a specific form of non-analytical reasoning, play a substantial role in diagnostic reasoning by general practitioners (GPs) and may activate analytical reasoning. In GP traineeships in the Netherlands, trainees mostly see patients alone but regularly consult with their supervisors to discuss patients and problems, receive feedback, and improve their competencies. In the present study, we examined the discussions of supervisors and their trainees about diagnostic reasoning in these so-called tutorial dialogues and how gut feelings feature in these discussions. 17 tutorial dialogues focussing on diagnostic reasoning were video-recorded and transcribed and the protocols were analysed using a detailed bottom-up and iterative content analysis and coding procedure. The dialogues were segmented into quotes. Each quote received a content code and a participant code. The number of words per code was used as a unit of analysis to quantitatively compare the contributions to the dialogues made by supervisors and trainees, and the attention given to different topics. The dialogues were usually analytical reflections on a trainee's diagnostic reasoning. A hypothetico-deductive strategy was often used, by listing differential diagnoses and discussing what information guided the reasoning process and might confirm or exclude provisional hypotheses. Gut feelings were discussed in seven dialogues. They were used as a tool in diagnostic reasoning, inducing analytical reflection, sometimes on the entire diagnostic reasoning process. The emphasis in these tutorial dialogues was on analytical components of diagnostic reasoning. Discussing gut feelings in tutorial dialogues seems to be a good educational method to familiarize trainees with non-analytical reasoning. Supervisors need specialised knowledge about these aspects of diagnostic reasoning and how to deal with them in medical education.

  11. Harnessing Scientific Literature Reports for Pharmacovigilance

    PubMed Central

    Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-01-01

    Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432

  12. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  13. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  14. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  15. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  16. Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft

    DTIC Science & Technology

    2013-03-01

    imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89  B.  FUTURE WORK................................................................................. 90  APPENDIX A. STK DATA AND BENEFIT

  17. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  18. The Metaphorical Department Head: Using Metaphors as Analytic Tools to Investigate the Role of Department Head

    ERIC Educational Resources Information Center

    Paranosic, Nikola; Riveros, Augusto

    2017-01-01

    This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…

  19. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  20. Status quo and future research challenges on organic food quality determination with focus on laboratory methods.

    PubMed

    Kahl, Johannes; Bodroza-Solarov, Marija; Busscher, Nicolaas; Hajslova, Jana; Kneifel, Wolfgang; Kokornaczyk, Maria Olga; van Ruth, Saskia; Schulzova, Vera; Stolz, Peter

    2014-10-01

    Organic food quality determination needs multi-dimensional evaluation tools. The main focus is on the authentication as an analytical verification of the certification process. New fingerprinting approaches such as ultra-performance liquid chromatography-mass spectrometry, gas chromatography-mass spectrometry, direct analysis in real time-high-resolution mass spectrometry as well as crystallization with and without the presence of additives seem to be promising methods in terms of time of analysis and detecting organic system-related parameters. For further methodological development, a system approach is recommended, which also takes into account food structure aspects. Furthermore, the authentication of processed organic samples needs more consciousness, hence most of organic food is complex and processed. © 2013 Society of Chemical Industry.

  1. Molecular signature of complex regional pain syndrome (CRPS) and its analysis.

    PubMed

    König, Simone; Schlereth, Tanja; Birklein, Frank

    2017-10-01

    Complex Regional Pain Syndrome (CRPS) is a rare, but often disabling pain disease. Biomarkers are lacking, but several inflammatory substances have been associated with the pathophysiology. This review outlines the current knowledge with respect to target biomolecules and the analytical tools available to measure them. Areas covered: Targets include cytokines, neuropeptides and resolvins; analysis strategies are thus needed for different classes of substances such as proteins, peptides, lipids and small molecules. Traditional methods like immunoassays are of importance next to state-of-the art high-resolution mass spectrometry techniques and 'omics' approaches. Expert commentary: Future biomarker studies need larger cohorts, which improve subgrouping of patients due to their presumed pathophysiology, and highly standardized workflows from sampling to analysis.

  2. Why infectious disease research needs community ecology

    PubMed Central

    Johnson, Pieter T. J.; de Roode, Jacobus C.; Fenton, Andy

    2016-01-01

    Infectious diseases often emerge from interactions among multiple species and across nested levels of biological organization. Threats as diverse as Ebola virus, human malaria, and bat white-nose syndrome illustrate the need for a mechanistic understanding of the ecological interactions underlying emerging infections. We describe how recent advances in community ecology can be adopted to address contemporary challenges in disease research. These analytical tools can identify the factors governing complex assemblages of multiple hosts, parasites, and vectors, and reveal how processes link across scales from individual hosts to regions. They can also determine the drivers of heterogeneities among individuals, species, and regions to aid targeting of control strategies. We provide examples where these principles have enhanced disease management and illustrate how they can be further extended. PMID:26339035

  3. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  4. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    ERIC Educational Resources Information Center

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  5. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  6. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  7. INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING

    PubMed Central

    Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong

    2017-01-01

    Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363

  8. Applying Feedback Analysis on Citizen’s Participation System of CSFLU Barangays on Disaster Preparedness

    NASA Astrophysics Data System (ADS)

    Ocampo, A. J.; Baro, R.; Palaoag, T.

    2018-03-01

    Various initiatives through the use of ICT paved the way to better improve the services of the government during disaster situations. It helped in the preparation and mitigation process during disaster situations through different mediums such as Social Networking Sites and SMS to disseminate information. However, data that are gathered from this medium are not sufficient to address the problem experienced by the citizens, thus the concept of Citizen’s participation system was developed. The objective of the study is to provide a mechanism or tool for barangay officials and the city government to strategically plan preventive measures during times of disasters based on the citizen’s perspective, data analytics gathered from sentiments, suggestions, and feedback of the citizens was analysed using of Feedback Analysis in order to provide accuracy of data which is needed by the disaster response team that will be generated through data analytics.

  9. BIOCHEMISTRY OF MOBILE ZINC AND NITRIC OXIDE REVEALED BY FLUORESCENT SENSORS

    PubMed Central

    Pluth, Michael D.; Tomat, Elisa; Lippard, Stephen J.

    2010-01-01

    Biologically mobile zinc and nitric oxide (NO) are two prominent examples of inorganic compounds involved in numerous signaling pathways in living systems. In the past decade, a synergy of regulation, signaling, and translocation of these two species has emerged in several areas of human physiology, providing additional incentive for developing adequate detection systems for Zn(II) ions and NO in biological specimens. Fluorescent probes for both of these bioinorganic analytes provide excellent tools for their detection, with high spatial and temporal resolution. We review the most widely used fluorescent sensors for biological zinc and nitric oxide, together with promising new developments and unmet needs of contemporary Zn(II) and NO biological imaging. The interplay between zinc and nitric oxide in the nervous, cardiovascular, and immune systems is highlighted to illustrate the contributions of selective fluorescent probes to the study of these two important bioinorganic analytes. PMID:21675918

  10. Neutron Activation Analysis of the Rare Earth Elements (REE) - With Emphasis on Geological Materials

    NASA Astrophysics Data System (ADS)

    Stosch, Heinz-Günter

    2016-08-01

    Neutron activation analysis (NAA) has been the analytical method of choice for rare earth element (REE) analysis from the early 1960s through the 1980s. At that time, irradiation facilitieswere widely available and fairly easily accessible. The development of high-resolution gamma-ray detectors in the mid-1960s eliminated, formany applications, the need for chemical separation of the REE from the matrix material, making NAA a reliable and effective analytical tool. While not as precise as isotopedilution mass spectrometry, NAA was competitive by being sensitive for the analysis of about half of the rare earths (La, Ce, Nd, Sm, Eu, Tb, Yb, Lu). The development of inductively coupled plasma mass spectrometry since the 1980s, together with decommissioning of research reactors and the lack of installation of new ones in Europe and North America has led to the rapid decline of NAA.

  11. Economic benefit evaluation for renewable energy transmitted by HVDC based on production simulation (PS) and analytic hierarchy process(AHP)

    NASA Astrophysics Data System (ADS)

    Zhang, Jinfang; Zheng, Kuan; Liu, Jun; Huang, Xinting

    2018-02-01

    In order to support North and West China’s RE (RE) development and enhance accommodation in reasonable high level, HVDC’s traditional operation curves need some change to follow the output characteristic of RE, which helps to shrink curtailment electricity and curtailment ratio of RE. In this paper, an economic benefit analysis method based on production simulation (PS) and Analytic hierarchy process (AHP) has been proposed. PS is the basic tool to analyze chosen power system operation situation, and AHP method could give a suitable comparison result among many candidate schemes. Based on four different transmission curve combinations, related economic benefit has been evaluated by PS and AHP. The results and related index have shown the efficiency of suggested method, and finally it has been validated that HVDC operation curve in following RE output mode could have benefit in decreasing RE curtailment level and improving economic operation.

  12. XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.

    2009-01-01

    Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.

  13. Forensic collection of trace chemicals from diverse surfaces with strippable coatings.

    PubMed

    Jakubowski, Michael J; Beltis, Kevin J; Drennan, Paul M; Pindzola, Bradford A

    2013-11-07

    Surface sampling for chemical analysis plays a vital role in environmental monitoring, industrial hygiene, homeland security and forensics. The standard surface sampling tool, a simple cotton gauze pad, is failing to meet the needs of the community as analytical techniques become more sensitive and the variety of analytes increases. In previous work, we demonstrated the efficacy of non-destructive, conformal, spray-on strippable coatings for chemical collection from simple glass surfaces. Here we expand that work by presenting chemical collection at a low spiking level (0.1 g m(-2)) from a diverse array of common surfaces - painted metal, engineering plastics, painted wallboard and concrete - using strippable coatings. The collection efficiency of the strippable coatings is compared to and far exceeds gauze pads. Collection from concrete, a particular challenge for wipes like gauze, averaged 73% over eight chemically diverse compounds for the strippable coatings whereas gauze averaged 10%.

  14. Heave-pitch-roll analysis and testing of air cushion landing systems

    NASA Technical Reports Server (NTRS)

    Boghani, A. B.; Captain, K. M.; Wormley, D. N.

    1978-01-01

    The analytical tools (analysis and computer simulation) needed to explain and predict the dynamic operation of air cushion landing systems (ACLS) is described. The following tasks were performed: the development of improved analytical models for the fan and the trunk; formulation of a heave pitch roll analysis for the complete ACLS; development of a general purpose computer simulation to evaluate landing and taxi performance of an ACLS equipped aircraft; and the verification and refinement of the analysis by comparison with test data obtained through lab testing of a prototype cushion. Demonstration of simulation capabilities through typical landing and taxi simulation of an ACLS aircraft are given. Initial results show that fan dynamics have a major effect on system performance. Comparison with lab test data (zero forward speed) indicates that the analysis can predict most of the key static and dynamic parameters (pressure, deflection, acceleration, etc.) within a margin of a 10 to 25 percent.

  15. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  16. Making advanced analytics work for you.

    PubMed

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.

  17. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  18. Learner Dashboards a Double-Edged Sword? Students' Sense-Making of a Collaborative Critical Reading and Learning Analytics Environment for Fostering 21st-Century Literacies

    ERIC Educational Resources Information Center

    Pei-Ling Tan, Jennifer; Koh, Elizabeth; Jonathan, Christin; Yang, Simon

    2017-01-01

    The affordances of learning analytics (LA) tools and solutions are being increasingly harnessed for enhancing 21st century pedagogical and learning strategies and outcomes. However, use cases and empirical understandings of students' experiences with LA tools and environments aimed at fostering 21st century literacies, especially in the K-12…

  19. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  20. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  1. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.

  2. Quantification and propagation of disciplinary uncertainty via Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Mantis, George Constantine

    2002-08-01

    Several needs exist in the military, commercial, and civil sectors for new hypersonic systems. These needs remain unfulfilled, due in part to the uncertainty encountered in designing these systems. This uncertainty takes a number of forms, including disciplinary uncertainty, that which is inherent in the analytical tools utilized during the design process. Yet, few efforts to date empower the designer with the means to account for this uncertainty within the disciplinary analyses. In the current state-of-the-art in design, the effects of this unquantifiable uncertainty significantly increase the risks associated with new design efforts. Typically, the risk proves too great to allow a given design to proceed beyond the conceptual stage. To that end, the research encompasses the formulation and validation of a new design method, a systematic process for probabilistically assessing the impact of disciplinary uncertainty. The method implements Bayesian Statistics theory to quantify this source of uncertainty, and propagate its effects to the vehicle system level. Comparison of analytical and physical data for existing systems, modeled a priori in the given analysis tools, leads to quantification of uncertainty in those tools' calculation of discipline-level metrics. Then, after exploration of the new vehicle's design space, the quantified uncertainty is propagated probabilistically through the design space. This ultimately results in the assessment of the impact of disciplinary uncertainty on the confidence in the design solution: the final shape and variability of the probability functions defining the vehicle's system-level metrics. Although motivated by the hypersonic regime, the proposed treatment of uncertainty applies to any class of aerospace vehicle, just as the problem itself affects the design process of any vehicle. A number of computer programs comprise the environment constructed for the implementation of this work. Application to a single-stage-to-orbit (SSTO) reusable launch vehicle concept, developed by the NASA Langley Research Center under the Space Launch Initiative, provides the validation case for this work, with the focus placed on economics, aerothermodynamics, propulsion, and structures metrics. (Abstract shortened by UMI.)

  3. Do knowledge translation (KT) plans help to structure KT practices?

    PubMed

    Tchameni Ngamo, Salomon; Souffez, Karine; Lord, Catherine; Dagenais, Christian

    2016-06-17

    A knowledge translation (KT) planning template is a roadmap laying out the core elements to be considered when structuring the implementation of KT activities by researchers and practitioners. Since 2010, the Institut national de santé publique du Québec (INSPQ; Québec Public Health Institute) has provided tools and guidance to in-house project teams to help them develop KT plans. This study sought to identify the dimensions included in those plans and which ones were integrated and how. The results will be of interest to funding agencies and scientific organizations that provide frameworks for KT planning. The operationalization of KT planning dimensions was assessed in a mixed methods case study of 14 projects developed at the INSPQ between 2010 and 2013. All plans were assessed (rated) using an analytical tool developed for this study and data from interviews with the planning coordinators. The analytical tool and interview guide were based on eight core KT dimensions identified in the literature. Analysis of the plans and interviews revealed that the dimensions best integrated into the KT plans were 'analysis of the context (barriers and facilitators) and of users' needs', 'knowledge to be translated', 'KT partners', 'KT strategies' and, to a lesser extent, 'overall KT approach'. The least well integrated dimensions were 'knowledge about knowledge users', 'KT process evaluation' and 'resources'. While the planning coordinators asserted that a plan did not need to include all the dimensions to ensure its quality and success, nevertheless the dimensions that received less attention might have been better incorporated if they had been supported with more instruments related to those dimensions and sustained methodological guidance. Overall, KT planning templates appear to be an appreciated mechanism for supporting KT reflexive practices. Based on this study and our experience, we recommend using KT plans cautiously when assessing project efficacy and funding.

  4. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  5. GeneTools--application for functional annotation and statistical hypothesis testing.

    PubMed

    Beisvag, Vidar; Jünge, Frode K R; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Laegreid, Astrid

    2006-10-24

    Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. GeneTools is the first "all in one" annotation tool, providing users with a rapid extraction of highly relevant gene annotation data for e.g. thousands of genes or clones at once. It allows a user to define and archive new GO annotations and it supports hypothesis testing related to GO category representations. GeneTools is freely available through www.genetools.no

  6. The second iteration of the Systems Prioritization Method: A systems prioritization and decision-aiding tool for the Waste Isolation Pilot Plant: Volume 2, Summary of technical input and model implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prindle, N.H.; Mendenhall, F.T.; Trauth, K.

    1996-05-01

    The Systems Prioritization Method (SPM) is a decision-aiding tool developed by Sandia National Laboratories (SNL). SPM provides an analytical basis for supporting programmatic decisions for the Waste Isolation Pilot Plant (WIPP) to meet selected portions of the applicable US EPA long-term performance regulations. The first iteration of SPM (SPM-1), the prototype for SPM< was completed in 1994. It served as a benchmark and a test bed for developing the tools needed for the second iteration of SPM (SPM-2). SPM-2, completed in 1995, is intended for programmatic decision making. This is Volume II of the three-volume final report of the secondmore » iteration of the SPM. It describes the technical input and model implementation for SPM-2, and presents the SPM-2 technical baseline and the activities, activity outcomes, outcome probabilities, and the input parameters for SPM-2 analysis.« less

  7. Application of the predicted heat strain model in development of localized, threshold-based heat stress management guidelines for the construction industry.

    PubMed

    Rowlinson, Steve; Jia, Yunyan Andrea

    2014-04-01

    Existing heat stress risk management guidelines recommended by international standards are not practical for the construction industry which needs site supervision staff to make instant managerial decisions to mitigate heat risks. The ability of the predicted heat strain (PHS) model [ISO 7933 (2004). Ergonomics of the thermal environment analytical determination and interpretation of heat stress using calculation of the predicted heat strain. Geneva: International Standard Organisation] to predict maximum allowable exposure time (D lim) has now enabled development of localized, action-triggering and threshold-based guidelines for implementation by lay frontline staff on construction sites. This article presents a protocol for development of two heat stress management tools by applying the PHS model to its full potential. One of the tools is developed to facilitate managerial decisions on an optimized work-rest regimen for paced work. The other tool is developed to enable workers' self-regulation during self-paced work.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less

  9. Analytical considerations for mass spectrometry profiling in serum biomarker discovery.

    PubMed

    Whiteley, Gordon R; Colantonio, Simona; Sacconi, Andrea; Saul, Richard G

    2009-03-01

    The potential of using mass spectrometry profiling as a diagnostic tool has been demonstrated for a wide variety of diseases. Various cancers and cancer-related diseases have been the focus of much of this work because of both the paucity of good diagnostic markers and the knowledge that early diagnosis is the most powerful weapon in treating cancer. The implementation of mass spectrometry as a routine diagnostic tool has proved to be difficult, however, primarily because of the stringent controls that are required for the method to be reproducible. The method is evolving as a powerful guide to the discovery of biomarkers that could, in turn, be used either individually or in an array or panel of tests for early disease detection. Using proteomic patterns to guide biomarker discovery and the possibility of deployment in the clinical laboratory environment on current instrumentation or in a hybrid technology has the possibility of being the early diagnosis tool that is needed.

  10. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    ERIC Educational Resources Information Center

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  11. Greenhouse Gas Mitigation Options Database and Tool - Data ...

    EPA Pesticide Factsheets

    Industry and electricity production facilities generate over 50 percent of greenhouse gas (GHG) emissions in the United States. There is a growing consensus among scientists that the primary cause of climate change is anthropogenic greenhouse gas (GHG) emissions. Reducing GHG emissions from these sources is a key part of the United States’ strategy to reduce the impacts of these global-warming emissions. As a result of the recent focus on GHG emissions, the U.S. Environmental Protection Agency (EPA) and state agencies are implementing policies and programs to quantify and regulate GHG emissions from key emitting sources in the United States. These policies and programs have generated a need for a reliable source of information regarding GHG mitigation options for both industry and regulators. In response to this need, EPA developed a comprehensive GHG mitigation options database (GMOD) that was compiled based on information from industry, government research agencies, and academia. The GMOD and Tool (GMODT) is a comprehensive data repository and analytical tool being developed by EPA to evaluate alternative GHG mitigation options for several high-emitting industry sectors, including electric power plants, cement plants, refineries, landfills and other industrial sources of GHGs. The data is collected from credible sources including peer-reviewed journals, reports, and others government and academia data sources which include performance, applicability, develop

  12. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  13. HPTLC Fingerprint Analysis: A Quality Control for Authentication of Herbal Phytochemicals

    NASA Astrophysics Data System (ADS)

    Ram, Mauji; Abdin, M. Z.; Khan, M. A.; Jha, Prabhakar

    Authentication and consistent quality are the basic requirement for Indian traditional medicine (TIM), Chinese traditional herbal medicine (TCHM), and their commercial products, regardless of the kind of research conducted to modernize the TIM and TCHM. The complexities of TIM and TCHM challenge the current official quality control mode, for which only a few biochemical markers were selected for identification and quantitative assay. Referring too many unknown factors existed in TIM and TCHM, it is impossible and unnecessary to pinpoint qualitatively and quantitatively every single component contained in the herbal drug. Chromatographic fingerprint is a rational option to meet the need for more effective and powerful quality assessment to TIM and TCHM. The optimized chromatographic fingerprint is not only an alternative analytical tool for authentication, but also an approach to express the various pattern of chemical ingredients distribution in the herbal drugs and preserve such "database" for further multifaced sustainable studies. Analytical separation techniques, for example, high-performance liquid chromatography (HPLC), gas chromatography (GC) and mass spectrometry (MS) were among the most popular methods of choice used for quality control of raw material and finished herbal product. Fingerprint analysis approach using high-performance thin-layer chromatography (HPTLC) has become the most potent tool for quality control of herbal medicines because of its simplicity and reliability. It can serve as a tool for identification, authentication, and quality control of herbal drugs. In this chapter, attempts are being made to expand the use of HPTLC and at the same time create interest among prospective researcher in herbal analysis. The developed method can be used as a quality control tool for rapid authentication from a wide variety of herbal samples. Some examples demonstrated the role of fingerprinting in quality control and assessment.

  14. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.

  15. Amperometric Enzyme-Based Biosensors for Application in Food and Beverage Industry

    NASA Astrophysics Data System (ADS)

    Csöoregi, Elisabeth; Gáspñr, Szilveszter; Niculescu, Mihaela; Mattiasson, Bo; Schuhmann, Wolfgang

    Continuous, sensitive, selective, and reliable monitoring of a large variety of different compounds in various food and beverage samples is of increasing importance to assure a high-quality and tracing of any possible source of contamination of food and beverages. Most of the presently used classical analytical methods are often requiring expensive instrumentation, long analysis times and well-trained staff. Amperometric enzyme-based biosensors on the other hand have emerged in the last decade from basic science to useful tools with very promising application possibilities in food and beverage industry. Amperometric biosensors are in general highly selective, sensitive, relatively cheap, and easy to integrate into continuous analysis systems. A successful application of such sensors for industrial purposes, however, requires a sensor design, which satisfies the specific needs of monitoring the targeted analyte in the particular application, Since each individual application needs different operational conditions and sensor characteristics, it is obvious that biosensors have to be tailored for the particular case. The characteristics of the biosensors are depending on the used biorecognition element (enzyme), nature of signal transducer (electrode material) and the communication between these two elements (electron-transfer pathway).

  16. Psychiatric epidemiology: selected recent advances and future directions.

    PubMed Central

    Kessler, R. C.

    2000-01-01

    Reviewed in this article are selected recent advances and future challenges for psychiatric epidemiology. Major advances in descriptive psychiatric epidemiology in recent years include the development of reliable and valid fully structured diagnostic interviews, the implementation of parallel cross-national surveys of the prevalences and correlates of mental disorders, and the initiation of research in clinical epidemiology. Remaining challenges include the refinement of diagnostic categories and criteria, recognition and evaluation of systematic underreporting bias in surveys of mental disorders, creation and use of accurate assessment tools for studying disorders of children, adolescents, the elderly, and people in less developed countries, and setting up systems to carry out small area estimations for needs assessment and programme planning. Advances in analytical and experimental epidemiology have been more modest. A major challenge is for psychiatric epidemiologists to increase the relevance of their analytical research to their colleagues in preventative psychiatry as well as to social policy analysts. Another challenge is to develop interventions aimed at increasing the proportion of people with mental disorders who receive treatment. Despite encouraging advances, much work still needs to be conducted before psychiatric epidemiology can realize its potential to improve the mental health of populations. PMID:10885165

  17. Analytical and experimental study of the acoustics and the flow field characteristics of cavitating self-resonating water jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chahine, G.L.; Genoux, P.F.; Johnson, V.E. Jr.

    1984-09-01

    Waterjet nozzles (STRATOJETS) have been developed which achieve passive structuring of cavitating submerged jets into discrete ring vortices, and which possess cavitation incipient numbers six times higher than obtained with conventional cavitating jet nozzles. In this study we developed analytical and numerical techniques and conducted experimental work to gain an understanding of the basic phenomena involved. The achievements are: (1) a thorough analysis of the acoustic dynamics of the feed pipe to the nozzle; (2) a theory for bubble ring growth and collapse; (3) a numerical model for jet simulation; (4) an experimental observation and analysis of candidate second-generation low-sigmamore » STRATOJETS. From this study we can conclude that intensification of bubble ring collapse and design of highly resonant feed tubes can lead to improved drilling rates. The models here described are excellent tools to analyze the various parameters needed for STRATOJET optimizations. Further analysis is needed to introduce such important factors as viscosity, nozzle-jet interaction, and ring-target interaction, and to develop the jet simulation model to describe the important fine details of the flow field at the nozzle exit.« less

  18. Effect-directed analysis of ginger (Zingiber officinale) and its food products, and quantification of bioactive compounds via high-performance thin-layer chromatography and mass spectrometry.

    PubMed

    Krüger, S; Bergin, A; Morlock, G E

    2018-03-15

    Decision makers responsible for quality management along the food chain need to reflect on their analytical tools that should ensure quality of food and especially superfood. The "4ables" in target analysis (stable, extractable, separable, detectable) focusing on marker compounds do not cover all relevant information about the sample. On the example of ginger, a streamlined quantitative bioprofiling was developed for effect-directed analysis of 17 commercially available ginger and ginger-containing products via high-performance thin-layer chromatography (HPTLC-UV/Vis/FLD-bioassay). The samples were investigated concerning their active profile as radical scavengers, antimicrobials, estrogen-like activators and acetylcholinesterase/tyrosinase inhibitors. The [6]-gingerol and [6]-shogaol content of the different products ranged 0.2-7.4mg/g and 0.2-3.0mg/g, respectively. Further, multipotent compounds were discovered, characterized, and for example, assigned as [8]- and [10]-gingerol via HPTLC-ESI-HRMS. The developed bioprofiling is a step forward to new analytical methods needed to inform on the true product quality influenced by cultivation, processing, and storage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on andmore » applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between distributed data and analytical tools. This work included construction of interfaces to various commercial database products and to one of the data analysis algorithms developed through this LDRD.« less

  20. Employing socially driven techniques for framing, contextualization, and collaboration in complex analytical threads

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin

    2015-05-01

    The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.

  1. The Water-Energy-Food Nexus: Advancing Innovative, Policy-Relevant Methods

    NASA Astrophysics Data System (ADS)

    Crootof, A.; Albrecht, T.; Scott, C. A.

    2017-12-01

    The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex Anthropocene challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, a primary limitation of the nexus approach is the absence - or gaps and inconsistent use - of adequate methods to advance an innovative and policy-relevant nexus approach. This paper presents an analytical framework to identify robust nexus methods that align with nexus thinking and highlights innovative nexus methods at the frontier. The current state of nexus methods was assessed with a systematic review of 245 journal articles and book chapters. This review revealed (a) use of specific and reproducible methods for nexus assessment is uncommon - less than one-third of the reviewed studies present explicit methods; (b) nexus methods frequently fall short of capturing interactions among water, energy, and food - the very concept they purport to address; (c) assessments strongly favor quantitative approaches - 70% use primarily quantitative tools; (d) use of social science methods is limited (26%); and (e) many nexus methods are confined to disciplinary silos - only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. Despite some pitfalls of current nexus methods, there are a host of studies that offer innovative approaches to help quantify nexus linkages and interactions among sectors, conceptualize dynamic feedbacks, and support mixed method approaches to better understand WEF systems. Applying our analytical framework to all 245 studies, we identify, and analyze herein, seventeen studies that implement innovative multi-method and cross-scalar tools to demonstrate promising advances toward improved nexus assessment. This paper finds that, to make the WEF nexus effective as a policy-relevant analytical tool, methods are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and policy-makers.

  2. A multi-purpose tool for food inspection: Simultaneous determination of various classes of preservatives and biogenic amines in meat and fish products by LC-MS.

    PubMed

    Molognoni, Luciano; Daguer, Heitor; de Sá Ploêncio, Leandro Antunes; De Dea Lindner, Juliano

    2018-02-01

    This paper describes an innovative fast and multipurpose method for the chemical inspection of meat and fish products by liquid chromatography-tandem mass spectrometry. Solid-liquid extraction and low temperature partitioning were applied to 17 analytes, which included large bacteriocins (3.5kDa) and small molecules (organic acids, heterocyclic compounds, polyene macrolides, alkyl esters of the p-hydroxybenzoic acid, aromatic, and aliphatic biogenic amines and polyamines). Chromatographic separation was achieved in 10min, using stationary phase of di-isopropyl-3-aminopropyl silane bound to hydroxylated silica. Method validation was in accordance to Commission Decision 657/2002/CE. Linear ranges were among 1.25-10.0mgkg -1 (natamycin and parabens), 2.50-10.0mgkg -1 (sorbate and nisin), 25.0-200mgkg -1 (biogenic amines, hexamethylenetetramine, benzoic and lactic acids), and 50.0-400mgkg -1 (citric acid). Expanded measurement uncertainty (U) was estimated by single laboratory validation combined to modeling in two calculation approaches: internal (U = 5%) and external standardization (U = 24%). Method applicability was checked on 89 real samples among raw, cooked, dry fermented and cured products, yielding acceptable recoveries. Many regulatory issues were revealed, corroborating the need for enhancement of the current analytical methods. This simple execution method dispenses the use of additional procedures of extraction and, therefore, reduces costs over time. It is suitable for routine analysis as a screening or confirmatory tool for both qualitative and quantitative results, replacing many time consuming analytical procedures. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less

  4. Data Intensive Computing on Amazon Web Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magana-Zook, S. A.

    The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less

  5. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    PubMed

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  6. HPTLC in Herbal Drug Quantification

    NASA Astrophysics Data System (ADS)

    Shinde, Devanand B.; Chavan, Machindra J.; Wakte, Pravin S.

    For the past few decades, compounds from natural sources have been gaining importance because of the vast chemical diversity they offer. This has led to phenomenal increase in the demand for herbal medicines in the last two decades and need has been felt for ensuring the quality, safety, and efficacy of herbal drugs. Phytochemical evaluation is one of the tools for the quality assessment, which include preliminary phytochemical screening, chemoprofiling, and marker compound analysis using modern analytical techniques. High-performance thin-layer chromatography (HPTLC) has been emerged as an important tool for the qualitative, semiquantitative, and quantitative phytochemical analysis of the herbal drugs and formulations. This includes developing TLC fingerprinting profiles and estimation of biomarkers. This review has an attempt to focus on the theoretical considerations of HPTLC and some examples of herbal drugs and formulations analyzed by HPTLC.

  7. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  8. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  9. Haze Gray Paint and the U.S. Navy: A Procurement Process Review

    DTIC Science & Technology

    2017-12-01

    support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane

  10. Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools

    DTIC Science & Technology

    2014-01-14

    Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and

  11. Visualization and Analytics Software Tools for Peregrine System |

    Science.gov Websites

    R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open

  12. Dynamic Vision for Control

    DTIC Science & Technology

    2006-07-27

    unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry

  13. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    DTIC Science & Technology

    2011-03-28

    Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory

  14. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  15. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  16. Turbomachinery

    NASA Technical Reports Server (NTRS)

    Simoneau, Robert J.; Strazisar, Anthony J.; Sockol, Peter M.; Reid, Lonnie; Adamczyk, John J.

    1987-01-01

    The discipline research in turbomachinery, which is directed toward building the tools needed to understand such a complex flow phenomenon, is based on the fact that flow in turbomachinery is fundamentally unsteady or time dependent. Success in building a reliable inventory of analytic and experimental tools will depend on how the time and time-averages are treated, as well as on who the space and space-averages are treated. The raw tools at disposal (both experimentally and computational) are truly powerful and their numbers are growing at a staggering pace. As a result of this power, a case can be made that a situation exists where information is outstripping understanding. The challenge is to develop a set of computational and experimental tools which genuinely increase understanding of the fluid flow and heat transfer in a turbomachine. Viewgraphs outline a philosophy based on working on a stairstep hierarchy of mathematical and experimental complexity to build a system of tools, which enable one to aggressively design the turbomachinery of the next century. Examples of the types of computational and experimental tools under current development at Lewis, with progress to date, are examined. The examples include work in both the time-resolved and time-averaged domains. Finally, an attempt is made to identify the proper place for Lewis in this continuum of research.

  17. Development of a robust space power system decision model

    NASA Astrophysics Data System (ADS)

    Chew, Gilbert; Pelaccio, Dennis G.; Jacobs, Mark; Stancati, Michael; Cataldo, Robert

    2001-02-01

    NASA continues to evaluate power systems to support human exploration of the Moon and Mars. The system(s) would address all power needs of surface bases and on-board power for space transfer vehicles. Prior studies have examined both solar and nuclear-based alternatives with respect to individual issues such as sizing or cost. What has not been addressed is a comprehensive look at the risks and benefits of the options that could serve as the analytical framework to support a system choice that best serves the needs of the exploration program. This paper describes the SAIC developed Space Power System Decision Model, which uses a formal Two-step Analytical Hierarchy Process (TAHP) methodology that is used in the decision-making process to clearly distinguish candidate power systems in terms of benefits, safety, and risk. TAHP is a decision making process based on the Analytical Hierarchy Process, which employs a hierarchic approach of structuring decision factors by weights, and relatively ranks system design options on a consistent basis. This decision process also includes a level of data gathering and organization that produces a consistent, well-documented assessment, from which the capability of each power system option to meet top-level goals can be prioritized. The model defined on this effort focuses on the comparative assessment candidate power system options for Mars surface application(s). This paper describes the principles of this approach, the assessment criteria and weighting procedures, and the tools to capture and assess the expert knowledge associated with space power system evaluation. .

  18. Approximate solutions for diffusive fracture-matrix transfer: Application to storage of dissolved CO 2 in fractured rocks

    DOE PAGES

    Zhou, Quanlin; Oldenburg, Curtis M.; Spangler, Lee H.; ...

    2017-01-05

    Analytical solutions with infinite exponential series are available to calculate the rate of diffusive transfer between low-permeability blocks and high-permeability zones in the subsurface. Truncation of these series is often employed by neglecting the early-time regime. Here in this paper, we present unified-form approximate solutions in which the early-time and the late-time solutions are continuous at a switchover time. The early-time solutions are based on three-term polynomial functions in terms of square root of dimensionless time, with the first coefficient dependent only on the dimensionless area-to-volume ratio. The last two coefficients are either determined analytically for isotropic blocks (e.g., spheresmore » and slabs) or obtained by fitting the exact solutions, and they solely depend on the aspect ratios for rectangular columns and parallelepipeds. For the late-time solutions, only the leading exponential term is needed for isotropic blocks, while a few additional exponential terms are needed for highly anisotropic rectangular blocks. The optimal switchover time is between 0.157 and 0.229, with highest relative approximation error less than 0.2%. The solutions are used to demonstrate the storage of dissolved CO 2 in fractured reservoirs with low-permeability matrix blocks of single and multiple shapes and sizes. These approximate solutions are building blocks for development of analytical and numerical tools for hydraulic, solute, and thermal diffusion processes in low-permeability matrix blocks.« less

  19. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  20. Toxicological importance of human biomonitoring of metallic and metalloid elements in different biological samples.

    PubMed

    Gil, F; Hernández, A F

    2015-06-01

    Human biomonitoring has become an important tool for the assessment of internal doses of metallic and metalloid elements. These elements are of great significance because of their toxic properties and wide distribution in environmental compartments. Although blood and urine are the most used and accepted matrices for human biomonitoring, other non-conventional samples (saliva, placenta, meconium, hair, nails, teeth, breast milk) may have practical advantages and would provide additional information on health risk. Nevertheless, the analysis of these compounds in biological matrices other than blood and urine has not yet been accepted as a useful tool for biomonitoring. The validation of analytical procedures is absolutely necessary for a proper implementation of non-conventional samples in biomonitoring programs. However, the lack of reliable and useful analytical methodologies to assess exposure to metallic elements, and the potential interference of external contamination and variation in biological features of non-conventional samples are important limitations for setting health-based reference values. The influence of potential confounding factors on metallic concentration should always be considered. More research is needed to ascertain whether or not non-conventional matrices offer definitive advantages over the traditional samples and to broaden the available database for establishing worldwide accepted reference values in non-exposed populations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases

    PubMed Central

    Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673

  2. An Evaluation of Fractal Surface Measurement Methods for Characterizing Landscape Complexity from Remote-Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina Siu-Ngan; Qiu, Hong-Lie; Quattrochi, Dale A.; Emerson, Charles W.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The rapid increase in digital data volumes from new and existing sensors necessitates the need for efficient analytical tools for extracting information. We developed an integrated software package called ICAMS (Image Characterization and Modeling System) to provide specialized spatial analytical functions for interpreting remote sensing data. This paper evaluates the three fractal dimension measurement methods: isarithm, variogram, and triangular prism, along with the spatial autocorrelation measurement methods Moran's I and Geary's C, that have been implemented in ICAMS. A modified triangular prism method was proposed and implemented. Results from analyzing 25 simulated surfaces having known fractal dimensions show that both the isarithm and triangular prism methods can accurately measure a range of fractal surfaces. The triangular prism method is most accurate at estimating the fractal dimension of higher spatial complexity, but it is sensitive to contrast stretching. The variogram method is a comparatively poor estimator for all of the surfaces, particularly those with higher fractal dimensions. Similar to the fractal techniques, the spatial autocorrelation techniques are found to be useful to measure complex images but not images with low dimensionality. These fractal measurement methods can be applied directly to unclassified images and could serve as a tool for change detection and data mining.

  3. Characterising infant inter-breath interval patterns during active and quiet sleep using recurrence plot analysis.

    PubMed

    Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M

    2009-01-01

    Breathing patterns are characteristically different between active and quiet sleep states in infants. It has been previously identified that breathing dynamics are governed by a non-linear controller which implies the need for a nonlinear analytical tool. Further, it has been shown that quantified nonlinear variables are different between adult sleep states. This study aims to determine whether a nonlinear analytical tool known as recurrence plot analysis can characterize breath intervals of active and quiet sleep states in infants. Overnight polysomnograms were obtained from 32 healthy infants. The 6 longest periods each of active and quiet sleep were identified and a software routine extracted inter-breath interval data for recurrence plot analysis. Determinism (DET), laminarity (LAM) and radius (RAD) values were calculated for an embedding dimension of 4, 6, 8 and 16, and fixed recurrence of 0.5, 1, 2, 3.5 and 5%. Recurrence plots exhibited characteristically different patterns for active and quiet sleep. Active sleep periods typically had higher values of RAD, DET and LAM than for quiet sleep, and this trend was invariant to a specific choice of embedding dimension or fixed recurrence. These differences may provide a basis for automated sleep state classification, and the quantitative investigation of pathological breathing patterns.

  4. Advances on a Decision Analytic Approach to Exposure-Based Chemical Prioritization.

    PubMed

    Wood, Matthew D; Plourde, Kenton; Larkin, Sabrina; Egeghy, Peter P; Williams, Antony J; Zemba, Valerie; Linkov, Igor; Vallero, Daniel A

    2018-05-11

    The volume and variety of manufactured chemicals is increasing, although little is known about the risks associated with the frequency and extent of human exposure to most chemicals. The EPA and the recent signing of the Lautenberg Act have both signaled the need for high-throughput methods to characterize and screen chemicals based on exposure potential, such that more comprehensive toxicity research can be informed. Prior work of Mitchell et al. using multicriteria decision analysis tools to prioritize chemicals for further research is enhanced here, resulting in a high-level chemical prioritization tool for risk-based screening. Reliable exposure information is a key gap in currently available engineering analytics to support predictive environmental and health risk assessments. An elicitation with 32 experts informed relative prioritization of risks from chemical properties and human use factors, and the values for each chemical associated with each metric were approximated with data from EPA's CP_CAT database. Three different versions of the model were evaluated using distinct weight profiles, resulting in three different ranked chemical prioritizations with only a small degree of variation across weight profiles. Future work will aim to include greater input from human factors experts and better define qualitative metrics. © 2018 Society for Risk Analysis.

  5. "EMERGING" POLLUTANTS, MASS SPECTROMETRY, AND ...

    EPA Pesticide Factsheets

    A foundation for Environmental Science - Mass Spectrometry: Historically fundamental to amassing our understanding of environmental processes and chemical pollution is the realm of mass spectrometry - the mainstay of analytical chemistry - the workhorse that supplies much of the definitive data that environmental scientists rely upon for identifying the molecular compositions (and ultimately the structures) of chemicals. This is not to ignore the complementary, critical roles played by the adjunct practices of sample enrichment (via any of various means of selective extraction) and analyte separation (via the myriad forms of chromatography and electrophoresis).While the power of mass spectrometry has long been highly visible to the practicing environmental chemist, it borders on continued obscurity to the lay public and most non-chemists. Even though mass spectrometry has played a long, historic (and largely invisible) role in establishing or undergirdidng our existing knowledge about environmental processes and pollution, what recognition it does enjoy is usually relegated to that of a tool. It is ususally the relevance of ssignificance of the knowledge acquired from the application of the tool that has ultimate meaning to the public and science at large - not how the knowledge was acquired. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in

  6. GWAS in a Box: Statistical and Visual Analytics of Structured Associations via GenAMap

    PubMed Central

    Xing, Eric P.; Curtis, Ross E.; Schoenherr, Georg; Lee, Seunghak; Yin, Junming; Puniyani, Kriti; Wu, Wei; Kinnaird, Peter

    2014-01-01

    With the continuous improvement in genotyping and molecular phenotyping technology and the decreasing typing cost, it is expected that in a few years, more and more clinical studies of complex diseases will recruit thousands of individuals for pan-omic genetic association analyses. Hence, there is a great need for algorithms and software tools that could scale up to the whole omic level, integrate different omic data, leverage rich structure information, and be easily accessible to non-technical users. We present GenAMap, an interactive analytics software platform that 1) automates the execution of principled machine learning methods that detect genome- and phenome-wide associations among genotypes, gene expression data, and clinical or other macroscopic traits, and 2) provides new visualization tools specifically designed to aid in the exploration of association mapping results. Algorithmically, GenAMap is based on a new paradigm for GWAS and PheWAS analysis, termed structured association mapping, which leverages various structures in the omic data. We demonstrate the function of GenAMap via a case study of the Brem and Kruglyak yeast dataset, and then apply it on a comprehensive eQTL analysis of the NIH heterogeneous stock mice dataset and report some interesting findings. GenAMap is available from http://sailing.cs.cmu.edu/genamap. PMID:24905018

  7. Current Research in Aircraft Tire Design and Performance

    NASA Technical Reports Server (NTRS)

    Tanner, J. A.; Mccarthy, J. L.; Clark, S. K.

    1981-01-01

    A review of the tire research programs which address the various needs identified by landing gear designers and airplane users is presented. The experimental programs are designed to increase tire tread lifetimes, relate static and dynamic tire properties, establish the tire hydroplaning spin up speed, study gear response to tire failures, and define tire temperature profiles during taxi, braking, and cornering operations. The analytical programs are aimed at providing insights into the mechanisms of heat generation in rolling tires and developing the tools necessary to streamline the tire design process and to aid in the analysis of landing gear problems.

  8. The USAID-NREL Partnership: Delivering Clean, Reliable, and Affordable Power in the Developing World

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, Andrea C; Leisch, Jennifer E

    The U.S. Agency for International Development (USAID) and the National Renewable Energy Laboratory (NREL) are partnering to support clean, reliable, and affordable power in the developing world. The USAID-NREL Partnership helps countries with policy, planning, and deployment support for advanced energy technologies. Through this collaboration, USAID is accessing advanced energy expertise and analysis pioneered by the U.S. National Laboratory system. The Partnership addresses critical aspects of advanced energy systems including renewable energy deployment, grid modernization, distributed energy resources and storage, power sector resilience, and the data and analytical tools needed to support them.

  9. Measuring the costs and benefits of conservation programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Einhorn, M.A.

    1985-07-25

    A step-by-step analysis of the effects of utility-sponsored conservation promoting programs begins by identifying several factors which will reduce a program's effectiveness. The framework for measuring cost savings and designing a conservation program needs to consider the size of appliance subsidies, what form incentives should take, and how will customer behavior change as a result of incentives. Continual reevaluation is necessary to determine whether to change the size of rebates or whether to continue the program. Analytical tools for making these determinations are improving as conceptual breakthroughs in econometrics permit more rigorous analysis. 5 figures.

  10. Topochemical Analysis of Cell Wall Components by TOF-SIMS.

    PubMed

    Aoki, Dan; Fukushima, Kazuhiko

    2017-01-01

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) is a recently developing analytical tool and a type of imaging mass spectrometry. TOF-SIMS provides mass spectral information with a lateral resolution on the order of submicrons, with widespread applicability. Sometimes, it is described as a surface analysis method without the requirement for sample pretreatment; however, several points need to be taken into account for the complete utilization of the capabilities of TOF-SIMS. In this chapter, we introduce methods for TOF-SIMS sample treatments, as well as basic knowledge of wood samples TOF-SIMS spectral and image data analysis.

  11. Evaluation Criteria for Micro-CAI: A Psychometric Approach

    PubMed Central

    Wallace, Douglas; Slichter, Mark; Bolwell, Christine

    1985-01-01

    The increased use of microcomputer-based instructional programs has resulted in a greater need for third-party evaluation of the software. This in turn has prompted the development of micro-CAI evaluation tools. The present project sought to develop a prototype instrument to assess the impact of CAI program presentation characteristics on students. Data analysis and scale construction was conducted using standard item reliability analyses and factor analytic techniques. Adequate subscale reliabilities and factor structures were found, suggesting that a psychometric approach to CAI evaluation may possess some merit. Efforts to assess the utility of the resultant instrument are currently underway.

  12. Distributed Generation Interconnection Collaborative | NREL

    Science.gov Websites

    , reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability

  13. A smarter way to search, share and utilize open-spatial online data for energy R&D - Custom machine learning and GIS tools in U.S. DOE's virtual data library & laboratory, EDX

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J.; Baker, D.; Barkhurst, A.; Bean, A.; DiGiulio, J.; Jones, K.; Jones, T.; Justman, D.; Miller, R., III; Romeo, L.; Sabbatino, M.; Tong, A.

    2017-12-01

    As spatial datasets are increasingly accessible through open, online systems, the opportunity to use these resources to address a range of Earth system questions grows. Simultaneously, there is a need for better infrastructure and tools to find and utilize these resources. We will present examples of advanced online computing capabilities, hosted in the U.S. DOE's Energy Data eXchange (EDX), that address these needs for earth-energy research and development. In one study the computing team developed a custom, machine learning, big data computing tool designed to parse the web and return priority datasets to appropriate servers to develop an open-source global oil and gas infrastructure database. The results of this spatial smart search approach were validated against expert-driven, manual search results which required a team of seven spatial scientists three months to produce. The custom machine learning tool parsed online, open systems, including zip files, ftp sites and other web-hosted resources, in a matter of days. The resulting resources were integrated into a geodatabase now hosted for open access via EDX. Beyond identifying and accessing authoritative, open spatial data resources, there is also a need for more efficient tools to ingest, perform, and visualize multi-variate, spatial data analyses. Within the EDX framework, there is a growing suite of processing, analytical and visualization capabilities that allow multi-user teams to work more efficiently in private, virtual workspaces. An example of these capabilities are a set of 5 custom spatio-temporal models and data tools that form NETL's Offshore Risk Modeling suite that can be used to quantify oil spill risks and impacts. Coupling the data and advanced functions from EDX with these advanced spatio-temporal models has culminated with an integrated web-based decision-support tool. This platform has capabilities to identify and combine data across scales and disciplines, evaluate potential environmental, social, and economic impacts, highlight knowledge or technology gaps, and reduce uncertainty for a range of `what if' scenarios relevant to oil spill prevention efforts. These examples illustrate EDX's growing capabilities for advanced spatial data search and analysis to support geo-data science needs.

  14. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.

  15. Use of application containers and workflows for genomic data analysis.

    PubMed

    Schulz, Wade L; Durant, Thomas J S; Siddon, Alexa J; Torres, Richard

    2016-01-01

    The rapid acquisition of biological data and development of computationally intensive analyses has led to a need for novel approaches to software deployment. In particular, the complexity of common analytic tools for genomics makes them difficult to deploy and decreases the reproducibility of computational experiments. Recent technologies that allow for application virtualization, such as Docker, allow developers and bioinformaticians to isolate these applications and deploy secure, scalable platforms that have the potential to dramatically increase the efficiency of big data processing. While limitations exist, this study demonstrates a successful implementation of a pipeline with several discrete software applications for the analysis of next-generation sequencing (NGS) data. With this approach, we significantly reduced the amount of time needed to perform clonal analysis from NGS data in acute myeloid leukemia.

  16. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Fractals and Spatial Methods for Mining Remote Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina; Emerson, Charles; Quattrochi, Dale

    2003-01-01

    The rapid increase in digital remote sensing and GIS data raises a critical problem -- how can such an enormous amount of data be handled and analyzed so that useful information can be derived quickly? Efficient handling and analysis of large spatial data sets is central to environmental research, particularly in global change studies that employ time series. Advances in large-scale environmental monitoring and modeling require not only high-quality data, but also reliable tools to analyze the various types of data. A major difficulty facing geographers and environmental scientists in environmental assessment and monitoring is that spatial analytical tools are not easily accessible. Although many spatial techniques have been described recently in the literature, they are typically presented in an analytical form and are difficult to transform to a numerical algorithm. Moreover, these spatial techniques are not necessarily designed for remote sensing and GIS applications, and research must be conducted to examine their applicability and effectiveness in different types of environmental applications. This poses a chicken-and-egg problem: on one hand we need more research to examine the usability of the newer techniques and tools, yet on the other hand, this type of research is difficult to conduct if the tools to be explored are not accessible. Another problem that is fundamental to environmental research are issues related to spatial scale. The scale issue is especially acute in the context of global change studies because of the need to integrate remote-sensing and other spatial data that are collected at different scales and resolutions. Extrapolation of results across broad spatial scales remains the most difficult problem in global environmental research. There is a need for basic characterization of the effects of scale on image data, and the techniques used to measure these effects must be developed and implemented to allow for a multiple scale assessment of the data before any useful process-oriented modeling involving scale-dependent data can be conducted. Through the support of research grants from NASA, we have developed a software module called ICAMS (Image Characterization And Modeling System) to address the need to develop innovative spatial techniques and make them available to the broader scientific communities. ICAMS provides new spatial techniques, such as fractal analysis, geostatistical functions, and multiscale analysis that are not easily available in commercial GIS/image processing software. By bundling newer spatial methods in a user-friendly software module, researchers can begin to test and experiment with the new spatial analysis methods and they can gauge scale effects using a variety of remote sensing imagery. In the following, we describe briefly the development of ICAMS and present application examples.

  18. Rotary Motors Actuated by Traveling Ultrasonic Flexural Waves

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Bao, Xiaoqi; Grandia, Willem

    1999-01-01

    Efficient miniature actuators that are compact and consume low power are needed to drive space and planetary mechanisms in future NASA missions. Ultrasonic rotary motors have the potential to meet this NASA need and they are developed as actuators for miniature telerobotic applications. These motors have emerged in commercial products but they need to be adapted for operation at the harsh space environments that include cryogenic temperatures and vacuum and also require effective analytical tools for the design of efficient motors. A finite element analytical model was developed to examine the excitation of flexural plate wave traveling in a piezoelectrically actuated rotary motor. The model uses 3D finite element and equivalent circuit models that are applied to predict the excitation frequency and modal response of the stator. This model incorporates the details of the stator including the teeth, piezoelectric ceramic, geometry, bonding layer, etc. The theoretical predictions were corroborated experimentally for the stator. In parallel, efforts have been made to determine the thermal and vacuum performance of these motors. Experiments have shown that the motor can sustain at least 230 temperature cycles from 0 C to -90 C at 7 Torr pressure significant performance change. Also, in an earlier study the motor lasted over 334 hours at -150 C and vacuum. To explore telerobotic applications for USMs a robotic arm was constructed with such motors.

  19. A Comprehensive Tool and Analytical Pathway for Differential Molecular Profiling and Biomarker Discovery

    DTIC Science & Technology

    2014-10-20

    three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY

  20. GPs’ views concerning spirituality and the use of the FICA tool in palliative care in Flanders: a qualitative study

    PubMed Central

    Vermandere, Mieke; Choi, Yoo-Na; De Brabandere, Heleen; Decouttere, Ruth; De Meyere, Evelien; Gheysens, Elien; Nickmans, Brecht; Schoutteten, Melanie; Seghers, Lynn; Truijens, Joachim; Vandenberghe, Stien; Van de Wiele, Sofie; Van Oevelen, Laure-Anne; Aertgeerts, Bert

    2012-01-01

    Background According to recent recommendations, healthcare professionals in palliative care should be able to perform a spiritual history-taking. Previous findings suggest that the FICA tool is feasible for the clinical assessment of spirituality. However, little is known about the views of GPs on the use of this tool. Aim To provide a solid overview of the views of Flemish GPs concerning spirituality and the use of the FICA tool for spiritual history-taking in palliative care. Design and setting Qualitative interview study in Flanders, Belgium. Method Twenty-three GPs participated in a semi-structured interview. The interviews were analysed by thematic analysis, which includes line-by-line coding and the generation of descriptive and analytical themes. Results The interviewees stated that they would keep in mind the questions of the FICA tool while having a spiritual conversation, but not use them as a checklist. The content of the tool was generally appreciated as relevant, however, many GPs found the tool too structured and prescribed, and that it limited their spontaneity. They suggested rephrasing the questions into spoken language. The perceived barriers during spiritual conversations included feelings of discomfort and fear, and the lack of time and specific training. Factors that facilitated spiritual conversations included the patients’ acceptance of their diagnosis, a trusting relationship, and respect for the patients’ beliefs. Conclusion A palliative care process with attention focused on the patient’s spirituality was generally perceived as a tough but rewarding experience. The study concludes that the FICA tool could be a feasible instrument for the clinical assessment of spirituality, provided that certain substantive and linguistic adjustments are made. Additional research is needed to find the most suitable model for spiritual history-taking, in response to the specific needs of GPs. PMID:23265232

  1. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  2. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  3. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  4. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  5. Analysis of Nursing Clinical Decision Support Requests and Strategic Plan in a Large Academic Health System.

    PubMed

    Whalen, Kimberly; Bavuso, Karen; Bouyer-Ferullo, Sharon; Goldsmith, Denise; Fairbanks, Amanda; Gesner, Emily; Lagor, Charles; Collins, Sarah

    2016-01-01

    To understand requests for nursing Clinical Decision Support (CDS) interventions at a large integrated health system undergoing vendor-based EHR implementation. In addition, to establish a process to guide both short-term implementation and long-term strategic goals to meet nursing CDS needs. We conducted an environmental scan to understand current state of nursing CDS over three months. The environmental scan consisted of a literature review and an analysis of CDS requests received from across our health system. We identified existing high priority CDS and paper-based tools used in nursing practice at our health system that guide decision-making. A total of 46 nursing CDS requests were received. Fifty-six percent (n=26) were specific to a clinical specialty; 22 percent (n=10) were focused on facilitating clinical consults in the inpatient setting. "Risk Assessments/Risk Reduction/Promotion of Healthy Habits" (n=23) was the most requested High Priority Category received for nursing CDS. A continuum of types of nursing CDS needs emerged using the Data-Information-Knowledge-Wisdom Conceptual Framework: 1) facilitating data capture, 2) meeting information needs, 3) guiding knowledge-based decision making, and 4) exposing analytics for wisdom-based clinical interpretation by the nurse. Identifying and prioritizing paper-based tools that can be modified into electronic CDS is a challenge. CDS strategy is an evolving process that relies on close collaboration and engagement with clinical sites for short-term implementation and should be incorporated into a long-term strategic plan that can be optimized and achieved overtime. The Data-Information-Knowledge-Wisdom Conceptual Framework in conjunction with the High Priority Categories established may be a useful tool to guide a strategic approach for meeting short-term nursing CDS needs and aligning with the organizational strategic plan.

  6. Google Analytics – Index of Resources

    EPA Pesticide Factsheets

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  7. Advances in aptamer screening and small molecule aptasensors.

    PubMed

    Kim, Yeon Seok; Gu, Man Bock

    2014-01-01

    It has been 20 years since aptamer and SELEX (systematic evolution of ligands by exponential enrichment) were described independently by Andrew Ellington and Larry Gold. Based on the great advantages of aptamers, there have been numerous isolated aptamers for various targets that have actively been applied as therapeutic and analytical tools. Over 2,000 papers related to aptamers or SELEX have been published, attesting to their wide usefulness and the applicability of aptamers. SELEX methods have been modified or re-created over the years to enable aptamer isolation with higher affinity and selectivity in more labor- and time-efficient manners, including automation. Initially, most of the studies about aptamers have focused on the protein targets, which have physiological functions in the body, and their applications as therapeutic agents or receptors for diagnostics. However, aptamers for small molecules such as organic or inorganic compounds, drugs, antibiotics, or metabolites have not been studied sufficiently, despite the ever-increasing need for rapid and simple analytical methods for various chemical targets in the fields of medical diagnostics, environmental monitoring, food safety, and national defense against targets including chemical warfare. This review focuses on not only recent advances in aptamer screening methods but also its analytical application for small molecules.

  8. Modeling and Predicting the Stress Relaxation of Composites with Short and Randomly Oriented Fibers

    PubMed Central

    Obaid, Numaira; Sain, Mohini

    2017-01-01

    The addition of short fibers has been experimentally observed to slow the stress relaxation of viscoelastic polymers, producing a change in the relaxation time constant. Our recent study attributed this effect of fibers on stress relaxation behavior to the interfacial shear stress transfer at the fiber-matrix interface. This model explained the effect of fiber addition on stress relaxation without the need to postulate structural changes at the interface. In our previous study, we developed an analytical model for the effect of fully aligned short fibers, and the model predictions were successfully compared to finite element simulations. However, in most industrial applications of short-fiber composites, fibers are not aligned, and hence it is necessary to examine the time dependence of viscoelastic polymers containing randomly oriented short fibers. In this study, we propose an analytical model to predict the stress relaxation behavior of short-fiber composites where the fibers are randomly oriented. The model predictions were compared to results obtained from Monte Carlo finite element simulations, and good agreement between the two was observed. The analytical model provides an excellent tool to accurately predict the stress relaxation behavior of randomly oriented short-fiber composites. PMID:29053601

  9. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  10. Long story short: an introduction to the short-term and long-term Six Sigma quality and its importance in the laboratory medicine for the management of extra-analytical processes.

    PubMed

    Ialongo, Cristiano; Bernardini, Sergio

    2018-06-18

    There is a compelling need for quality tools that enable effective control of the extra-analytical phase. In this regard, Six Sigma seems to offer a valid methodological and conceptual opportunity, and in recent times, the International Federation of Clinical Chemistry and Laboratory Medicine has adopted it for indicating the performance requirements for non-analytical laboratory processes. However, the Six Sigma implies a distinction between short-term and long-term quality that is based on the dynamics of the processes. These concepts are still not widespread and applied in the field of laboratory medicine although they are of fundamental importance to exploit the full potential of this methodology. This paper reviews the Six Sigma quality concepts and shows how they originated from Shewhart's control charts, in respect of which they are not an alternative but a completion. It also discusses the dynamic nature of process and how it arises, concerning particularly the long-term dynamic mean variation, and explains why this leads to the fundamental distinction of quality we previously mentioned.

  11. A three-dimensional analytical model to simulate groundwater flow during operation of recirculating wells

    NASA Astrophysics Data System (ADS)

    Huang, Junqi; Goltz, Mark N.

    2005-11-01

    The potential for using pairs of so-called horizontal flow treatment wells (HFTWs) to effect in situ capture and treatment of contaminated groundwater has recently been demonstrated. To apply this new technology, design engineers need to be able to simulate the relatively complex groundwater flow patterns that result from HFTW operation. In this work, a three-dimensional analytical solution for steady flow in a homogeneous, anisotropic, contaminated aquifer is developed to efficiently calculate the interflow of water circulating between a pair of HFTWs and map the spatial extent of contaminated groundwater flowing from upgradient that is captured. The solution is constructed by superposing the solutions for the flow fields resulting from operation of partially penetrating wells. The solution is used to investigate the flow resulting from operation of an HFTW well pair and to quantify how aquifer anisotropy, well placement, and pumping rate impact capture zone width and interflow. The analytical modeling method presented here provides a fast and accurate technique for representing the flow field resulting from operation of HFTW systems, and represents a tool that can be useful in designing in situ groundwater contamination treatment systems.

  12. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  13. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  14. Visual analysis of online social media to open up the investigation of stance phenomena

    PubMed Central

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2015-01-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool. PMID:29249903

  15. Visual analysis of online social media to open up the investigation of stance phenomena.

    PubMed

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2016-04-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool.

  16. Visual programming for next-generation sequencing data analytics.

    PubMed

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  17. Modelling turbulent boundary layer flow over fractal-like multiscale terrain using large-eddy simulations and analytical tools.

    PubMed

    Yang, X I A; Meneveau, C

    2017-04-13

    In recent years, there has been growing interest in large-eddy simulation (LES) modelling of atmospheric boundary layers interacting with arrays of wind turbines on complex terrain. However, such terrain typically contains geometric features and roughness elements reaching down to small scales that typically cannot be resolved numerically. Thus subgrid-scale models for the unresolved features of the bottom roughness are needed for LES. Such knowledge is also required to model the effects of the ground surface 'underneath' a wind farm. Here we adapt a dynamic approach to determine subgrid-scale roughness parametrizations and apply it for the case of rough surfaces composed of cuboidal elements with broad size distributions, containing many scales. We first investigate the flow response to ground roughness of a few scales. LES with the dynamic roughness model which accounts for the drag of unresolved roughness is shown to provide resolution-independent results for the mean velocity distribution. Moreover, we develop an analytical roughness model that accounts for the sheltering effects of large-scale on small-scale roughness elements. Taking into account the shading effect, constraints from fundamental conservation laws, and assumptions of geometric self-similarity, the analytical roughness model is shown to provide analytical predictions that agree well with roughness parameters determined from LES.This article is part of the themed issue 'Wind energy in complex terrains'. © 2017 The Author(s).

  18. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  19. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  20. Single Particle Analysis by Combined Chemical Imaging to Study Episodic Air Pollution Events in Vienna

    NASA Astrophysics Data System (ADS)

    Ofner, Johannes; Eitenberger, Elisabeth; Friedbacher, Gernot; Brenner, Florian; Hutter, Herbert; Schauer, Gerhard; Kistler, Magdalena; Greilinger, Marion; Lohninger, Hans; Lendl, Bernhard; Kasper-Giebl, Anne

    2017-04-01

    The aerosol composition of a city like Vienna is characterized by a complex interaction of local emissions and atmospheric input on a regional and continental scale. The identification of major aerosol constituents for basic source appointment and air quality issues needs a high analytical effort. Exceptional episodic air pollution events strongly change the typical aerosol composition of a city like Vienna on a time-scale of few hours to several days. Analyzing the chemistry of particulate matter from these events is often hampered by the sampling time and related sample amount necessary to apply the full range of bulk analytical methods needed for chemical characterization. Additionally, morphological and single particle features are hardly accessible. Chemical Imaging evolved to a powerful tool for image-based chemical analysis of complex samples. As a complementary technique to bulk analytical methods, chemical imaging can address a new access to study air pollution events by obtaining major aerosol constituents with single particle features at high temporal resolutions and small sample volumes. The analysis of the chemical imaging datasets is assisted by multivariate statistics with the benefit of image-based chemical structure determination for direct aerosol source appointment. A novel approach in chemical imaging is combined chemical imaging or so-called multisensor hyperspectral imaging, involving elemental imaging (electron microscopy-based energy dispersive X-ray imaging), vibrational imaging (Raman micro-spectroscopy) and mass spectrometric imaging (Time-of-Flight Secondary Ion Mass Spectrometry) with subsequent combined multivariate analytics. Combined chemical imaging of precipitated aerosol particles will be demonstrated by the following examples of air pollution events in Vienna: Exceptional episodic events like the transformation of Saharan dust by the impact of the city of Vienna will be discussed and compared to samples obtained at a high alpine background site (Sonnblick Observatory, Saharan Dust Event from April 2016). Further, chemical imaging of biological aerosol constituents of an autumnal pollen breakout in Vienna, with background samples from nearby locations from November 2016 will demonstrate the advantages of the chemical imaging approach. Additionally, the chemical fingerprint of an exceptional air pollution event from a local emission source, caused by the pull down process of a building in Vienna will unravel the needs for multisensor imaging, especially the combinational access. Obtained chemical images will be correlated to bulk analytical results. Benefits of the overall methodical access by combining bulk analytics and combined chemical imaging of exceptional episodic air pollution events will be discussed.

  1. Applying Pragmatics Principles for Interaction with Visual Analytics.

    PubMed

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  2. Value of social media in reaching and engaging employers in Total Worker Health.

    PubMed

    Hudson, Heidi; Hall, Jennifer

    2013-12-01

    To describe the initial use of social media by the National Institute for Occupational Safety and Health (NIOSH) Total Worker Health™ (TWH) Program and the University of Iowa Healthier Workforce Center for Excellence (HWCE) Outreach Program. Social media analytics tools and process evaluation methods were used to derive initial insights on the social media strategies used by the NIOSH and the HWCE. The on-line community size for the NIOSH TWH Program indicated 100% growth in 6 months; however, social media platforms have been slow to gain participation among employers. The NIOSH TWH Program and the HWCE Outreach Program have found social media tools as an effective way to expand reach, foster engagement, and gain understanding of audience interests around TWH concepts. More needs to be known about how to best use social media to reach and engage target audiences on issues relevant to TWH.

  3. A spectral Poisson solver for kinetic plasma simulation

    NASA Astrophysics Data System (ADS)

    Szeremley, Daniel; Obberath, Jens; Brinkmann, Ralf

    2011-10-01

    Plasma resonance spectroscopy is a well established plasma diagnostic method, realized in several designs. One of these designs is the multipole resonance probe (MRP). In its idealized - geometrically simplified - version it consists of two dielectrically shielded, hemispherical electrodes to which an RF signal is applied. A numerical tool is under development which is capable of simulating the dynamics of the plasma surrounding the MRP in electrostatic approximation. In this contribution we concentrate on the specialized Poisson solver for that tool. The plasma is represented by an ensemble of point charges. By expanding both the charge density and the potential into spherical harmonics, a largely analytical solution of the Poisson problem can be employed. For a practical implementation, the expansion must be appropriately truncated. With this spectral solver we are able to efficiently solve the Poisson equation in a kinetic plasma simulation without the need of introducing a spatial discretization.

  4. Intelligent platforms for disease assessment: novel approaches in functional echocardiography.

    PubMed

    Sengupta, Partho P

    2013-11-01

    Accelerating trends in the dynamic digital era (from 2004 onward) has resulted in the emergence of novel parametric imaging tools that allow easy and accurate extraction of quantitative information from cardiac images. This review principally attempts to heighten the awareness of newer emerging paradigms that may advance acquisition, visualization and interpretation of the large functional data sets obtained during cardiac ultrasound imaging. Incorporation of innovative cognitive software that allow advanced pattern recognition and disease forecasting will likely transform the human-machine interface and interpretation process to achieve a more efficient and effective work environment. Novel technologies for automation and big data analytics that are already active in other fields need to be rapidly adapted to the health care environment with new academic-industry collaborations to enrich and accelerate the delivery of newer decision making tools for enhancing patient care. Copyright © 2013. Published by Elsevier Inc.

  5. A digital future for the history of psychology?

    PubMed

    Green, Christopher D

    2016-08-01

    This article discusses the role that digital approaches to the history of psychology are likely to play in the near future. A tentative hierarchy of digital methods is proposed. A few examples are briefly described: a digital repository, a simple visualization using ready-made online database and tools, and more complex visualizations requiring the assembly of the database and, possibly, the analytic tools by the researcher. The relationship of digital history to the old "New Economic History" (Cliometrics) is considered. The question of whether digital history and traditional history need be at odds or, instead, might complement each other is woven throughout. The rapidly expanding territory of digital humanistic research outside of psychology is briefly discussed. Finally, the challenging current employment trends in history and the humanities more broadly are considered, along with the role that digital skills might play in mitigating those factors for prospective academic workers. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  7. Semi-Analytical Models of CO2 Injection into Deep Saline Aquifers: Evaluation of the Area of Review and Leakage through Abandoned Wells

    EPA Science Inventory

    This presentation will provide a conceptual preview of an Area of Review (AoR) tool being developed by EPA’s Office of Research and Development that applies analytic and semi-analytical mathematical solutions to elucidate potential risks associated with geologic sequestration of ...

  8. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  9. Milestones on a Shoestring: A Cost-Effective, Semi-automated Implementation of the New ACGME Requirements for Radiology.

    PubMed

    Schmitt, J Eric; Scanlon, Mary H; Servaes, Sabah; Levin, Dayna; Cook, Tessa S

    2015-10-01

    The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  10. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  11. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  12. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  13. The value of online learning and MRI: finding a niche for expensive technologies.

    PubMed

    Cook, David A

    2014-11-01

    The benefits of online learning come at a price. How can we optimize the overall value? Critically appraise the value of online learning. Narrative review. Several prevalent myths overinflate the value of online learning. These include that online learning is cheap and easy (it is usually more expensive), that it is more efficient (efficiency depends on the instructional design, not the modality), that it will transform education (fundamental learning principles have not changed), and that the Net Generation expects it (there is no evidence of pent-up demand). However, online learning does add real value by enhancing flexibility, control and analytics. Costs may also go down if disruptive innovations (e.g. low-cost, low-tech, but instructionally sound "good enough" online learning) supplant technically superior but more expensive online learning products. Cost-lowering strategies include focusing on core principles of learning rather than technologies, using easy-to-learn authoring tools, repurposing content (organizing and sequencing existing resources rather than creating new content) and using course templates. Online learning represents just one tool in an educator's toolbox, as does the MRI for clinicians. We need to use the right tool(s) for the right learner at the right dose, time and route.

  14. Evaluation of Visual Analytics Environments: The Road to the Visual Analytics Science and Technology Challenge Evaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Plaisant, Catherine; Whiting, Mark A.

    The evaluation of visual analytics environments was a topic in Illuminating the Path [Thomas 2005] as a critical aspect of moving research into practice. For a thorough understanding of the utility of the systems available, evaluation not only involves assessing the visualizations, interactions or data processing algorithms themselves, but also the complex processes that a tool is meant to support (such as exploratory data analysis and reasoning, communication through visualization, or collaborative data analysis [Lam 2012; Carpendale 2007]). Researchers and practitioners in the field have long identified many of the challenges faced when planning, conducting, and executing an evaluation ofmore » a visualization tool or system [Plaisant 2004]. Evaluation is needed to verify that algorithms and software systems work correctly and that they represent improvements over the current infrastructure. Additionally to effectively transfer new software into a working environment, it is necessary to ensure that the software has utility for the end-users and that the software can be incorporated into the end-user’s infrastructure and work practices. Evaluation test beds require datasets, tasks, metrics and evaluation methodologies. As noted in [Thomas 2005] it is difficult and expensive for any one researcher to setup an evaluation test bed so in many cases evaluation is setup for communities of researchers or for various research projects or programs. Examples of successful community evaluations can be found [Chinchor 1993; Voorhees 2007; FRGC 2012]. As visual analytics environments are intended to facilitate the work of human analysts, one aspect of evaluation needs to focus on the utility of the software to the end-user. This requires representative users, representative tasks, and metrics that measure the utility to the end-user. This is even more difficult as now one aspect of the test methodology is access to representative end-users to participate in the evaluation. In many cases the sensitive nature of data and tasks and difficult access to busy analysts puts even more of a burden on researchers to complete this type of evaluation. User-centered design goes beyond evaluation and starts with the user [Beyer 1997, Shneiderman 2009]. Having some knowledge of the type of data, tasks, and work practices helps researchers and developers know the correct paths to pursue in their work. When access to the end-users is problematic at best and impossible at worst, user-centered design becomes difficult. Researchers are unlikely to go to work on the type of problems faced by inaccessible users. Commercial vendors have difficulties evaluating and improving their products when they cannot observe real users working with their products. In well-established fields such as web site design or office software design, user-interface guidelines have been developed based on the results of empirical studies or the experience of experts. Guidelines can speed up the design process and replace some of the need for observation of actual users [heuristics review references]. In 2006 when the visual analytics community was initially getting organized, no such guidelines existed. Therefore, we were faced with the problem of developing an evaluation framework for the field of visual analytics that would provide representative situations and datasets, representative tasks and utility metrics, and finally a test methodology which would include a surrogate for representative users, increase interest in conducting research in the field, and provide sufficient feedback to the researchers so that they could improve their systems.« less

  15. Coupling the Multizone Airflow and Contaminant Transport Software CONTAM with EnergyPlus Using Co-Simulation.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-08-01

    Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.

  16. New trends in the analytical determination of emerging contaminants and their transformation products in environmental waters.

    PubMed

    Agüera, Ana; Martínez Bueno, María Jesús; Fernández-Alba, Amadeo R

    2013-06-01

    Since the so-called emerging contaminants were established as a new group of pollutants of environmental concern, a great effort has been devoted to the knowledge of their distribution, fate and effects in the environment. After more than 20 years of work, a significant improvement in knowledge about these contaminants has been achieved, but there is still a large gap of information on the growing number of new potential contaminants that are appearing and especially of their unpredictable transformation products. Although the environmental problem arising from emerging contaminants must be addressed from an interdisciplinary point of view, it is obvious that analytical chemistry plays an important role as the first step of the study, as it allows establishing the presence of chemicals in the environment, estimate their concentration levels, identify sources and determine their degradation pathways. These tasks involve serious difficulties requiring different analytical solutions adjusted to purpose. Thus, the complexity of the matrices requires highly selective analytical methods; the large number and variety of compounds potentially present in the samples demands the application of wide scope methods; the low concentrations at which these contaminants are present in the samples require a high detection sensitivity, and high demands on the confirmation and high structural information are needed for the characterisation of unknowns. New developments on analytical instrumentation have been applied to solve these difficulties. Furthermore and not less important has been the development of new specific software packages intended for data acquisition and, in particular, for post-run analysis. Thus, the use of sophisticated software tools has allowed successful screening analysis, determining several hundreds of analytes, and assisted in the structural elucidation of unknown compounds in a timely manner.

  17. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    PubMed

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    DOT National Transportation Integrated Search

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  19. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  20. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  1. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  2. MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.

    PubMed

    Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui

    2015-12-12

    Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.

  3. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  4. Biomanufacturing process analytical technology (PAT) application for downstream processing: Using dissolved oxygen as an indicator of product quality for a protein refolding reaction.

    PubMed

    Pizarro, Shelly A; Dinges, Rachel; Adams, Rachel; Sanchez, Ailen; Winter, Charles

    2009-10-01

    Process analytical technology (PAT) is an initiative from the US FDA combining analytical and statistical tools to improve manufacturing operations and ensure regulatory compliance. This work describes the use of a continuous monitoring system for a protein refolding reaction to provide consistency in product quality and process performance across batches. A small-scale bioreactor (3 L) is used to understand the impact of aeration for refolding recombinant human vascular endothelial growth factor (rhVEGF) in a reducing environment. A reverse-phase HPLC assay is used to assess product quality. The goal in understanding the oxygen needs of the reaction and its impact to quality, is to make a product that is efficiently refolded to its native and active form with minimum oxidative degradation from batch to batch. Because this refolding process is heavily dependent on oxygen, the % dissolved oxygen (DO) profile is explored as a PAT tool to regulate process performance at commercial manufacturing scale. A dynamic gassing out approach using constant mass transfer (k(L)a) is used for scale-up of the aeration parameters to manufacturing scale tanks (2,000 L, 15,000 L). The resulting DO profiles of the refolding reaction show similar trends across scales and these are analyzed using rpHPLC. The desired product quality attributes are then achieved through alternating air and nitrogen sparging triggered by changes in the monitored DO profile. This approach mitigates the impact of differences in equipment or feedstock components between runs, and is directly inline with the key goal of PAT to "actively manage process variability using a knowledge-based approach." (c) 2009 Wiley Periodicals, Inc.

  5. Systems analysis - a new paradigm and decision support tools for the water framework directive

    NASA Astrophysics Data System (ADS)

    Bruen, M.

    2007-06-01

    In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE) and the Analytical Hierarchy Process (AHP) are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS) that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness. This is best done by trained sociologists fully integrated into the processes. The WINCOMS research project is an example applied to the implementation of the WFD in Ireland.

  6. 76 FR 70517 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...

  7. Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.

    PubMed

    Scott, Bradley; Wilcock, Anne

    2006-01-01

    Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.

  8. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  9. Concurrence of big data analytics and healthcare: A systematic review.

    PubMed

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of Big Data analytics in healthcare. This is because, the usability studies have considered only qualitative approach which describes potential benefits but does not take into account the quantitative study. Also, majority of the studies were from developed countries which brings out the need for promotion of research on Healthcare Big Data analytics in developing countries. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Multiplexed and Microparticle-based Analyses: Quantitative Tools for the Large-Scale Analysis of Biological Systems

    PubMed Central

    Nolan, John P.; Mandy, Francis

    2008-01-01

    While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537

  11. Application of wavelet packet transform to compressing Raman spectra data

    NASA Astrophysics Data System (ADS)

    Chen, Chen; Peng, Fei; Cheng, Qinghua; Xu, Dahai

    2008-12-01

    Abstract The Wavelet transform has been established with the Fourier transform as a data-processing method in analytical fields. The main fields of application are related to de-noising, compression, variable reduction, and signal suppression. Raman spectroscopy (RS) is characterized by the frequency excursion that can show the information of molecule. Every substance has its own feature Raman spectroscopy, which can analyze the structure, components, concentrations and some other properties of samples easily. RS is a powerful analytical tool for detection and identification. There are many databases of RS. But the data of Raman spectrum needs large space to storing and long time to searching. In this paper, Wavelet packet is chosen to compress Raman spectra data of some benzene series. The obtained results show that the energy retained is as high as 99.9% after compression, while the percentage for number of zeros is 87.50%. It was concluded that the Wavelet packet has significance in compressing the RS data.

  12. Application of Laser Mass Spectrometry to Art and Archaeology

    NASA Technical Reports Server (NTRS)

    Gulian, Lase Lisa E.; Callahan, Michael P.; Muliadi, Sarah; Owens, Shawn; McGovern, Patrick E.; Schmidt, Catherine M.; Trentelman, Karen A.; deVries, Mattanjah S.

    2011-01-01

    REMPI laser mass spectrometry is a combination of resonance enhanced multiphoton ionization spectroscopy and time of flight mass spectrometry, This technique enables the collection of mass specific optical spectra as well as of optically selected mass spectra. Analytes are jet-cooled by entrainment in a molecular beam, and this low temperature gas phase analysis has the benefit of excellent vibronic resolution. Utilizing this method, mass spectrometric analysis of historically relevant samples can be simplified and improved; Optical selection of targets eliminates the need for chromatography while knowledge of a target's gas phase spectroscopy allows for facile differentiation of molecules that are in the aqueous phase considered spectroscopically indistinguishable. These two factors allow smaller sample sizes than commercial MS instruments, which in turn will require less damage to objects of antiquity. We have explored methods to optimize REMPI laser mass spectrometry as an analytical tool to archaeology using theobromine and caffeine as molecular markers in Mesoamerican pottery, and are expanding this approach to the field of art to examine laccaic acid in shellacs.

  13. Creating value in health care through big data: opportunities and policy implications.

    PubMed

    Roski, Joachim; Bo-Linn, George W; Andrews, Timothy A

    2014-07-01

    Big data has the potential to create significant value in health care by improving outcomes while lowering costs. Big data's defining features include the ability to handle massive data volume and variety at high velocity. New, flexible, and easily expandable information technology (IT) infrastructure, including so-called data lakes and cloud data storage and management solutions, make big-data analytics possible. However, most health IT systems still rely on data warehouse structures. Without the right IT infrastructure, analytic tools, visualization approaches, work flows, and interfaces, the insights provided by big data are likely to be limited. Big data's success in creating value in the health care sector may require changes in current polices to balance the potential societal benefits of big-data approaches and the protection of patients' confidentiality. Other policy implications of using big data are that many current practices and policies related to data use, access, sharing, privacy, and stewardship need to be revised. Project HOPE—The People-to-People Health Foundation, Inc.

  14. QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study

    NASA Astrophysics Data System (ADS)

    Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa

    2016-10-01

    Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.

  15. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  16. Feasibility of UV-VIS-Fluorescence spectroscopy combined with pattern recognition techniques to authenticate a new category of plant food supplements.

    PubMed

    Boggia, Raffaella; Turrini, Federica; Anselmo, Marco; Zunin, Paola; Donno, Dario; Beccaro, Gabriele L

    2017-07-01

    Bud extracts, named also "gemmoderivatives", are a new category of natural products, obtained macerating meristematic fresh tissues of trees and plants. In the European Community these botanical remedies are classified as plant food supplements. Nowadays these products are still poorly studied, even if they are widely used and commercialized. Several analytical tools for the quality control of these very expensive supplements are urgently needed in order to avoid mislabelling and frauds. In fact, besides the usual quality controls common to the other botanical dietary supplements, these extracts should be checked in order to quickly detect if the cheaper adult parts of the plants are deceptively used in place of the corresponding buds whose harvest-period and production are extremely limited. This study aims to provide a screening analytical method based on UV-VIS-Fluorescence spectroscopy coupled to multivariate analysis for a rapid, inexpensive and non-destructive quality control of these products.

  17. Toward best practice: leveraging the electronic patient record as a clinical data warehouse.

    PubMed

    Ledbetter, C S; Morgan, M W

    2001-01-01

    Automating clinical and administrative processes via an electronic patient record (EPR) gives clinicians the point-of-care tools they need to deliver better patient care. However, to improve clinical practice as a whole and then evaluate it, healthcare must go beyond basic automation and convert EPR data into aggregated, multidimensional information. Unfortunately, few EPR systems have the established, powerful analytical clinical data warehouses (CDWs) required for this conversion. This article describes how an organization can support best practice by leveraging a CDW that is fully integrated into its EPR and clinical decision support (CDS) system. The article (1) discusses the requirements for comprehensive CDS, including on-line analytical processing (OLAP) of data at both transactional and aggregate levels, (2) suggests that the transactional data acquired by an OLTP EPR system must be remodeled to support retrospective, population-based, aggregate analysis of those data, and (3) concludes that this aggregate analysis is best provided by a separate CDW system.

  18. Using business intelligence for efficient inter-facility patient transfer.

    PubMed

    Haque, Waqar; Derksen, Beth Ann; Calado, Devin; Foster, Lee

    2015-01-01

    In the context of inter-facility patient transfer, a transfer operator must be able to objectively identify a destination which meets the needs of a patient, while keeping in mind each facility's limitations. We propose a solution which uses Business Intelligence (BI) techniques to analyze data related to healthcare infrastructure and services, and provides a web based system to identify optimal destination(s). The proposed inter-facility transfer system uses a single data warehouse with an Online Analytical Processing (OLAP) cube built on top that supplies analytical data to multiple reports embedded in web pages. The data visualization tool includes map based navigation of the health authority as well as an interactive filtering mechanism which finds facilities meeting the selected criteria. The data visualization is backed by an intuitive data entry web form which safely constrains the data, ensuring consistency and a single version of truth. The overall time required to identify the destination for inter-facility transfers is reduced from hours to a few minutes with this interactive solution.

  19. Use of application containers and workflows for genomic data analysis

    PubMed Central

    Schulz, Wade L.; Durant, Thomas J. S.; Siddon, Alexa J.; Torres, Richard

    2016-01-01

    Background: The rapid acquisition of biological data and development of computationally intensive analyses has led to a need for novel approaches to software deployment. In particular, the complexity of common analytic tools for genomics makes them difficult to deploy and decreases the reproducibility of computational experiments. Methods: Recent technologies that allow for application virtualization, such as Docker, allow developers and bioinformaticians to isolate these applications and deploy secure, scalable platforms that have the potential to dramatically increase the efficiency of big data processing. Results: While limitations exist, this study demonstrates a successful implementation of a pipeline with several discrete software applications for the analysis of next-generation sequencing (NGS) data. Conclusions: With this approach, we significantly reduced the amount of time needed to perform clonal analysis from NGS data in acute myeloid leukemia. PMID:28163975

  20. Pattern search in multi-structure data: a framework for the next-generation evidence-based medicine

    NASA Astrophysics Data System (ADS)

    Sukumar, Sreenivas R.; Ainsworth, Keela C.

    2014-03-01

    With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. Addressing this need, we pose and answer the following questions: (i) How can we jointly analyze and explore measurement data in context with qualitative domain knowledge? (ii) How can we search and hypothesize patterns (not known apriori) from such multi-structure data? (iii) How can we build predictive models by integrating weakly-associated multi-relational multi-structure data? We propose a framework towards answering these questions. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  1. Synthesis of Feedback Controller for Chaotic Systems by Means of Evolutionary Techniques

    NASA Astrophysics Data System (ADS)

    Senkerik, Roman; Oplatkova, Zuzana; Zelinka, Ivan; Davendra, Donald; Jasek, Roman

    2011-06-01

    This research deals with a synthesis of control law for three selected discrete chaotic systems by means of analytic programming. The novality of the approach is that a tool for symbolic regression—analytic programming—is used for such kind of difficult problem. The paper consists of the descriptions of analytic programming as well as chaotic systems and used cost function. For experimentation, Self-Organizing Migrating Algorithm (SOMA) with analytic programming was used.

  2. Poor quality drugs: grand challenges in high throughput detection, countrywide sampling, and forensics in developing countries†

    PubMed Central

    Fernandez, Facundo M.; Hostetler, Dana; Powell, Kristen; Kaur, Harparkash; Green, Michael D.; Mildenhall, Dallas C.; Newton, Paul N.

    2012-01-01

    Throughout history, poor quality medicines have been a persistent problem, with periodical crises in the supply of antimicrobials, such as fake cinchona bark in the 1600s and fake quinine in the 1800s. Regrettably, this problem seems to have grown in the last decade, especially afflicting unsuspecting patients and those seeking medicines via on-line pharmacies. Here we discuss some of the challenges related to the fight against poor quality drugs, and counterfeits in particular, with an emphasis on the analytical tools available, their relative performance, and the necessary workflows needed for distinguishing between genuine, substandard, degraded and counterfeit medicines. PMID:21107455

  3. Analysis of High-Throughput ELISA Microarray Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Daly, Don S.; Zangar, Richard C.

    Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).

  4. Identification of novel peptides for horse meat speciation in highly processed foodstuffs.

    PubMed

    Claydon, Amy J; Grundy, Helen H; Charlton, Adrian J; Romero, M Rosario

    2015-01-01

    There is a need for robust analytical methods to support enforcement of food labelling legislation. Proteomics is emerging as a complementary methodology to existing tools such as DNA and antibody-based techniques. Here we describe the development of a proteomics strategy for the determination of meat species in highly processed foods. A database of specific peptides for nine relevant animal species was used to enable semi-targeted species determination. This principle was tested for horse meat speciation, and a range of horse-specific peptides were identified as heat stable marker peptides for the detection of low levels of horse meat in mixtures with other species.

  5. Shuttle cryogenics supply system. Optimization study. Volume 5 B-4: Programmers manual for space shuttle orbit injection analysis (SOPSA)

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A computer program for space shuttle orbit injection propulsion system analysis (SOPSA) is described to show the operational characteristics and the computer system requirements. The program was developed as an analytical tool to aid in the preliminary design of propellant feed systems for the space shuttle orbiter main engines. The primary purpose of the program is to evaluate the propellant tank ullage pressure requirements imposed by the need to accelerate propellants rapidly during the engine start sequence. The SOPSA program will generate parametric feed system pressure histories and weight data for a range of nominal feedline sizes.

  6. THE FUTURE OF SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (SMARTE): 2006-2010

    EPA Science Inventory

    SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...

  7. SchemaOnRead: A Package for Schema-on-Read in R

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less

  8. Big (Bio)Chemical Data Mining Using Chemometric Methods: A Need for Chemists.

    PubMed

    Tauler, Roma; Parastar, Hadi

    2018-03-23

    This review aims to demonstrate abilities to analyze Big (Bio)Chemical Data (BBCD) with multivariate chemometric methods and to show some of the more important challenges of modern analytical researches. In this review, the capabilities and versatility of chemometric methods will be discussed in light of the BBCD challenges that are being encountered in chromatographic, spectroscopic and hyperspectral imaging measurements, with an emphasis on their application to omics sciences. In addition, insights and perspectives on how to address the analysis of BBCD are provided along with a discussion of the procedures necessary to obtain more reliable qualitative and quantitative results. In this review, the importance of Big Data and of their relevance to (bio)chemistry are first discussed. Then, analytical tools which can produce BBCD are presented as well as some basics needed to understand prospects and limitations of chemometric techniques when they are applied to BBCD are given. Finally, the significance of the combination of chemometric approaches with BBCD analysis in different chemical disciplines is highlighted with some examples. In this paper, we have tried to cover some of the applications of big data analysis in the (bio)chemistry field. However, this coverage is not extensive covering everything done in the field. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Redox-capacitor to connect electrochemistry to redox-biology.

    PubMed

    Kim, Eunkyoung; Leverage, W Taylor; Liu, Yi; White, Ian M; Bentley, William E; Payne, Gregory F

    2014-01-07

    It is well-established that redox-reactions are integral to biology for energy harvesting (oxidative phosphorylation), immune defense (oxidative burst) and drug metabolism (phase I reactions), yet there is emerging evidence that redox may play broader roles in biology (e.g., redox signaling). A critical challenge is the need for tools that can probe biologically-relevant redox interactions simply, rapidly and without the need for a comprehensive suite of analytical methods. We propose that electrochemistry may provide such a tool. In this tutorial review, we describe recent studies with a redox-capacitor film that can serve as a bio-electrode interface that can accept, store and donate electrons from mediators commonly used in electrochemistry and also in biology. Specifically, we (i) describe the fabrication of this redox-capacitor from catechols and the polysaccharide chitosan, (ii) discuss the mechanistic basis for electron exchange, (iii) illustrate the properties of this redox-capacitor and its capabilities for promoting redox-communication between biology and electrodes, and (iv) suggest the potential for enlisting signal processing strategies to "extract" redox information. We believe these initial studies indicate broad possibilities for enlisting electrochemistry and signal processing to acquire "systems level" redox information from biology.

  10. t4 Workshop Report*

    PubMed Central

    Bouhifd, Mounir; Beger, Richard; Flynn, Thomas; Guo, Lining; Harris, Georgina; Hogberg, Helena; Kaddurah-Daouk, Rima; Kamp, Hennicke; Kleensang, Andre; Maertens, Alexandra; Odwin-DaCosta, Shelly; Pamies, David; Robertson, Donald; Smirnova, Lena; Sun, Jinchun; Zhao, Liang; Hartung, Thomas

    2017-01-01

    Summary Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however – from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining – is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools. PMID:26536290

  11. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  12. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  13. Perspectives and reflections on the practice of behaviour change communication for infant and young child feeding.

    PubMed

    Pelto, Gretel H; Martin, Stephanie L; van Liere, Marti J; Fabrizio, Cecilia S

    2016-04-01

    Behaviour change communication (BCC) is a critical component of infant and young child feeding (IYCF) interventions. In this study we asked BCC practitioners working in low- and middle-income countries to participate in an examination of BCC practice. We focus here on results of their personal reflections related to larger issues of practice. We used a combination of iterative triangulation and snowball sampling procedures to obtain a sample of 29 BCC professionals. Major themes include (1) participants using tools and guidelines to structure their work, and many consider their organisation's tools to be their most important contribution to the field; (2) they value research to facilitate programme design and implementation; (3) half felt research needed to increase; (4) they have a strong commitment to respecting cultural beliefs and culturally appropriate programming; (5) they are concerned about lack of a strong theoretical foundation for their work. Based on participants' perspectives and the authors' reflections, we identified the following needs: (1) conducting a systematic examination of the alternative theoretical structures that are available for nutrition BCC, followed by a review of the evidence base and suggestions for future programmatic research to fill the gaps in knowledge; (2) developing a checklist of common patterns to facilitate efficiency in formative research; (3) developing an analytic compendium of current IYCF BCC guidelines and tools; (4) developing tools and guidelines that cover the full programme process, including use of innovative channels to support 'scaling up nutrition'; and (5) continued support for programmes of proven effectiveness. © 2015 John Wiley & Sons Ltd.

  14. Software Tools on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of

  15. Mass spectrometry as a quantitative tool in plant metabolomics

    PubMed Central

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  16. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  17. Modeling human comprehension of data visualizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzen, Laura E.; Haass, Michael Joseph; Divis, Kristin Marie

    This project was inspired by two needs. The first is a need for tools to help scientists and engineers to design effective data visualizations for communicating information, whether to the user of a system, an analyst who must make decisions based on complex data, or in the context of a technical report or publication. Most scientists and engineers are not trained in visualization design, and they could benefit from simple metrics to assess how well their visualization's design conveys the intended message. In other words, will the most important information draw the viewer's attention? The second is the need formore » cognition-based metrics for evaluating new types of visualizations created by researchers in the information visualization and visual analytics communities. Evaluating visualizations is difficult even for experts. However, all visualization methods and techniques are intended to exploit the properties of the human visual system to convey information efficiently to a viewer. Thus, developing evaluation methods that are rooted in the scientific knowledge of the human visual system could be a useful approach. In this project, we conducted fundamental research on how humans make sense of abstract data visualizations, and how this process is influenced by their goals and prior experience. We then used that research to develop a new model, the Data Visualization Saliency Model, that can make accurate predictions about which features in an abstract visualization will draw a viewer's attention. The model is an evaluation tool that can address both of the needs described above, supporting both visualization research and Sandia mission needs.« less

  18. Towards sustainable infrastructure management: knowledge-based service-oriented computing framework for visual analytics

    NASA Astrophysics Data System (ADS)

    Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd

    2009-05-01

    Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.

  19. An integrative framework for sensor-based measurement of teamwork in healthcare

    PubMed Central

    Rosen, Michael A; Dietz, Aaron S; Yang, Ting; Priebe, Carey E; Pronovost, Peter J

    2015-01-01

    There is a strong link between teamwork and patient safety. Emerging evidence supports the efficacy of teamwork improvement interventions. However, the availability of reliable, valid, and practical measurement tools and strategies is commonly cited as a barrier to long-term sustainment and spread of these teamwork interventions. This article describes the potential value of sensor-based technology as a methodology to measure and evaluate teamwork in healthcare. The article summarizes the teamwork literature within healthcare, including team improvement interventions and measurement. Current applications of sensor-based measurement of teamwork are reviewed to assess the feasibility of employing this approach in healthcare. The article concludes with a discussion highlighting current application needs and gaps and relevant analytical techniques to overcome the challenges to implementation. Compelling studies exist documenting the feasibility of capturing a broad array of team input, process, and output variables with sensor-based methods. Implications of this research are summarized in a framework for development of multi-method team performance measurement systems. Sensor-based measurement within healthcare can unobtrusively capture information related to social networks, conversational patterns, physical activity, and an array of other meaningful information without having to directly observe or periodically survey clinicians. However, trust and privacy concerns present challenges that need to be overcome through engagement of end users in healthcare. Initial evidence exists to support the feasibility of sensor-based measurement to drive feedback and learning across individual, team, unit, and organizational levels. Future research is needed to refine methods, technologies, theory, and analytical strategies. PMID:25053579

  20. Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.

    PubMed

    Buske, Christine; Gerlai, Robert

    2014-08-30

    Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.

  1. Towards good dementia care: Awareness and uptake of an online Dementia Pathways tool for rural and regional primary health practitioners.

    PubMed

    Ollerenshaw, Alison; Wong Shee, Anna; Yates, Mark

    2018-04-01

    To explore the awareness and usage of an online dementia pathways tool (including decision tree and region-specific dementia services) for primary health practitioners (GPs and nurses) in regional Victoria. Quantitative pilot study using surveys and Google Analytics. A large regional area (48 000 square kilometres, population 220 000) in Victoria. Two hundred and sixty-three GPs and 160 practice nurses were invited to participate, with 42 respondents (GPs, n = 21; practice nurses, n = 21). Primary care practitioners' awareness and usage of the dementia pathways tool. Survey respondents that had used the tool (n = 14) reported accessing information about diagnosis, management and referral. Practitioners reported improvements in knowledge, skills and confidence about core dementia topics. There were 9683 page views between August 2013 and February 2015 (monthly average: 509 page views). The average time spent on page was 2.03 min, with many visitors (68%) spending more than 4 min at the site. This research demonstrates that the tool has been well received by practitioners and has been consistently used since its launch. Health practitioners' valued the content and the availability of local resources. Primary health practitioners reported that the dementia pathways tool provided access to region-specific referral and management resources for all stages of dementia. Such tools have broad transferability in other health areas with further research needed to determine their contribution to learning in the practice setting and over time. © 2017 National Rural Health Alliance Inc.

  2. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  3. Structuring modeling and simulation analysis for evacuation planning and operations.

    DOT National Transportation Integrated Search

    2009-06-01

    This document is intended to provide guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in evacuation planning and operations. It is often unclear what kind of analytical approach may be of most value, ...

  4. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    PubMed

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  5. Analytical Tools Interface for Landscape Assessments

    EPA Science Inventory

    Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...

  6. SolarPILOT | Concentrating Solar Power | NREL

    Science.gov Websites

    tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is

  7. Where Big Data and Prediction Meet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahrens, James; Brase, Jim M.; Hart, Bill

    Our ability to assemble and analyze massive data sets, often referred to under the title of “big data”, is an increasingly important tool for shaping national policy. This in turn has introduced issues from privacy concerns to cyber security. But as IBM’s John Kelly emphasized in the last Innovation, making sense of the vast arrays of data will require radically new computing tools. In the past, technologies and tools for analysis of big data were viewed as quite different from the traditional realm of high performance computing (HPC) with its huge models of phenomena such as global climate or supportingmore » the nuclear test moratorium. Looking ahead, this will change with very positive benefits for both worlds. Societal issues such as global security, economic planning and genetic analysis demand increased understanding that goes beyond existing data analysis and reduction. The modeling world often produces simulations that are complex compositions of mathematical models and experimental data. This has resulted in outstanding successes such as the annual assessment of the state of the US nuclear weapons stockpile without underground nuclear testing. Ironically, while there were historically many test conducted, this body of data provides only modest insight into the underlying physics of the system. A great deal of emphasis was thus placed on the level of confidence we can develop for the predictions. As data analytics and simulation come together, there is a growing need to assess the confidence levels in both data being gathered and the complex models used to make predictions. An example of this is assuring the security or optimizing the performance of critical infrastructure systems such as the power grid. If one wants to understand the vulnerabilities of the system or impacts of predicted threats, full scales tests of the grid against threat scenarios are unlikely. Preventive measures would need to be predicated on well-defined margins of confidence in order to take mitigating actions that could have wide ranging impacts. There is a rich opportunity for interaction and exchange between the HPC simulation and data analytics communities.« less

  8. Is food allergen analysis flawed? Health and supply chain risks and a proposed framework to address urgent analytical needs.

    PubMed

    Walker, M J; Burns, D T; Elliott, C T; Gowland, M H; Mills, E N Clare

    2016-01-07

    Food allergy is an increasing problem for those affected, their families or carers, the food industry and for regulators. The food supply chain is highly vulnerable to fraud involving food allergens, risking fatalities and severe reputational damage to the food industry. Many facets are being pursued to ameliorate the difficulties including better food labelling and the concept of thresholds of elicitation of allergy symptoms as risk management tools. These efforts depend to a high degree on the ability reliably to detect and quantify food allergens; yet all current analytical approaches exhibit severe deficiencies that jeopardise accurate results being produced particularly in terms of the risks of false positive and false negative reporting. If we fail to realise the promise of current risk assessment and risk management of food allergens through lack of the ability to measure food allergens reproducibly and with traceability to an international unit of measurement, the analytical community will have failed a significant societal challenge. Three distinct but interrelated areas of analytical work are urgently needed to address the substantial gaps identified: (a) a coordinated international programme for the production of properly characterised clinically relevant reference materials and calibrants for food allergen analysis; (b) an international programme to widen the scope of proteomics and genomics bioinformatics for the genera containing the major allergens to address problems in ELISA, MS and DNA methods; (c) the initiation of a coordinated international programme leading to reference methods for allergen proteins that provide results traceable to the SI. This article describes in more detail food allergy, the risks of inapplicable or flawed allergen analyses with examples and a proposed framework, including clinically relevant incurred allergen concentrations, to address the currently unmet and urgently required analytical requirements. Support for the above recommendations from food authorities, business organisations and National Measurement Institutes is important; however transparent international coordination is essential. Thus our recommendations are primarily addressed to the European Commission, the Health and Food Safety Directorate, DG Santé. A global multidisciplinary consortium is required to provide a curated suite of data including genomic and proteomic data on key allergenic food sources, made publically available on line.

  9. Langley Ground Facilities and Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Ambur, Damodar R.; Kegelman, Jerome T.; Kilgore, William A.

    2010-01-01

    A strategic approach for retaining and more efficiently operating the essential Langley Ground Testing Facilities in the 21st Century is presented. This effort takes advantage of the previously completed and ongoing studies at the Agency and National levels. This integrated approach takes into consideration the overall decline in test business base within the nation and reduced utilization in each of the Langley facilities with capabilities to test in the subsonic, transonic, supersonic, and hypersonic speed regimes. The strategy accounts for capability needs to meet the Agency programmatic requirements and strategic goals and to execute test activities in the most efficient and flexible facility operating structure. The structure currently being implemented at Langley offers agility to right-size our capability and capacity from a national perspective, to accommodate the dynamic nature of the testing needs, and will address the influence of existing and emerging analytical tools for design. The paradigm for testing in the retained facilities is to efficiently and reliably provide more accurate and high-quality test results at an affordable cost to support design information needs for flight regimes where the computational capability is not adequate and to verify and validate the existing and emerging computational tools. Each of the above goals are planned to be achieved, keeping in mind the increasing small industry customer base engaged in developing unpiloted aerial vehicles and commercial space transportation systems.

  10. Immunoaffinity capillary electrophoresis as a powerful strategy for the quantification of low-abundance biomarkers, drugs, and metabolites in biological matrices

    PubMed Central

    Guzman, Norberto A.; Blanc, Timothy; Phillips, Terry M.

    2009-01-01

    In the last few years, there has been a greater appreciation by the scientific community of how separation science has contributed to the advancement of biomedical research. Despite past contributions in facilitating several biomedical breakthroughs, separation sciences still urgently need the development of improved methods for the separation and detection of biological and chemical substances. In particular, the challenging task of quantifying small molecules and biomolecules, found in low abundance in complex matrices (e.g., serum), is a particular area in need of new high-efficiency techniques. The tandem or on-line coupling of highly selective antibody capture agents with the high-resolving power of CE is being recognized as a powerful analytical tool for the enrichment and quantification of ultra-low abundance analytes in complex matrices. This development will have a significant impact on the identification and characterization of many putative biomarkers and on biomedical research in general. Immunoaffinity CE (IACE) technology is rapidly emerging as the most promising method for the analysis of low-abundance biomarkers; its power comes from a three-step procedure: (i) bioselective adsorption and (ii) subsequent recovery of compounds from an immobilized affinity ligand followed by (iii) separation of the enriched compounds. This technology is highly suited to automation and can be engineered to as a multiplex instrument capable of routinely performing hundreds of assays per day. Furthermore, a significant enhancement in sensitivity can be achieved for the purified and enriched affinity targeted analytes. Thus, a compound that exists in a complex biological matrix at a concentration far below its LOD is easily brought to well within its range of quantification. The present review summarizes several applications of IACE, as well as a chronological description of the improvements made in the fabrication of the analyte concentrator-microreactor device leading to the development of a multidimensional biomarker analyzer. PMID:18646282

  11. Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory

    NASA Astrophysics Data System (ADS)

    Bozkaya, Uǧur

    2013-09-01

    Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory (OMP3) [U. Bozkaya, J. Chem. Phys. 135, 224103 (2011)], 10.1063/1.3665134 are presented. The OMP3 method is applied to problematic chemical systems with challenging electronic structures. The performance of the OMP3 method is compared with those of canonical second-order Møller-Plesset perturbation theory (MP2), third-order Møller-Plesset perturbation theory (MP3), coupled-cluster singles and doubles (CCSD), and coupled-cluster singles and doubles with perturbative triples [CCSD(T)] for investigating equilibrium geometries, vibrational frequencies, and open-shell reaction energies. For bond lengths, the performance of OMP3 is in between those of MP3 and CCSD. For harmonic vibrational frequencies, the OMP3 method significantly eliminates the singularities arising from the abnormal response contributions observed for MP3 in case of symmetry-breaking problems, and provides noticeably improved vibrational frequencies for open-shell molecules. For open-shell reaction energies, OMP3 exhibits a better performance than MP3 and CCSD as in case of barrier heights and radical stabilization energies. As discussed in previous studies, the OMP3 method is several times faster than CCSD in energy computations. Further, in analytic gradient computations for the CCSD method one needs to solve λ-amplitude equations, however for OMP3 one does not since λ _{ab}^{ij(1)} = t_{ij}^{ab(1)} and λ _{ab}^{ij(2)} = t_{ij}^{ab(2)}. Additionally, one needs to solve orbital Z-vector equations for CCSD, but for OMP3 orbital response contributions are zero owing to the stationary property of OMP3. Overall, for analytic gradient computations the OMP3 method is several times less expensive than CCSD (roughly ˜4-6 times). Considering the balance of computational cost and accuracy we conclude that the OMP3 method emerges as a very useful tool for the study of electronically challenging chemical systems.

  12. Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory.

    PubMed

    Bozkaya, Uğur

    2013-09-14

    Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory (OMP3) [U. Bozkaya, J. Chem. Phys. 135, 224103 (2011)] are presented. The OMP3 method is applied to problematic chemical systems with challenging electronic structures. The performance of the OMP3 method is compared with those of canonical second-order Møller-Plesset perturbation theory (MP2), third-order Møller-Plesset perturbation theory (MP3), coupled-cluster singles and doubles (CCSD), and coupled-cluster singles and doubles with perturbative triples [CCSD(T)] for investigating equilibrium geometries, vibrational frequencies, and open-shell reaction energies. For bond lengths, the performance of OMP3 is in between those of MP3 and CCSD. For harmonic vibrational frequencies, the OMP3 method significantly eliminates the singularities arising from the abnormal response contributions observed for MP3 in case of symmetry-breaking problems, and provides noticeably improved vibrational frequencies for open-shell molecules. For open-shell reaction energies, OMP3 exhibits a better performance than MP3 and CCSD as in case of barrier heights and radical stabilization energies. As discussed in previous studies, the OMP3 method is several times faster than CCSD in energy computations. Further, in analytic gradient computations for the CCSD method one needs to solve λ-amplitude equations, however for OMP3 one does not since λ(ab)(ij(1))=t(ij)(ab(1)) and λ(ab)(ij(2))=t(ij)(ab(2)). Additionally, one needs to solve orbital Z-vector equations for CCSD, but for OMP3 orbital response contributions are zero owing to the stationary property of OMP3. Overall, for analytic gradient computations the OMP3 method is several times less expensive than CCSD (roughly ~4-6 times). Considering the balance of computational cost and accuracy we conclude that the OMP3 method emerges as a very useful tool for the study of electronically challenging chemical systems.

  13. Benefits of an automated GLP final report preparation software solution.

    PubMed

    Elvebak, Larry E

    2011-07-01

    The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.

  14. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  15. Weathering the Storm: Developing a Spatial Data Infrastructure and Online Research Platform for Oil Spill Preparedness

    NASA Astrophysics Data System (ADS)

    Bauer, J. R.; Rose, K.; Romeo, L.; Barkhurst, A.; Nelson, J.; Duran-Sesin, R.; Vielma, J.

    2016-12-01

    Efforts to prepare for and reduce the risk of hazards, from both natural and anthropogenic sources, which threaten our oceans and coasts requires an understanding of the dynamics and interactions between the physical, ecological, and socio-economic systems. Understanding these coupled dynamics are essential as offshore oil & gas exploration and production continues to push into harsher, more extreme environments where risks and uncertainty increase. However, working with these large, complex data from various sources and scales to assess risks and potential impacts associated with offshore energy exploration and production poses several challenges to research. In order to address these challenges, an integrated assessment model (IAM) was developed at the Department of Energy's (DOE) National Energy Technology Laboratory (NETL) that combines spatial data infrastructure and an online research platform to manage, process, analyze, and share these large, multidimensional datasets, research products, and the tools and models used to evaluate risk and reduce uncertainty for the entire offshore system, from the subsurface, through the water column, to coastal ecosystems and communities. Here, we will discuss the spatial data infrastructure and online research platform, NETL's Energy Data eXchange (EDX), that underpin the offshore IAM, providing information on how the framework combines multidimensional spatial data and spatio-temporal tools to evaluate risks to the complex matrix of potential environmental, social, and economic impacts stemming from modeled offshore hazard scenarios, such as oil spills or hurricanes. In addition, we will discuss the online analytics, tools, and visualization methods integrated into this framework that support availability and access to data, as well as allow for the rapid analysis and effective communication of analytical results to aid a range of decision-making needs.

  16. Cardiac catheterization laboratory inpatient forecast tool: a prospective evaluation

    PubMed Central

    Flanagan, Eleni; Siddiqui, Sauleh; Appelbaum, Jeff; Kasper, Edward K; Levin, Scott

    2016-01-01

    Objective To develop and prospectively evaluate a web-based tool that forecasts the daily bed need for admissions from the cardiac catheterization laboratory using routinely available clinical data within electronic medical records (EMRs). Methods The forecast model was derived using a 13-month retrospective cohort of 6384 catheterization patients. Predictor variables such as demographics, scheduled procedures, and clinical indicators mined from free-text notes were input to a multivariable logistic regression model that predicted the probability of inpatient admission. The model was embedded into a web-based application connected to the local EMR system and used to support bed management decisions. After implementation, the tool was prospectively evaluated for accuracy on a 13-month test cohort of 7029 catheterization patients. Results The forecast model predicted admission with an area under the receiver operating characteristic curve of 0.722. Daily aggregate forecasts were accurate to within one bed for 70.3% of days and within three beds for 97.5% of days during the prospective evaluation period. The web-based application housing the forecast model was used by cardiology providers in practice to estimate daily admissions from the catheterization laboratory. Discussion The forecast model identified older age, male gender, invasive procedures, coronary artery bypass grafts, and a history of congestive heart failure as qualities indicating a patient was at increased risk for admission. Diagnostic procedures and less acute clinical indicators decreased patients’ risk of admission. Despite the site-specific limitations of the model, these findings were supported by the literature. Conclusion Data-driven predictive analytics may be used to accurately forecast daily demand for inpatient beds for cardiac catheterization patients. Connecting these analytics to EMR data sources has the potential to provide advanced operational decision support. PMID:26342217

  17. Analytical technique to address terrorist threats by chemical weapons of mass destruction

    NASA Astrophysics Data System (ADS)

    Dempsey, Patrick M.

    1997-01-01

    Terrorism is no longer an issue without effect on the American mind. We now live with the same concerns and fears that have been commonplace in other developed and third world countries for a long time. Citizens of other countries have long lived with the specter of terrorism and now the U.S. needs to be concerned and prepared for terrorist activities.T he terrorist has the ability to cause great destructive effects by focusing their effort on unaware and unprepared civilian populations. Attacks can range from simple explosives to sophisticated nuclear, chemical and biological weapons. Intentional chemical releases of hazardous chemicals or chemical warfare agents pose a great threat because of their ready availability and/or ease of production, and their ability to cause widespread damage. As this battlefront changes from defined conflicts and enemies to unnamed terrorists, we must implement the proper analytical tools to provide a fast and efficient response. Each chemical uses in a terrorists weapon leaves behind a chemical signature that can be used to identify the materials involved and possibly lead investigators to the source and to those responsible. New tools to provide fast and accurate detection for battlefield chemical and biological agent attack are emerging. Gas chromatography/mass spectrometry (GC/MS) is one of these tools that has found increasing use by the military to respond to chemical agent attacks. As the technology becomes smaller and more portable, it can be used by law enforcement personnel to identify suspected terrorist releases and to help prepare the response; define contaminated areas for evacuation and safety concerns, identify the proper treatment of exposed or affected civilians, and suggest decontamination and cleanup procedures.

  18. The Stanford Data Miner: a novel approach for integrating and exploring heterogeneous immunological data.

    PubMed

    Siebert, Janet C; Munsil, Wes; Rosenberg-Hasson, Yael; Davis, Mark M; Maecker, Holden T

    2012-03-28

    Systems-level approaches are increasingly common in both murine and human translational studies. These approaches employ multiple high information content assays. As a result, there is a need for tools to integrate heterogeneous types of laboratory and clinical/demographic data, and to allow the exploration of that data by aggregating and/or segregating results based on particular variables (e.g., mean cytokine levels by age and gender). Here we describe the application of standard data warehousing tools to create a novel environment for user-driven upload, integration, and exploration of heterogeneous data. The system presented here currently supports flow cytometry and immunoassays performed in the Stanford Human Immune Monitoring Center, but could be applied more generally. Users upload assay results contained in platform-specific spreadsheets of a defined format, and clinical and demographic data in spreadsheets of flexible format. Users then map sample IDs to connect the assay results with the metadata. An OLAP (on-line analytical processing) data exploration interface allows filtering and display of various dimensions (e.g., Luminex analytes in rows, treatment group in columns, filtered on a particular study). Statistics such as mean, median, and N can be displayed. The views can be expanded or contracted to aggregate or segregate data at various levels. Individual-level data is accessible with a single click. The result is a user-driven system that permits data integration and exploration in a variety of settings. We show how the system can be used to find gender-specific differences in serum cytokine levels, and compare them across experiments and assay types. We have used the tools and techniques of data warehousing, including open-source business intelligence software, to support investigator-driven data integration and mining of diverse immunological data.

  19. The Stanford Data Miner: a novel approach for integrating and exploring heterogeneous immunological data

    PubMed Central

    2012-01-01

    Background Systems-level approaches are increasingly common in both murine and human translational studies. These approaches employ multiple high information content assays. As a result, there is a need for tools to integrate heterogeneous types of laboratory and clinical/demographic data, and to allow the exploration of that data by aggregating and/or segregating results based on particular variables (e.g., mean cytokine levels by age and gender). Methods Here we describe the application of standard data warehousing tools to create a novel environment for user-driven upload, integration, and exploration of heterogeneous data. The system presented here currently supports flow cytometry and immunoassays performed in the Stanford Human Immune Monitoring Center, but could be applied more generally. Results Users upload assay results contained in platform-specific spreadsheets of a defined format, and clinical and demographic data in spreadsheets of flexible format. Users then map sample IDs to connect the assay results with the metadata. An OLAP (on-line analytical processing) data exploration interface allows filtering and display of various dimensions (e.g., Luminex analytes in rows, treatment group in columns, filtered on a particular study). Statistics such as mean, median, and N can be displayed. The views can be expanded or contracted to aggregate or segregate data at various levels. Individual-level data is accessible with a single click. The result is a user-driven system that permits data integration and exploration in a variety of settings. We show how the system can be used to find gender-specific differences in serum cytokine levels, and compare them across experiments and assay types. Conclusions We have used the tools and techniques of data warehousing, including open-source business intelligence software, to support investigator-driven data integration and mining of diverse immunological data. PMID:22452993

  20. Bio-TDS: bioscience query tool discovery system.

    PubMed

    Gnimpieba, Etienne Z; VanDiermen, Menno S; Gustafson, Shayla M; Conn, Bill; Lushbough, Carol M

    2017-01-04

    Bioinformatics and computational biology play a critical role in bioscience and biomedical research. As researchers design their experimental projects, one major challenge is to find the most relevant bioinformatics toolkits that will lead to new knowledge discovery from their data. The Bio-TDS (Bioscience Query Tool Discovery Systems, http://biotds.org/) has been developed to assist researchers in retrieving the most applicable analytic tools by allowing them to formulate their questions as free text. The Bio-TDS is a flexible retrieval system that affords users from multiple bioscience domains (e.g. genomic, proteomic, bio-imaging) the ability to query over 12 000 analytic tool descriptions integrated from well-established, community repositories. One of the primary components of the Bio-TDS is the ontology and natural language processing workflow for annotation, curation, query processing, and evaluation. The Bio-TDS's scientific impact was evaluated using sample questions posed by researchers retrieved from Biostars, a site focusing on BIOLOGICAL DATA ANALYSIS: The Bio-TDS was compared to five similar bioscience analytic tool retrieval systems with the Bio-TDS outperforming the others in terms of relevance and completeness. The Bio-TDS offers researchers the capacity to associate their bioscience question with the most relevant computational toolsets required for the data analysis in their knowledge discovery process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Assessment of catchments' flooding potential: a physically-based analytical tool

    NASA Astrophysics Data System (ADS)

    Botter, G.; Basso, S.; Schirmer, M.

    2016-12-01

    The assessment of the flooding potential of river catchments is critical in many research and applied fields, ranging from river science and geomorphology to urban planning and the insurance industry. Predicting magnitude and frequency of floods is key to prevent and mitigate the negative effects of high flows, and has therefore long been the focus of hydrologic research. Here, the recurrence intervals of seasonal flow maxima are estimated through a novel physically-based analytic approach, which links the extremal distribution of streamflows to the stochastic dynamics of daily discharge. An analytical expression of the seasonal flood-frequency curve is provided, whose parameters embody climate and landscape attributes of the contributing catchment and can be estimated from daily rainfall and streamflow data. Only one parameter, which expresses catchment saturation prior to rainfall events, needs to be calibrated on the observed maxima. The method has been tested in a set of catchments featuring heterogeneous daily flow regimes. The model is able to reproduce characteristic shapes of flood-frequency curves emerging in erratic and persistent flow regimes and provides good estimates of seasonal flow maxima in different climatic regions. Performances are steady when the magnitude of events with return times longer than the available sample size is estimated. This makes the approach especially valuable for regions affected by data scarcity.

  2. Analytical Protein Microarrays: Advancements Towards Clinical Applications

    PubMed Central

    Sauer, Ursula

    2017-01-01

    Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems. PMID:28146048

  3. Manipulability, force, and compliance analysis for planar continuum manipulators

    NASA Technical Reports Server (NTRS)

    Gravagne, Ian A.; Walker, Ian D.

    2002-01-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  4. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  5. Manipulability, force, and compliance analysis for planar continuum manipulators.

    PubMed

    Gravagne, Ian A; Walker, Ian D

    2002-06-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  6. Health needs: the interface between the discourse of health professionals and victimized women1

    PubMed Central

    de Oliveira, Rebeca Nunes Guedes; da Fonseca, Rosa Maria Godoy Serpa

    2015-01-01

    Objective: to understand the limits and the evaluative possibilities of the Family Health Strategy regarding the recognition of the health needs of women who experience violence. Method: a study with a qualitative approach, grounded in the perspective of gender, and which adopted health needs as the analytical category. The data were collected through interviews with health professionals and women who made use of a health service, and were analyzed using the method of discourse analysis. Results: the meeting between the discourses of women who use the services and the professionals of the health service revealed, as the interface, human needs, as in the example of autonomy and of bonds. The understanding regarding the needs was limited to the recognition of health problems of physical and psychological natures, just as the predominance of the recognition of needs for maintaining life in the light of essentially human needs was revealed in the professionals' discourses as an important limitation of the practices. Conclusion: emphasis is placed on the perspective of gender as a tool which must be aggregated to the routine of the professional practices in health so as to confirm or deny the transformative character of the care in place regarding the recognition and confronting of the women's health needs. PMID:26039301

  7. Revisitation of the dipole tracer test for heterogeneous porous formations

    NASA Astrophysics Data System (ADS)

    Zech, Alraune; D'Angelo, Claudia; Attinger, Sabine; Fiori, Aldo

    2018-05-01

    In this paper, a new analytical solution for interpreting dipole tests in heterogeneous media is derived by associating the shape of the tracer breakthrough curve with the log-conductivity variance. It is presented how the solution can be used for interpretation of dipole field test in view of geostatistical aquifer characterization on three illustrative examples. The analytical solution for the tracer breakthrough curve at the pumping well in a dipole tracer test is developed by considering a perfectly stratified formation. The analysis is carried out making use of the travel time of a generic solute particle, from the injection to the pumping well. Injection conditions are adapted to different possible field setting. Solutions are presented for resident and flux proportional injection mode as well as for an instantaneous pulse of solute and continuous solute injections. The analytical form of the solution allows a detailed investigation on the impact of heterogeneity, the tracer input conditions and ergodicity conditions at the well. The impact of heterogeneity manifests in a significant spreading of solute particles that increases the natural tendency to spreading induced by the dipole setup. Furthermore, with increasing heterogeneity the number of layers needed to reach ergodic conditions become larger. Thus, dipole test in highly heterogeneous aquifers might take place under non-ergodic conditions giving that the log-conductivity variance is underestimated. The method is a promising geostatistical analyzing tool being the first analytical solution for dipole tracer test analysis taking heterogeneity of hydraulic conductivity into account.

  8. Experimental study and analytical model of deformation of magnetostrictive films as applied to mirrors for x-ray space telescopes.

    PubMed

    Wang, Xiaoli; Knapp, Peter; Vaynman, S; Graham, M E; Cao, Jian; Ulmer, M P

    2014-09-20

    The desire for continuously gaining new knowledge in astronomy has pushed the frontier of engineering methods to deliver lighter, thinner, higher quality mirrors at an affordable cost for use in an x-ray observatory. To address these needs, we have been investigating the application of magnetic smart materials (MSMs) deposited as a thin film on mirror substrates. MSMs have some interesting properties that make the application of MSMs to mirror substrates a promising solution for making the next generation of x-ray telescopes. Due to the ability to hold a shape with an impressed permanent magnetic field, MSMs have the potential to be the method used to make light weight, affordable x-ray telescope mirrors. This paper presents the experimental setup for measuring the deformation of the magnetostrictive bimorph specimens under an applied magnetic field, and the analytical and numerical analysis of the deformation. As a first step in the development of tools to predict deflections, we deposited Terfenol-D on the glass substrates. We then made measurements that were compared with the results from the analytical and numerical analysis. The surface profiles of thin-film specimens were measured under an external magnetic field with white light interferometry (WLI). The analytical model provides good predictions of film deformation behavior under various magnetic field strengths. This work establishes a solid foundation for further research to analyze the full three-dimensional deformation behavior of magnetostrictive thin films.

  9. Data and Tools | Concentrating Solar Power | NREL

    Science.gov Websites

    download. Solar Power tower Integrated Layout and Optimization Tool (SolarPILOT(tm)) The SolarPILOT is code rapid layout and optimization capability of the analytical DELSOL3 program with the accuracy and

  10. Training the next generation analyst using red cell analytics

    NASA Astrophysics Data System (ADS)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  11. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  12. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    NASA Astrophysics Data System (ADS)

    Jaggi, S.

    1993-02-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  13. Advancements in nano-enabled therapeutics for neuroHIV management.

    PubMed

    Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan

    This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.

  14. Physics of cosmological cascades and observable properties

    NASA Astrophysics Data System (ADS)

    Fitoussi, T.; Belmont, R.; Malzac, J.; Marcowith, A.; Cohen-Tanugi, J.; Jean, P.

    2017-04-01

    TeV photons from extragalactic sources are absorbed in the intergalactic medium and initiate electromagnetic cascades. These cascades offer a unique tool to probe the properties of the universe at cosmological scales. We present a new Monte Carlo code dedicated to the physics of such cascades. This code has been tested against both published results and analytical approximations, and is made publicly available. Using this numerical tool, we investigate the main cascade properties (spectrum, halo extension and time delays), and study in detail their dependence on the physical parameters (extragalactic magnetic field, extragalactic background light, source redshift, source spectrum and beaming emission). The limitations of analytical solutions are emphasized. In particular, analytical approximations account only for the first generation of photons and higher branches of the cascade tree are neglected.

  15. Burden Calculator: a simple and open analytical tool for estimating the population burden of injuries.

    PubMed

    Bhalla, Kavi; Harrison, James E

    2016-04-01

    Burden of disease and injury methods can be used to summarise and compare the effects of conditions in terms of disability-adjusted life years (DALYs). Burden estimation methods are not inherently complex. However, as commonly implemented, the methods include complex modelling and estimation. To provide a simple and open-source software tool that allows estimation of incidence-DALYs due to injury, given data on incidence of deaths and non-fatal injuries. The tool includes a default set of estimation parameters, which can be replaced by users. The tool was written in Microsoft Excel. All calculations and values can be seen and altered by users. The parameter sets currently used in the tool are based on published sources. The tool is available without charge online at http://calculator.globalburdenofinjuries.org. To use the tool with the supplied parameter sets, users need to only paste a table of population and injury case data organised by age, sex and external cause of injury into a specified location in the tool. Estimated DALYs can be read or copied from tables and figures in another part of the tool. In some contexts, a simple and user-modifiable burden calculator may be preferable to undertaking a more complex study to estimate the burden of disease. The tool and the parameter sets required for its use can be improved by user innovation, by studies comparing DALYs estimates calculated in this way and in other ways, and by shared experience of its use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. Analytical Plan for Roman Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strachan, Denis M.; Buck, Edgar C.; Mueller, Karl T.

    Roman glasses that have been in the sea or underground for about 1800 years can serve as the independent “experiment” that is needed for validation of codes and models that are used in performance assessment. Two sets of Roman-era glasses have been obtained for this purpose. One set comes from the sunken vessel the Iulia Felix; the second from recently excavated glasses from a Roman villa in Aquileia, Italy. The specimens contain glass artifacts and attached sediment or soil. In the case of the Iulia Felix glasses quite a lot of analytical work has been completed at the University ofmore » Padova, but from an archaeological perspective. The glasses from Aquileia have not been so carefully analyzed, but they are similar to other Roman glasses. Both glass and sediment or soil need to be analyzed and are the subject of this analytical plan. The glasses need to be analyzed with the goal of validating the model used to describe glass dissolution. The sediment and soil need to be analyzed to determine the profile of elements released from the glass. This latter need represents a significant analytical challenge because of the trace quantities that need to be analyzed. Both pieces of information will yield important information useful in the validation of the glass dissolution model and the chemical transport code(s) used to determine the migration of elements once released from the glass. In this plan, we outline the analytical techniques that should be useful in obtaining the needed information and suggest a useful starting point for this analytical effort.« less

  17. PAVA: Physiological and Anatomical Visual Analytics for Mapping of Tissue-Specific Concentration and Time-Course Data

    EPA Science Inventory

    We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...

  18. Analytical Model-Based Design Optimization of a Transverse Flux Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less

  19. Reducing the barriers against analytical epidemiological studies in investigations of local foodborne disease outbreaks in Germany - a starter kit for local health authorities.

    PubMed

    Werber, D; Bernard, H

    2014-02-27

    Thousands of infectious food-borne disease outbreaks (FBDO) are reported annually to the European Food Safety Authority within the framework of the zoonoses Directive (2003/99/EC). Most recognised FBDO occur locally following point source exposure, but only few are investigated using analytical epidemiological studies. In Germany, and probably also in other countries of the European Union, this seems to be particularly true for those investigated by local health authorities. Analytical studies, usually cohort studies or case–control studies, are a powerful tool to identify suspect food vehicles. Therefore, from a public health and food safety perspective, their more frequent usage is highly desirable. We have developed a small toolbox consisting of a strategic concept and a simple software tool for data entry and analysis, with the objective to increase the use of analytical studies in the investigation of local point source FBDO in Germany.

  20. Multivariable Hermite polynomials and phase-space dynamics

    NASA Technical Reports Server (NTRS)

    Dattoli, G.; Torre, Amalia; Lorenzutta, S.; Maino, G.; Chiccoli, C.

    1994-01-01

    The phase-space approach to classical and quantum systems demands for advanced analytical tools. Such an approach characterizes the evolution of a physical system through a set of variables, reducing to the canonically conjugate variables in the classical limit. It often happens that phase-space distributions can be written in terms of quadratic forms involving the above quoted variables. A significant analytical tool to treat these problems may come from the generalized many-variables Hermite polynomials, defined on quadratic forms in R(exp n). They form an orthonormal system in many dimensions and seem the natural tool to treat the harmonic oscillator dynamics in phase-space. In this contribution we discuss the properties of these polynomials and present some applications to physical problems.

Top