Sample records for analytical tools required

  1. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  2. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  3. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  4. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  5. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  6. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  7. IBM's Health Analytics and Clinical Decision Support.

    PubMed

    Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W

    2014-08-15

    This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.

  8. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    PubMed

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  9. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  10. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  11. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  12. IBM’s Health Analytics and Clinical Decision Support

    PubMed Central

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  13. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  14. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  15. Update on SLD Engineering Tools Development

    NASA Technical Reports Server (NTRS)

    Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.

    2004-01-01

    The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.

  16. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  17. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    ERIC Educational Resources Information Center

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  18. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  19. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  20. Patent databases and analytical tools for space technology commercialization (Part 2)

    NASA Astrophysics Data System (ADS)

    Hulsey, William N., III

    2002-07-01

    A shift in the space industry has occurred that requires technology developers to understand the basics of the intellectual property laws; Global harmonization facilitates this understanding; internet-based tools enable knowledge of these rights and the facts affecting them.

  1. Geospatial methods and data analysis for assessing distribution of grazing livestock

    USDA-ARS?s Scientific Manuscript database

    Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...

  2. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  3. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  4. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  5. Rapid deletion plasmid construction methods for protoplast and Agrobacterium based fungal transformation systems

    USDA-ARS?s Scientific Manuscript database

    Increasing availability of genomic data and sophistication of analytical methodology in fungi has elevated the need for functional genomics tools in these organisms. Gene deletion is a critical tool for functional analysis. The targeted deletion of genes requires both a suitable method for the trans...

  6. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  7. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  8. Required, Practical, or Unnecessary? An Examination and Demonstration of Propensity Score Matching Using Longitudinal Secondary Data

    ERIC Educational Resources Information Center

    Padgett, Ryan D.; Salisbury, Mark H.; An, Brian P.; Pascarella, Ernest T.

    2010-01-01

    The sophisticated analytical techniques available to institutional researchers give them an array of procedures to estimate a causal effect using observational data. But as many quantitative researchers have discovered, access to a wider selection of statistical tools does not necessarily ensure construction of a better analytical model. Moreover,…

  9. Mass spectrometry as a quantitative tool in plant metabolomics

    PubMed Central

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  10. Aviation Modeling and Simulation Needs and Requirements Workshop: January 27-28, 1999

    DOT National Transportation Integrated Search

    1999-01-01

    A two-day workshop was held at the Volpe Center on January 27-28, 1999. The purpose of the workshop was to: 1) identify and understand the requirements for analytical and planning tool initiatives that will give decision makers insight into the capac...

  11. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory

    PubMed Central

    Kumar, B. Vinodh; Mohan, Thuthi

    2018-01-01

    OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587

  12. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  13. Quality of Big Data in Healthcare

    DOE PAGES

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    2015-01-01

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  14. Quality of Big Data in Healthcare

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  15. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  16. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  17. From Streaming Data to Streaming Insights: The Impact of Data Velocities on Mental Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; Pike, William A.; Cook, Kristin A.

    The rise of Big Data has influenced the design and technical implementation of visual analytic tools required to handle the increased volumes, velocities, and varieties of data. This has required a set of data management and computational advancements to allow us to store and compute on such datasets. However, as the ultimate goal of visual analytic technology is to enable the discovery and creation of insights from the users, an under-explored area is understanding how these datasets impact their mental models. That is, how have the analytic processes and strategies of users changed? How have users changed their perception ofmore » how to leverage, and ask questions of, these datasets?« less

  18. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    PubMed

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  19. MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.

    PubMed

    Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui

    2015-12-12

    Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.

  20. Analytical challenges for conducting rapid metabolism characterization for QIVIVE.

    PubMed

    Tolonen, Ari; Pelkonen, Olavi

    2015-06-05

    For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  2. Bioinformatics Symposium of the Analytical Division of the American Chemical Society Meeting. Final Technical Report from 03/15/2000 to 03/14/2001 [sample pages of agenda, abstracts, index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Robert T.

    Sparked by the Human Genome Project, biological and biomedical research has become an information science. Information tools are now being generated for proteins, cell modeling, and genomics. The opportunity for analytical chemistry in this new environment is profound. New analytical techniques that can provide the information on genes, SNPs, proteins, protein modifications, cells, and cell chemistry are required. In this symposium, we brought together both informatics experts and leading analytical chemists to discuss this interface. Over 200 people attended this highly successful symposium.

  3. FDT 2.0: Improving scalability of the fuzzy decision tree induction tool - integrating database storage.

    PubMed

    Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W

    2014-12-01

    Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.

  4. A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program

    NASA Technical Reports Server (NTRS)

    Bartoszek, J. T.; Huckins, B.; Coyle, M.

    1979-01-01

    A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.

  5. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Implementing bioinformatic workflows within the bioextract server

    USDA-ARS?s Scientific Manuscript database

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  7. Considerations regarding the validation of chromatographic mass spectrometric methods for the quantification of endogenous substances in forensics.

    PubMed

    Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra

    2018-02-01

    The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.

    PubMed

    Dunn, Joshua G; Weissman, Jonathan S

    2016-11-22

    Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .

  9. Concept design theory and model for multi-use space facilities: Analysis of key system design parameters through variance of mission requirements

    NASA Astrophysics Data System (ADS)

    Reynerson, Charles Martin

    This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.

  10. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  11. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  12. Exascale computing and big data

    DOE PAGES

    Reed, Daniel A.; Dongarra, Jack

    2015-06-25

    Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less

  13. Structural analyses for the modification and verification of the Viking aeroshell

    NASA Technical Reports Server (NTRS)

    Stephens, W. B.; Anderson, M. S.

    1976-01-01

    The Viking aeroshell is an extremely lightweight flexible shell structure that has undergone thorough buckling analyses in the course of its development. The analytical tools and modeling technique required to reveal the structural behavior are presented. Significant results are given which illustrate the complex failure modes not usually observed in simple models and analyses. Both shell-of-revolution analysis for the pressure loads and thermal loads during entry and a general shell analysis for concentrated tank loads during launch were used. In many cases fixes or alterations to the structure were required, and the role of the analytical results in determining these modifications is indicated.

  14. Exascale computing and big data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, Daniel A.; Dongarra, Jack

    Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less

  15. Meeting report: Ocean 'omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013).

    PubMed

    Gilbert, Jack A; Dick, Gregory J; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R M; DeLong, Edward F

    2014-06-15

    The National Science Foundation's EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on 'omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, "big-data capable" analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean 'omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the 'omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography.

  16. Meeting report: Ocean ‘omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013)

    PubMed Central

    Gilbert, Jack A; Dick, Gregory J.; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R. M.

    2014-01-01

    The National Science Foundation’s EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on ‘omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, “big-data capable” analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean ‘omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the ‘omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography. PMID:25197495

  17. The National energy modeling system

    NASA Astrophysics Data System (ADS)

    The DOE uses a variety of energy and economic models to forecast energy supply and demand. It also uses a variety of more narrowly focussed analytical tools to examine energy policy options. For the purpose of the scope of this work, this set of models and analytical tools is called the National Energy Modeling System (NEMS). The NEMS is the result of many years of development of energy modeling and analysis tools, many of which were developed for different applications and under different assumptions. As such, NEMS is believed to be less than satisfactory in certain areas. For example, NEMS is difficult to keep updated and expensive to use. Various outputs are often difficult to reconcile. Products were not required to interface, but were designed to stand alone. Because different developers were involved, the inner workings of the NEMS are often not easily or fully understood. Even with these difficulties, however, NEMS comprises the best tools currently identified to deal with our global, national and regional energy modeling, and energy analysis needs.

  18. Bio-TDS: bioscience query tool discovery system.

    PubMed

    Gnimpieba, Etienne Z; VanDiermen, Menno S; Gustafson, Shayla M; Conn, Bill; Lushbough, Carol M

    2017-01-04

    Bioinformatics and computational biology play a critical role in bioscience and biomedical research. As researchers design their experimental projects, one major challenge is to find the most relevant bioinformatics toolkits that will lead to new knowledge discovery from their data. The Bio-TDS (Bioscience Query Tool Discovery Systems, http://biotds.org/) has been developed to assist researchers in retrieving the most applicable analytic tools by allowing them to formulate their questions as free text. The Bio-TDS is a flexible retrieval system that affords users from multiple bioscience domains (e.g. genomic, proteomic, bio-imaging) the ability to query over 12 000 analytic tool descriptions integrated from well-established, community repositories. One of the primary components of the Bio-TDS is the ontology and natural language processing workflow for annotation, curation, query processing, and evaluation. The Bio-TDS's scientific impact was evaluated using sample questions posed by researchers retrieved from Biostars, a site focusing on BIOLOGICAL DATA ANALYSIS: The Bio-TDS was compared to five similar bioscience analytic tool retrieval systems with the Bio-TDS outperforming the others in terms of relevance and completeness. The Bio-TDS offers researchers the capacity to associate their bioscience question with the most relevant computational toolsets required for the data analysis in their knowledge discovery process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Marshall Space Flight Center's Virtual Reality Applications Program 1993

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1993-01-01

    A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.

  20. Population genomics of fungal and oomycete pathogens

    USDA-ARS?s Scientific Manuscript database

    We are entering a new era in plant pathology where whole-genome sequences of many individuals of a pathogen species are becoming readily available. This era of pathogen population genomics will provide new opportunities and challenges, requiring new computational and analytical tools. Population gen...

  1. Electrochemical detection for microscale analytical systems: a review.

    PubMed

    Wang, Joseph

    2002-02-11

    As the field of chip-based microscale systems continues its rapid growth, there are urgent needs for developing compatible detection modes. Electrochemistry detection offers considerable promise for such microfluidic systems, with features that include remarkable sensitivity, inherent miniaturization and portability, independence of optical path length or sample turbidity, low cost, low-power requirements and high compatibility with advanced micromachining and microfabrication technologies. This paper highlights recent advances, directions and key strategies in controlled-potential electrochemical detectors for miniaturized analytical systems. Subjects covered include the design and integration of the electrochemical detection system, its requirements and operational principles, common electrode materials, derivatization reactions, electrical-field decouplers, typical applications and future prospects. It is expected that electrochemical detection will become a powerful tool for microscale analytical systems and will facilitate the creation of truly portable (and possibly disposable) devices.

  2. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  3. Single Cell Proteomics in Biomedicine: High-dimensional Data Acquisition, Visualization and Analysis

    PubMed Central

    Su, Yapeng; Shi, Qihui; Wei, Wei

    2017-01-01

    New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. PMID:28128880

  4. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  6. Structural Model Tuning Capability in an Object-Oriented Multidisciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Lung, Shun-fat; Pak, Chan-gi

    2008-01-01

    Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization (MDAO) tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.

  7. Structural Model Tuning Capability in an Object-Oriented Multidisciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Lung, Shun-fat; Pak, Chan-gi

    2008-01-01

    Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization [MDAO] tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.

  8. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  9. Integrated Requirements Analysis and Technology Roadmaps

    NASA Technical Reports Server (NTRS)

    1997-01-01

    In fiscal year 1997, Strategic Insight performed analytical studies for NASA's Highly Reusable Space Transportation (HRST) program, creating program documents which illuminated technical requirements and critical research opportunities. Studies were performed to structure and confirm HRST's evolving technical requirements, building on Marshall's Phase 1 work, which defined HRST system concepts, analytical tools and high-level issues for assessment in Phase 2. Specifically, Strategic Insight: (1) Performed a requirements analysis to update HRST: An Advanced Concepts Study, Study Guidelines, Version 2.0 of January 22, 1996; only minor changes were recommended for the given parameters of interest to concept designers; (2) Conducted mini-workshops during HRST Working Group meetings on April 14-15, 1997 and July 22-24, 1997; and (3) Created structures for technology road maps of candidate HRST concepts, both subsystem and end-to-end concepts, emerging from the 13 cooperative agreement projects.

  10. Authors’ response: what are emotions and how are they created in the brain?

    PubMed

    Lindquist, Kristen A; Wager, Tor D; Bliss-Moreau, Eliza; Kober, Hedy; Barret, Lisa Feldman

    2012-06-01

    In our response, we clarify important theoretical differences between basic emotion and psychological construction approaches. We evaluate the empirical status of the basic emotion approach, addressing whether it requires brain localization, whether localization can be observed with better analytic tools, and whether evidence for basic emotions exists in other types of measures. We then revisit the issue of whether the key hypotheses of psychological construction are supported by our meta-analytic findings. We close by elaborating on commentator suggestions for future research.

  11. Mass spectrometry in systems biology an introduction.

    PubMed

    Dunn, Warwick B

    2011-01-01

    The qualitative detection, quantification, and structural characterization of analytes in biological systems are important requirements for objectives to be fulfilled in systems biology research. One analytical tool applied to a multitude of systems biology studies is mass spectrometry, particularly for the study of proteins and metabolites. Here, the role of mass spectrometry in systems biology will be assessed, the advantages and disadvantages discussed, and the instrument configurations available described. Finally, general applications will be briefly reviewed. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) USER MANUAL

    EPA Science Inventory

    ATtlLA is an ArcView extension that allows users to easily calculate many common landscape metrics. GIS expertise is not required, but some experience with ArcView is recommended. Four metric groups are currently included in ATtILA: landscape characteristics, riparian characteris...

  13. Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.

    PubMed

    Buske, Christine; Gerlai, Robert

    2014-08-30

    Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.

  14. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    PubMed

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  15. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  16. A comparative assessment of tools for ecosystem services quantification and valuation

    USGS Publications Warehouse

    Bagstad, Kenneth J.; Semmens, Darius; Waage, Sissel; Winthrop, Robert

    2013-01-01

    To enter widespread use, ecosystem service assessments need to be quantifiable, replicable, credible, flexible, and affordable. With recent growth in the field of ecosystem services, a variety of decision-support tools has emerged to support more systematic ecosystem services assessment. Despite the growing complexity of the tool landscape, thorough reviews of tools for identifying, assessing, modeling and in some cases monetarily valuing ecosystem services have generally been lacking. In this study, we describe 17 ecosystem services tools and rate their performance against eight evaluative criteria that gauge their readiness for widespread application in public- and private-sector decision making. We describe each of the tools′ intended uses, services modeled, analytical approaches, data requirements, and outputs, as well time requirements to run seven tools in a first comparative concurrent application of multiple tools to a common location – the San Pedro River watershed in southeast Arizona, USA, and northern Sonora, Mexico. Based on this work, we offer conclusions about these tools′ current ‘readiness’ for widespread application within both public- and private-sector decision making processes. Finally, we describe potential pathways forward to reduce the resource requirements for running ecosystem services models, which are essential to facilitate their more widespread use in environmental decision making.

  17. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  18. Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.

    PubMed

    Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D

    2016-02-01

    Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  19. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  20. Reinventing Continuing Higher Education

    ERIC Educational Resources Information Center

    Walshok, Mary Lindenstein

    2012-01-01

    Re-inventing continuing higher education is about finding ways to be a more central player in a region's civic, cultural, and economic life as well as in the education of individuals for work and citizenship. Continuing higher education will require data gathering, analytical tools, convening authority, interpretive skills, new models of delivery,…

  1. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  2. Competing on analytics.

    PubMed

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  3. Practice Evaluation Strategies Among Social Workers: Why an Evidence-Informed Dual-Process Theory Still Matters.

    PubMed

    Davis, Thomas D

    2017-01-01

    Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.

  4. Carbon dioxide gas purification and analytical measurement for leading edge 193nm lithography

    NASA Astrophysics Data System (ADS)

    Riddle Vogt, Sarah; Landoni, Cristian; Applegarth, Chuck; Browning, Matt; Succi, Marco; Pirola, Simona; Macchi, Giorgio

    2015-03-01

    The use of purified carbon dioxide (CO2) has become a reality for leading edge 193 nm immersion lithography scanners. Traditionally, both dry and immersion 193 nm lithographic processes have constantly purged the optics stack with ultrahigh purity compressed dry air (UHPCDA). CO2 has been utilized for a similar purpose as UHPCDA. Airborne molecular contamniation (AMC) purification technologies and analytical measurement methods have been extensively developed to support the Lithography Tool Manufacturers purity requirements. This paper covers the analytical tests and characterizations carried out to assess impurity removal from 3.0 N CO2 (beverage grade) for its final utilization in 193 nm and EUV scanners.

  5. Recombinant drugs-on-a-chip: The usage of capillary electrophoresis and trends in miniaturized systems - A review.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel

    2016-09-07

    We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    EPA Pesticide Factsheets

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  7. THREAT ANTICIPATION AND DECEPTIVE REASONING USING BAYESIAN BELIEF NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgood, Glenn O; Olama, Mohammed M; Lake, Joe E

    Recent events highlight the need for tools to anticipate threats posed by terrorists. Assessing these threats requires combining information from disparate data sources such as analytic models, simulations, historical data, sensor networks, and user judgments. These disparate data can be combined in a coherent, analytically defensible, and understandable manner using a Bayesian belief network (BBN). In this paper, we develop a BBN threat anticipatory model based on a deceptive reasoning algorithm using a network engineering process that treats the probability distributions of the BBN nodes within the broader context of the system development process.

  8. Analytical aspects of plant metabolite profiling platforms: current standings and future aims.

    PubMed

    Seger, Christoph; Sturm, Sonja

    2007-02-01

    Over the past years, metabolic profiling has been established as a comprehensive systems biology tool. Mass spectrometry or NMR spectroscopy-based technology platforms combined with unsupervised or supervised multivariate statistical methodologies allow a deep insight into the complex metabolite patterns of plant-derived samples. Within this review, we provide a thorough introduction to the analytical hard- and software requirements of metabolic profiling platforms. Methodological limitations are addressed, and the metabolic profiling workflow is exemplified by summarizing recent applications ranging from model systems to more applied topics.

  9. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...

  10. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  11. Single cell proteomics in biomedicine: High-dimensional data acquisition, visualization, and analysis.

    PubMed

    Su, Yapeng; Shi, Qihui; Wei, Wei

    2017-02-01

    New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features, and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  13. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  14. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  15. Structure identification for Non-Targeted Analytical Chemistry using the US EPA’s CompTox Chemistry Dashboard (ACS 2017 Fall meeting 3 of 3)

    EPA Science Inventory

    Identification of unknowns in non-targeted analyses (NTA) requires the integration of complementary data types to generate a confident consensus structure. Researchers use a variety of data and tools (e.g., chemical reference databases, spectral matching, fragment prediction too...

  16. Cyberhubs: Virtual Research Environments for Astronomy

    NASA Astrophysics Data System (ADS)

    Herwig, Falk; Andrassy, Robert; Annau, Nic; Clarkson, Ondrea; Côté, Benoit; D’Sa, Aaron; Jones, Sam; Moa, Belaid; O’Connell, Jericho; Porter, David; Ritter, Christian; Woodward, Paul

    2018-05-01

    Collaborations in astronomy and astrophysics are faced with numerous cyber-infrastructure challenges, such as large data sets, the need to combine heterogeneous data sets, and the challenge to effectively collaborate on those large, heterogeneous data sets with significant processing requirements and complex science software tools. The cyberhubs system is an easy-to-deploy package for small- to medium-sized collaborations based on the Jupyter and Docker technology, which allows web-browser-enabled, remote, interactive analytic access to shared data. It offers an initial step to address these challenges. The features and deployment steps of the system are described, as well as the requirements collection through an account of the different approaches to data structuring, handling, and available analytic tools for the NuGrid and PPMstar collaborations. NuGrid is an international collaboration that creates stellar evolution and explosion physics and nucleosynthesis simulation data. The PPMstar collaboration performs large-scale 3D stellar hydrodynamics simulations of interior convection in the late phases of stellar evolution. Examples of science that is currently performed on cyberhubs, in the areas of 3D stellar hydrodynamic simulations, stellar evolution and nucleosynthesis, and Galactic chemical evolution, are presented.

  17. National Facilities Study. Volume 1: Facilities Inventory

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The inventory activity was initiated to solve the critical need for a single source of site specific descriptive and parametric data on major public and privately held aeronautics and aerospace related facilities. This a challenging undertaking due to the scope of the effort and the short lead time in which to assemble the inventory and have it available to support the task group study needs. The inventory remains dynamic as sites are being added and the data is accessed and refined as the study progresses. The inventory activity also included the design and implementation of a computer database and analytical tools to simplify access to the data. This volume describes the steps which were taken to define the data requirements, select sites, and solicit and acquire data from them. A discussion of the inventory structure and analytical tools is also provided.

  18. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  19. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  20. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  1. Structural considerations for fabrication and mounting of the AXAF HRMA optics

    NASA Technical Reports Server (NTRS)

    Cohen, Lester M.; Cernoch, Larry; Mathews, Gary; Stallcup, Michael

    1990-01-01

    A methodology is described which minimizes optics distortion in the fabrication, metrology, and launch configuration phases. The significance of finite element modeling and breadboard testing is described with respect to performance analyses of support structures and material effects in NASA's AXAF X-ray optics. The paper outlines the requirements for AXAF performance, optical fabrication, metrology, and glass support fixtures, as well as the specifications for mirror sensitivity and the high-resolution mirror assembly. Analytical modeling of the tools is shown to coincide with grinding and polishing experiments, and is useful for designing large-area polishing and grinding tools. Metrological subcomponents that have undergone initial testing show evidence of meeting force requirements.

  2. Visual Analytics Tools for Sustainable Lifecycle Design: Current Status, Challenges, and Future Opportunities.

    PubMed

    Ramanujan, Devarajan; Bernstein, William Z; Chandrasegaran, Senthil K; Ramani, Karthik

    2017-01-01

    The rapid rise in technologies for data collection has created an unmatched opportunity to advance the use of data-rich tools for lifecycle decision-making. However, the usefulness of these technologies is limited by the ability to translate lifecycle data into actionable insights for human decision-makers. This is especially true in the case of sustainable lifecycle design (SLD), as the assessment of environmental impacts, and the feasibility of making corresponding design changes, often relies on human expertise and intuition. Supporting human sense-making in SLD requires the use of both data-driven and user-driven methods while exploring lifecycle data. A promising approach for combining the two is through the use of visual analytics (VA) tools. Such tools can leverage the ability of computer-based tools to gather, process, and summarize data along with the ability of human-experts to guide analyses through domain knowledge or data-driven insight. In this paper, we review previous research that has created VA tools in SLD. We also highlight existing challenges and future opportunities for such tools in different lifecycle stages-design, manufacturing, distribution & supply chain, use-phase, end-of-life, as well as life cycle assessment. Our review shows that while the number of VA tools in SLD is relatively small, researchers are increasingly focusing on the subject matter. Our review also suggests that VA tools can address existing challenges in SLD and that significant future opportunities exist.

  3. An Analytic Tool to Investigate the Effect of Binder on the Sensitivity of HMX-Based Plastic Bonded Explosives in the Skid Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayden, D. W.

    This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried tomore » develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of Hatler et. al., and the reactive temperature rise will be obtained from Mader's work. Finally, the assessment of when a detonation occurs will be derived from Bowden and Yoffe's thermal explosion theory (hot spot).« less

  4. Machine Learning Technologies Translates Vigilant Surveillance Satellite Big Data into Predictive Alerts for Environmental Stressors

    NASA Astrophysics Data System (ADS)

    Johnson, S. P.; Rohrer, M. E.

    2017-12-01

    The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention, intervention, and prediction of emerging environmental threats.

  5. Analytical performance evaluation of a high-volume hematology laboratory utilizing sigma metrics as standard of excellence.

    PubMed

    Shaikh, M S; Moiz, B

    2016-04-01

    Around two-thirds of important clinical decisions about the management of patients are based on laboratory test results. Clinical laboratories are required to adopt quality control (QC) measures to ensure provision of accurate and precise results. Six sigma is a statistical tool, which provides opportunity to assess performance at the highest level of excellence. The purpose of this study was to assess performance of our hematological parameters on sigma scale in order to identify gaps and hence areas of improvement in patient care. Twelve analytes included in the study were hemoglobin (Hb), hematocrit (Hct), red blood cell count (RBC), mean corpuscular volume (MCV), red cell distribution width (RDW), total leukocyte count (TLC) with percentages of neutrophils (Neutr%) and lymphocytes (Lymph %), platelet count (Plt), mean platelet volume (MPV), prothrombin time (PT), and fibrinogen (Fbg). Internal quality control data and external quality assurance survey results were utilized for the calculation of sigma metrics for each analyte. Acceptable sigma value of ≥3 was obtained for the majority of the analytes included in the analysis. MCV, Plt, and Fbg achieved value of <3 for level 1 (low abnormal) control. PT performed poorly on both level 1 and 2 controls with sigma value of <3. Despite acceptable conventional QC tools, application of sigma metrics can identify analytical deficits and hence prospects for the improvement in clinical laboratories. © 2016 John Wiley & Sons Ltd.

  6. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    NASA Astrophysics Data System (ADS)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  7. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  8. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  9. Analytical and multibody modeling for the power analysis of standing jumps.

    PubMed

    Palmieri, G; Callegari, M; Fioretti, S

    2015-01-01

    Two methods for the power analysis of standing jumps are proposed and compared in this article. The first method is based on a simple analytical formulation which requires as input the coordinates of the center of gravity in three specified instants of the jump. The second method is based on a multibody model that simulates the jumps processing the data obtained by a three-dimensional (3D) motion capture system and the dynamometric measurements obtained by the force platforms. The multibody model is developed with OpenSim, an open-source software which provides tools for the kinematic and dynamic analyses of 3D human body models. The study is focused on two of the typical tests used to evaluate the muscular activity of lower limbs, which are the counter movement jump and the standing long jump. The comparison between the results obtained by the two methods confirms that the proposed analytical formulation is correct and represents a simple tool suitable for a preliminary analysis of total mechanical work and the mean power exerted in standing jumps.

  10. Mathematical support for automated geometry analysis of lathe machining of oblique peakless round-nose tools

    NASA Astrophysics Data System (ADS)

    Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.

    2017-01-01

    Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.

  11. Linking linear programming and spatial simulation models to predict landscape effects of forest management alternatives

    Treesearch

    Eric J. Gustafson; L. Jay Roberts; Larry A. Leefers

    2006-01-01

    Forest management planners require analytical tools to assess the effects of alternative strategies on the sometimes disparate benefits from forests such as timber production and wildlife habitat. We assessed the spatial patterns of alternative management strategies by linking two models that were developed for different purposes. We used a linear programming model (...

  12. Formulation of advanced consumables management models: Executive summary. [modeling spacecraft environmental control, life support, and electric power supply systems

    NASA Technical Reports Server (NTRS)

    Daly, J. K.; Torian, J. G.

    1979-01-01

    An overview of studies conducted to establish the requirements for advanced subsystem analytical tools is presented. Modifications are defined for updating current computer programs used to analyze environmental control, life support, and electric power supply systems so that consumables for future advanced spacecraft may be managed.

  13. Factors Influencing the Internet Resource Users' Satisfaction: An Analytical Study on Omani Undergraduate Learners

    ERIC Educational Resources Information Center

    Sriram, B.

    2016-01-01

    The internet resources are one of the important knowledge sharing tools in day-to-day business processes. These internet resources have greater impact on education field too. The learning processes have become comparatively easy due to these electronic resources. The online resources help the students to acquire the required knowledge through…

  14. Using Log Variables in a Learning Management System to Evaluate Learning Activity Using the Lens of Activity Theory

    ERIC Educational Resources Information Center

    Park, Yeonjeong; Jo, Il-Hyun

    2017-01-01

    As the advance of learning technologies and analytics tools continues, learning management systems (LMSs) have been required to fulfil the growing expectations for smart learning. However, the reality regarding the level of technology integration in higher education differs considerably from such expectations or the speed of advances in…

  15. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  16. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  17. 21st century toolkit for optimizing population health through precision nutrition.

    PubMed

    O'Sullivan, Aifric; Henrick, Bethany; Dixon, Bonnie; Barile, Daniela; Zivkovic, Angela; Smilowitz, Jennifer; Lemay, Danielle; Martin, William; German, J Bruce; Schaefer, Sara Elizabeth

    2017-07-05

    Scientific, technological, and economic progress over the last 100 years all but eradicated problems of widespread food shortage and nutrient deficiency in developed nations. But now society is faced with a new set of nutrition problems related to energy imbalance and metabolic disease, which require new kinds of solutions. Recent developments in the area of new analytical tools enable us to systematically study large quantities of detailed and multidimensional metabolic and health data, providing the opportunity to address current nutrition problems through an approach called Precision Nutrition. This approach integrates different kinds of "big data" to expand our understanding of the complexity and diversity of human metabolism in response to diet. With these tools, we can more fully elucidate each individual's unique phenotype, or the current state of health, as determined by the interactions among biology, environment, and behavior. The tools of precision nutrition include genomics, metabolomics, microbiomics, phenotyping, high-throughput analytical chemistry techniques, longitudinal tracking with body sensors, informatics, data science, and sophisticated educational and behavioral interventions. These tools are enabling the development of more personalized and predictive dietary guidance and interventions that have the potential to transform how the public makes food choices and greatly improve population health.

  18. Application of Data Provenance in Healthcare Analytics Software: Information Visualisation of User Activities

    PubMed Central

    Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa

    2018-01-01

    Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084

  19. Computerized power supply analysis: State equation generation and terminal models

    NASA Technical Reports Server (NTRS)

    Garrett, S. J.

    1978-01-01

    To aid engineers that design power supply systems two analysis tools that can be used with the state equation analysis package were developed. These tools include integration routines that start with the description of a power supply in state equation form and yield analytical results. The first tool uses a computer program that works with the SUPER SCEPTRE circuit analysis program and prints the state equation for an electrical network. The state equations developed automatically by the computer program are used to develop an algorithm for reducing the number of state variables required to describe an electrical network. In this way a second tool is obtained in which the order of the network is reduced and a simpler terminal model is obtained.

  20. Automatic differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. AD is assessed as a tool for engineering design. The forward and reverse modes of AD, their computing requirements, as well as approaches to implementing AD are discussed. The application of two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation is also discussed. The observation is made that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available; in some instances, AD may be the alternative to consider in lieu of analytical sensitivity analysis.

  1. Functional Interfaces Constructed by Controlled/Living Radical Polymerization for Analytical Chemistry.

    PubMed

    Wang, Huai-Song; Song, Min; Hang, Tai-Jun

    2016-02-10

    The high-value applications of functional polymers in analytical science generally require well-defined interfaces, including precisely synthesized molecular architectures and compositions. Controlled/living radical polymerization (CRP) has been developed as a versatile and powerful tool for the preparation of polymers with narrow molecular weight distributions and predetermined molecular weights. Among the CRP system, atom transfer radical polymerization (ATRP) and reversible addition-fragmentation chain transfer (RAFT) are well-used to develop new materials for analytical science, such as surface-modified core-shell particles, monoliths, MIP micro- or nanospheres, fluorescent nanoparticles, and multifunctional materials. In this review, we summarize the emerging functional interfaces constructed by RAFT and ATRP for applications in analytical science. Various polymers with precisely controlled architectures including homopolymers, block copolymers, molecular imprinted copolymers, and grafted copolymers were synthesized by CRP methods for molecular separation, retention, or sensing. We expect that the CRP methods will become the most popular technique for preparing functional polymers that can be broadly applied in analytical chemistry.

  2. Big data and new knowledge in medicine: the thinking, training, and tools needed for a learning health system.

    PubMed

    Krumholz, Harlan M

    2014-07-01

    Big data in medicine--massive quantities of health care data accumulating from patients and populations and the advanced analytics that can give those data meaning--hold the prospect of becoming an engine for the knowledge generation that is necessary to address the extensive unmet information needs of patients, clinicians, administrators, researchers, and health policy makers. This article explores the ways in which big data can be harnessed to advance prediction, performance, discovery, and comparative effectiveness research to address the complexity of patients, populations, and organizations. Incorporating big data and next-generation analytics into clinical and population health research and practice will require not only new data sources but also new thinking, training, and tools. Adequately utilized, these reservoirs of data can be a practically inexhaustible source of knowledge to fuel a learning health care system. Project HOPE—The People-to-People Health Foundation, Inc.

  3. Big Data And New Knowledge In Medicine: The Thinking, Training, And Tools Needed For A Learning Health System

    PubMed Central

    Krumholz, Harlan M.

    2017-01-01

    Big data in medicine--massive quantities of health care data accumulating from patients and populations and the advanced analytics that can give it meaning--hold the prospect of becoming an engine for the knowledge generation that is necessary to address the extensive unmet information needs of patients, clinicians, administrators, researchers, and health policy makers. This paper explores the ways in which big data can be harnessed to advance prediction, performance, discovery, and comparative effectiveness research to address the complexity of patients, populations, and organizations. Incorporating big data and next-generation analytics into clinical and population health research and practice will require not only new data sources but also new thinking, training, and tools. Adequately used, these reservoirs of data can be a practically inexhaustible source of knowledge to fuel a learning health care system. PMID:25006142

  4. Predicting Operator Execution Times Using CogTool

    NASA Technical Reports Server (NTRS)

    Santiago-Espada, Yamira; Latorella, Kara A.

    2013-01-01

    Researchers and developers of NextGen systems can use predictive human performance modeling tools as an initial approach to obtain skilled user performance times analytically, before system testing with users. This paper describes the CogTool models for a two pilot crew executing two different types of a datalink clearance acceptance tasks, and on two different simulation platforms. The CogTool time estimates for accepting and executing Required Time of Arrival and Interval Management clearances were compared to empirical data observed in video tapes and registered in simulation files. Results indicate no statistically significant difference between empirical data and the CogTool predictions. A population comparison test found no significant differences between the CogTool estimates and the empirical execution times for any of the four test conditions. We discuss modeling caveats and considerations for applying CogTool to crew performance modeling in advanced cockpit environments.

  5. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel

    1990-01-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  6. Effect-directed analysis supporting monitoring of aquatic environments--An in-depth overview.

    PubMed

    Brack, Werner; Ait-Aissa, Selim; Burgess, Robert M; Busch, Wibke; Creusot, Nicolas; Di Paolo, Carolina; Escher, Beate I; Mark Hewitt, L; Hilscherova, Klara; Hollender, Juliane; Hollert, Henner; Jonker, Willem; Kool, Jeroen; Lamoree, Marja; Muschket, Matthias; Neumann, Steffen; Rostkowski, Pawel; Ruttkies, Christoph; Schollee, Jennifer; Schymanski, Emma L; Schulze, Tobias; Seiler, Thomas-Benjamin; Tindall, Andrew J; De Aragão Umbuzeiro, Gisela; Vrana, Branislav; Krauss, Martin

    2016-02-15

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification. The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches. A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  8. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  9. SEURAT: Visual analytics for the integrated analysis of microarray data

    PubMed Central

    2010-01-01

    Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data. PMID:20525257

  10. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    NASA Astrophysics Data System (ADS)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  11. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  12. Analytical tools for characterizing biopharmaceuticals and the implications for biosimilars

    PubMed Central

    Berkowitz, Steven A.; Engen, John R.; Mazzeo, Jeffrey R.; Jones, Graham B.

    2013-01-01

    Biologics such as monoclonal antibodies are much more complex than small-molecule drugs, which raises challenging questions for the development and regulatory evaluation of follow-on versions of such biopharmaceutical products (also known as biosimilars) and their clinical use once patent protection for the pioneering biologic has expired. With the recent introduction of regulatory pathways for follow-on versions of complex biologics, the role of analytical technologies in comparing biosimilars with the corresponding reference product is attracting substantial interest in establishing the development requirements for biosimilars. Here, we discuss the current state of the art in analytical technologies to assess three characteristics of protein biopharmaceuticals that regulatory authorities have identified as being important in development strategies for biosimilars: post-translational modifications, three-dimensional structures and protein aggregation. PMID:22743980

  13. Development of an integrated BEM for hot fluid-structure interaction

    NASA Technical Reports Server (NTRS)

    Banerjee, P. K.; Dargush, G. F.

    1989-01-01

    The Boundary Element Method (BEM) is chosen as a basic analysis tool principally because the definition of quantities like fluxes, temperature, displacements, and velocities is very precise on a boundary base discretization scheme. One fundamental difficulty is, of course, that the entire analysis requires a very considerable amount of analytical work which is not present in other numerical methods. During the last 18 months all of this analytical work was completed and a two-dimensional, general purpose code was written. Some of the early results are described. It is anticipated that within the next two to three months almost all two-dimensional idealizations will be examined. It should be noted that the analytical work for the three-dimensional case has also been done and numerical implementation will begin next year.

  14. Epilepsy analytic system with cloud computing.

    PubMed

    Shen, Chia-Ping; Zhou, Weizhi; Lin, Feng-Seng; Sung, Hsiao-Ya; Lam, Yan-Yu; Chen, Wei; Lin, Jeng-Wei; Pan, Ming-Kai; Chiu, Ming-Jang; Lai, Feipei

    2013-01-01

    Biomedical data analytic system has played an important role in doing the clinical diagnosis for several decades. Today, it is an emerging research area of analyzing these big data to make decision support for physicians. This paper presents a parallelized web-based tool with cloud computing service architecture to analyze the epilepsy. There are many modern analytic functions which are wavelet transform, genetic algorithm (GA), and support vector machine (SVM) cascaded in the system. To demonstrate the effectiveness of the system, it has been verified by two kinds of electroencephalography (EEG) data, which are short term EEG and long term EEG. The results reveal that our approach achieves the total classification accuracy higher than 90%. In addition, the entire training time accelerate about 4.66 times and prediction time is also meet requirements in real time.

  15. [The Scope, Quality and Safety Requirements of Drug Abuse Testing].

    PubMed

    Küme, Tuncay; Karakükcü, Çiğdem; Pınar, Aslı; Coşkunol, Hakan

    2017-01-01

    The aim of this review is to inform about the scopes and requirements of drug abuse testing. Drug abuse testing is one of the tools for determination of drug use. It must fulfill the quality and safety requirements in judgmental legal and administrative decisions. Drug abuse testing must fulfill some requirements like selection of the appropriate test matrix, appropriate screening test panel, sampling in detection window, patient consent, identification of the donor, appropriate collection site, sample collection with observation, identification and control of the sample, specimen custody chain in preanalytical phase; analysis in authorized laboratories, specimen validity tests, reliable testing METHODS, strict quality control, two-step analysis in analytical phase; storage of the split specimen, confirmation of the split specimen in the objection, result custody chain, appropriate cut-off concentration, the appropriate interpretation of the result in postanalytical phase. The workflow and analytical processes of drug abuse testing are explained in last regulation of the Department of Medical Laboratory Services, Ministry of Health in Turkey. The clinical physicians have to know and apply the quality and safety requirements in drug abuse testing according to last regulations in Turkey.

  16. Individual human cell responses to low doses of chemicals studied by synchrotron infrared spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Holman, Hoi-Ying N.; Goth-Goldstein, Regine; Blakely, Elanor A.; Bjornstad, Kathy; Martin, Michael C.; McKinney, Wayne R.

    2000-05-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in the individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR-FTIR microscopy probes intact living cells providing a composite view of all of the molecular response and the ability to monitor the response over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low- doses of chemicals. In this study we used the high spatial - resolution SR-FTIR vibrational spectromicroscopy as a sensitive analytical tool to detect chemical- and radiation- induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of dioxin. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio- compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  17. New trends in the analytical determination of emerging contaminants and their transformation products in environmental waters.

    PubMed

    Agüera, Ana; Martínez Bueno, María Jesús; Fernández-Alba, Amadeo R

    2013-06-01

    Since the so-called emerging contaminants were established as a new group of pollutants of environmental concern, a great effort has been devoted to the knowledge of their distribution, fate and effects in the environment. After more than 20 years of work, a significant improvement in knowledge about these contaminants has been achieved, but there is still a large gap of information on the growing number of new potential contaminants that are appearing and especially of their unpredictable transformation products. Although the environmental problem arising from emerging contaminants must be addressed from an interdisciplinary point of view, it is obvious that analytical chemistry plays an important role as the first step of the study, as it allows establishing the presence of chemicals in the environment, estimate their concentration levels, identify sources and determine their degradation pathways. These tasks involve serious difficulties requiring different analytical solutions adjusted to purpose. Thus, the complexity of the matrices requires highly selective analytical methods; the large number and variety of compounds potentially present in the samples demands the application of wide scope methods; the low concentrations at which these contaminants are present in the samples require a high detection sensitivity, and high demands on the confirmation and high structural information are needed for the characterisation of unknowns. New developments on analytical instrumentation have been applied to solve these difficulties. Furthermore and not less important has been the development of new specific software packages intended for data acquisition and, in particular, for post-run analysis. Thus, the use of sophisticated software tools has allowed successful screening analysis, determining several hundreds of analytes, and assisted in the structural elucidation of unknown compounds in a timely manner.

  18. Wearable physiological systems and technologies for metabolic monitoring.

    PubMed

    Gao, Wei; Brooks, George A; Klonoff, David C

    2018-03-01

    Wearable sensors allow continuous monitoring of metabolites for diabetes, sports medicine, exercise science, and physiology research. These sensors can continuously detect target analytes in skin interstitial fluid (ISF), tears, saliva, and sweat. In this review, we will summarize developments on wearable devices and their potential applications in research, clinical practice, and recreational and sporting activities. Sampling skin ISF can require insertion of a needle into the skin, whereas sweat, tears, and saliva can be sampled by devices worn outside the body. The most widely sampled metabolite from a wearable device is glucose in skin ISF for monitoring diabetes patients. Continuous ISF glucose monitoring allows estimation of the glucose concentration in blood without the pain, inconvenience, and blood waste of fingerstick capillary blood glucose testing. This tool is currently used by diabetes patients to provide information for dosing insulin and determining a diet and exercise plan. Similar technologies for measuring concentrations of other analytes in skin ISF could be used to monitor athletes, emergency responders, warfighters, and others in states of extreme physiological stress. Sweat is a potentially useful substrate for sampling analytes for metabolic monitoring during exercise. Lactate, sodium, potassium, and hydrogen ions can be measured in sweat. Tools for converting the concentrations of these analytes sampled from sweat, tears, and saliva into blood concentrations are being developed. As an understanding of the relationships between the concentrations of analytes in blood and easily sampled body fluid increases, then the benefits of new wearable devices for metabolic monitoring will also increase.

  19. Development and in-line validation of a Process Analytical Technology to facilitate the scale up of coating processes.

    PubMed

    Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P

    2013-05-05

    Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Growth and yield model application in tropical rain forest management

    Treesearch

    James Atta-Boateng; John W., Jr. Moser

    2000-01-01

    Analytical tools are needed to evaluate the impact of management policies on the sustainable use of rain forest. Optimal decisions concerning the level of management inputs require accurate predictions of output at all relevant input levels. Using growth data from 40 l-hectare permanent plots obtained from the semi-deciduous forest of Ghana, a system of 77 differential...

  1. Clustering Patterns of Engagement in Massive Open Online Courses (MOOCs): The Use of Learning Analytics to Reveal Student Categories

    ERIC Educational Resources Information Center

    Khalil, Mohammad; Ebner, Martin

    2017-01-01

    Massive Open Online Courses (MOOCs) are remote courses that excel in their students' heterogeneity and quantity. Due to the peculiarity of being massiveness, the large datasets generated by MOOC platforms require advanced tools and techniques to reveal hidden patterns for purposes of enhancing learning and educational behaviors. This publication…

  2. Using an innovative combination of quality-by-design and green analytical chemistry approaches for the development of a stability indicating UHPLC method in pharmaceutical products.

    PubMed

    Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen

    2015-11-10

    An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Coupling the Multizone Airflow and Contaminant Transport Software CONTAM with EnergyPlus Using Co-Simulation.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-08-01

    Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.

  4. Framework for Deploying a Virtualized Computing Environment for Collaborative and Secure Data Analytics

    PubMed Central

    Meyer, Adrian; Green, Laura; Faulk, Ciearro; Galla, Stephen; Meyer, Anne-Marie

    2016-01-01

    Introduction: Large amounts of health data generated by a wide range of health care applications across a variety of systems have the potential to offer valuable insight into populations and health care systems, but robust and secure computing and analytic systems are required to leverage this information. Framework: We discuss our experiences deploying a Secure Data Analysis Platform (SeDAP), and provide a framework to plan, build and deploy a virtual desktop infrastructure (VDI) to enable innovation, collaboration and operate within academic funding structures. It outlines 6 core components: Security, Ease of Access, Performance, Cost, Tools, and Training. Conclusion: A platform like SeDAP is not simply successful through technical excellence and performance. It’s adoption is dependent on a collaborative environment where researchers and users plan and evaluate the requirements of all aspects. PMID:27683665

  5. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  6. Considerations on the Use of Custom Accelerators for Big Data Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Tumeo, Antonino; Minutoli, Marco

    Accelerators, including Graphic Processing Units (GPUs) for gen- eral purpose computation, many-core designs with wide vector units (e.g., Intel Phi), have become a common component of many high performance clusters. The appearance of more stable and reliable tools tools that can automatically convert code written in high-level specifications with annotations (such as C or C++) to hardware de- scription languages (High-Level Synthesis - HLS), is also setting the stage for a broader use of reconfigurable devices (e.g., Field Pro- grammable Gate Arrays - FPGAs) in high performance system for the implementation of custom accelerators, helped by the fact that newmore » processors include advanced cache-coherent interconnects for these components. In this chapter, we briefly survey the status of the use of accelerators in high performance systems targeted at big data analytics applications. We argue that, although the progress in the use of accelerators for this class of applications has been sig- nificant, differently from scientific simulations there still are gaps to close. This is particularly true for the ”irregular” behaviors exhibited by no-SQL, graph databases. We focus our attention on the limits of HLS tools for data analytics and graph methods, and discuss a new architectural template that better fits the requirement of this class of applications. We validate the new architectural templates by mod- ifying the Graph Engine for Multithreaded System (GEMS) frame- work to support accelerators generated with such a methodology, and testing with queries coming from the Lehigh University Benchmark (LUBM). The architectural template enables better supporting the task and memory level parallelism present in graph methods by sup- porting a new control model and a enhanced memory interface. We show that out solution allows generating parallel accelerators, pro- viding speed ups with respect to conventional HLS flows. We finally draw conclusions and present a perspective on the use of reconfig- urable devices and Design Automation tools for data analytics.« less

  7. Cloud-Based Computational Tools for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Arendt, A. A.; Fatland, R.; Howe, B.

    2015-12-01

    Earth scientists are increasingly required to think across disciplines and utilize a wide range of datasets in order to solve complex environmental challenges. Although significant progress has been made in distributing data, researchers must still invest heavily in developing computational tools to accommodate their specific domain. Here we document our development of lightweight computational data systems aimed at enabling rapid data distribution, analytics and problem solving tools for Earth science applications. Our goal is for these systems to be easily deployable, scalable and flexible to accommodate new research directions. As an example we describe "Ice2Ocean", a software system aimed at predicting runoff from snow and ice in the Gulf of Alaska region. Our backend components include relational database software to handle tabular and vector datasets, Python tools (NumPy, pandas and xray) for rapid querying of gridded climate data, and an energy and mass balance hydrological simulation model (SnowModel). These components are hosted in a cloud environment for direct access across research teams, and can also be accessed via API web services using a REST interface. This API is a vital component of our system architecture, as it enables quick integration of our analytical tools across disciplines, and can be accessed by any existing data distribution centers. We will showcase several data integration and visualization examples to illustrate how our system has expanded our ability to conduct cross-disciplinary research.

  8. State-of-the-art of bone marrow analysis in forensic toxicology: a review.

    PubMed

    Cartiser, Nathalie; Bévalot, Fabien; Fanton, Laurent; Gaillard, Yvan; Guitton, Jérôme

    2011-03-01

    Although blood is the reference medium in the field of forensic toxicology, alternative matrices are required in case of limited, unavailable or unusable blood samples. The present review investigated the suitability of bone marrow (BM) as an alternative matrix to characterize xenobiotic consumption and its influence on the occurrence of death. Basic data on BM physiology are reported in order to highlight the specificities of this matrix and their analytical and toxicokinetic consequences. A review of case reports, animal and human studies involving BM sample analysis focuses on the various parameters of interpretation of toxicological results: analytic limits, sampling location, pharmacokinetics, blood/BM concentration correlation, stability and postmortem redistribution. Tables summarizing the analytical conditions and quantification of 45 compounds from BM samples provide a useful tool for toxicologists. A specific section devoted to ethanol shows that, despite successful quantification, interpretation is highly dependent on postmortem interval. In conclusion, BM is an interesting alternative matrix, and further experimental data and validated assays are required to confirm its great potential relevance in forensic toxicology.

  9. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  10. New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Eric W; Rames, Clement L; Muratori, Matteo

    This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.

  11. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. The Geoinformatica free and open source software stack

    NASA Astrophysics Data System (ADS)

    Jolma, A.

    2012-04-01

    The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.

  13. A review of the analytical simulation of aircraft crash dynamics

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Carden, Huey D.; Boitnott, Richard L.; Hayduk, Robert J.

    1990-01-01

    A large number of full scale tests of general aviation aircraft, helicopters, and one unique air-to-ground controlled impact of a transport aircraft were performed. Additionally, research was also conducted on seat dynamic performance, load-limiting seats, load limiting subfloor designs, and emergency-locator-transmitters (ELTs). Computer programs were developed to provide designers with methods for predicting accelerations, velocities, and displacements of collapsing structure and for estimating the human response to crash loads. The results of full scale aircraft and component tests were used to verify and guide the development of analytical simulation tools and to demonstrate impact load attenuating concepts. Analytical simulation of metal and composite aircraft crash dynamics are addressed. Finite element models are examined to determine their degree of corroboration by experimental data and to reveal deficiencies requiring further development.

  14. SMC Standard: Evaluation and Test Requirements for Liquid Rocket Engines

    DTIC Science & Technology

    2017-07-26

    Run -Time Trends .................................................................................................... 53 7.2.4 Steady State Analytical...Administration, 2008. 22. M. Singh, J. Vargo, D. Schiffer and J. Dello, “Safe Diagram – A Design and Reliability Tool for Turbine Blading ,” Dresser-Rand...allowed starts and run ‐time including ground acceptance testing, on‐pad firings/aborts, and flight exposure. Part: A single piece (or two or more

  15. The Earth Data Analytic Services (EDAS) Framework

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  16. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation. PMID:25867077

  17. Quantum Metrology Assisted by Abstention

    NASA Astrophysics Data System (ADS)

    Gendra, B.; Ronco-Bonvehi, E.; Calsamiglia, J.; Muñoz-Tapia, R.; Bagan, E.

    2013-03-01

    The main goal of quantum metrology is to obtain accurate values of physical parameters using quantum probes. In this context, we show that abstention, i.e., the possibility of getting an inconclusive answer at readout, can drastically improve the measurement precision and even lead to a change in its asymptotic behavior, from the shot-noise to the Heisenberg scaling. We focus on phase estimation and quantify the required amount of abstention for a given precision. We also develop analytical tools to obtain the asymptotic behavior of the precision and required rate of abstention for arbitrary pure states.

  18. Stochastic modelling of the hydrologic operation of rainwater harvesting systems

    NASA Astrophysics Data System (ADS)

    Guo, Rui; Guo, Yiping

    2018-07-01

    Rainwater harvesting (RWH) systems are an effective low impact development practice that provides both water supply and runoff reduction benefits. A stochastic modelling approach is proposed in this paper to quantify the water supply reliability and stormwater capture efficiency of RWH systems. The input rainfall series is represented as a marked Poisson process and two typical water use patterns are analytically described. The stochastic mass balance equation is solved analytically, and based on this, explicit expressions relating system performance to system characteristics are derived. The performances of a wide variety of RWH systems located in five representative climatic regions of the United States are examined using the newly derived analytical equations. Close agreements between analytical and continuous simulation results are shown for all the compared cases. In addition, an analytical equation is obtained expressing the required storage size as a function of the desired water supply reliability, average water use rate, as well as rainfall and catchment characteristics. The equations developed herein constitute a convenient and effective tool for sizing RWH systems and evaluating their performances.

  19. Aeroelastic Optimization Study Based on the X-56A Model

    NASA Technical Reports Server (NTRS)

    Li, Wesley W.; Pak, Chan-Gi

    2014-01-01

    One way to increase the aircraft fuel efficiency is to reduce structural weight while maintaining adequate structural airworthiness, both statically and aeroelastically. A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. This paper presents two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. Such an approach exploits the anisotropic capabilities of the fiber composite materials chosen for this analytical exercise with ply stacking sequence. A hybrid and discretization optimization approach improves accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study for the fabricated flexible wing of the X-56A model since a desired flutter speed band is required for the active flutter suppression demonstration during flight testing. The results of the second study provide guidance to modify the wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished successfully. The second case also demonstrates that the object-oriented MDAO tool can handle multiple analytical configurations in a single optimization run.

  20. Design/analysis of the JWST ISIM bonded joints for survivability at cryogenic temperatures

    NASA Astrophysics Data System (ADS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini, Benjamin; Young, Daniel

    2005-08-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite adhesively bonded joints at the cryogenic temperature of 30K (-405°F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to hybrid composite tubes (75mm square) made with M55J/954-6 and T300/954-6 prepregs. Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently, the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  1. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  2. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  3. Scalable Earth-observation Analytics for Geoscientists: Spacetime Extensions to the Array Database SciDB

    NASA Astrophysics Data System (ADS)

    Appel, Marius; Lahn, Florian; Pebesma, Edzer; Buytaert, Wouter; Moulds, Simon

    2016-04-01

    Today's amount of freely available data requires scientists to spend large parts of their work on data management. This is especially true in environmental sciences when working with large remote sensing datasets, such as obtained from earth-observation satellites like the Sentinel fleet. Many frameworks like SpatialHadoop or Apache Spark address the scalability but target programmers rather than data analysts, and are not dedicated to imagery or array data. In this work, we use the open-source data management and analytics system SciDB to bring large earth-observation datasets closer to analysts. Its underlying data representation as multidimensional arrays fits naturally to earth-observation datasets, distributes storage and computational load over multiple instances by multidimensional chunking, and also enables efficient time-series based analyses, which is usually difficult using file- or tile-based approaches. Existing interfaces to R and Python furthermore allow for scalable analytics with relatively little learning effort. However, interfacing SciDB and file-based earth-observation datasets that come as tiled temporal snapshots requires a lot of manual bookkeeping during ingestion, and SciDB natively only supports loading data from CSV-like and custom binary formatted files, which currently limits its practical use in earth-observation analytics. To make it easier to work with large multi-temporal datasets in SciDB, we developed software tools that enrich SciDB with earth observation metadata and allow working with commonly used file formats: (i) the SciDB extension library scidb4geo simplifies working with spatiotemporal arrays by adding relevant metadata to the database and (ii) the Geospatial Data Abstraction Library (GDAL) driver implementation scidb4gdal allows to ingest and export remote sensing imagery from and to a large number of file formats. Using added metadata on temporal resolution and coverage, the GDAL driver supports time-based ingestion of imagery to existing multi-temporal SciDB arrays. While our SciDB plugin works directly in the database, the GDAL driver has been specifically developed using a minimum amount of external dependencies (i.e. CURL). Source code for both tools is available from github [1]. We present these tools in a case-study that demonstrates the ingestion of multi-temporal tiled earth-observation data to SciDB, followed by a time-series analysis using R and SciDBR. Through the exclusive use of open-source software, our approach supports reproducibility in scalable large-scale earth-observation analytics. In the future, these tools can be used in an automated way to let scientists only work on ready-to-use SciDB arrays to significantly reduce the data management workload for domain scientists. [1] https://github.com/mappl/scidb4geo} and \\url{https://github.com/mappl/scidb4gdal

  4. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  5. Consistent approach to describing aircraft HIRF protection

    NASA Technical Reports Server (NTRS)

    Rimbey, P. R.; Walen, D. B.

    1995-01-01

    The high intensity radiated fields (HIRF) certification process as currently implemented is comprised of an inconsistent combination of factors that tend to emphasize worst case scenarios in assessing commercial airplane certification requirements. By examining these factors which include the process definition, the external HIRF environment, the aircraft coupling and corresponding internal fields, and methods of measuring equipment susceptibilities, activities leading to an approach to appraising airplane vulnerability to HIRF are proposed. This approach utilizes technically based criteria to evaluate the nature of the threat, including the probability of encountering the external HIRF environment. No single test or analytic method comprehensively addresses the full HIRF threat frequency spectrum. Additional tools such as statistical methods must be adopted to arrive at more realistic requirements to reflect commercial aircraft vulnerability to the HIRF threat. Test and analytic data are provided to support the conclusions of this report. This work was performed under NASA contract NAS1-19360, Task 52.

  6. Manufacturing data analytics using a virtual factory representation.

    PubMed

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  7. Predictive Analytics for Identification of Patients at Risk for QT Interval Prolongation - A Systematic Review.

    PubMed

    Tomaselli Muensterman, Elena; Tisdale, James E

    2018-06-08

    Prolongation of the heart rate-corrected QT (QTc) interval increases the risk for torsades de pointes (TdP), a potentially fatal arrhythmia. The likelihood of TdP is higher in patients with risk factors, which include female sex, older age, heart failure with reduced ejection fraction, hypokalemia, hypomagnesemia, concomitant administration of ≥ 2 QTc interval-prolonging medications, among others. Assessment and quantification of risk factors may facilitate prediction of patients at highest risk for developing QTc interval prolongation and TdP. Investigators have utilized the field of predictive analytics, which generates predictions using techniques including data mining, modeling, machine learning, and others, to develop methods of risk quantification and prediction of QTc interval prolongation. Predictive analytics have also been incorporated into clinical decision support (CDS) tools to alert clinicians regarding patients at increased risk of developing QTc interval prolongation. The objectives of this paper are to assess the effectiveness of predictive analytics for identification of patients at risk of drug-induced QTc interval prolongation, and to discuss the efficacy of incorporation of predictive analytics into CDS tools in clinical practice. A systematic review of English language articles (human subjects only) was performed, yielding 57 articles, with an additional 4 articles identified from other sources; a total of 10 articles were included in this review. Risk scores for QTc interval prolongation have been developed in various patient populations including those in cardiac intensive care units (ICUs) and in broader populations of hospitalized or health system patients. One group developed a risk score that includes information regarding genetic polymorphisms; this score significantly predicted TdP. Development of QTc interval prolongation risk prediction models and incorporation of these models into CDS tools reduces the risk of QTc interval prolongation in cardiac ICUs and identifies health-system patients at increased risk for mortality. The impact of these QTc interval prolongation predictive analytics on overall patient safety outcomes, such as TdP and sudden cardiac death relative to the cost of development and implementation, requires further study. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. Application of non-traditional stable isotopes in analytical ecogeochemistry assessed by MC ICP-MS--A critical review.

    PubMed

    Irrgeher, Johanna; Prohaska, Thomas

    2016-01-01

    Analytical ecogeochemistry is an evolving scientific field dedicated to the development of analytical methods and tools and their application to ecological questions. Traditional stable isotopic systems have been widely explored and have undergone continuous development during the last century. The variations of the isotopic composition of light elements (H, O, N, C, and S) have provided the foundation of stable isotope analysis followed by the analysis of traditional geochemical isotope tracers (e.g., Pb, Sr, Nd, Hf). Questions in a considerable diversity of scientific fields have been addressed, many of which can be assigned to the field of ecogeochemistry. Over the past 15 years, other stable isotopes (e.g., Li, Zn, Cu, Cl) have emerged gradually as novel tools for the investigation of scientific topics that arise in ecosystem research and have enabled novel discoveries and explorations. These systems are often referred to as non-traditional isotopes. The small isotopic differences of interest that are increasingly being addressed for a growing number of isotopic systems represent a challenge to the analytical scientist and push the limits of today's instruments constantly. This underlines the importance of a metrologically sound concept of analytical protocols and procedures and a solid foundation of data processing strategies and uncertainty considerations before these small isotopic variations can be interpreted in the context of applied ecosystem research. This review focuses on the development of isotope research in ecogeochemistry, the requirements for successful detection of small isotopic shifts, and highlights the most recent and innovative applications in the field.

  9. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  10. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates

    PubMed Central

    2013-01-01

    Background Out-of-hospital cardiac arrest (OHCA) is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS) system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS) simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch) increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1) static changes such as trimming the emergency call handling time or (2) dynamic changes such as location of emergency resources or which resources should carry a defibrillator. PMID:23415045

  11. Mining Mathematics in Textbook Lessons

    ERIC Educational Resources Information Center

    Ronda, Erlina; Adler, Jill

    2017-01-01

    In this paper, we propose an analytic tool for describing the mathematics made available to learn in a "textbook lesson". The tool is an adaptation of the Mathematics Discourse in Instruction (MDI) analytic tool that we developed to analyze what is made available to learn in teachers' lessons. Our motivation to adapt the use of the MDI…

  12. Fire behavior modeling-a decision tool

    Treesearch

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  13. Guidance for the Design and Adoption of Analytic Tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  14. An Integrated Multivariable Visualization Tool for Marine Sanctuary Climate Assessments

    NASA Astrophysics Data System (ADS)

    Shein, K. A.; Johnston, S.; Stachniewicz, J.; Duncan, B.; Cecil, D.; Ansari, S.; Urzen, M.

    2012-12-01

    The comprehensive development and use of ecological climate impact assessments by ecosystem managers can be limited by data access and visualization methods that require a priori knowledge about the various large and complex climate data products necessary to those impact assessments. In addition, it can be difficult to geographically and temporally integrate climate and ecological data to fully characterize climate-driven ecological impacts. To address these considerations, we have enhanced and extended the functionality of the NOAA National Climatic Data Center's Weather and Climate Toolkit (WCT). The WCT is a freely available Java-based tool designed to access and display NCDC's georeferenced climate data products (e.g., satellite, radar, and reanalysis gridded data). However, the WCT requires users already know how to obtain the data products, which products are preferred for a given variable, and which products are most relevant to their needs. Developed in cooperation with research and management customers at the Gulf of the Farallones National Marine Sanctuary, the Integrated Marine Protected Area Climate Tools (IMPACT) modification to the WCT simplifies or eliminates these requirements, while simultaneously adding core analytical functionality to the tool. Designed for use by marine ecosystem managers, WCT-IMPACT accesses a suite of data products that have been identified as relevant to marine ecosystem climate impact assessments, such as NOAA's Climate Data Records. WCT-IMPACT regularly crops these products to the geographic boundaries of each included marine protected area (MPA), and those clipped regions are processed to produce MPA-specific analytics. The tool retrieves the most appropriate data files based on the user selection of MPA, environmental variable(s), and time frame. Once the data are loaded, they may be visualized, explored, analyzed, and exported to other formats (e.g., Google KML). Multiple variables may be simultaneously visualized using a 4-panel display and compared via a variety of statistics such as difference, probability, or correlation maps.; NCDC's Weather and Climate Toolkit image of NARR-A non-convective cloud cover (%) over the Pacific Coast on June 17, 2012 at 09:00 GMT.

  15. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, M. Clinton; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many konts to make possible high speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flow fields/plumes; the Optical Plume Anomaly Detection (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDIFIS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Additionally, efforts are being advanced to hardware encode components of the EDIFIS in order to address real-time operational requirements for health monitoring and management. This paper addresses the OPAD with its tool suite, and discusses what is considered a natural progression: a concept for migrating OPAD towards detection of high energy particles, including neutrons and gamma rays. The integration of these tools and capabilities will provide NASA with a systematic approach to monitor space vehicle internal and external environment.

  16. Biokinetics of Nanomaterials: the Role of Biopersistence.

    PubMed

    Laux, Peter; Riebeling, Christian; Booth, Andy M; Brain, Joseph D; Brunner, Josephine; Cerrillo, Cristina; Creutzenberg, Otto; Estrela-Lopis, Irina; Gebel, Thomas; Johanson, Gunnar; Jungnickel, Harald; Kock, Heiko; Tentschert, Jutta; Tlili, Ahmed; Schäffer, Andreas; Sips, Adriënne J A M; Yokel, Robert A; Luch, Andreas

    2017-04-01

    Nanotechnology risk management strategies and environmental regulations continue to rely on hazard and exposure assessment protocols developed for bulk materials, including larger size particles, while commercial application of nanomaterials (NMs) increases. In order to support and corroborate risk assessment of NMs for workers, consumers, and the environment it is crucial to establish the impact of biopersistence of NMs at realistic doses. In the future, such data will allow a more refined future categorization of NMs. Despite many experiments on NM characterization and numerous in vitro and in vivo studies, several questions remain unanswered including the influence of biopersistence on the toxicity of NMs. It is unclear which criteria to apply to characterize a NM as biopersistent. Detection and quantification of NMs, especially determination of their state, i.e., dissolution, aggregation, and agglomeration within biological matrices and other environments are still challenging tasks; moreover mechanisms of nanoparticle (NP) translocation and persistence remain critical gaps. This review summarizes the current understanding of NM biokinetics focusing on determinants of biopersistence. Thorough particle characterization in different exposure scenarios and biological matrices requires use of suitable analytical methods and is a prerequisite to understand biopersistence and for the development of appropriate dosimetry. Analytical tools that potentially can facilitate elucidation of key NM characteristics, such as ion beam microscopy (IBM) and time-of-flight secondary ion mass spectrometry (ToF-SIMS), are discussed in relation to their potential to advance the understanding of biopersistent NM kinetics. We conclude that a major requirement for future nanosafety research is the development and application of analytical tools to characterize NPs in different exposure scenarios and biological matrices.

  17. A shipboard comparison of analytic methods for ballast water compliance monitoring

    NASA Astrophysics Data System (ADS)

    Bradie, Johanna; Broeg, Katja; Gianoli, Claudio; He, Jianjun; Heitmüller, Susanne; Curto, Alberto Lo; Nakata, Akiko; Rolke, Manfred; Schillak, Lothar; Stehouwer, Peter; Vanden Byllaardt, Julie; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Promising approaches for indicative analysis of ballast water samples have been developed that require study in the field to examine their utility for determining compliance with the International Convention for the Control and Management of Ships' Ballast Water and Sediments. To address this gap, a voyage was undertaken on board the RV Meteor, sailing the North Atlantic Ocean from Mindelo (Cape Verde) to Hamburg (Germany) during June 4-15, 2015. Trials were conducted on local sea water taken up by the ship's ballast system at multiple locations along the trip, including open ocean, North Sea, and coastal water, to evaluate a number of analytic methods that measure the numeric concentration or biomass of viable organisms according to two size categories (≥ 50 μm in minimum dimension: 7 techniques, ≥ 10 μm and < 50 μm: 9 techniques). Water samples were analyzed in parallel to determine whether results were similar between methods and whether rapid, indicative methods offer comparable results to standard, time- and labor-intensive detailed methods (e.g. microscopy) and high-end scientific approaches (e.g. flow cytometry). Several promising indicative methods were identified that showed high correlation with microscopy, but allow much quicker processing and require less expert knowledge. This study is the first to concurrently use a large number of analytic tools to examine a variety of ballast water samples on board an operational ship in the field. Results are useful to identify the merits of each method and can serve as a basis for further improvement and development of tools and methodologies for ballast water compliance monitoring.

  18. Development and implementation of a mobile device-based pediatric electronic decision support tool as part of a national practice standardization project.

    PubMed

    McCulloh, Russell J; Fouquet, Sarah D; Herigon, Joshua; Biondi, Eric A; Kennedy, Brandan; Kerns, Ellen; DePorre, Adrienne; Markham, Jessica L; Chan, Y Raymond; Nelson, Krista; Newland, Jason G

    2018-06-07

    Implementing evidence-based practices requires a multi-faceted approach. Electronic clinical decision support (ECDS) tools may encourage evidence-based practice adoption. However, data regarding the role of mobile ECDS tools in pediatrics is scant. Our objective is to describe the development, distribution, and usage patterns of a smartphone-based ECDS tool within a national practice standardization project. We developed a smartphone-based ECDS tool for use in the American Academy of Pediatrics, Value in Inpatient Pediatrics Network project entitled "Reducing Excessive Variation in the Infant Sepsis Evaluation (REVISE)." The mobile application (app), PedsGuide, was developed using evidence-based recommendations created by an interdisciplinary panel. App workflow and content were aligned with clinical benchmarks; app interface was adjusted after usability heuristic review. Usage patterns were measured using Google Analytics. Overall, 3805 users across the United States downloaded PedsGuide from December 1, 2016, to July 31, 2017, leading to 14 256 use sessions (average 3.75 sessions per user). Users engaged in 60 442 screen views, including 37 424 (61.8%) screen views that displayed content related to the REVISE clinical practice benchmarks, including hospital admission appropriateness (26.8%), length of hospitalization (14.6%), and diagnostic testing recommendations (17.0%). Median user touch depth was 5 [IQR 5]. We observed rapid dissemination and in-depth engagement with PedsGuide, demonstrating feasibility for using smartphone-based ECDS tools within national practice improvement projects. ECDS tools may prove valuable in future national practice standardization initiatives. Work should next focus on developing robust analytics to determine ECDS tools' impact on medical decision making, clinical practice, and health outcomes.

  19. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  20. A history of development in rotordynamics: A manufacturer's perspective

    NASA Technical Reports Server (NTRS)

    Shemeld, David E.

    1987-01-01

    The subject of rotordynamics and instability problems in high performance turbomachinery has been a topic of considerable industry discussion and debate over the last 15 or so years. This paper reviews an original equipment manufacturer's history of development of concepts and equipment as applicable to multistage centrifugal compressors. The variety of industry user compression requirements and resultant problematical situations tends to confound many of the theories and analytical techniques set forth. The experiences and examples described herein support the conclusion that the successful addressing of potential rotordynamics problems is best served by a fundamental knowledge of the specific equipment. This in addition to having the appropriate analytical tools. Also, that the final proof is in the doing.

  1. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  2. Horizon Missions Methodology - Using new paradigms to overcome conceptual blocks to innovation

    NASA Technical Reports Server (NTRS)

    Anderson, John L.

    1993-01-01

    The Horizon Mission Methodology was developed to provide a systematic analytical approach for evaluating and identifying technological requirements for breakthrough technology options (BTOs) and for assessing their potential to provide revolutionary capabilities for advanced space missions. Here, attention is given to the further use of the methodology as a new tool for a broader range of studies dealing with technology innovation and new technology paradigms.

  3. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  4. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  5. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  6. Laser desorption ionization mass spectrometry: Recent progress in matrix-free and label-assisted techniques.

    PubMed

    Mandal, Arundhoti; Singha, Monisha; Addy, Partha Sarathi; Basak, Amit

    2017-10-13

    The MALDI-based mass spectrometry, over the last three decades, has become an important analytical tool. It is a gentle ionization technique, usually applicable to detect and characterize analytes with high molecular weights like proteins and other macromolecules. The earlier difficulty of detection of analytes with low molecular weights like small organic molecules and metal ion complexes with this technique arose due to the cluster of peaks in the low molecular weight region generated from the matrix. To detect such molecules and metal ion complexes, a four-prong strategy has been developed. These include use of alternate matrix materials, employment of new surface materials that require no matrix, use of metabolites that directly absorb the laser light, and the laser-absorbing label-assisted LDI-MS (popularly known as LALDI-MS). This review will highlight the developments with all these strategies with a special emphasis on LALDI-MS. © 2017 Wiley Periodicals, Inc.

  7. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.; Price, D. Marvin

    1991-01-01

    Spacecraft designers have always been concerned about the effects of meteoroid impacts on mission safety. The engineering solution to this problem has generally been to erect a bumper or shield placed outboard from the spacecraft wall to disrupt/deflect the incoming projectiles. Spacecraft designers have a number of tools at their disposal to aid in the design process. These include hypervelocity impact testing, analytic impact predictors, and hydrodynamic codes. Analytic impact predictors generally provide the best quick-look estimate of design tradeoffs. The most complete way to determine the characteristics of an analytic impact predictor is through optimization of the protective structures design problem formulated with the predictor of interest. Space Station Freedom protective structures design insight is provided through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. Major results are presented.

  8. Analytical expression for Risken-Nummedal-Graham-Haken instability threshold in quantum cascade lasers.

    PubMed

    Vukovic, N; Radovanovic, J; Milanovic, V; Boiko, D L

    2016-11-14

    We have obtained a closed-form expression for the threshold of Risken-Nummedal-Graham-Haken (RNGH) multimode instability in a Fabry-Pérot (FP) cavity quantum cascade laser (QCL). This simple analytical expression is a versatile tool that can easily be applied in practical situations which require analysis of QCL dynamic behavior and estimation of its RNGH multimode instability threshold. Our model for a FP cavity laser accounts for the carrier coherence grating and carrier population grating as well as their relaxation due to carrier diffusion. In the model, the RNGH instability threshold is analyzed using a second-order bi-orthogonal perturbation theory and we confirm our analytical solution by a comparison with the numerical simulations. In particular, the model predicts a low RNGH instability threshold in QCLs. This agrees very well with experimental data available in the literature.

  9. Structural Design Optimization of Doubly-Fed Induction Generators Using GeneratorSE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sethuraman, Latha; Fingersh, Lee J; Dykes, Katherine L

    2017-11-13

    A wind turbine with a larger rotor swept area can generate more electricity, however, this increases costs disproportionately for manufacturing, transportation, and installation. This poster presents analytical models for optimizing doubly-fed induction generators (DFIGs), with the objective of reducing the costs and mass of wind turbine drivetrains. The structural design for the induction machine includes models for the casing, stator, rotor, and high-speed shaft developed within the DFIG module in the National Renewable Energy Laboratory's wind turbine sizing tool, GeneratorSE. The mechanical integrity of the machine is verified by examining stresses, structural deflections, and modal properties. The optimization results aremore » then validated using finite element analysis (FEA). The results suggest that our analytical model correlates with the FEA in some areas, such as radial deflection, differing by less than 20 percent. But the analytical model requires further development for axial deflections, torsional deflections, and stress calculations.« less

  10. Monte Carlo simulations of neutron-scattering instruments using McStas

    NASA Astrophysics Data System (ADS)

    Nielsen, K.; Lefmann, K.

    2000-06-01

    Monte Carlo simulations have become an essential tool for improving the performance of neutron-scattering instruments, since the level of sophistication in the design of instruments is defeating purely analytical methods. The program McStas, being developed at Risø National Laboratory, includes an extension language that makes it easy to adapt it to the particular requirements of individual instruments, and thus provides a powerful and flexible tool for constructing such simulations. McStas has been successfully applied in such areas as neutron guide design, flux optimization, non-Gaussian resolution functions of triple-axis spectrometers, and time-focusing in time-of-flight instruments.

  11. EPA Tools and Resources Webinar: EPA’s Environmental Sampling and Analytical Methods for Environmental Remediation and Recovery

    EPA Pesticide Factsheets

    EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.

  12. The Climate Data Analytic Services (CDAS) Framework.

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2016-12-01

    Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.

  13. Quantitative Story Telling: Initial steps towards bridging perspectives and tools for a robust nexus assessment

    NASA Astrophysics Data System (ADS)

    Cabello, Violeta

    2017-04-01

    This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.

  14. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  15. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  16. Quantitative evaluation of the matrix effect in bioanalytical methods based on LC-MS: A comparison of two approaches.

    PubMed

    Rudzki, Piotr J; Gniazdowska, Elżbieta; Buś-Kwaśnik, Katarzyna

    2018-06-05

    Liquid chromatography coupled to mass spectrometry (LC-MS) is a powerful tool for studying pharmacokinetics and toxicokinetics. Reliable bioanalysis requires the characterization of the matrix effect, i.e. influence of the endogenous or exogenous compounds on the analyte signal intensity. We have compared two methods for the quantitation of matrix effect. The CVs(%) of internal standard normalized matrix factors recommended by the European Medicines Agency were evaluated against internal standard normalized relative matrix effects derived from Matuszewski et al. (2003). Both methods use post-extraction spiked samples, but matrix factors require also neat solutions. We have tested both approaches using analytes of diverse chemical structures. The study did not reveal relevant differences in the results obtained with both calculation methods. After normalization with the internal standard, the CV(%) of the matrix factor was on average 0.5% higher than the corresponding relative matrix effect. The method adopted by the European Medicines Agency seems to be slightly more conservative in the analyzed datasets. Nine analytes of different structures enabled a general overview of the problem, still, further studies are encouraged to confirm our observations. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Cell chips as new tools for cell biology--results, perspectives and opportunities.

    PubMed

    Primiceri, Elisabetta; Chiriacò, Maria Serena; Rinaldi, Ross; Maruccio, Giuseppe

    2013-10-07

    Cell culture technologies were initially developed as research tools for studying cell functions, but nowadays they are essential for the biotechnology industry, with rapidly expanding applications requiring more and more advancements with respect to traditional tools. Miniaturization and integration of sensors and microfluidic components with cell culture techniques open the way to the development of cellomics as a new field of research targeting innovative analytic platforms for high-throughput studies. This approach enables advanced cell studies under controllable conditions by providing inexpensive, easy-to-operate devices. Thanks to their numerous advantages cell-chips have become a hotspot in biosensors and bioelectronics fields and have been applied to very different fields. In this review exemplary applications will be discussed, for cell counting and detection, cytotoxicity assays, migration assays and stem cell studies.

  18. Realizing the Potential of Mobile Mental Health: New Methods for New Data in Psychiatry

    PubMed Central

    Staples, Patrick; Onnela, Jukka-Pekka

    2015-01-01

    Smartphones are now ubiquitous and can be harnessed to offer psychiatry a wealth of real-time data regarding patient behavior, self-reported symptoms, and even physiology. The data collected from smartphones meet the three criteria of big data: velocity, volume, and variety. Although these data have tremendous potential, transforming them into clinically valid and useful information requires using new tools and methods as a part of assessment in psychiatry. In this paper, we introduce and explore numerous analytical methods and tools from the computational and statistical sciences that appear readily applicable to psychiatric data collected using smartphones. By matching smartphone data with appropriate statistical methods, psychiatry can better realize the potential of mobile mental health and empower both patients and providers with novel clinical tools. PMID:26073363

  19. Crossing disciplines and scales to understand the critical zone

    USGS Publications Warehouse

    Brantley, S.L.; Goldhaber, M.B.; Vala, Ragnarsdottir K.

    2007-01-01

    The Critical Zone (CZ) is the system of coupled chemical, biological, physical, and geological processes operating together to support life at the Earth's surface. While our understanding of this zone has increased over the last hundred years, further advance requires scientists to cross disciplines and scales to integrate understanding of processes in the CZ, ranging in scale from the mineral-water interface to the globe. Despite the extreme heterogeneities manifest in the CZ, patterns are observed at all scales. Explanations require the use of new computational and analytical tools, inventive interdisciplinary approaches, and growing networks of sites and people.

  20. I'll take that to go: Big data bags and minimal identifiers for exchange of large, complex datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; D'Arcy, Mike; Heavner, Benjamin D.

    Big data workflows often require the assembly and exchange of complex, multi-element datasets. For example, in biomedical applications, the input to an analytic pipeline can be a dataset consisting thousands of images and genome sequences assembled from diverse repositories, requiring a description of the contents of the dataset in a concise and unambiguous form. Typical approaches to creating datasets for big data workflows assume that all data reside in a single location, requiring costly data marshaling and permitting errors of omission and commission because dataset members are not explicitly specified. We address these issues by proposing simple methods and toolsmore » for assembling, sharing, and analyzing large and complex datasets that scientists can easily integrate into their daily workflows. These tools combine a simple and robust method for describing data collections (BDBags), data descriptions (Research Objects), and simple persistent identifiers (Minids) to create a powerful ecosystem of tools and services for big data analysis and sharing. We present these tools and use biomedical case studies to illustrate their use for the rapid assembly, sharing, and analysis of large datasets.« less

  1. Bioassays as one of the Green Chemistry tools for assessing environmental quality: A review.

    PubMed

    Wieczerzak, M; Namieśnik, J; Kudłak, B

    2016-09-01

    For centuries, mankind has contributed to irreversible environmental changes, but due to the modern science of recent decades, scientists are able to assess the scale of this impact. The introduction of laws and standards to ensure environmental cleanliness requires comprehensive environmental monitoring, which should also meet the requirements of Green Chemistry. The broad spectrum of Green Chemistry principle applications should also include all of the techniques and methods of pollutant analysis and environmental monitoring. The classical methods of chemical analyses do not always match the twelve principles of Green Chemistry, and they are often expensive and employ toxic and environmentally unfriendly solvents in large quantities. These solvents can generate hazardous and toxic waste while consuming large volumes of resources. Therefore, there is a need to develop reliable techniques that would not only meet the requirements of Green Analytical Chemistry, but they could also complement and sometimes provide an alternative to conventional classical analytical methods. These alternatives may be found in bioassays. Commercially available certified bioassays often come in the form of ready-to-use toxkits, and they are easy to use and relatively inexpensive in comparison with certain conventional analytical methods. The aim of this study is to provide evidence that bioassays can be a complementary alternative to classical methods of analysis and can fulfil Green Analytical Chemistry criteria. The test organisms discussed in this work include single-celled organisms, such as cell lines, fungi (yeast), and bacteria, and multicellular organisms, such as invertebrate and vertebrate animals and plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Planning effectiveness may grow on fault trees.

    PubMed

    Chow, C W; Haddad, K; Mannino, B

    1991-10-01

    The first step of a strategic planning process--identifying and analyzing threats and opportunities--requires subjective judgments. By using an analytical tool known as a fault tree, healthcare administrators can reduce the unreliability of subjective decision making by creating a logical structure for problem solving and decision making. A case study of 11 healthcare administrators showed that an analysis technique called prospective hindsight can add to a fault tree's ability to improve a strategic planning process.

  3. Analytical solutions to non-Fickian subsurface dispersion in uniform groundwater flow

    USGS Publications Warehouse

    Zou, S.; Xia, J.; Koussis, Antonis D.

    1996-01-01

    Analytical solutions are obtained by the Fourier transform technique for the one-, two-, and three-dimensional transport of a conservative solute injected instantaneously in a uniform groundwater flow. These solutions account for dispersive non-linearity caused by the heterogeneity of the hydraulic properties of aquifer systems and can be used as building blocks to construct solutions by convolution (principle of superposition) for source conditions other than slug injection. The dispersivity is assumed to vary parabolically with time and is thus constant for the entire system at any given time. Two approaches for estimating time-dependent dispersion parameters are developed for two-dimensional plumes. They both require minimal field tracer test data and, therefore, represent useful tools for assessing real-world aquifer contamination sites. The first approach requires mapped plume-area measurements at two specific times after the tracer injection. The second approach requires concentration-versus-time data from two sampling wells through which the plume passes. Detailed examples and comparisons with other procedures show that the methods presented herein are sufficiently accurate and easier to use than other available methods.

  4. Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.

    DOT National Transportation Integrated Search

    2015-01-01

    The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...

  5. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  6. Streamflow variability and optimal capacity of run-of-river hydropower plants

    NASA Astrophysics Data System (ADS)

    Basso, S.; Botter, G.

    2012-10-01

    The identification of the capacity of a run-of-river plant which allows for the optimal utilization of the available water resources is a challenging task, mainly because of the inherent temporal variability of river flows. This paper proposes an analytical framework to describe the energy production and the economic profitability of small run-of-river power plants on the basis of the underlying streamflow regime. We provide analytical expressions for the capacity which maximize the produced energy as a function of the underlying flow duration curve and minimum environmental flow requirements downstream of the plant intake. Similar analytical expressions are derived for the capacity which maximize the economic return deriving from construction and operation of a new plant. The analytical approach is applied to a minihydro plant recently proposed in a small Alpine catchment in northeastern Italy, evidencing the potential of the method as a flexible and simple design tool for practical application. The analytical model provides useful insight on the major hydrologic and economic controls (e.g., streamflow variability, energy price, costs) on the optimal plant capacity and helps in identifying policy strategies to reduce the current gap between the economic and energy optimizations of run-of-river plants.

  7. Individual Human Cell Responses to Low Doses of Chemicals and Radiation Studied by Synchrotron Infrared Spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Martin, Michael C.; Holman, Hoi-Ying N.; Blakely, Eleanor A.; Goth-Goldstein, Regine; McKinney, Wayne R.

    2000-03-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR FTIR microscopy probes intact living cells providing a composite view of all of the molecular responses and the ability to monitor the responses over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low-doses of radiation and chemicals. In this study we used high spatial-resolution SR FTIR vibrational spectromicroscopy at ALS Beamline 1.4.3 as a sensitive analytical tool to detect chemical- and radiation-induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of oxidative stresses: bleomycin, hydrogen peroxide, and X-rays. We observe spectral changes that are unique to each exogenous stressor. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio-compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  8. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    PubMed

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  9. SensePath: Understanding the Sensemaking Process Through Analytic Provenance.

    PubMed

    Nguyen, Phong H; Xu, Kai; Wheat, Ashley; Wong, B L William; Attfield, Simon; Fields, Bob

    2016-01-01

    Sensemaking is described as the process of comprehension, finding meaning and gaining insight from information, producing new knowledge and informing further action. Understanding the sensemaking process allows building effective visual analytics tools to make sense of large and complex datasets. Currently, it is often a manual and time-consuming undertaking to comprehend this: researchers collect observation data, transcribe screen capture videos and think-aloud recordings, identify recurring patterns, and eventually abstract the sensemaking process into a general model. In this paper, we propose a general approach to facilitate such a qualitative analysis process, and introduce a prototype, SensePath, to demonstrate the application of this approach with a focus on browser-based online sensemaking. The approach is based on a study of a number of qualitative research sessions including observations of users performing sensemaking tasks and post hoc analyses to uncover their sensemaking processes. Based on the study results and a follow-up participatory design session with HCI researchers, we decided to focus on the transcription and coding stages of thematic analysis. SensePath automatically captures user's sensemaking actions, i.e., analytic provenance, and provides multi-linked views to support their further analysis. A number of other requirements elicited from the design session are also implemented in SensePath, such as easy integration with existing qualitative analysis workflow and non-intrusive for participants. The tool was used by an experienced HCI researcher to analyze two sensemaking sessions. The researcher found the tool intuitive and considerably reduced analysis time, allowing better understanding of the sensemaking process.

  10. Immunochemistry for high-throughput screening of human exhaled breath condensate (EBC) media: implementation of automated Quanterix SIMOA instrumentation.

    PubMed

    Pleil, Joachim D; Angrish, Michelle M; Madden, Michael C

    2015-12-11

    Immunochemistry is an important clinical tool for indicating biological pathways leading towards disease. Standard enzyme-linked immunosorbent assays (ELISA) are labor intensive and lack sensitivity at low-level concentrations. Here we report on emerging technology implementing fully-automated ELISA capable of molecular level detection and describe application to exhaled breath condensate (EBC) samples. The Quanterix SIMOA HD-1 analyzer was evaluated for analytical performance for inflammatory cytokines (IL-6, TNF-α, IL-1β and IL-8). The system was challenged with human EBC representing the most dilute and analytically difficult of the biological media. Calibrations from synthetic samples and spiked EBC showed excellent linearity at trace levels (r(2)  >  0.99). Sensitivities varied by analyte, but were robust from ~0.006 (IL-6) to ~0.01 (TNF-α) pg ml(-1). All analytes demonstrated response suppression when diluted with deionized water and so assay buffer diluent was found to be a better choice. Analytical runs required ~45 min setup time for loading samples, reagents, calibrants, etc., after which the instrument performs without further intervention for up to 288 separate samples. Currently, available kits are limited to single-plex analyses and so sample volumes require adjustments. Sample dilutions should be made with assay diluent to avoid response suppression. Automation performs seamlessly and data are automatically analyzed and reported in spreadsheet format. The internal 5-parameter logistic (pl) calibration model should be supplemented with a linear regression spline at the very lowest analyte levels, (<1.3 pg ml(-1)). The implementation of the automated Quanterix platform was successfully demonstrated using EBC, which poses the greatest challenge to ELISA due to limited sample volumes and low protein levels.

  11. Glucose Biosensors: An Overview of Use in Clinical Practice

    PubMed Central

    Yoo, Eun-Hyung; Lee, Soo-Youn

    2010-01-01

    Blood glucose monitoring has been established as a valuable tool in the management of diabetes. Since maintaining normal blood glucose levels is recommended, a series of suitable glucose biosensors have been developed. During the last 50 years, glucose biosensor technology including point-of-care devices, continuous glucose monitoring systems and noninvasive glucose monitoring systems has been significantly improved. However, there continues to be several challenges related to the achievement of accurate and reliable glucose monitoring. Further technical improvements in glucose biosensors, standardization of the analytical goals for their performance, and continuously assessing and training lay users are required. This article reviews the brief history, basic principles, analytical performance, and the present status of glucose biosensors in the clinical practice. PMID:22399892

  12. Design of High Field Solenoids made of High Temperature Superconductors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartalesi, Antonio; /Pisa U.

    2010-12-01

    This thesis starts from the analytical mechanical analysis of a superconducting solenoid, loaded by self generated Lorentz forces. Also, a finite element model is proposed and verified with the analytical results. To study the anisotropic behavior of a coil made by layers of superconductor and insulation, a finite element meso-mechanic model is proposed and designed. The resulting material properties are then used in the main solenoid analysis. In parallel, design work is performed as well: an existing Insert Test Facility (ITF) is adapted and structurally verified to support a coil made of YBa{sub 2}Cu{sub 3}O{sub 7}, a High Temperature Superconductormore » (HTS). Finally, a technological winding process was proposed and the required tooling is designed.« less

  13. Physics-based and human-derived information fusion for analysts

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael

    2017-05-01

    Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.

  14. SAM Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation

  15. Milestones on a Shoestring: A Cost-Effective, Semi-automated Implementation of the New ACGME Requirements for Radiology.

    PubMed

    Schmitt, J Eric; Scanlon, Mary H; Servaes, Sabah; Levin, Dayna; Cook, Tessa S

    2015-10-01

    The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  16. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  17. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  18. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  19. RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.

    PubMed

    Varghese, Blesson; Patel, Ishan; Barker, Adam

    2015-01-01

    Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.

  20. Health informatics and analytics - building a program to integrate business analytics across clinical and administrative disciplines.

    PubMed

    Tremblay, Monica Chiarini; Deckard, Gloria J; Klein, Richard

    2016-07-01

    Health care organizations must develop integrated health information systems to respond to the numerous government mandates driving the movement toward reimbursement models emphasizing value-based and accountable care. Success in this transition requires integrated data analytics, supported by the combination of health informatics, interoperability, business process design, and advanced decision support tools. This case study presents the development of a master's level cross- and multidisciplinary informatics program offered through a business school. The program provides students from diverse backgrounds with the knowledge, leadership, and practical application skills of health informatics, information systems, and data analytics that bridge the interests of clinical and nonclinical professionals. This case presents the actions taken and challenges encountered in navigating intra-university politics, specifying curriculum, recruiting the requisite interdisciplinary faculty, innovating the educational format, managing students with diverse educational and professional backgrounds, and balancing multiple accreditation agencies. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. SAM Pathogen Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.

  2. New tools for investigating student learning in upper-division electrostatics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.

    Student learning in upper-division physics courses is a growing area of research in the field of Physics Education. Developing effective new curricular materials and pedagogical techniques to improve student learning in upper-division courses requires knowledge of both what material students struggle with and what curricular approaches help to overcome these struggles. To facilitate the course transformation process for one specific content area --- upper-division electrostatics --- this thesis presents two new methodological tools: (1) an analytical framework designed to investigate students' struggles with the advanced physics content and mathematically sophisticated tools/techniques required at the junior and senior level, and (2) a new multiple-response conceptual assessment designed to measure student learning and assess the effectiveness of different curricular approaches. We first describe the development and theoretical grounding of a new analytical framework designed to characterize how students use mathematical tools and techniques during physics problem solving. We apply this framework to investigate student difficulties with three specific mathematical tools used in upper-division electrostatics: multivariable integration in the context of Coulomb's law, the Dirac delta function in the context of expressing volume charge densities, and separation of variables as a technique to solve Laplace's equation. We find a number of common themes in students' difficulties around these mathematical tools including: recognizing when a particular mathematical tool is appropriate for a given physics problem, mapping between the specific physical context and the formal mathematical structures, and reflecting spontaneously on the solution to a physics problem to gain physical insight or ensure consistency with expected results. We then describe the development of a novel, multiple-response version of an existing conceptual assessment in upper-division electrostatics courses. The goal of this new version is to provide an easily-graded electrostatics assessment that can potentially be implemented to investigate student learning on a large scale. We show that student performance on the new multiple-response version exhibits a significant degree of consistency with performance on the free-response version, and that it continues to provide significant insight into student reasoning and student difficulties. Moreover, we demonstrate that the new assessment is both valid and reliable using data from upper-division physics students at multiple institutions. Overall, the work described in this thesis represents a significant contribution to the methodological tools available to researchers and instructors interested in improving student learning at the upper-division level.

  3. Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples

    PubMed Central

    2012-01-01

    Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466

  4. Design and analysis of seals for extended service life

    NASA Technical Reports Server (NTRS)

    Bower, Mark V.

    1992-01-01

    Space Station Freedom is being developed for a service life of up to thirty years. As a consequence, the design requirements for the seals to be used are unprecedented. Full scale testing to assure the selected seals can satisfy the design requirements are not feasible. As an alternative, a sub-scale test program has been developed by MSFC to calibrate the analysis tools to be used to certify the proposed design. This research has been conducted in support of the MSFC Integrated Seal Test Program. The ultimate objective of this research is to correlate analysis and test results to qualify the analytical tools, which in turn, are to be used to qualify the flight hardware. This research is totally focused on O-rings that are compressed by perpendicular clamping forces. In this type of seal the O-ring is clamped between the sealing surfaces by loads perpendicular to the circular cross section.

  5. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.

    PubMed

    Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio

    2009-12-01

    Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.

  7. Population dynamics of Aphis gossypii Glover and in sole and intercropping systems of cotton and cowpea.

    PubMed

    Fernandes, Francisco S; Godoy, Wesley A C; Ramalho, Francisco S; Garcia, Adriano G; Santos, Bárbara D B; Malaquias, José B

    2018-01-01

    Population dynamics of aphids have been studied in sole and intercropping systems. These studies have required the use of more precise analytical tools in order to better understand patterns in quantitative data. Mathematical models are among the most important tools to explain the dynamics of insect populations. This study investigated the population dynamics of aphids Aphis gossypii and Aphis craccivora over time, using mathematical models composed of a set of differential equations as a helpful analytical tool to understand the population dynamics of aphids in arrangements of cotton and cowpea. The treatments were sole cotton, sole cowpea, and three arrangements of cotton intercropped with cowpea (t1, t2 and t3). The plants were infested with two aphid species and were evaluated at 7, 14, 28, 35, 42, and 49 days after the infestations. Mathematical models were used to fit the population dynamics of two aphid species. There were good fits for aphid dynamics by mathematical model over time. The highest population peak of both species A. gossypii and A. craccivora was found in the sole crops, and the lowest population peak was found in crop system t2. These results are important for integrated management programs of aphids in cotton and cowpea.

  8. Simulation supported POD for RT test case-concept and modeling

    NASA Astrophysics Data System (ADS)

    Gollwitzer, C.; Bellon, C.; Deresch, A.; Ewert, U.; Jaenisch, G.-R.; Zscherpel, U.; Mistral, Q.

    2012-05-01

    Within the framework of the European project PICASSO, the radiographic simulator aRTist (analytical Radiographic Testing inspection simulation tool) developed by BAM has been extended for reliability assessment of film and digital radiography. NDT of safety relevant components of aerospace industry requires the proof of probability of detection (POD) of the inspection. Modeling tools can reduce the expense of such extended, time consuming NDT trials, if the result of simulation fits to the experiment. Our analytic simulation tool consists of three modules for the description of the radiation source, the interaction of radiation with test pieces and flaws, and the detection process with special focus on film and digital industrial radiography. It features high processing speed with near-interactive frame rates and a high level of realism. A concept has been developed as well as a software extension for reliability investigations, completed by a user interface for planning automatic simulations with varying parameters and defects. Furthermore, an automatic image analysis procedure is included to evaluate the defect visibility. The radiographic modeling from 3D CAD of aero engine components and quality test samples are compared as a precondition for real trials. This enables the evaluation and optimization of film replacement for application of modern digital equipment for economical NDT and defined POD.

  9. Development of computer-based analytical tool for assessing physical protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less

  10. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  11. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  12. SAM Biotoxin Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.

  13. SAM Chemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery

  14. Using Learning Analytics to Support Engagement in Collaborative Writing

    ERIC Educational Resources Information Center

    Liu, Ming; Pardo, Abelardo; Liu, Li

    2017-01-01

    Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…

  15. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  16. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    ERIC Educational Resources Information Center

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  17. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  18. [Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].

    PubMed

    Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou

    2014-08-01

    In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).

  19. QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study

    NASA Astrophysics Data System (ADS)

    Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa

    2016-10-01

    Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.

  20. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1990-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  1. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  2. Multidisciplinary optimization of aeroservoelastic systems using reduced-size models

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1992-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  3. Toward best practice: leveraging the electronic patient record as a clinical data warehouse.

    PubMed

    Ledbetter, C S; Morgan, M W

    2001-01-01

    Automating clinical and administrative processes via an electronic patient record (EPR) gives clinicians the point-of-care tools they need to deliver better patient care. However, to improve clinical practice as a whole and then evaluate it, healthcare must go beyond basic automation and convert EPR data into aggregated, multidimensional information. Unfortunately, few EPR systems have the established, powerful analytical clinical data warehouses (CDWs) required for this conversion. This article describes how an organization can support best practice by leveraging a CDW that is fully integrated into its EPR and clinical decision support (CDS) system. The article (1) discusses the requirements for comprehensive CDS, including on-line analytical processing (OLAP) of data at both transactional and aggregate levels, (2) suggests that the transactional data acquired by an OLTP EPR system must be remodeled to support retrospective, population-based, aggregate analysis of those data, and (3) concludes that this aggregate analysis is best provided by a separate CDW system.

  4. Current and future bioanalytical approaches for stroke assessment.

    PubMed

    Pullagurla, Swathi R; Baird, Alison E; Adamski, Mateusz G; Soper, Steven A

    2015-01-01

    Efforts are underway to develop novel platforms for stroke diagnosis to meet the criteria for effective treatment within the narrow time window mandated by the FDA-approved therapeutic (<3 h). Blood-based biomarkers could be used for rapid stroke diagnosis and coupled with new analytical tools, could serve as an attractive platform for managing stroke-related diseases. In this review, we will discuss the physiological processes associated with stroke and current diagnostic tools as well as their associated shortcomings. We will then review information on blood-based biomarkers and various detection technologies. In particular, point of care testing that permits small blood volumes required for the analysis and rapid turn-around time measurements of multiple markers will be presented.

  5. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  6. Problem Solving in a Middle School Robotics Design Classroom

    NASA Astrophysics Data System (ADS)

    Norton, Stephen J.; McRobbie, Campbell J.; Ginns, Ian S.

    2007-07-01

    Little research has been conducted on how students work when they are required to plan, build and evaluate artefacts in technology rich learning environments such as those supported by tools including flow charts, Labview programming and Lego construction. In this study, activity theory was used as an analytic tool to examine the social construction of meaning. There was a focus on the effect of teachers’ goals and the rules they enacted upon student use of the flow chart planning tool, and the tools of the programming language Labview and Lego construction. It was found that the articulation of a teacher’s goals via rules and divisions of labour helped to form distinct communities of learning and influenced the development of different problem solving strategies. The use of the planning tool flow charting was associated with continuity of approach, integration of problem solutions including appreciation of the nexus between construction and programming, and greater educational transformation. Students who flow charted defined problems in a more holistic way and demonstrated more methodical, insightful and integrated approaches to their use of tools. The findings have implications for teaching in design dominated learning environments.

  7. Phase-0/microdosing studies using PET, AMS, and LC-MS/MS: a range of study methodologies and conduct considerations. Accelerating development of novel pharmaceuticals through safe testing in humans - a practical guide.

    PubMed

    Burt, Tal; John, Christy S; Ruckle, Jon L; Vuong, Le T

    2017-05-01

    Phase-0 studies, including microdosing, also called Exploratory Investigational New Drug (eIND) or exploratory clinical trials, are a regulatory framework for first-in-human (FIH) trials. Common to these approaches is the use and implied safety of limited exposures to test articles. Use of sub-pharmacological doses in phase-0/microdose studies requires sensitive analytic tools such as accelerator mass spectrometer (AMS), Positron Emission Tomography (PET), and Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) to determine drug disposition. Areas covered: Here we present a practical guide to the range of methodologies, design options, and conduct strategies that can be used to increase the efficiency of drug development. We provide detailed examples of relevant developmental scenarios. Expert opinion: Validation studies over the past decade demonstrated the reliability of extrapolation of sub-pharmacological to therapeutic-level exposures in more than 80% of cases, an improvement over traditional allometric approaches. Applications of phase-0/microdosing approaches include study of pharmacokinetic and pharmacodynamic properties, target tissue localization, drug-drug interactions, effects in vulnerable populations (e.g. pediatric), and intra-target microdosing (ITM). Study design should take into account the advantages and disadvantages of each analytic tool. Utilization of combinations of these analytic techniques increases the versatility of study designs and the power of data obtained.

  8. Programmable living material containing reporter micro-organisms permits quantitative detection of oligosaccharides.

    PubMed

    Mora, Carlos A; Herzog, Antoine F; Raso, Renzo A; Stark, Wendelin J

    2015-08-01

    The increasing molecular understanding of many diseases today permits the development of new diagnostic methods. However, few easy-to-handle and inexpensive tools exist for common diseases such as food disorders. Here we present a living material based analytical sensor (LiMBAS) containing genetically modified bacteria (Escherichia coli) immobilized and protected in a thin layer between a nanoporous and support polymer membrane for a facile quantification of disease-relevant oligosaccharides. The bacteria were engineered to fluoresce in response to the analyte to reveal its diffusion behavior when using a blue-light source and optical filter. We demonstrated that the diffusion zone diameter was related semi-logarithmically to the analyte concentration. LiMBAS could accurately quantify lactose or galactose in undiluted food samples and was able to measure food intolerance relevant concentrations in the range of 1-1000 mM requiring a sample volume of 1-10 μL. LiMBAS was storable for at least seven days without losing functionality at 4 °C. A wide range of genetic tools for E. coli are readily available thus allowing the reprogramming of the material to serve as biosensor for other molecules. In combination with smartphones, an automated diagnostic analysis becomes feasible which would also allow untrained people to use LiMBAS. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. PlanetSense: A Real-time Streaming and Spatio-temporal Analytics Platform for Gathering Geo-spatial Intelligence from Open Source Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O

    Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less

  10. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  11. Customisation of the exome data analysis pipeline using a combinatorial approach.

    PubMed

    Pattnaik, Swetansu; Vaidyanathan, Srividya; Pooja, Durgad G; Deepak, Sa; Panda, Binay

    2012-01-01

    The advent of next generation sequencing (NGS) technologies have revolutionised the way biologists produce, analyse and interpret data. Although NGS platforms provide a cost-effective way to discover genome-wide variants from a single experiment, variants discovered by NGS need follow up validation due to the high error rates associated with various sequencing chemistries. Recently, whole exome sequencing has been proposed as an affordable option compared to whole genome runs but it still requires follow up validation of all the novel exomic variants. Customarily, a consensus approach is used to overcome the systematic errors inherent to the sequencing technology, alignment and post alignment variant detection algorithms. However, the aforementioned approach warrants the use of multiple sequencing chemistry, multiple alignment tools, multiple variant callers which may not be viable in terms of time and money for individual investigators with limited informatics know-how. Biologists often lack the requisite training to deal with the huge amount of data produced by NGS runs and face difficulty in choosing from the list of freely available analytical tools for NGS data analysis. Hence, there is a need to customise the NGS data analysis pipeline to preferentially retain true variants by minimising the incidence of false positives and make the choice of right analytical tools easier. To this end, we have sampled different freely available tools used at the alignment and post alignment stage suggesting the use of the most suitable combination determined by a simple framework of pre-existing metrics to create significant datasets.

  12. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Shuttle cryogenics supply system. Optimization study. Volume 5 B-4: Programmers manual for space shuttle orbit injection analysis (SOPSA)

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A computer program for space shuttle orbit injection propulsion system analysis (SOPSA) is described to show the operational characteristics and the computer system requirements. The program was developed as an analytical tool to aid in the preliminary design of propellant feed systems for the space shuttle orbiter main engines. The primary purpose of the program is to evaluate the propellant tank ullage pressure requirements imposed by the need to accelerate propellants rapidly during the engine start sequence. The SOPSA program will generate parametric feed system pressure histories and weight data for a range of nominal feedline sizes.

  14. Energy-dispersive X-ray fluorescence systems as analytical tool for assessment of contaminated soils.

    PubMed

    Vanhoof, Chris; Corthouts, Valère; Tirez, Kristof

    2004-04-01

    To determine the heavy metal content in soil samples at contaminated locations, a static and time consuming procedure is used in most cases. Soil samples are collected and analyzed in the laboratory at high quality and high analytical costs. The demand by government and consultants for a more dynamic approach and by customers requiring performances in which analyses are performed in the field with immediate feedback of the analytical results, is growing. Especially during the follow-up of remediation projects or during the determination of the sampling strategy, field analyses are advisable. For this purpose four types of ED-XRF systems, ranging from portable up to high performance laboratory systems, have been evaluated. The evaluation criteria are based on the performance characteristics for all the ED-XRF systems such as limit of detection, accuracy and the measurement uncertainty on one hand, and also the influence of the sample pretreatment on the obtained results on the other hand. The study proved that the field portable system and the bench top system, placed in a mobile van, can be applied as field techniques, resulting in semi-quantitative analytical results. A limited homogenization of the analyzed sample significantly increases the representativeness of the soil sample. The ED-XRF systems can be differentiated by their limits of detection which are a factor of 10 to 20 higher for the portable system. The accuracy of the results and the measurement uncertainty also improved using the bench top system. Therefore, the selection criteria for applicability of both field systems are based on the required detection level and also the required accuracy of the results.

  15. Developing a Science Commons for Geosciences

    NASA Astrophysics Data System (ADS)

    Lenhardt, W. C.; Lander, H.

    2016-12-01

    Many scientific communities, recognizing the research possibilities inherent in data sets, have created domain specific archives such as the Incorporated Research Institutions for Seismology (iris.edu) and ClinicalTrials.gov. Though this is an important step forward, most scientists, including geoscientists, also use a variety of software tools and at least some amount of computation to conduct their research. While the archives make it simpler for scientists to locate the required data, provisioning disk space, compute resources, and network bandwidth can still require significant efforts. This challenge exists despite the wealth of resources available to researchers, namely lab IT resources, institutional IT resources, national compute resources (XSEDE, OSG), private clouds, public clouds, and the development of cyberinfrastructure technologies meant to facilitate use of those resources. Further tasks include obtaining and installing required tools for analysis and visualization. If the research effort is a collaboration or involves certain types of data, then the partners may well have additional non-scientific tasks such as securing the data and developing secure sharing methods for the data. These requirements motivate our investigations into the "Science Commons". This paper will present a working definition of a science commons, compare and contrast examples of existing science commons, and describe a project based at RENCI to implement a science commons for risk analytics. We will then explore what a similar tool might look like for the geosciences.

  16. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  17. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  18. Analytical Approaches to Verify Food Integrity: Needs and Challenges.

    PubMed

    Stadler, Richard H; Tran, Lien-Anh; Cavin, Christophe; Zbinden, Pascal; Konings, Erik J M

    2016-09-01

    A brief overview of the main analytical approaches and practices to determine food authenticity is presented, addressing, as well, food supply chain and future requirements to more effectively mitigate food fraud. Food companies are introducing procedures and mechanisms that allow them to identify vulnerabilities in their food supply chain under the umbrella of a food fraud prevention management system. A key step and first line of defense is thorough supply chain mapping and full transparency, assessing the likelihood of fraudsters to penetrate the chain at any point. More vulnerable chains, such as those where ingredients and/or raw materials are purchased through traders or auctions, may require a higher degree of sampling, testing, and surveillance. Access to analytical tools is therefore pivotal, requiring continuous development and possibly sophistication in identifying chemical markers, data acquisition, and modeling. Significant progress in portable technologies is evident already today, for instance, as in the rapid testing now available at the agricultural level. In the near future, consumers may also have the ability to scan products in stores or at home to authenticate labels and food content. For food manufacturers, targeted analytical methods complemented by untargeted approaches are end control measures at the factory gate when the material is delivered. In essence, testing for food adulterants is an integral part of routine QC, ideally tailored to the risks in the individual markets and/or geographies or supply chains. The development of analytical methods is a first step in verifying the compliance and authenticity of food materials. A next, more challenging step is the successful establishment of global consensus reference methods as exemplified by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals initiative, which can serve as an approach that could also be applied to methods for contaminants and adulterants in food. The food industry has taken these many challenges aboard, working closely with all stakeholders and continuously communicating on progress in a fully transparent manner.

  19. 7 CFR 400.172 - Qualifying with less than two of the required ratios or ten of the analytical ratios meeting the...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... or ten of the analytical ratios meeting the specified requirements. 400.172 Section 400.172... required ratios or ten of the analytical ratios meeting the specified requirements. An insurer with less than two of the required ratios or ten of the analytical ratios meeting the specified requirements in...

  20. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  1. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  2. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  3. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  4. Feasibility model of a high reliability five-year tape transport. Volume 3: Appendices. [detailed drawing and analytical tools used in analyses

    NASA Technical Reports Server (NTRS)

    Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.

    1973-01-01

    Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.

  5. Challenges and Opportunities in Analysing Students Modelling

    ERIC Educational Resources Information Center

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-01-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…

  6. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  7. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  8. Detection of nanomaterials in food and consumer products: bridging the gap from legislation to enforcement.

    PubMed

    Stamm, H; Gibson, N; Anklam, E

    2012-08-01

    This paper describes the requirements and resulting challenges for the implementation of current and upcoming European Union legislation referring to the use of nanomaterials in food, cosmetics and other consumer products. The European Commission has recently adopted a recommendation for the definition of nanomaterials. There is now an urgent need for appropriate and fit-for-purpose analytical methods in order to identify nanomaterials properly according to this definition and to assess whether or not a product contains nanomaterials. Considering the lack of such methods to date, this paper elaborates on the challenges of the legislative framework and the type of methods needed, not only to facilitate implementation of labelling requirements, but also to ensure the safety of products coming to the market. Considering the many challenges in the analytical process itself, such as interaction of nanoparticles with matrix constituents, potential agglomeration and aggregation due to matrix environment, broad variety of matrices, etc., there is a need for integrated analytical approaches, not only for sample preparation (e.g. separation from matrix), but also for the actual characterisation. Furthermore, there is an urgent need for quality assurance tools such as validated methods and (certified) reference materials, including materials containing nanoparticles in a realistic matrix (food products, cosmetics, etc.).

  9. A Liquid Chromatography – Tandem Mass Spectrometry Approach for the Identification of Mebendazole Residue in Pork, Chicken, and Horse

    PubMed Central

    Lee, Ji Sun; Cho, Soo Hee; Lim, Chae Mi; Chang, Moon Ik; Joo, Hyun Jin; Park, Hyun Jin

    2017-01-01

    A confirmatory and quantitative method of liquid chromatography-tandem mass spectrometry (LC-MS/MS) for the determination of mebendazole and its hydrolyzed and reduced metabolites in pork, chicken, and horse muscles was developed and validated in this study. Anthelmintic compounds were extracted with ethyl acetate after sample mixture was made alkaline followed by liquid chromatographic separation using a reversed phase C18 column. Gradient elution was performed with a mobile phase consisting of water containing 10 mM ammonium formate and methanol. This confirmatory method was validated according to EU requirements. Evaluated validation parameters included specificity, accuracy, precision (repeatability and within-laboratory reproducibility), analytical limits (decision limit and detection limit), and applicability. Most parameters were proved to be conforming to the EU requirements. The decision limit (CCα) and detection capability (CCβ) for all analytes ranged from 15.84 to 17.96 μgkg-1. The limit of detection (LOD) and the limit of quantification (LOQ) for all analytes were 0.07 μgkg-1 and 0.2 μgkg-1, respectively. The developed method was successfully applied to monitoring samples collected from the markets in major cities and proven great potential to be used as a regulatory tool to determine mebendazole residues in animal based foods. PMID:28085912

  10. A web-based data visualization tool for the MIMIC-II database.

    PubMed

    Lee, Joon; Ribey, Evan; Wallace, James R

    2016-02-04

    Although MIMIC-II, a public intensive care database, has been recognized as an invaluable resource for many medical researchers worldwide, becoming a proficient MIMIC-II researcher requires knowledge of SQL programming and an understanding of the MIMIC-II database schema. These are challenging requirements especially for health researchers and clinicians who may have limited computer proficiency. In order to overcome this challenge, our objective was to create an interactive, web-based MIMIC-II data visualization tool that first-time MIMIC-II users can easily use to explore the database. The tool offers two main features: Explore and Compare. The Explore feature enables the user to select a patient cohort within MIMIC-II and visualize the distributions of various administrative, demographic, and clinical variables within the selected cohort. The Compare feature enables the user to select two patient cohorts and visually compare them with respect to a variety of variables. The tool is also helpful to experienced MIMIC-II researchers who can use it to substantially accelerate the cumbersome and time-consuming steps of writing SQL queries and manually visualizing extracted data. Any interested researcher can use the MIMIC-II data visualization tool for free to quickly and conveniently conduct a preliminary investigation on MIMIC-II with a few mouse clicks. Researchers can also use the tool to learn the characteristics of the MIMIC-II patients. Since it is still impossible to conduct multivariable regression inside the tool, future work includes adding analytics capabilities. Also, the next version of the tool will aim to utilize MIMIC-III which contains more data.

  11. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  12. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  13. Analysis of laparoscopy in trauma.

    PubMed

    Villavicencio, R T; Aucar, J A

    1999-07-01

    The optimum roles for laparoscopy in trauma have yet to be established. To date, reviews of laparoscopy in trauma have been primarily descriptive rather than analytic. This article analyzes the results of laparoscopy in trauma. Outcome analysis was done by reviewing 37 studies with more than 1,900 trauma patients, and laparoscopy was analyzed as a screening, diagnostic, or therapeutic tool. Laparoscopy was regarded as a screening tool if it was used to detect or exclude a positive finding (eg, hemoperitoneum, organ injury, gastrointestinal spillage, peritoneal penetration) that required operative exploration or repair. Laparoscopy was regarded as a diagnostic tool when it was used to identify all injuries, rather than as a screening tool to identify the first indication for a laparotomy. It was regarded as a diagnostic tool only in studies that mandated a laparotomy (gold standard) after laparoscopy to confirm the diagnostic accuracy of laparoscopic findings. Costs and charges for using laparoscopy in trauma were analyzed when feasible. As a screening tool, laparoscopy missed 1% of injuries and helped prevent 63% of patients from having a trauma laparotomy. When used as a diagnostic tool, laparoscopy had a 41% to 77% missed injury rate per patient. Overall, laparoscopy carried a 1% procedure-related complication rate. Cost-effectiveness has not been uniformly proved in studies comparing laparoscopy and laparotomy. Laparoscopy has been applied safely and effectively as a screening tool in stable patients with acute trauma. Because of the large number of missed injuries when used as a diagnostic tool, its value in this context is limited. Laparoscopy has been reported infrequently as a therapeutic tool in selected patients, and its use in this context requires further study.

  14. Automated Deployment of Advanced Controls and Analytics in Buildings

    NASA Astrophysics Data System (ADS)

    Pritoni, Marco

    Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.

  15. Mars sample collection and preservation

    NASA Technical Reports Server (NTRS)

    Blanchard, Douglas P.

    1988-01-01

    The intensive exploration of Mars is a major step in the systematic exploration of the solar system. Mars, earth, and Venus provide valuable contrasts in planetary evolution. Mars exploration has progressed through the stages of exploration and is now ready for a sample-return mission. About 5 kg of intelligently selected samples will be returned from Mars. A variety of samples are wanted. This requires accurate landing in areas of high interest, surface mobility and analytical capability, a variety of sampling tools, and stringent preservation and isolation measures.

  16. MODELING MICROBUBBLE DYNAMICS IN BIOMEDICAL APPLICATIONS*

    PubMed Central

    CHAHINE, Georges L.; HSIAO, Chao-Tsung

    2012-01-01

    Controlling microbubble dynamics to produce desirable biomedical outcomes when and where necessary and avoid deleterious effects requires advanced knowledge, which can be achieved only through a combination of experimental and numerical/analytical techniques. The present communication presents a multi-physics approach to study the dynamics combining viscous- in-viscid effects, liquid and structure dynamics, and multi bubble interaction. While complex numerical tools are developed and used, the study aims at identifying the key parameters influencing the dynamics, which need to be included in simpler models. PMID:22833696

  17. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  18. Internal quality control: best practice.

    PubMed

    Kinns, Helen; Pitkin, Sarah; Housley, David; Freedman, Danielle B

    2013-12-01

    There is a wide variation in laboratory practice with regard to implementation and review of internal quality control (IQC). A poor approach can lead to a spectrum of scenarios from validation of incorrect patient results to over investigation of falsely rejected analytical runs. This article will provide a practical approach for the routine clinical biochemistry laboratory to introduce an efficient quality control system that will optimise error detection and reduce the rate of false rejection. Each stage of the IQC system is considered, from selection of IQC material to selection of IQC rules, and finally the appropriate action to follow when a rejection signal has been obtained. The main objective of IQC is to ensure day-to-day consistency of an analytical process and thus help to determine whether patient results are reliable enough to be released. The required quality and assay performance varies between analytes as does the definition of a clinically significant error. Unfortunately many laboratories currently decide what is clinically significant at the troubleshooting stage. Assay-specific IQC systems will reduce the number of inappropriate sample-run rejections compared with the blanket use of one IQC rule. In practice, only three or four different IQC rules are required for the whole of the routine biochemistry repertoire as assays are assigned into groups based on performance. The tools to categorise performance and assign IQC rules based on that performance are presented. Although significant investment of time and education is required prior to implementation, laboratories have shown that such systems achieve considerable reductions in cost and labour.

  19. What information on measurement uncertainty should be communicated to clinicians, and how?

    PubMed

    Plebani, Mario; Sciacovelli, Laura; Bernardi, Daniela; Aita, Ada; Antonelli, Giorgia; Padoan, Andrea

    2018-02-02

    The communication of laboratory results to physicians and the quality of reports represent fundamental requirements of the post-analytical phase in order to assure the right interpretation and utilization of laboratory information. Accordingly, the International Standard for clinical laboratories accreditation (ISO 15189) requires that "laboratory reports shall include the information necessary for the interpretation of the examination results". Measurement uncertainty (MU) is an inherent property of any quantitative measurement result which express the lack of knowledge of the true value and quantify the uncertainty of a result, incorporating the factors known to influence it. Even if the MU is not included in the report attributes of ISO 15189 and cannot be considered a post-analytical requirement, it is suggested as an information which should facilitate an appropriate interpretation of quantitative results (quantity values). Therefore, MU has two intended uses: for laboratory professionals, it gives information about the quality of measurements, providing evidence of the compliance with analytical performance characteristics; for physicians (and patients) it may help in interpretation of measurement results, especially when values are compared with reference intervals or clinical decision limits, providing objective information. Here we describe the way that MU should be added to laboratory reports in order to facilitate the interpretation of laboratory results and connecting efforts performed within laboratory to provide more accurate and reliable results with a more objective tool for their interpretation by physicians. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Development of the Veritas plot and its application in cardiac surgery: an evidence-synthesis graphic tool for the clinician to assess multiple meta-analyses reporting on a common outcome.

    PubMed

    Panesar, Sukhmeet S; Rao, Christopher; Vecht, Joshua A; Mirza, Saqeb B; Netuveli, Gopalakrishnan; Morris, Richard; Rosenthal, Joe; Darzi, Ara; Athanasiou, Thanos

    2009-10-01

    Meta-analyses may be prone to generating misleading results because of a paucity of experimental studies (especially in surgery); publication bias; and heterogeneity in study design, intervention and the patient population of included studies. When investigating a specific clinical or scientific question on which several relevant meta-analyses may have been published, value judgments must be applied to determine which analysis represents the most robust evidence. These value judgments should be specifically acknowledged. We designed the Veritas plot to explicitly explore important elements of quality and to facilitate decision-making by highlighting specific areas in which meta-analyses are found to be deficient. Furthermore, as a graphic tool, it may be more intuitive than when similar data are presented in a tabular or text format. The Veritas plot is an adaption of the radar plot, a graphic tool for the description of multiattribute data. Key elements of meta-analytical quality such as heterogeneity, publication bias and study design are assessed. Existing qualitative methods such as the Assessment of Multiple Systematic Reviews (AMSTAR) tool have been incorporated in addition to important considerations when interpreting surgical meta-analyses such as the year of publication and population characteristics. To demonstrate the potential of the Veritas plot to inform clinical practice, we apply the Veritas plot to the meta-analytical literature comparing the incidence of 30-day stroke in off-pump coronary artery bypass surgery and conventional coronary artery bypass surgery. We demonstrate that a visually-stimulating and practical evidence-synthesis tool can direct the clinician and scientist to a particular meta-analytical study to inform clinical practice. The Veritas plot is also cumulative and allowed us to assess the quality of evidence over time. We have presented a practical graphic application for scientists and clinicians to identify and interpret variability in meta-analyses. Although further validation of the Veritas plot is required, it may have the potential to contribute to the implementation of evidence-based practice.

  1. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  2. An Engineering Tool for the Prediction of Internal Dielectric Charging

    NASA Astrophysics Data System (ADS)

    Rodgers, D. J.; Ryden, K. A.; Wrenn, G. L.; Latham, P. M.; Sorensen, J.; Levy, L.

    1998-11-01

    A practical internal charging tool has been developed. It provides an easy-to-use means for satellite engineers to predict whether on-board dielectrics are vulnerable to electrostatic discharge in the outer radiation belt. The tool is designed to simulate irradiation of single-dielectric planar or cylindrical structures with or without shielding. Analytical equations are used to describe current deposition in the dielectric. This is fast and gives charging currents to sufficient accuracy given the uncertainties in other aspects of the problem - particularly material characteristics. Time-dependent internal electric fields are calculated, taking into account the effect on conductivity of electric field, dose rate and temperature. A worst-case model of electron fluxes in the outer belt has been created specifically for the internal charging problem and is built into the code. For output, the tool gives a YES or NO decision on the susceptibility of the structure to internal electrostatic breakdown and if necessary, calculates the required changes to bring the system below the breakdown threshold. A complementary programme of laboratory irradiations has been carried out to validate the tool. The results for Epoxy-fibreglass samples show that the code models electric field realistically for a wide variety of shields, dielectric thicknesses and electron spectra. Results for Teflon samples indicate that some further experimentation is required and the radiation-induced conductivity aspects of the code have not been validated.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  4. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  5. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  6. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    PubMed

    Manickum, Thavrin; John, Wilson

    2015-07-01

    The availability of national test centers to offer a routine service for analysis and quantitation of some selected steroid hormones [natural estrogens (17-β-estradiol, E2; estrone, E1; estriol, E3), synthetic estrogen (17-α-ethinylestradiol, EE2), androgen (testosterone), and progestogen (progesterone)] in wastewater matrix was investigated; corresponding internationally used chemical- and immuno-analytical test methods were reviewed. The enzyme-linked immunosorbent assay (ELISA) (immuno-analytical technique) was also assessed for its suitability as a routine test method to quantitate the levels of these hormones at a sewage/wastewater treatment plant (WTP) (Darvill, Pietermaritzburg, South Africa), over a 2-year period. The method performance and other relevant characteristics of the immuno-analytical ELISA method were compared to the conventional chemical-analytical methodology, like gas/liquid chromatography-mass spectrometry (GC/LC-MS), and GC-LC/tandem mass spectrometry (MSMS), for quantitation of the steroid hormones in wastewater and environmental waters. The national immuno-analytical ELISA technique was found to be sensitive (LOQ 5 ng/L, LOD 0.2-5 ng/L), accurate (mean recovery 96%), precise (RSD 7-10%), and cost-effective for screening and quantitation of these steroid hormones in wastewater and environmental water matrix. A survey of the most current international literature indicates a fairly equal use of the LC-MS/MS, GC-MS/MS (chemical-analytical), and ELISA (immuno-analytical) test methods for screening and quantitation of the target steroid hormones in both water and wastewater matrix. Internationally, the observed sensitivity, based on LOQ (ng/L), for the steroid estrogens E1, E2, EE2, is, in decreasing order: LC-MSMS (0.08-9.54) > GC-MS (1) > ELISA (5) (chemical-analytical > immuno-analytical). At the national level, the routine, unoptimized chemical-analytical LC-MSMS method was found to lack the required sensitivity for meeting environmental requirements for steroid hormone quantitation. Further optimization of the sensitivity of the chemical-analytical LC-tandem mass spectrometry methods, especially for wastewater screening, in South Africa is required. Risk assessment studies showed that it was not practical to propose standards or allowable limits for the steroid estrogens E1, E2, EE2, and E3; the use of predicted-no-effect concentration values of the steroid estrogens appears to be appropriate for use in their risk assessment in relation to aquatic organisms. For raw water sources, drinking water, raw and treated wastewater, the use of bioassays, with trigger values, is a useful screening tool option to decide whether further examination of specific endocrine activity may be warranted, or whether concentrations of such activity are of low priority, with respect to health concerns in the human population. The achievement of improved quantitation limits for immuno-analytical methods, like ELISA, used for compound quantitation, and standardization of the method for measuring E2 equivalents (EEQs) used for biological activity (endocrine: e.g., estrogenic) are some areas for future EDC research.

  7. Human factors issues and approaches in the spatial layout of a space station control room, including the use of virtual reality as a design analysis tool

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1994-01-01

    Human Factors Engineering support was provided for the 30% design review of the late Space Station Freedom Payload Control Area (PCA). The PCA was to be the payload operations control room, analogous to the Spacelab Payload Operations Control Center (POCC). This effort began with a systematic collection and refinement of the relevant requirements driving the spatial layout of the consoles and PCA. This information was used as input for specialized human factors analytical tools and techniques in the design and design analysis activities. Design concepts and configuration options were developed and reviewed using sketches, 2-D Computer-Aided Design (CAD) drawings, and immersive Virtual Reality (VR) mockups.

  8. Increasing Access and Usability of Remote Sensing Data: The NASA Protected Area Archive

    NASA Technical Reports Server (NTRS)

    Geller, Gary N.

    2004-01-01

    Although remote sensing data are now widely available, much of it at low or no-cost, many managers of protected conservation areas do not have the expertise or tools to view or analyze it. Thus access to it by the protected area management community is effectively blocked. The Protected Area Archive will increase access to remote sensing data by creating collections of satellite images of protected areas and packaging them with simple-to-use visualization and analytical tools. The user can easily locate the area and image of interest on a map, then display, roam, and zoom the image. A set of simple tools will be provided so the user can explore the data and employ it to assist in management and monitoring of their area. The 'Phase 1 ' version requires only a Windows-based computer and basic computer skills, and may be of particular help to protected area managers in developing countries.

  9. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  10. Analytical techniques for characterization of cyclodextrin complexes in aqueous solution: a review.

    PubMed

    Mura, Paola

    2014-12-01

    Cyclodextrins are cyclic oligosaccharides endowed with a hydrophilic outer surface and a hydrophobic inner cavity, able to form inclusion complexes with a wide variety of guest molecules, positively affecting their physicochemical properties. In particular, in the pharmaceutical field, cyclodextrin complexation is mainly used to increase the aqueous solubility and dissolution rate of poorly soluble drugs, and to enhance their bioavailability and stability. Analytical characterization of host-guest interactions is of fundamental importance for fully exploiting the potential benefits of complexation, helping in selection of the most appropriate cyclodextrin. The assessment of the actual formation of a drug-cyclodextrin inclusion complex and its full characterization is not a simple task and often requires the use of different analytical methods, whose results have to be combined and examined together. The purpose of the present review is to give, as much as possible, a general overview of the main analytical tools which can be employed for the characterization of drug-cyclodextrin inclusion complexes in solution, with emphasis on their respective potential merits, disadvantages and limits. Further, the applicability of each examined technique is illustrated and discussed by specific examples from literature. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Harnessing Scientific Literature Reports for Pharmacovigilance

    PubMed Central

    Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-01-01

    Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432

  12. National baselines for the Sustainable Development Goals assessed in the SDG Index and Dashboards

    NASA Astrophysics Data System (ADS)

    Schmidt-Traub, Guido; Kroll, Christian; Teksoz, Katerina; Durand-Delacre, David; Sachs, Jeffrey D.

    2017-08-01

    The Sustainable Development Goals (SDGs) -- agreed in 2015 by all 193 member states of the United Nations and complemented by commitments made in the Paris Agreement -- map out a broad spectrum of economic, social and environmental objectives to be achieved by 2030. Reaching these goals will require deep transformations in every country, as well as major efforts in monitoring and measuring progress. Here we introduce the SDG Index and Dashboards as analytical tools for assessing countries' baselines for the SDGs that can be applied by researchers in the cross-disciplinary analyses required for implementation. The Index and Dashboards synthesize available country-level data for all 17 goals, and for each country estimate the size of the gap towards achieving the SDGs. They will be updated annually. All 149 countries for which sufficient data is available face significant challenges in achieving the goals, and many countries' development strategies are imbalanced across the economic, social and environmental priorities. We illustrate the analytical value of the index by examining its relationship with other widely used development indices and by showing how it accounts for cross-national differences in subjective well-being. Given significant data gaps, scope and coverage of the Index and Dashboards are limited, but we suggest that these analyses represent a starting point for a comprehensive assessment of national SDG baselines and can help policymakers determine priorities for early action and monitor progress. The tools also identify data gaps that must be closed for SDG monitoring.

  13. Strategy under uncertainty.

    PubMed

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  14. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  15. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  16. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  17. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  18. Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft

    DTIC Science & Technology

    2013-03-01

    imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89  B.  FUTURE WORK................................................................................. 90  APPENDIX A. STK DATA AND BENEFIT

  19. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  20. The Metaphorical Department Head: Using Metaphors as Analytic Tools to Investigate the Role of Department Head

    ERIC Educational Resources Information Center

    Paranosic, Nikola; Riveros, Augusto

    2017-01-01

    This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…

  1. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  2. Near-Term Actions to Address Long-Term Climate Risk

    NASA Astrophysics Data System (ADS)

    Lempert, R. J.

    2014-12-01

    Addressing climate change requires effective long-term policy making, which occurs when reflecting on potential events decades or more in the future causes policy makers to choose near-term actions different than those they would otherwise pursue. Contrary to some expectations, policy makers do sometimes make such long-term decisions, but not as commonly and successfully as climate change may require. In recent years however, the new capabilities of analytic decision support tools, combined with improved understanding of cognitive and organizational behaviors, has significantly improved the methods available for organizations to manage longer-term climate risks. In particular, these tools allow decision makers to understand what near-term actions consistently contribute to achieving both short- and long-term societal goals, even in the face of deep uncertainty regarding the long-term future. This talk will describe applications of these approaches for infrastructure, water, and flood risk management planning, as well as studies of how near-term choices about policy architectures can affect long-term greenhouse gas emission reduction pathways.

  3. Microtube strip heat exchanger

    NASA Astrophysics Data System (ADS)

    Doty, F. D.

    1991-04-01

    During the last quarter, Doty Scientific, Inc. (DSI) continued to make progress on the microtube strip (MTS) heat exchangers. The team has begun a heat exchanger stress analysis; however, they have been concentrating the bulk of their analytical energies on a computational fluid dynmaics (CFD) model to determine the location and magnitude of shell-side flow maldistribution which decreases heat exchanger effectiveness. DSI received 120 fineblanked tubestrips from Southern Fineblanking (SFB) for manufacturing process development. Both SFB and NIST provided inspection reports of the tubestrips. DSI completed the tooling required to encapsulate a tube array and press tubestrips on the array. Pressing the tubestrips on tube arrays showed design deficiencies both in the tubestrip design and the tooling design. DSI has a number of revisions in process to correct these deficiencies. The research effort has identified a more economical fusible alloy for encapsulating the tube array, and determined the parameters required to successfully encapsulate the tube array with the new alloy. A more compact MTS heat exchanger bank was designed.

  4. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Engineering and programming manual: Two-dimensional kinetic reference computer program (TDK)

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dang, L. D.; Coats, D. E.

    1985-01-01

    The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket thrust chamber performance prediction methodology. The development of a methodology that includes all aspects of rocket engine performance from analytical calculation to test measurements, that is physically accurate and consistent, and that serves as an industry and government reference is presented. Recent interest in rocket engines that operate at high expansion ratio, such as most Orbit Transfer Vehicle (OTV) engine designs, has required an extension of the analytical methods used by the TDK computer program. Thus, the version of TDK that is described in this manual is in many respects different from the 1973 version of the program. This new material reflects the new capabilities of the TDK computer program, the most important of which are described.

  6. Challenges and opportunities in analysing students modelling

    NASA Astrophysics Data System (ADS)

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-02-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them - the model of modelling diagram (MMD) - as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the modelling process experienced by students working in small groups aiming at creating and testing a model of a sedimentary basin from the information provided. The study was conducted in a regular Biology and Geology classroom (16-17 years old students). Data was collected through video recording of the classes, along with written reports and the material models made by each group. The results show the complexity of adapting MMD at two levels: the group modelling and the actual requirements for the activity. Our main challenges were to gather the modelling process of each individual and the group, as well as to identify, from students' speech, which stage of modelling they were performing at a given time. When facing such challenges, we propose some changes in the MMD so that it can be properly used to analyse students performing modelling activities in groups.

  7. Not Just a Game … When We Play Together, We Learn Together: Interactive Virtual Environments and Gaming Engines for Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Anderson, J. W.

    2017-12-01

    An ideal tool for ecologists and land managers to investigate the impacts of both projected environmental changes and policy alternatives is the creation of immersive, interactive, virtual landscapes. As a new frontier in visualizing and understanding geospatial data, virtual landscapes require a new toolbox for data visualization that includes traditional GIS tools and uncommon tools such as the Unity3d game engine. Game engines provide capabilities to not only explore data but to build and interact with dynamic models collaboratively. These virtual worlds can be used to display and illustrate data that is often more understandable and plausible to both stakeholders and policy makers than is achieved using traditional maps.Within this context we will present funded research that has been developed utilizing virtual landscapes for geographic visualization and decision support among varied stakeholders. We will highlight the challenges and lessons learned when developing interactive virtual environments that require large multidisciplinary team efforts with varied competences. The results will emphasize the importance of visualization and interactive virtual environments and the link with emerging research disciplines within Visual Analytics.

  8. Quantitative imaging with fluorescent biosensors.

    PubMed

    Okumoto, Sakiko; Jones, Alexander; Frommer, Wolf B

    2012-01-01

    Molecular activities are highly dynamic and can occur locally in subcellular domains or compartments. Neighboring cells in the same tissue can exist in different states. Therefore, quantitative information on the cellular and subcellular dynamics of ions, signaling molecules, and metabolites is critical for functional understanding of organisms. Mass spectrometry is generally used for monitoring ions and metabolites; however, its temporal and spatial resolution are limited. Fluorescent proteins have revolutionized many areas of biology-e.g., fluorescent proteins can report on gene expression or protein localization in real time-yet promoter-based reporters are often slow to report physiologically relevant changes such as calcium oscillations. Therefore, novel tools are required that can be deployed in specific cells and targeted to subcellular compartments in order to quantify target molecule dynamics directly. We require tools that can measure enzyme activities, protein dynamics, and biophysical processes (e.g., membrane potential or molecular tension) with subcellular resolution. Today, we have an extensive suite of tools at our disposal to address these challenges, including translocation sensors, fluorescence-intensity sensors, and Förster resonance energy transfer sensors. This review summarizes sensor design principles, provides a database of sensors for more than 70 different analytes/processes, and gives examples of applications in quantitative live cell imaging.

  9. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  10. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    ERIC Educational Resources Information Center

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  11. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  12. Pods: a Powder Delivery System for Mars In-situ Organic, Mineralogic and Isotopic Analysis Instruments

    NASA Technical Reports Server (NTRS)

    Saha, C. P.; Bryson, C. E.; Sarrazin, P.; Blake, D. F.

    2005-01-01

    Many Mars in situ instruments require fine-grained high-fidelity samples of rocks or soil. Included are instruments for the determination of mineralogy as well as organic and isotopic chemistry. Powder can be obtained as a primary objective of a sample collection system (e.g., by collecting powder as a surface is abraded by a rotary abrasion tool (RAT)), or as a secondary objective (e.g, by collecting drill powder as a core is drilled). In the latter case, a properly designed system could be used to monitor drilling in real time as well as to deliver powder to analytical instruments which would perform complementary analyses to those later performed on the intact core. In addition, once a core or other sample is collected, a system that could transfer intelligently collected subsamples of power from the intact core to a suite of analytical instruments would be highly desirable. We have conceptualized, developed and tested a breadboard Powder Delivery System (PoDS) intended to satisfy the collection, processing and distribution requirements of powder samples for Mars in-situ mineralogic, organic and isotopic measurement instruments.

  13. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  14. Supporting cognition in systems biology analysis: findings on users' processes and design implications.

    PubMed

    Mirel, Barbara

    2009-02-13

    Current usability studies of bioinformatics tools suggest that tools for exploratory analysis support some tasks related to finding relationships of interest but not the deep causal insights necessary for formulating plausible and credible hypotheses. To better understand design requirements for gaining these causal insights in systems biology analyses a longitudinal field study of 15 biomedical researchers was conducted. Researchers interacted with the same protein-protein interaction tools to discover possible disease mechanisms for further experimentation. Findings reveal patterns in scientists' exploratory and explanatory analysis and reveal that tools positively supported a number of well-structured query and analysis tasks. But for several of scientists' more complex, higher order ways of knowing and reasoning the tools did not offer adequate support. Results show that for a better fit with scientists' cognition for exploratory analysis systems biology tools need to better match scientists' processes for validating, for making a transition from classification to model-based reasoning, and for engaging in causal mental modelling. As the next great frontier in bioinformatics usability, tool designs for exploratory systems biology analysis need to move beyond the successes already achieved in supporting formulaic query and analysis tasks and now reduce current mismatches with several of scientists' higher order analytical practices. The implications of results for tool designs are discussed.

  15. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  16. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  17. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  18. Using geovisual analytics in Google Earth to understand disease distribution: a case study of campylobacteriosis in the Czech Republic (2008-2012).

    PubMed

    Marek, Lukáš; Tuček, Pavel; Pászto, Vít

    2015-01-28

    Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.

  19. Development and validation of the first liquid chromatography-tandem mass spectrometry assay for simultaneous quantification of multiple antiretrovirals in meconium.

    PubMed

    Himes, Sarah K; Scheidweiler, Karl B; Tassiopoulos, Katherine; Kacanek, Deborah; Hazra, Rohan; Rich, Kenneth; Huestis, Marilyn A

    2013-02-05

    A novel method for the simultaneous quantification of 16 antiretroviral (ARV) drugs and 4 metabolites in meconium was developed and validated. Quantification of 6 nucleoside/nucleotide reverse transcriptase inhibitors, 2 non-nucleoside reverse transcriptase inhibitors, 7 protease inhibitors, and 1 integrase inhibitor was achieved in 0.25 g of meconium. Specimen preparation included methanol homogenization and solid-phase extraction. Separate positive and negative polarity multiple reaction monitoring mode injections were required to achieve sufficient sensitivity. Linearity ranged from 10 to 75 ng/g up to 2500 ng/g for most analytes and 100-500 ng/g up to 25,000 ng/g for some; all correlation coefficients were ≥0.99. Extraction efficiencies from meconium were 32.8-119.5% with analytical recovery of 80.3-108.3% and total imprecision of 2.2-11.0% for all quantitative analytes. Two analytes with analytical recovery (70.0-138.5%) falling outside the 80-120% criteria range were considered semiquantitative. Matrix effects were -98.3-47.0% and -98.0-67.2% for analytes and internal standards, respectively. Analytes were stable (>75%) at room temperature for 24 h, 4 °C for 3 days, -20 °C for 3 freeze-thaw cycles over 3 days, and on the autosampler. Method applicability was demonstrated by analyzing meconium from HIV-uninfected infants born to HIV-positive mothers on ARV therapy. This method can be used as a tool to investigate the potential effects of in utero ARV exposure on childhood health and neurodevelopmental outcomes.

  20. Learner Dashboards a Double-Edged Sword? Students' Sense-Making of a Collaborative Critical Reading and Learning Analytics Environment for Fostering 21st-Century Literacies

    ERIC Educational Resources Information Center

    Pei-Ling Tan, Jennifer; Koh, Elizabeth; Jonathan, Christin; Yang, Simon

    2017-01-01

    The affordances of learning analytics (LA) tools and solutions are being increasingly harnessed for enhancing 21st century pedagogical and learning strategies and outcomes. However, use cases and empirical understandings of students' experiences with LA tools and environments aimed at fostering 21st century literacies, especially in the K-12…

  1. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  2. The Benefits and Complexities of Operating Geographic Information Systems (GIS) in a High Performance Computing (HPC) Environment

    NASA Astrophysics Data System (ADS)

    Shute, J.; Carriere, L.; Duffy, D.; Hoy, E.; Peters, J.; Shen, Y.; Kirschbaum, D.

    2017-12-01

    The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center is building and maintaining an Enterprise GIS capability for its stakeholders, to include NASA scientists, industry partners, and the public. This platform is powered by three GIS subsystems operating in a highly-available, virtualized environment: 1) the Spatial Analytics Platform is the primary NCCS GIS and provides users discoverability of the vast DigitalGlobe/NGA raster assets within the NCCS environment; 2) the Disaster Mapping Platform provides mapping and analytics services to NASA's Disaster Response Group; and 3) the internal (Advanced Data Analytics Platform/ADAPT) enterprise GIS provides users with the full suite of Esri and open source GIS software applications and services. All systems benefit from NCCS's cutting edge infrastructure, to include an InfiniBand network for high speed data transfers; a mixed/heterogeneous environment featuring seamless sharing of information between Linux and Windows subsystems; and in-depth system monitoring and warning systems. Due to its co-location with the NCCS Discover High Performance Computing (HPC) environment and the Advanced Data Analytics Platform (ADAPT), the GIS platform has direct access to several large NCCS datasets including DigitalGlobe/NGA, Landsat, MERRA, and MERRA2. Additionally, the NCCS ArcGIS Desktop Windows virtual machines utilize existing NetCDF and OPeNDAP assets for visualization, modelling, and analysis - thus eliminating the need for data duplication. With the advent of this platform, Earth scientists have full access to vast data repositories and the industry-leading tools required for successful management and analysis of these multi-petabyte, global datasets. The full system architecture and integration with scientific datasets will be presented. Additionally, key applications and scientific analyses will be explained, to include the NASA Global Landslide Catalog (GLC) Reporter crowdsourcing application, the NASA GLC Viewer discovery and analysis tool, the DigitalGlobe/NGA Data Discovery Tool, the NASA Disaster Response Group Mapping Platform (https://maps.disasters.nasa.gov), and support for NASA's Arctic - Boreal Vulnerability Experiment (ABoVE).

  3. A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.

    2015-12-01

    Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.

  4. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.

  5. Kinematic synthesis of adjustable robotic mechanisms

    NASA Astrophysics Data System (ADS)

    Chuenchom, Thatchai

    1993-01-01

    Conventional hard automation, such as a linkage-based or a cam-driven system, provides high speed capability and repeatability but not the flexibility required in many industrial applications. The conventional mechanisms, that are typically single-degree-of-freedom systems, are being increasingly replaced by multi-degree-of-freedom multi-actuators driven by logic controllers. Although this new trend in sophistication provides greatly enhanced flexibility, there are many instances where the flexibility needs are exaggerated and the associated complexity is unnecessary. Traditional mechanism-based hard automation, on the other hand, neither can fulfill multi-task requirements nor are cost-effective mainly due to lack of methods and tools to design-in flexibility. This dissertation attempts to bridge this technological gap by developing Adjustable Robotic Mechanisms (ARM's) or 'programmable mechanisms' as a middle ground between high speed hard automation and expensive serial jointed-arm robots. This research introduces the concept of adjustable robotic mechanisms towards cost-effective manufacturing automation. A generalized analytical synthesis technique has been developed to support the computational design of ARM's that lays the theoretical foundation for synthesis of adjustable mechanisms. The synthesis method developed in this dissertation, called generalized adjustable dyad and triad synthesis, advances the well-known Burmester theory in kinematics to a new level. While this method provides planar solutions, a novel patented scheme is utilized for converting prescribed three-dimensional motion specifications into sets of planar projections. This provides an analytical and a computational tool for designing adjustable mechanisms that satisfy multiple sets of three-dimensional motion specifications. Several design issues were addressed, including adjustable parameter identification, branching defect, and mechanical errors. An efficient mathematical scheme for identification of adjustable member was also developed. The analytical synthesis techniques developed in this dissertation were successfully implemented in a graphic-intensive user-friendly computer program. A physical prototype of a general purpose adjustable robotic mechanism has been constructed to serve as a proof-of-concept model.

  6. Analysis of Advanced Rotorcraft Configurations

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2000-01-01

    Advanced rotorcraft configurations are being investigated with the objectives of identifying vehicles that are larger, quieter, and faster than current-generation rotorcraft. A large rotorcraft, carrying perhaps 150 passengers, could do much to alleviate airport capacity limitations, and a quiet rotorcraft is essential for community acceptance of the benefits of VTOL operations. A fast, long-range, long-endurance rotorcraft, notably the tilt-rotor configuration, will improve rotorcraft economics through productivity increases. A major part of the investigation of advanced rotorcraft configurations consists of conducting comprehensive analyses of vehicle behavior for the purpose of assessing vehicle potential and feasibility, as well as to establish the analytical models required to support the vehicle development. The analytical work of FY99 included applications to tilt-rotor aircraft. Tilt Rotor Aeroacoustic Model (TRAM) wind tunnel measurements are being compared with calculations performed by using the comprehensive analysis tool (Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD 11)). The objective is to establish the wing and wake aerodynamic models that are required for tilt-rotor analysis and design. The TRAM test in the German-Dutch Wind Tunnel (DNW) produced extensive measurements. This is the first test to encompass air loads, performance, and structural load measurements on tilt rotors, as well as acoustic and flow visualization data. The correlation of measurements and calculations includes helicopter-mode operation (performance, air loads, and blade structural loads), hover (performance and air loads), and airplane-mode operation (performance).

  7. Trends in access of plant biodiversity data revealed by Google Analytics

    PubMed Central

    Baxter, David G.; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E.

    2014-01-01

    Abstract The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development. PMID:25425933

  8. Reporting to Improve Reproducibility and Facilitate Validity Assessment for Healthcare Database Studies V1.0.

    PubMed

    Wang, Shirley V; Schneeweiss, Sebastian; Berger, Marc L; Brown, Jeffrey; de Vries, Frank; Douglas, Ian; Gagne, Joshua J; Gini, Rosa; Klungel, Olaf; Mullins, C Daniel; Nguyen, Michael D; Rassen, Jeremy A; Smeeth, Liam; Sturkenboom, Miriam

    2017-09-01

    Defining a study population and creating an analytic dataset from longitudinal healthcare databases involves many decisions. Our objective was to catalogue scientific decisions underpinning study execution that should be reported to facilitate replication and enable assessment of validity of studies conducted in large healthcare databases. We reviewed key investigator decisions required to operate a sample of macros and software tools designed to create and analyze analytic cohorts from longitudinal streams of healthcare data. A panel of academic, regulatory, and industry experts in healthcare database analytics discussed and added to this list. Evidence generated from large healthcare encounter and reimbursement databases is increasingly being sought by decision-makers. Varied terminology is used around the world for the same concepts. Agreeing on terminology and which parameters from a large catalogue are the most essential to report for replicable research would improve transparency and facilitate assessment of validity. At a minimum, reporting for a database study should provide clarity regarding operational definitions for key temporal anchors and their relation to each other when creating the analytic dataset, accompanied by an attrition table and a design diagram. A substantial improvement in reproducibility, rigor and confidence in real world evidence generated from healthcare databases could be achieved with greater transparency about operational study parameters used to create analytic datasets from longitudinal healthcare databases. © 2017 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  9. A two-dimensional analytical model of vapor intrusion involving vertical heterogeneity.

    PubMed

    Yao, Yijun; Verginelli, Iason; Suuberg, Eric M

    2017-05-01

    In this work, we present an analytical chlorinated vapor intrusion (CVI) model that can estimate source-to-indoor air concentration attenuation by simulating two-dimensional (2-D) vapor concentration profile in vertically heterogeneous soils overlying a homogenous vapor source. The analytical solution describing the 2-D soil gas transport was obtained by applying a modified Schwarz-Christoffel mapping method. A partial field validation showed that the developed model provides results (especially in terms of indoor emission rates) in line with the measured data from a case involving a building overlying a layered soil. In further testing, it was found that the new analytical model can very closely replicate the results of three-dimensional (3-D) numerical models at steady state in scenarios involving layered soils overlying homogenous groundwater sources. By contrast, by adopting a two-layer approach (capillary fringe and vadose zone) as employed in the EPA implementation of the Johnson and Ettinger model, the spatially and temporally averaged indoor concentrations in the case of groundwater sources can be higher than the ones estimated by the numerical model up to two orders of magnitude. In short, the model proposed in this work can represent an easy-to-use tool that can simulate the subsurface soil gas concentration in layered soils overlying a homogenous vapor source while keeping the simplicity of an analytical approach that requires much less computational effort.

  10. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  11. Trends in access of plant biodiversity data revealed by Google Analytics.

    PubMed

    Jones, Timothy Mark; Baxter, David G; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E

    2014-01-01

    The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development.

  12. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    ERIC Educational Resources Information Center

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  13. Software Models Impact Stresses

    NASA Technical Reports Server (NTRS)

    Hanshaw, Timothy C.; Roy, Dipankar; Toyooka, Mark

    1991-01-01

    Generalized Impact Stress Software designed to assist engineers in predicting stresses caused by variety of impacts. Program straightforward, simple to implement on personal computers, "user friendly", and handles variety of boundary conditions applied to struck body being analyzed. Applications include mathematical modeling of motions and transient stresses of spacecraft, analysis of slamming of piston, of fast valve shutoffs, and play of rotating bearing assembly. Provides fast and inexpensive analytical tool for analysis of stresses and reduces dependency on expensive impact tests. Written in FORTRAN 77. Requires use of commercial software package PLOT88.

  14. [The analytical reliability of clinical laboratory information and role of the standards in its support].

    PubMed

    Men'shikov, V V

    2012-12-01

    The article deals with the factors impacting the reliability of clinical laboratory information. The differences of qualities of laboratory analysis tools produced by various manufacturers are discussed. These characteristics are the causes of discrepancy of the results of laboratory analyses of the same analite. The role of the reference system in supporting the comparability of laboratory analysis results is demonstrated. The project of national standard is presented to regulate the requirements to standards and calibrators for analysis of qualitative and non-metrical characteristics of components of biomaterials.

  15. Observations, Ideas, and Opinions: Systems Engineering and Integration for Return to Flight

    NASA Technical Reports Server (NTRS)

    Gafka, George K.

    2006-01-01

    This presentation addresses project management and systems engineering and integration challenges for return to flight, focusing on the Thermal Protection System Tile Repair Project (TRP). The program documentation philosophy, communication with program requirements flow and philosophy and planned deliverables and documentation are outlined. The development of TRP 'use-as-is' analytical tools is also highlighted and emphasis is placed on the use flight history to assess pre-flight and real-time risk. Additionally, an overview is provided of the repair procedure, including an outline of the logistics deployment chart.

  16. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  17. Magic Angle Spinning NMR Metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhi Hu, Jian

    Nuclear Magnetic Resonance (NMR) spectroscopy is a non-destructive, quantitative, reproducible, untargeted and unbiased method that requires no or minimal sample preparation, and is one of the leading analytical tools for metabonomics research [1-3]. The easy quantification and the no need of prior knowledge about compounds present in a sample associated with NMR are advantageous over other techniques [1,4]. 1H NMR is especially attractive because protons are present in virtually all metabolites and its NMR sensitivity is high, enabling the simultaneous identification and monitoring of a wide range of low molecular weight metabolites.

  18. Boom Minimization Framework for Supersonic Aircraft Using CFD Analysis

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Rallabhandi, Sriram K.

    2010-01-01

    A new framework is presented for shape optimization using analytical shape functions and high-fidelity computational fluid dynamics (CFD) via Cart3D. The focus of the paper is the system-level integration of several key enabling analysis tools and automation methods to perform shape optimization and reduce sonic boom footprint. A boom mitigation case study subject to performance, stability and geometrical requirements is presented to demonstrate a subset of the capabilities of the framework. Lastly, a design space exploration is carried out to assess the key parameters and constraints driving the design.

  19. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  20. Microplastic Exposure Assessment in Aquatic Environments: Learning from Similarities and Differences to Engineered Nanoparticles.

    PubMed

    Hüffer, Thorsten; Praetorius, Antonia; Wagner, Stephan; von der Kammer, Frank; Hofmann, Thilo

    2017-03-07

    Microplastics (MPs) have been identified as contaminants of emerging concern in aquatic environments and research into their behavior and fate has been sharply increasing in recent years. Nevertheless, significant gaps remain in our understanding of several crucial aspects of MP exposure and risk assessment, including the quantification of emissions, dominant fate processes, types of analytical tools required for characterization and monitoring, and adequate laboratory protocols for analysis and hazard testing. This Feature aims at identifying transferrable knowledge and experience from engineered nanoparticle (ENP) exposure assessment. This is achieved by comparing ENP and MPs based on their similarities as particulate contaminants, whereas critically discussing specific differences. We also highlight the most pressing research priorities to support an efficient development of tools and methods for MPs environmental risk assessment.

  1. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  2. Bioimaging of cells and tissues using accelerator-based sources.

    PubMed

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (< 1 mum) or tissue (<1 mm) analyses, (2) a temporal resolution to follow molecular dynamics, and (3) a sensitivity in the micromolar to nanomolar range, thus allowing true investigations on biological dynamics. Third-generation synchrotrons now offer the opportunity of bioanalytical measurements at nanometer resolutions with incredible sensitivity. Linear accelerators are more specialized in their physical features but may exceed synchrotron performances. All these techniques have become irreplaceable tools for developing knowledge in biology. This review highlights the pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  3. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  4. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    PubMed Central

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  6. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    PubMed

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  7. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.

  8. Science meets regulation.

    PubMed

    Bilia, Anna Rita

    2014-12-02

    The European Pharmacopoeia (Ph. Eur.) is a standard reference for both European and non-European countries and defines requirements for the qualitative and quantitative composition of medicines. Herbal drug (HD) monographs state which aspects have to be considered for quality assurance through the relevant chapters "Definition", "Characters", "Identification", "Tests", and "Assay". Identification of botanical material is achieved by macroscopic and microscopic morphology, generally examined by a trained expert. Content or assay is the most difficult area of quality control to perform, since in most herbal drugs the active constituents are unknown and markers should be used which cannot be really related to the quality. The other critical points are represented by the purity tests, in particular some tests such as heavy metals, aflatoxins and pesticides are laborious and time intensive, requiring a significant investment in equipment, materials, and maintenance. A literature survey concerning alternative and/or complementary tools for quality control of botanicals has been performed by searching the scientific databases Pubmed, SciFinder, Scopus and Web of Science. Diverse analytical methods including DNA fingerprinting, Nuclear Magnetic Resonance (NMR), Near Infra Red (NIR) and (bio)sensors have been reported in the literature to evaluate the quality of botanical products. Identification of plants at the species level can be successfully based on genome-based methods, using DNA barcodes, the nucleotide sequence of a short DNA fragment. NMR can provide direct NMR fingerprint determination (complete assignment of the signals by 1D and 2D experiments), quantitative NMR and chemometric analysis (the metabolite fingerprint is based on the distribution of intensity in the NMR spectrum to provide sample classification). NIR spectroscopy is a fast qualitative and quantitative analytical method, getting knowledge about plant species and/or its geographic origin. Finally, the development of chemical and biological sensors is currently one of the most active areas of analytical research. Immobilization of specific enzymes led to recognize definite class of compounds such as cysteine sulfoxides, glucosinolates, cyanogenic glycosides, and polyphenols. Other recognition elements are nucleic acids to evaluate the ability of different molecules to bind DNA. Sensors have also been developed for the detection of heavy metals in botanicals. Moreover, the analysis of mycotoxins and pesticides, could represent another field of possible application. These alternative/complementary analytical methods represent tools which appear to be an analyst's dream: they are able to give rapid analysis responses; to operate directly on complex matrices, in many cases; to be selective and sensitive enough for the required application; to be portable and sometimes also disposable; and to have fast analysis times. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Space Station Freedom Data Assessment Study

    NASA Technical Reports Server (NTRS)

    Johnson, Anngienetta R.; Deskevich, Joseph

    1990-01-01

    The SSF Data Assessment Study was initiated to identify payload and operations data requirements to be supported in the Space Station era. To initiate the study payload requirements from the projected SSF user community were obtained utilizing an electronic questionnaire. The results of the questionnaire were incorporated in a personal computer compatible database used for mission scheduling and end-to-end communications analyses. This paper discusses data flow paths and associated latencies, communications bottlenecks, resource needs versus availability, payload scheduling 'warning flags' and payload data loading requirements for each major milestone in the Space Station buildup sequence. This paper also presents the statistical and analytical assessments produced using the data base, an experiment scheduling program, and a Space Station unique end-to-end simulation model. The modeling concepts and simulation methodologies presented in this paper provide a foundation for forecasting communication requirements and identifying modeling tools to be used in the SSF Tactical Operations Planning (TOP) process.

  10. Data Intensive Computing on Amazon Web Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magana-Zook, S. A.

    The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less

  11. Haze Gray Paint and the U.S. Navy: A Procurement Process Review

    DTIC Science & Technology

    2017-12-01

    support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane

  12. Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools

    DTIC Science & Technology

    2014-01-14

    Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and

  13. Visualization and Analytics Software Tools for Peregrine System |

    Science.gov Websites

    R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open

  14. Dynamic Vision for Control

    DTIC Science & Technology

    2006-07-27

    unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry

  15. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    DTIC Science & Technology

    2011-03-28

    Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory

  16. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  17. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  18. Is food allergen analysis flawed? Health and supply chain risks and a proposed framework to address urgent analytical needs.

    PubMed

    Walker, M J; Burns, D T; Elliott, C T; Gowland, M H; Mills, E N Clare

    2016-01-07

    Food allergy is an increasing problem for those affected, their families or carers, the food industry and for regulators. The food supply chain is highly vulnerable to fraud involving food allergens, risking fatalities and severe reputational damage to the food industry. Many facets are being pursued to ameliorate the difficulties including better food labelling and the concept of thresholds of elicitation of allergy symptoms as risk management tools. These efforts depend to a high degree on the ability reliably to detect and quantify food allergens; yet all current analytical approaches exhibit severe deficiencies that jeopardise accurate results being produced particularly in terms of the risks of false positive and false negative reporting. If we fail to realise the promise of current risk assessment and risk management of food allergens through lack of the ability to measure food allergens reproducibly and with traceability to an international unit of measurement, the analytical community will have failed a significant societal challenge. Three distinct but interrelated areas of analytical work are urgently needed to address the substantial gaps identified: (a) a coordinated international programme for the production of properly characterised clinically relevant reference materials and calibrants for food allergen analysis; (b) an international programme to widen the scope of proteomics and genomics bioinformatics for the genera containing the major allergens to address problems in ELISA, MS and DNA methods; (c) the initiation of a coordinated international programme leading to reference methods for allergen proteins that provide results traceable to the SI. This article describes in more detail food allergy, the risks of inapplicable or flawed allergen analyses with examples and a proposed framework, including clinically relevant incurred allergen concentrations, to address the currently unmet and urgently required analytical requirements. Support for the above recommendations from food authorities, business organisations and National Measurement Institutes is important; however transparent international coordination is essential. Thus our recommendations are primarily addressed to the European Commission, the Health and Food Safety Directorate, DG Santé. A global multidisciplinary consortium is required to provide a curated suite of data including genomic and proteomic data on key allergenic food sources, made publically available on line.

  19. Distributed Generation Interconnection Collaborative | NREL

    Science.gov Websites

    , reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability

  20. World Spatiotemporal Analytics and Mapping Project (wstamp): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World's Largest Open Soruce Data Sets

    NASA Astrophysics Data System (ADS)

    Stewart, R.; Piburn, J.; Sorokine, A.; Myers, A.; Moehl, J.; White, D.

    2015-07-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings.

  1. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World's Largest Open Source Data Sets

    NASA Astrophysics Data System (ADS)

    Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.

    2017-10-01

    Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  2. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  3. Carbon Nanomaterial Based Biosensors for Non-Invasive Detection of Cancer and Disease Biomarkers for Clinical Diagnosis

    PubMed Central

    Tung, Thanh Tran

    2017-01-01

    The early diagnosis of diseases, e.g., Parkinson’s and Alzheimer’s disease, diabetes, and various types of cancer, and monitoring the response of patients to the therapy plays a critical role in clinical treatment; therefore, there is an intensive research for the determination of many clinical analytes. In order to achieve point-of-care sensing in clinical practice, sensitive, selective, cost-effective, simple, reliable, and rapid analytical methods are required. Biosensors have become essential tools in biomarker sensing, in which electrode material and architecture play critical roles in achieving sensitive and stable detection. Carbon nanomaterials in the form of particle/dots, tube/wires, and sheets have recently become indispensable elements of biosensor platforms due to their excellent mechanical, electronic, and optical properties. This review summarizes developments in this lucrative field by presenting major biosensor types and variability of sensor platforms in biomedical applications. PMID:28825646

  4. High Accuracy Evaluation of the Finite Fourier Transform Using Sampled Data

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1997-01-01

    Many system identification and signal processing procedures can be done advantageously in the frequency domain. A required preliminary step for this approach is the transformation of sampled time domain data into the frequency domain. The analytical tool used for this transformation is the finite Fourier transform. Inaccuracy in the transformation can degrade system identification and signal processing results. This work presents a method for evaluating the finite Fourier transform using cubic interpolation of sampled time domain data for high accuracy, and the chirp Zeta-transform for arbitrary frequency resolution. The accuracy of the technique is demonstrated in example cases where the transformation can be evaluated analytically. Arbitrary frequency resolution is shown to be important for capturing details of the data in the frequency domain. The technique is demonstrated using flight test data from a longitudinal maneuver of the F-18 High Alpha Research Vehicle.

  5. Design, fabrication and test of graphite/epoxy metering truss structure components, phase 3

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The design, materials, tooling, manufacturing processes, quality control, test procedures, and results associated with the fabrication and test of graphite/epoxy metering truss structure components exhibiting a near zero coefficient of thermal expansion are described. Analytical methods were utilized, with the aid of a computer program, to define the most efficient laminate configurations in terms of thermal behavior and structural requirements. This was followed by an extensive material characterization and selection program, conducted for several graphite/graphite/hybrid laminate systems to obtain experimental data in support of the analytical predictions. Mechanical property tests as well as the coefficient of thermal expansion tests were run on each laminate under study, the results of which were used as the selection criteria for the single most promising laminate. Further coefficient of thermal expansion measurement was successfully performed on three subcomponent tubes utilizing the selected laminate.

  6. The sweet tooth of biopharmaceuticals: importance of recombinant protein glycosylation analysis.

    PubMed

    Lingg, Nico; Zhang, Peiqing; Song, Zhiwei; Bardor, Muriel

    2012-12-01

    Biopharmaceuticals currently represent the fastest growing sector of the pharmaceutical industry, mainly driven by a rapid expansion in the manufacture of recombinant protein-based drugs. Glycosylation is the most prominent post-translational modification occurring on these protein drugs. It constitutes one of the critical quality attributes that requires thorough analysis for optimal efficacy and safety. This review examines the functional importance of glycosylation of recombinant protein drugs, illustrated using three examples of protein biopharmaceuticals: IgG antibodies, erythropoietin and glucocerebrosidase. Current analytical methods are reviewed as solutions for qualitative and quantitative measurements of glycosylation to monitor quality target product profiles of recombinant glycoprotein drugs. Finally, we propose a framework for designing the quality target product profile of recombinant glycoproteins and planning workflow for glycosylation analysis with the selection of available analytical methods and tools. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Progress in development of coated indexable cemented carbide inserts for machining of iron based work piece materials

    NASA Astrophysics Data System (ADS)

    Czettl, C.; Pohler, M.

    2016-03-01

    Increasing demands on material properties of iron based work piece materials, e.g. for the turbine industry, complicate the machining process and reduce the lifetime of the cutting tools. Therefore, improved tool solutions, adapted to the requirements of the desired application have to be developed. Especially, the interplay of macro- and micro geometry, substrate material, coating and post treatment processes is crucial for the durability of modern high performance tool solutions. Improved and novel analytical methods allow a detailed understanding of material properties responsible for the wear behaviour of the tools. Those support the knowledge based development of tailored cutting materials for selected applications. One important factor for such a solution is the proper choice of coating material, which can be synthesized by physical or chemical vapor deposition techniques. Within this work an overview of state-of-the-art coated carbide grades is presented and application examples are shown to demonstrate their high efficiency. Machining processes for a material range from cast iron, low carbon steels to high alloyed steels are covered.

  8. Automated Flight Dynamics Product Generation for the EOS AM-1 Spacecraft

    NASA Technical Reports Server (NTRS)

    Matusow, Carla

    1999-01-01

    As part of NASA's Earth Science Enterprise, the Earth Observing System (EOS) AM-1 spacecraft is designed to monitor long-term, global, environmental changes. Because of the complexity of the AM-1 spacecraft, the mission operations center requires more than 80 distinct flight dynamics products (reports). To create these products, the AM-1 Flight Dynamics Team (FDT) will use a combination of modified commercial software packages (e.g., Analytical Graphic's Satellite ToolKit) and NASA-developed software applications. While providing the most cost-effective solution to meeting the mission requirements, the integration of these software applications raises several operational concerns: (1) Routine product generation requires knowledge of multiple applications executing on variety of hardware platforms. (2) Generating products is a highly interactive process requiring a user to interact with each application multiple times to generate each product. (3) Routine product generation requires several hours to complete. (4) User interaction with each application introduces the potential for errors, since users are required to manually enter filenames and input parameters as well as run applications in the correct sequence. Generating products requires some level of flight dynamics expertise to determine the appropriate inputs and sequencing. To address these issues, the FDT developed an automation software tool called AutoProducts, which runs on a single hardware platform and provides all necessary coordination and communication among the various flight dynamics software applications. AutoProducts, autonomously retrieves necessary files, sequences and executes applications with correct input parameters, and deliver the final flight dynamics products to the appropriate customers. Although AutoProducts will normally generate pre-programmed sets of routine products, its graphical interface allows for easy configuration of customized and one-of-a-kind products. Additionally, AutoProducts has been designed as a mission-independent tool, and can be easily reconfigured to support other missions or incorporate new flight dynamics software packages. After the AM-1 launch, AutoProducts will run automatically at pre-determined time intervals . The AutoProducts tool reduces many of the concerns associated with the flight dynamics product generation. Although AutoProducts required a significant effort to develop because of the complexity of the interfaces involved, its use will provide significant cost savings through reduced operator time and maximum product reliability. In addition, user satisfaction is significantly improved and flight dynamics experts have more time to perform valuable analysis work. This paper will describe the evolution of the AutoProducts tool, highlighting the cost savings and customer satisfaction resulting from its development. It will also provide details about the tool including its graphical interface and operational capabilities.

  9. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    PubMed

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges according to associated data values. We demonstrated the advantages of these new capabilities through three biological network visualization case studies: human disease association network, drug-target interaction network and protein-peptide mapping network. The architectural design of ProteoLens makes it suitable for bioinformatics expert data analysts who are experienced with relational database management to perform large-scale integrated network visual explorations. ProteoLens is a promising visual analytic platform that will facilitate knowledge discoveries in future network and systems biology studies.

  10. Accessibility and usability OCW data: The UTPL OCW.

    PubMed

    Rodríguez, Germania; Perez, Jennifer; Cueva, Samanta; Torres, Rommel

    2017-08-01

    This data article provides a data description on article entitled "A framework for improving web accessibility and usability of Open Course Ware sites" [3] This Data in Brief presents the data obtained from the accessibility and usability evaluation of the UTPL OCW. The data obtained from the framework evaluation consists of the manual evaluation of the standards criteria and the automatic evaluation of the tools Google PageSpeed and Google Analytics. In addition, this article presents the synthetized tables from standards that are used by the framework to evaluate the accessibility and usability of OCW, and the questionnaires required to extract the data. As a result, the article also provides the data required to reproduce the evaluation of other OCW.

  11. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  12. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  13. A Comprehensive Tool and Analytical Pathway for Differential Molecular Profiling and Biomarker Discovery

    DTIC Science & Technology

    2014-10-20

    three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY

  14. Inducer analysis/pump model development

    NASA Astrophysics Data System (ADS)

    Cheng, Gary C.

    1994-03-01

    Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.

  15. Inducer analysis/pump model development

    NASA Technical Reports Server (NTRS)

    Cheng, Gary C.

    1994-01-01

    Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.

  16. HPTLC Fingerprint Analysis: A Quality Control for Authentication of Herbal Phytochemicals

    NASA Astrophysics Data System (ADS)

    Ram, Mauji; Abdin, M. Z.; Khan, M. A.; Jha, Prabhakar

    Authentication and consistent quality are the basic requirement for Indian traditional medicine (TIM), Chinese traditional herbal medicine (TCHM), and their commercial products, regardless of the kind of research conducted to modernize the TIM and TCHM. The complexities of TIM and TCHM challenge the current official quality control mode, for which only a few biochemical markers were selected for identification and quantitative assay. Referring too many unknown factors existed in TIM and TCHM, it is impossible and unnecessary to pinpoint qualitatively and quantitatively every single component contained in the herbal drug. Chromatographic fingerprint is a rational option to meet the need for more effective and powerful quality assessment to TIM and TCHM. The optimized chromatographic fingerprint is not only an alternative analytical tool for authentication, but also an approach to express the various pattern of chemical ingredients distribution in the herbal drugs and preserve such "database" for further multifaced sustainable studies. Analytical separation techniques, for example, high-performance liquid chromatography (HPLC), gas chromatography (GC) and mass spectrometry (MS) were among the most popular methods of choice used for quality control of raw material and finished herbal product. Fingerprint analysis approach using high-performance thin-layer chromatography (HPTLC) has become the most potent tool for quality control of herbal medicines because of its simplicity and reliability. It can serve as a tool for identification, authentication, and quality control of herbal drugs. In this chapter, attempts are being made to expand the use of HPTLC and at the same time create interest among prospective researcher in herbal analysis. The developed method can be used as a quality control tool for rapid authentication from a wide variety of herbal samples. Some examples demonstrated the role of fingerprinting in quality control and assessment.

  17. Modeling of the Global Water Cycle - Analytical Models

    Treesearch

    Yongqiang Liu; Roni Avissar

    2005-01-01

    Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...

  18. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  19. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  20. Employing socially driven techniques for framing, contextualization, and collaboration in complex analytical threads

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin

    2015-05-01

    The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.

  1. 40 CFR 161.180 - Enforcement analytical method.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Enforcement analytical method. 161.180... DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements § 161.180 Enforcement analytical method. An analytical method suitable for enforcement purposes must be...

  2. Composing Data Parallel Code for a SPARQL Graph Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Tumeo, Antonino; Villa, Oreste

    Big data analytics process large amount of data to extract knowledge from them. Semantic databases are big data applications that adopt the Resource Description Framework (RDF) to structure metadata through a graph-based representation. The graph based representation provides several benefits, such as the possibility to perform in memory processing with large amounts of parallelism. SPARQL is a language used to perform queries on RDF-structured data through graph matching. In this paper we present a tool that automatically translates SPARQL queries to parallel graph crawling and graph matching operations. The tool also supports complex SPARQL constructs, which requires more than basicmore » graph matching for their implementation. The tool generates parallel code annotated with OpenMP pragmas for x86 Shared-memory Multiprocessors (SMPs). With respect to commercial database systems such as Virtuoso, our approach reduces memory occupation due to join operations and provides higher performance. We show the scaling of the automatically generated graph-matching code on a 48-core SMP.« less

  3. Analytical considerations for mass spectrometry profiling in serum biomarker discovery.

    PubMed

    Whiteley, Gordon R; Colantonio, Simona; Sacconi, Andrea; Saul, Richard G

    2009-03-01

    The potential of using mass spectrometry profiling as a diagnostic tool has been demonstrated for a wide variety of diseases. Various cancers and cancer-related diseases have been the focus of much of this work because of both the paucity of good diagnostic markers and the knowledge that early diagnosis is the most powerful weapon in treating cancer. The implementation of mass spectrometry as a routine diagnostic tool has proved to be difficult, however, primarily because of the stringent controls that are required for the method to be reproducible. The method is evolving as a powerful guide to the discovery of biomarkers that could, in turn, be used either individually or in an array or panel of tests for early disease detection. Using proteomic patterns to guide biomarker discovery and the possibility of deployment in the clinical laboratory environment on current instrumentation or in a hybrid technology has the possibility of being the early diagnosis tool that is needed.

  4. Microchip-Based Single-Cell Functional Proteomics for Biomedical Applications

    PubMed Central

    Lu, Yao; Yang, Liu; Wei, Wei; Shi, Qihui

    2017-01-01

    Cellular heterogeneity has been widely recognized but only recently have single cell tools become available that allow characterizing heterogeneity at the genomic and proteomic levels. We review the technological advances in microchip-based toolkits for single-cell functional proteomics. Each of these tools has distinct advantages and limitations, and a few have advanced toward being applied to address biological or clinical problems that fail to be addressed by traditional population-based methods. High-throughput single-cell proteomic assays generate high-dimensional data sets that contain new information and thus require developing new analytical framework to extract new biology. In this review article, we highlight a few biological and clinical applications in which the microchip-based single-cell proteomic tools provide unique advantages. The examples include resolving functional heterogeneity and dynamics of immune cells, dissecting cell-cell interaction by creating well-contolled on-chip microenvironment, capturing high-resolution snapshots of immune system functions in patients for better immunotherapy and elucidating phosphoprotein signaling networks in cancer cells for guiding effective molecularly targeted therapies. PMID:28280819

  5. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  6. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  7. Google Analytics – Index of Resources

    EPA Pesticide Factsheets

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  8. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  9. Using In Silico Fragmentation to Improve Routine Residue Screening in Complex Matrices.

    PubMed

    Kaufmann, Anton; Butcher, Patrick; Maden, Kathryn; Walker, Stephan; Widmer, Mirjam

    2017-12-01

    Targeted residue screening requires the use of reference substances in order to identify potential residues. This becomes a difficult issue when using multi-residue methods capable of analyzing several hundreds of analytes. Therefore, the capability of in silico fragmentation based on a structure database ("suspect screening") instead of physical reference substances for routine targeted residue screening was investigated. The detection of fragment ions that can be predicted or explained by in silico software was utilized to reduce the number of false positives. These "proof of principle" experiments were done with a tool that is integrated into a commercial MS vendor instrument operating software (UNIFI) as well as with a platform-independent MS tool (Mass Frontier). A total of 97 analytes belonging to different chemical families were separated by reversed phase liquid chromatography and detected in a data-independent acquisition (DIA) mode using ion mobility hyphenated with quadrupole time of flight mass spectrometry. The instrument was operated in the MS E mode with alternating low and high energy traces. The fragments observed from product ion spectra were investigated using a "chopping" bond disconnection algorithm and a rule-based algorithm. The bond disconnection algorithm clearly explained more analyte product ions and a greater percentage of the spectral abundance than the rule-based software (92 out of the 97 compounds produced ≥1 explainable fragment ions). On the other hand, tests with a complex blank matrix (bovine liver extract) indicated that the chopping algorithm reports significantly more false positive fragments than the rule based software. Graphical Abstract.

  10. Bacterial discrimination by means of a universal array approach mediated by LDR (ligase detection reaction)

    PubMed Central

    Busti, Elena; Bordoni, Roberta; Castiglioni, Bianca; Monciardini, Paolo; Sosio, Margherita; Donadio, Stefano; Consolandi, Clarissa; Rossi Bernardi, Luigi; Battaglia, Cristina; De Bellis, Gianluca

    2002-01-01

    Background PCR amplification of bacterial 16S rRNA genes provides the most comprehensive and flexible means of sampling bacterial communities. Sequence analysis of these cloned fragments can provide a qualitative and quantitative insight of the microbial population under scrutiny although this approach is not suited to large-scale screenings. Other methods, such as denaturing gradient gel electrophoresis, heteroduplex or terminal restriction fragment analysis are rapid and therefore amenable to field-scale experiments. A very recent addition to these analytical tools is represented by microarray technology. Results Here we present our results using a Universal DNA Microarray approach as an analytical tool for bacterial discrimination. The proposed procedure is based on the properties of the DNA ligation reaction and requires the design of two probes specific for each target sequence. One oligo carries a fluorescent label and the other a unique sequence (cZipCode or complementary ZipCode) which identifies a ligation product. Ligated fragments, obtained in presence of a proper template (a PCR amplified fragment of the 16s rRNA gene) contain either the fluorescent label or the unique sequence and therefore are addressed to the location on the microarray where the ZipCode sequence has been spotted. Such an array is therefore "Universal" being unrelated to a specific molecular analysis. Here we present the design of probes specific for some groups of bacteria and their application to bacterial diagnostics. Conclusions The combined use of selective probes, ligation reaction and the Universal Array approach yielded an analytical procedure with a good power of discrimination among bacteria. PMID:12243651

  11. Using In Silico Fragmentation to Improve Routine Residue Screening in Complex Matrices

    NASA Astrophysics Data System (ADS)

    Kaufmann, Anton; Butcher, Patrick; Maden, Kathryn; Walker, Stephan; Widmer, Mirjam

    2017-12-01

    Targeted residue screening requires the use of reference substances in order to identify potential residues. This becomes a difficult issue when using multi-residue methods capable of analyzing several hundreds of analytes. Therefore, the capability of in silico fragmentation based on a structure database ("suspect screening") instead of physical reference substances for routine targeted residue screening was investigated. The detection of fragment ions that can be predicted or explained by in silico software was utilized to reduce the number of false positives. These "proof of principle" experiments were done with a tool that is integrated into a commercial MS vendor instrument operating software (UNIFI) as well as with a platform-independent MS tool (Mass Frontier). A total of 97 analytes belonging to different chemical families were separated by reversed phase liquid chromatography and detected in a data-independent acquisition (DIA) mode using ion mobility hyphenated with quadrupole time of flight mass spectrometry. The instrument was operated in the MSE mode with alternating low and high energy traces. The fragments observed from product ion spectra were investigated using a "chopping" bond disconnection algorithm and a rule-based algorithm. The bond disconnection algorithm clearly explained more analyte product ions and a greater percentage of the spectral abundance than the rule-based software (92 out of the 97 compounds produced ≥1 explainable fragment ions). On the other hand, tests with a complex blank matrix (bovine liver extract) indicated that the chopping algorithm reports significantly more false positive fragments than the rule based software. [Figure not available: see fulltext.

  12. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform

    PubMed Central

    Poucke, Sven Van; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; Deyne, Cathy De

    2016-01-01

    With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner’s Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research. PMID:26731286

  13. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform.

    PubMed

    Van Poucke, Sven; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; De Deyne, Cathy

    2016-01-01

    With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner's Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research.

  14. Application of the Life Cycle Analysis and the Building Information Modelling Software in the Architectural Climate Change-Oriented Design Process

    NASA Astrophysics Data System (ADS)

    Gradziński, Piotr

    2017-10-01

    Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.

  15. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  16. New software solutions for analytical spectroscopists

    NASA Astrophysics Data System (ADS)

    Davies, Antony N.

    1999-05-01

    Analytical spectroscopists must be computer literate to effectively carry out the tasks assigned to them. This has often been resisted within organizations with insufficient funds to equip their staff properly, a lack of desire to deliver the essential training and a basic resistance amongst staff to learn the new techniques required for computer assisted analysis. In the past these problems were compounded by seriously flawed software which was being sold for spectroscopic applications. Owing to the limited market for such complex products the analytical spectroscopist often was faced with buying incomplete and unstable tools if the price was to remain reasonable. Long product lead times meant spectrometer manufacturers often ended up offering systems running under outdated and sometimes obscure operating systems. Not only did this mean special staff training for each instrument where the knowledge gained on one system could not be transferred to the neighbouring system but these spectrometers were often only capable of running in a stand-alone mode, cut-off from the rest of the laboratory environment. Fortunately a number of developments in recent years have substantially changed this depressing picture. A true multi-tasking operating system with a simple graphical user interface, Microsoft Windows NT4, has now been widely introduced into the spectroscopic computing environment which has provided a desktop operating system which has proved to be more stable and robust as well as requiring better programming techniques of software vendors. The opening up of the Internet has provided an easy way to access new tools for data handling and has forced a substantial re-think about results delivery (for example Chemical MIME types, IUPAC spectroscopic data exchange standards). Improved computing power and cheaper hardware now allows large spectroscopic data sets to be handled without too many problems. This includes the ability to carry out chemometric operations in minutes rather than hours. Fast networks now enable data analysis of even multi-dimensional spectroscopic data sets remote from the measuring instrument. A strong tendency to opt for a more unified graphical user interface which is substantially more user friendly allows even inexperienced users to rapidly get acquainted with even the complex mathematical analyses. Some examples of new spectroscopic software products will be given to demonstrate the aforesaid points and highlight the ease of integration into a modern analytical spectroscopy workplace.

  17. Combined sensing platform for advanced diagnostics in exhaled mouse breath

    NASA Astrophysics Data System (ADS)

    Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris

    2013-03-01

    Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas chromatography coupled to mass spectrometric detection. Here, we aim at potentially continuously analyzing the TTR via iHWGs and LS flow-through sensors requiring only minute (< 1 mL) sample volumes. Furthermore, this study explores non-linearities observed for the calibration functions of 12CO2 and 13CO2 potentially resulting from effects related to optical collision diameters e.g., in presence of molecular oxygen. It is anticipated that the simultaneous continuous analysis of oxygen via LS will facilitate the correction of these effects after inclusion within appropriate multivariate calibration models, thus providing more reliable and robust calibration schemes for continuously monitoring relevant breath constituents.

  18. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    PubMed

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  19. Effect-directed analysis supporting monitoring of aquatic ...

    EPA Pesticide Factsheets

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required,and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, includingtheir strengths andweaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies onfractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determiningthe chemical structures causing effects is analytical toxi

  20. A collision scheme for hybrid fluid-particle simulation of plasmas

    NASA Astrophysics Data System (ADS)

    Nguyen, Christine; Lim, Chul-Hyun; Verboncoeur, John

    2006-10-01

    Desorption phenomena at the wall of a tokamak can lead to the introduction of impurities at the edge of a thermonuclear plasma. In particular, the use of carbon as a constituent of the tokamak wall, as planned for ITER, requires the study of carbon and hydrocarbon transport in the plasma, including understanding of collisional interaction with the plasma. These collisions can result in new hydrocarbons, hydrogen, secondary electrons and so on. Computational modeling is a primary tool for studying these phenomena. XOOPIC [1] and OOPD1 are widely used computer modeling tools for the simulation of plasmas. Both are particle type codes. Particle simulation gives more kinetic information than fluid simulation, but more computation time is required. In order to reduce this disadvantage, hybrid simulation has been developed, and applied to the modeling of collisions. Present particle simulation tools such as XOOPIC and OODP1 employ a Monte Carlo model for the collisions between particle species and a neutral background gas defined by its temperature and pressure. In fluid-particle hybrid plasma models, collisions include combinations of particle and fluid interactions categorized by projectile-target pairing: particle-particle, particle-fluid, and fluid-fluid. For verification of this hybrid collision scheme, we compare simulation results to analytic solutions for classical plasma models. [1] Verboncoeur et al. Comput. Phys. Comm. 87, 199 (1995).

  1. Medical Applications of Laser Induced Breakdown Spectroscopy

    NASA Astrophysics Data System (ADS)

    Pathak, A. K.; Rai, N. K.; Singh, Ankita; Rai, A. K.; Rai, Pradeep K.; Rai, Pramod K.

    2014-11-01

    Sedentary lifestyle of human beings has resulted in various diseases and in turn we require a potential tool that can be used to address various issues related to human health. Laser Induced Breakdown Spectroscopy (LIBS) is one such potential optical analytical tool that has become quite popular because of its distinctive features that include applicability to any type/phase of samples with almost no sample preparation. Several reports are available that discusses the capabilities of LIBS, suitable for various applications in different branches of science which cannot be addressed by traditional analytical methods but only few reports are available for the medical applications of LIBS. In the present work, LIBS has been implemented to understand the role of various elements in the formation of gallstones (formed under the empyema and mucocele state of gallbladder) samples along with patient history that were collected from Purvancal region of Uttar Pradesh, India. The occurrence statistics of gallstones under the present study reveal higher occurrence of gallstones in female patients. The gallstone occurrence was found more prevalent for those male patients who were having the habit of either tobacco chewing, smoking or drinking alcohols. This work further reports in-situ LIBS study of deciduous tooth and in-vivo LIBS study of human nail.

  2. Interactive entity resolution in relational data: a visual analytic tool and its evaluation.

    PubMed

    Kang, Hyunmo; Getoor, Lise; Shneiderman, Ben; Bilgic, Mustafa; Licamele, Louis

    2008-01-01

    Databases often contain uncertain and imprecise references to real-world entities. Entity resolution, the process of reconciling multiple references to underlying real-world entities, is an important data cleaning process required before accurate visualization or analysis of the data is possible. In many cases, in addition to noisy data describing entities, there is data describing the relationships among the entities. This relational data is important during the entity resolution process; it is useful both for the algorithms which determine likely database references to be resolved and for visual analytic tools which support the entity resolution process. In this paper, we introduce a novel user interface, D-Dupe, for interactive entity resolution in relational data. D-Dupe effectively combines relational entity resolution algorithms with a novel network visualization that enables users to make use of an entity's relational context for making resolution decisions. Since resolution decisions often are interdependent, D-Dupe facilitates understanding this complex process through animations which highlight combined inferences and a history mechanism which allows users to inspect chains of resolution decisions. An empirical study with 12 users confirmed the benefits of the relational context visualization on the performance of entity resolution tasks in relational data in terms of time as well as users' confidence and satisfaction.

  3. Cumulative biological impacts framework for solar energy projects in the California Desert

    USGS Publications Warehouse

    Davis, Frank W.; Kreitler, Jason R.; Soong, Oliver; Stoms, David M.; Dashiell, Stephanie; Hannah, Lee; Wilkinson, Whitney; Dingman, John

    2013-01-01

    This project developed analytical approaches, tools and geospatial data to support conservation planning for renewable energy development in the California deserts. Research focused on geographical analysis to avoid, minimize and mitigate the cumulative biological effects of utility-scale solar energy development. A hierarchical logic model was created to map the compatibility of new solar energy projects with current biological conservation values. The research indicated that the extent of compatible areas is much greater than the estimated land area required to achieve 2040 greenhouse gas reduction goals. Species distribution models were produced for 65 animal and plant species that were of potential conservation significance to the Desert Renewable Energy Conservation Plan process. These models mapped historical and projected future habitat suitability using 270 meter resolution climate grids. The results were integrated into analytical frameworks to locate potential sites for offsetting project impacts and evaluating the cumulative effects of multiple solar energy projects. Examples applying these frameworks in the Western Mojave Desert ecoregion show the potential of these publicly-available tools to assist regional planning efforts. Results also highlight the necessity to explicitly consider projected land use change and climate change when prioritizing areas for conservation and mitigation offsets. Project data, software and model results are all available online.

  4. Multivariate analysis in the pharmaceutical industry: enabling process understanding and improvement in the PAT and QbD era.

    PubMed

    Ferreira, Ana P; Tobyn, Mike

    2015-01-01

    In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.

  5. Time-resolved fluorescence microscopy (FLIM) as an analytical tool in skin nanomedicine.

    PubMed

    Alexiev, Ulrike; Volz, Pierre; Boreham, Alexander; Brodwolf, Robert

    2017-07-01

    The emerging field of nanomedicine provides new approaches for the diagnosis and treatment of diseases, for symptom relief, and for monitoring of disease progression. Topical application of drug-loaded nanoparticles for the treatment of skin disorders is a promising strategy to overcome the stratum corneum, the upper layer of the skin, which represents an effective physical and biochemical barrier. The understanding of drug penetration into skin and enhanced penetration into skin facilitated by nanocarriers requires analytical tools that ideally allow to visualize the skin, its morphology, the drug carriers, drugs, their transport across the skin and possible interactions, as well as effects of the nanocarriers within the different skin layers. Here, we review some recent developments in the field of fluorescence microscopy, namely Fluorescence Lifetime Imaging Microscopy (FLIM)), for improved characterization of nanocarriers, their interactions and penetration into skin. In particular, FLIM allows for the discrimination of target molecules, e.g. fluorescently tagged nanocarriers, against the autofluorescent tissue background and, due to the environmental sensitivity of the fluorescence lifetime, also offers insights into the local environment of the nanoparticle and its interactions with other biomolecules. Thus, FLIM shows the potential to overcome several limits of intensity based microscopy. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Clinical validation of the 50 gene AmpliSeq Cancer Panel V2 for use on a next generation sequencing platform using formalin fixed, paraffin embedded and fine needle aspiration tumour specimens.

    PubMed

    Rathi, Vivek; Wright, Gavin; Constantin, Diana; Chang, Siok; Pham, Huong; Jones, Kerryn; Palios, Atha; Mclachlan, Sue-Anne; Conron, Matthew; McKelvie, Penny; Williams, Richard

    2017-01-01

    The advent of massively parallel sequencing has caused a paradigm shift in the ways cancer is treated, as personalised therapy becomes a reality. More and more laboratories are looking to introduce next generation sequencing (NGS) as a tool for mutational analysis, as this technology has many advantages compared to conventional platforms like Sanger sequencing. In Australia all massively parallel sequencing platforms are still considered in-house in vitro diagnostic tools by the National Association of Testing Authorities (NATA) and a comprehensive analytical validation of all assays, and not just mere verification, is a strict requirement before accreditation can be granted for clinical testing on these platforms. Analytical validation of assays on NGS platforms can prove to be extremely challenging for pathology laboratories. Although there are many affordable and easily accessible NGS instruments available, there are no standardised guidelines as yet for clinical validation of NGS assays. We present an accreditation development procedure that was both comprehensive and applicable in a setting of hospital laboratory for NGS services. This approach may also be applied to other NGS applications in service laboratories. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.

  7. Modeling the Multi-Body System Dynamics of a Flexible Solar Sail Spacecraft

    NASA Technical Reports Server (NTRS)

    Kim, Young; Stough, Robert; Whorton, Mark

    2005-01-01

    Solar sail propulsion systems enable a wide range of space missions that are not feasible with current propulsion technology. Hardware concepts and analytical methods have matured through ground development to the point that a flight validation mission is now realizable. Much attention has been given to modeling the structural dynamics of the constituent elements, but to date an integrated system level dynamics analysis has been lacking. Using a multi-body dynamics and control analysis tool called TREETOPS, the coupled dynamics of the sailcraft bus, sail membranes, flexible booms, and control system sensors and actuators of a representative solar sail spacecraft are investigated to assess system level dynamics and control issues. With this tool, scaling issues and parametric trade studies can be performed to study achievable performance, control authority requirements, and control/structure interaction assessments.

  8. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  9. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  10. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  11. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  12. Large scale and cloud-based multi-model analytics experiments on climate change data in the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Płóciennik, Marcin; Doutriaux, Charles; Blanquer, Ignacio; Barbera, Roberto; Donvito, Giacinto; Williams, Dean N.; Anantharaj, Valentine; Salomoni, Davide D.; Aloisio, Giovanni

    2017-04-01

    In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated, such as the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). A case study on climate models intercomparison data analysis addressing several classes of multi-model experiments is being implemented in the context of the EU H2020 INDIGO-DataCloud project. Such experiments require the availability of large amount of data (multi-terabyte order) related to the output of several climate models simulations as well as the exploitation of scientific data management tools for large-scale data analytics. More specifically, the talk discusses in detail a use case on precipitation trend analysis in terms of requirements, architectural design solution, and infrastructural implementation. The experiment has been tested and validated on CMIP5 datasets, in the context of a large scale distributed testbed across EU and US involving three ESGF sites (LLNL, ORNL, and CMCC) and one central orchestrator site (PSNC). The general "environment" of the case study relates to: (i) multi-model data analysis inter-comparison challenges; (ii) addressed on CMIP5 data; and (iii) which are made available through the IS-ENES/ESGF infrastructure. The added value of the solution proposed in the INDIGO-DataCloud project are summarized in the following: (i) it implements a different paradigm (from client- to server-side); (ii) it intrinsically reduces data movement; (iii) it makes lightweight the end-user setup; (iv) it fosters re-usability (of data, final/intermediate products, workflows, sessions, etc.) since everything is managed on the server-side; (v) it complements, extends and interoperates with the ESGF stack; (vi) it provides a "tool" for scientists to run multi-model experiments, and finally; and (vii) it can drastically reduce the time-to-solution for these experiments from weeks to hours. At the time the contribution is being written, the proposed testbed represents the first concrete implementation of a distributed multi-model experiment in the ESGF/CMIP context joining server-side and parallel processing, end-to-end workflow management and cloud computing. As opposed to the current scenario based on search & discovery, data download, and client-based data analysis, the INDIGO-DataCloud architectural solution described in this contribution addresses the scientific computing & analytics requirements by providing a paradigm shift based on server-side and high performance big data frameworks jointly with two-level workflow management systems realized at the PaaS level via a cloud infrastructure.

  13. Applying Pragmatics Principles for Interaction with Visual Analytics.

    PubMed

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  14. Environmental screening tools for assessment of infrastructure plans based on biodiversity preservation and global warming (PEIT, Spain)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia-Montero, Luis G., E-mail: luisgonzaga.garcia@upm.e; Lopez, Elena, E-mail: elopez@caminos.upm.e; Monzon, Andres, E-mail: amonzon@caminos.upm.e

    Most Strategic Environmental Assessment (SEA) research has been concerned with SEA as a procedure, and there have been relatively few developments and tests of analytical methodologies. The first stage of the SEA is the 'screening', which is the process whereby a decision is taken on whether or not SEA is required for a particular programme or plan. The effectiveness of screening and SEA procedures will depend on how well the assessment fits into the planning from the early stages of the decision-making process. However, it is difficult to prepare the environmental screening for an infrastructure plan involving a whole country.more » To be useful, such methodologies must be fast and simple. We have developed two screening tools which would make it possible to estimate promptly the overall impact an infrastructure plan might have on biodiversity and global warming for a whole country, in order to generate planning alternatives, and to determine whether or not SEA is required for a particular infrastructure plan.« less

  15. Experiences with the Twitter Health Surveillance (THS) System

    PubMed Central

    Rodríguez-Martínez, Manuel

    2018-01-01

    Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype. PMID:29607412

  16. Beam position monitor engineering

    NASA Astrophysics Data System (ADS)

    Smith, Stephen R.

    1997-01-01

    The design of beam position monitors often involves challenging system design choices. Position transducers must be robust, accurate, and generate adequate position signal without unduly disturbing the beam. Electronics must be reliable and affordable, usually while meeting tough requirements on precision, accuracy, and dynamic range. These requirements may be difficult to achieve simultaneously, leading the designer into interesting opportunities for optimization or compromise. Some useful techniques and tools are shown. Both finite element analysis and analytic techniques will be used to investigate quasi-static aspects of electromagnetic fields such as the impedance of and the coupling of beam to striplines or buttons. Finite-element tools will be used to understand dynamic aspects of the electromagnetic fields of beams, such as wake fields and transmission-line and cavity effects in vacuum-to-air feedthroughs. Mathematical modeling of electrical signals through a processing chain will be demonstrated, in particular to illuminate areas where neither a pure time-domain nor a pure frequency-domain analysis is obviously advantageous. Emphasis will be on calculational techniques, in particular on using both time domain and frequency domain approaches to the applicable parts of interesting problems.

  17. Experiences with the Twitter Health Surveillance (THS) System.

    PubMed

    Rodríguez-Martínez, Manuel

    2017-06-01

    Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype.

  18. Amperometric Enzyme-Based Biosensors for Application in Food and Beverage Industry

    NASA Astrophysics Data System (ADS)

    Csöoregi, Elisabeth; Gáspñr, Szilveszter; Niculescu, Mihaela; Mattiasson, Bo; Schuhmann, Wolfgang

    Continuous, sensitive, selective, and reliable monitoring of a large variety of different compounds in various food and beverage samples is of increasing importance to assure a high-quality and tracing of any possible source of contamination of food and beverages. Most of the presently used classical analytical methods are often requiring expensive instrumentation, long analysis times and well-trained staff. Amperometric enzyme-based biosensors on the other hand have emerged in the last decade from basic science to useful tools with very promising application possibilities in food and beverage industry. Amperometric biosensors are in general highly selective, sensitive, relatively cheap, and easy to integrate into continuous analysis systems. A successful application of such sensors for industrial purposes, however, requires a sensor design, which satisfies the specific needs of monitoring the targeted analyte in the particular application, Since each individual application needs different operational conditions and sensor characteristics, it is obvious that biosensors have to be tailored for the particular case. The characteristics of the biosensors are depending on the used biorecognition element (enzyme), nature of signal transducer (electrode material) and the communication between these two elements (electron-transfer pathway).

  19. High frequency flow-structural interaction in dense subsonic fluids

    NASA Technical Reports Server (NTRS)

    Liu, Baw-Lin; Ofarrell, J. M.

    1995-01-01

    Prediction of the detailed dynamic behavior in rocket propellant feed systems and engines and other such high-energy fluid systems requires precise analysis to assure structural performance. Designs sometimes require placement of bluff bodies in a flow passage. Additionally, there are flexibilities in ducts, liners, and piping systems. A design handbook and interactive data base have been developed for assessing flow/structural interactions to be used as a tool in design and development, to evaluate applicable geometries before problems develop, or to eliminate or minimize problems with existing hardware. This is a compilation of analytical/empirical data and techniques to evaluate detailed dynamic characteristics of both the fluid and structures. These techniques have direct applicability to rocket engine internal flow passages, hot gas drive systems, and vehicle propellant feed systems. Organization of the handbook is by basic geometries for estimating Strouhal numbers, added mass effects, mode shapes for various end constraints, critical onset flow conditions, and possible structural response amplitudes. Emphasis is on dense fluids and high structural loading potential for fatigue at low subsonic flow speeds where high-frequency excitations are possible. Avoidance and corrective measure illustrations are presented together with analytical curve fits for predictions compiled from a comprehensive data base.

  20. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  1. Semi-Analytical Models of CO2 Injection into Deep Saline Aquifers: Evaluation of the Area of Review and Leakage through Abandoned Wells

    EPA Science Inventory

    This presentation will provide a conceptual preview of an Area of Review (AoR) tool being developed by EPA’s Office of Research and Development that applies analytic and semi-analytical mathematical solutions to elucidate potential risks associated with geologic sequestration of ...

  2. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  3. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    NASA Astrophysics Data System (ADS)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  4. Machine learning and predictive data analytics enabling metrology and process control in IC fabrication

    NASA Astrophysics Data System (ADS)

    Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.

    2015-03-01

    Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.

  5. Mining patterns in persistent surveillance systems with smart query and visual analytics

    NASA Astrophysics Data System (ADS)

    Habibi, Mohammad S.; Shirkhodaie, Amir

    2013-05-01

    In Persistent Surveillance Systems (PSS) the ability to detect and characterize events geospatially help take pre-emptive steps to counter adversary's actions. Interactive Visual Analytic (VA) model offers this platform for pattern investigation and reasoning to comprehend and/or predict such occurrences. The need for identifying and offsetting these threats requires collecting information from diverse sources, which brings with it increasingly abstract data. These abstract semantic data have a degree of inherent uncertainty and imprecision, and require a method for their filtration before being processed further. In this paper, we have introduced an approach based on Vector Space Modeling (VSM) technique for classification of spatiotemporal sequential patterns of group activities. The feature vectors consist of an array of attributes extracted from generated sensors semantic annotated messages. To facilitate proper similarity matching and detection of time-varying spatiotemporal patterns, a Temporal-Dynamic Time Warping (DTW) method with Gaussian Mixture Model (GMM) for Expectation Maximization (EM) is introduced. DTW is intended for detection of event patterns from neighborhood-proximity semantic frames derived from established ontology. GMM with EM, on the other hand, is employed as a Bayesian probabilistic model to estimated probability of events associated with a detected spatiotemporal pattern. In this paper, we present a new visual analytic tool for testing and evaluation group activities detected under this control scheme. Experimental results demonstrate the effectiveness of proposed approach for discovery and matching of subsequences within sequentially generated patterns space of our experiments.

  6. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less

  7. Topochemical Analysis of Cell Wall Components by TOF-SIMS.

    PubMed

    Aoki, Dan; Fukushima, Kazuhiko

    2017-01-01

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) is a recently developing analytical tool and a type of imaging mass spectrometry. TOF-SIMS provides mass spectral information with a lateral resolution on the order of submicrons, with widespread applicability. Sometimes, it is described as a surface analysis method without the requirement for sample pretreatment; however, several points need to be taken into account for the complete utilization of the capabilities of TOF-SIMS. In this chapter, we introduce methods for TOF-SIMS sample treatments, as well as basic knowledge of wood samples TOF-SIMS spectral and image data analysis.

  8. On a computational model of building thermal dynamic response

    NASA Astrophysics Data System (ADS)

    Jarošová, Petra; Vala, Jiří

    2016-07-01

    Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.

  9. Protein-centric N-glycoproteomics analysis of membrane and plasma membrane proteins.

    PubMed

    Sun, Bingyun; Hood, Leroy

    2014-06-06

    The advent of proteomics technology has transformed our understanding of biological membranes. The challenges for studying membrane proteins have inspired the development of many analytical and bioanalytical tools, and the techniques of glycoproteomics have emerged as an effective means to enrich and characterize membrane and plasma-membrane proteomes. This Review summarizes the development of various glycoproteomics techniques to overcome the hurdles formed by the unique structures and behaviors of membrane proteins with a focus on N-glycoproteomics. Example contributions of N-glycoproteomics to the understanding of membrane biology are provided, and the areas that require future technical breakthroughs are discussed.

  10. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.

  11. Experimental design for TBT quantification by isotope dilution SPE-GC-ICP-MS under the European water framework directive.

    PubMed

    Alasonati, Enrica; Fabbri, Barbara; Fettig, Ina; Yardin, Catherine; Del Castillo Busto, Maria Estela; Richter, Janine; Philipp, Rosemarie; Fisicaro, Paola

    2015-03-01

    In Europe the maximum allowable concentration for tributyltin (TBT) compounds in surface water has been regulated by the water framework directive (WFD) and daughter directive that impose a limit of 0.2 ng L(-1) in whole water (as tributyltin cation). Despite the large number of different methodologies for the quantification of organotin species developed in the last two decades, standardised analytical methods at required concentration level do not exist. TBT quantification at picogram level requires efficient and accurate sample preparation and preconcentration, and maximum care to avoid blank contamination. To meet the WFD requirement, a method for the quantification of TBT in mineral water at environmental quality standard (EQS) level, based on solid phase extraction (SPE), was developed and optimised. The quantification was done using species-specific isotope dilution (SSID) followed by gas chromatography (GC) coupled to inductively coupled plasma mass spectrometry (ICP-MS). The analytical process was optimised using a design of experiment (DOE) based on a factorial fractionary plan. The DOE allowed to evaluate 3 qualitative factors (type of stationary phase and eluent, phase mass and eluent volume, pH and analyte ethylation procedure) for a total of 13 levels studied, and a sample volume in the range of 250-1000 mL. Four different models fitting the results were defined and evaluated with statistic tools: one of them was selected and optimised to find the best procedural conditions. C18 phase was found to be the best stationary phase for SPE experiments. The 4 solvents tested with C18, the pH and ethylation conditions, the mass of the phases, the volume of the eluents and the sample volume can all be optimal, but depending on their respective combination. For that reason, the equation of the model conceived in this work is a useful decisional tool for the planning of experiments, because it can be applied to predict the TBT mass fraction recovery when the experimental conditions are drawn. This work shows that SPE is a convenient technique for TBT pre-concentration at pico-trace levels and a robust approach: in fact (i) number of different experimental conditions led to satisfactory results and (ii) the participation of two institutes to the experimental work did not impact the developed model. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    DOT National Transportation Integrated Search

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  13. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  14. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  15. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Feng; Liu, Yijin; Yu, Xiqian

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  16. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE PAGES

    Lin, Feng; Liu, Yijin; Yu, Xiqian; ...

    2017-08-30

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  17. 76 FR 70517 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...

  18. Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.

    PubMed

    Scott, Bradley; Wilcock, Anne

    2006-01-01

    Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.

  19. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  20. Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.

  1. SU-C-204-01: A Fast Analytical Approach for Prompt Gamma and PET Predictions in a TPS for Proton Range Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroniger, K; Herzog, M; Landry, G

    2015-06-15

    Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less

  2. Development and application of accurate analytical models for single active electron potentials

    NASA Astrophysics Data System (ADS)

    Miller, Michelle; Jaron-Becker, Agnieszka; Becker, Andreas

    2015-05-01

    The single active electron (SAE) approximation is a theoretical model frequently employed to study scenarios in which inner-shell electrons may productively be treated as frozen spectators to a physical process of interest, and accurate analytical approximations for these potentials are sought as a useful simulation tool. Density function theory is often used to construct a SAE potential, requiring that a further approximation for the exchange correlation functional be enacted. In this study, we employ the Krieger, Li, and Iafrate (KLI) modification to the optimized-effective-potential (OEP) method to reduce the complexity of the problem to the straightforward solution of a system of linear equations through simple arguments regarding the behavior of the exchange-correlation potential in regions where a single orbital dominates. We employ this method for the solution of atomic and molecular potentials, and use the resultant curve to devise a systematic construction for highly accurate and useful analytical approximations for several systems. Supported by the U.S. Department of Energy (Grant No. DE-FG02-09ER16103), and the U.S. National Science Foundation (Graduate Research Fellowship, Grants No. PHY-1125844 and No. PHY-1068706).

  3. Fiber optic evanescent wave biosensor

    NASA Astrophysics Data System (ADS)

    Duveneck, Gert L.; Ehrat, Markus; Widmer, H. M.

    1991-09-01

    The role of modern analytical chemistry is not restricted to quality control and environmental surveillance, but has been extended to process control using on-line analytical techniques. Besides industrial applications, highly specific, ultra-sensitive biochemical analysis becomes increasingly important as a diagnostic tool, both in central clinical laboratories and in the doctor's office. Fiber optic sensor technology can fulfill many of the requirements for both types of applications. As an example, the experimental arrangement of a fiber optic sensor for biochemical affinity assays is presented. The evanescent electromagnetic field, associated with a light ray guided in an optical fiber, is used for the excitation of luminescence labels attached to the biomolecules in solution to be analyzed. Due to the small penetration depth of the evanescent field into the medium, the generation of luminescence is restricted to the close proximity of the fiber, where, e.g., the luminescent analyte molecules combine with their affinity partners, which are immobilized on the fiber. Both cw- and pulsed light excitation can be used in evanescent wave sensor technology, enabling the on-line observation of an affinity assay on a macroscopic time scale (seconds and minutes), as well as on a microscopic, molecular time scale (nanoseconds or microseconds).

  4. Development of a new semi-analytical model for cross-borehole flow experiments in fractured media

    USGS Publications Warehouse

    Roubinet, Delphine; Irving, James; Day-Lewis, Frederick D.

    2015-01-01

    Analysis of borehole flow logs is a valuable technique for identifying the presence of fractures in the subsurface and estimating properties such as fracture connectivity, transmissivity and storativity. However, such estimation requires the development of analytical and/or numerical modeling tools that are well adapted to the complexity of the problem. In this paper, we present a new semi-analytical formulation for cross-borehole flow in fractured media that links transient vertical-flow velocities measured in one or a series of observation wells during hydraulic forcing to the transmissivity and storativity of the fractures intersected by these wells. In comparison with existing models, our approach presents major improvements in terms of computational expense and potential adaptation to a variety of fracture and experimental configurations. After derivation of the formulation, we demonstrate its application in the context of sensitivity analysis for a relatively simple two-fracture synthetic problem, as well as for field-data analysis to investigate fracture connectivity and estimate fracture hydraulic properties. These applications provide important insights regarding (i) the strong sensitivity of fracture property estimates to the overall connectivity of the system; and (ii) the non-uniqueness of the corresponding inverse problem for realistic fracture configurations.

  5. Long story short: an introduction to the short-term and long-term Six Sigma quality and its importance in the laboratory medicine for the management of extra-analytical processes.

    PubMed

    Ialongo, Cristiano; Bernardini, Sergio

    2018-06-18

    There is a compelling need for quality tools that enable effective control of the extra-analytical phase. In this regard, Six Sigma seems to offer a valid methodological and conceptual opportunity, and in recent times, the International Federation of Clinical Chemistry and Laboratory Medicine has adopted it for indicating the performance requirements for non-analytical laboratory processes. However, the Six Sigma implies a distinction between short-term and long-term quality that is based on the dynamics of the processes. These concepts are still not widespread and applied in the field of laboratory medicine although they are of fundamental importance to exploit the full potential of this methodology. This paper reviews the Six Sigma quality concepts and shows how they originated from Shewhart's control charts, in respect of which they are not an alternative but a completion. It also discusses the dynamic nature of process and how it arises, concerning particularly the long-term dynamic mean variation, and explains why this leads to the fundamental distinction of quality we previously mentioned.

  6. ExaSAT: An exascale co-design tool for performance modeling

    DOE PAGES

    Unat, Didem; Chan, Cy; Zhang, Weiqun; ...

    2015-02-09

    One of the emerging challenges to designing HPC systems is understanding and projecting the requirements of exascale applications. In order to determine the performance consequences of different hardware designs, analytic models are essential because they can provide fast feedback to the co-design centers and chip designers without costly simulations. However, current attempts to analytically model program performance typically rely on the user manually specifying a performance model. Here we introduce the ExaSAT framework that automates the extraction of parameterized performance models directly from source code using compiler analysis. The parameterized analytic model enables quantitative evaluation of a broad range ofmore » hardware design trade-offs and software optimizations on a variety of different performance metrics, with a primary focus on data movement as a metric. Finally, we demonstrate the ExaSAT framework’s ability to perform deep code analysis of a proxy application from the Department of Energy Combustion Co-design Center to illustrate its value to the exascale co-design process. ExaSAT analysis provides insights into the hardware and software trade-offs and lays the groundwork for exploring a more targeted set of design points using cycle-accurate architectural simulators.« less

  7. Using learning analytics to evaluate a video-based lecture series.

    PubMed

    Lau, K H Vincent; Farooque, Pue; Leydon, Gary; Schwartz, Michael L; Sadler, R Mark; Moeller, Jeremy J

    2018-01-01

    The video-based lecture (VBL), an important component of the flipped classroom (FC) and massive open online course (MOOC) approaches to medical education, has primarily been evaluated through direct learner feedback. Evaluation may be enhanced through learner analytics (LA) - analysis of quantitative audience usage data generated by video-sharing platforms. We applied LA to an experimental series of ten VBLs on electroencephalography (EEG) interpretation, uploaded to YouTube in the model of a publicly accessible MOOC. Trends in view count; total percentage of video viewed and audience retention (AR) (percentage of viewers watching at a time point compared to the initial total) were examined. The pattern of average AR decline was characterized using regression analysis, revealing a uniform linear decline in viewership for each video, with no evidence of an optimal VBL length. Segments with transient increases in AR corresponded to those focused on core concepts, indicative of content requiring more detailed evaluation. We propose a model for applying LA at four levels: global, series, video, and feedback. LA may be a useful tool in evaluating a VBL series. Our proposed model combines analytics data and learner self-report for comprehensive evaluation.

  8. Synthesis of Feedback Controller for Chaotic Systems by Means of Evolutionary Techniques

    NASA Astrophysics Data System (ADS)

    Senkerik, Roman; Oplatkova, Zuzana; Zelinka, Ivan; Davendra, Donald; Jasek, Roman

    2011-06-01

    This research deals with a synthesis of control law for three selected discrete chaotic systems by means of analytic programming. The novality of the approach is that a tool for symbolic regression—analytic programming—is used for such kind of difficult problem. The paper consists of the descriptions of analytic programming as well as chaotic systems and used cost function. For experimentation, Self-Organizing Migrating Algorithm (SOMA) with analytic programming was used.

  9. THE FUTURE OF SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (SMARTE): 2006-2010

    EPA Science Inventory

    SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...

  10. Feasibility study on the use of groupware support for NASA source evaluation boards

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Yoes, Cissy

    1991-01-01

    Groupware is a class of computer based systems that support groups engaged in a common task (or goal) and that provide an interface to a shared environment. A potential application for groupware is the source evaluation board (SEB) process used in the procurement of government contracts. This study was undertaken to (1) identify parts of the SEB process which are candidates for groupware supports; and (2) identify tools which could be used to support the candidate process. Two processes of the SEB were identified as good candidates for groupware support: (1) document generation - a coordination and communication process required to present and document the findings of an SEB; and (2) group decision making - a highly analytical and integrative decision process requiring a clear and supportable outcome.

  11. Process analytical technology (PAT) in insect and mammalian cell culture processes: dielectric spectroscopy and focused beam reflectance measurement (FBRM).

    PubMed

    Druzinec, Damir; Weiss, Katja; Elseberg, Christiane; Salzig, Denise; Kraume, Matthias; Pörtner, Ralf; Czermak, Peter

    2014-01-01

    Modern bioprocesses demand for a careful definition of the critical process parameters (CPPs) already during the early stages of process development in order to ensure high-quality products and satisfactory yields. In this context, online monitoring tools can be applied to recognize unfavorable changes of CPPs during the production processes and to allow for early interventions in order to prevent losses of production batches due to quality issues. Process analytical technologies such as the dielectric spectroscopy or focused beam reflectance measurement (FBRM) are possible online monitoring tools, which can be applied to monitor cell growth as well as morphological changes. Since the dielectric spectroscopy only captures cells with intact cell membranes, even information about dead cells with ruptured or leaking cell membranes can be derived. The following chapter describes the application of dielectric spectroscopy on various virus-infected and non-infected cell lines with respect to adherent as well as suspension cultures in common stirred tank reactors. The adherent mammalian cell lines Vero (African green monkey kidney cells) and hMSC-TERT (telomerase-immortalized human mesenchymal stem cells) are thereby cultured on microcarrier, which provide the required growth surface and allow the cultivation of these cells even in dynamic culture systems. In turn, the insect-derived cell lines S2 and Sf21 are used as examples for cells typically cultured in suspension. Moreover, the FBRM technology as a further monitoring tool for cell culture applications has been included in this chapter using the example of Drosophila S2 insect cells.

  12. Options to improve energy efficiency for educational building

    NASA Astrophysics Data System (ADS)

    Jahan, Mafruha

    The cost of energy is a major factor that must be considered for educational facility budget planning purpose. The analysis of energy related issues and options can be complex and requires significant time and detailed effort. One way to facilitate the inclusion of energy option planning in facility planning efforts is to utilize a tool that allows for quick appraisal of the facility energy profile. Once such an appraisal is accomplished, it is then possible to rank energy improvement options consistently with other facility needs and requirements. After an energy efficiency option has been determined to have meaningful value in comparison with other facility planning options, it is then possible to utilize the initial appraisal as the basis for an expanded consideration of additional facility and energy use detail using the same analytic system used for the initial appraisal. This thesis has developed a methodology and an associated analytic model to assist in these tasks and thereby improve the energy efficiency of educational facilities. A detailed energy efficiency and analysis tool is described that utilizes specific university building characteristics such as size, architecture, envelop, lighting, occupancy, thermal design which allows reducing the annual energy consumption. Improving the energy efficiency of various aspects of an educational building's energy performance can be complex and can require significant time and experience to make decisions. The approach developed in this thesis initially assesses the energy design for a university building. This initial appraisal is intended to assist administrators in assessing the potential value of energy efficiency options for their particular facility. Subsequently this scoping design can then be extended as another stage of the model by local facility or planning personnel to add more details and engineering aspects to the initial screening model. This approach can assist university planning efforts to identify the most cost effective combinations of energy efficiency strategies. The model analyzes and compares the payback periods of all proposed Energy Performance Measures (EPMs) to determine which has the greatest potential value.

  13. Modelling turbulent boundary layer flow over fractal-like multiscale terrain using large-eddy simulations and analytical tools.

    PubMed

    Yang, X I A; Meneveau, C

    2017-04-13

    In recent years, there has been growing interest in large-eddy simulation (LES) modelling of atmospheric boundary layers interacting with arrays of wind turbines on complex terrain. However, such terrain typically contains geometric features and roughness elements reaching down to small scales that typically cannot be resolved numerically. Thus subgrid-scale models for the unresolved features of the bottom roughness are needed for LES. Such knowledge is also required to model the effects of the ground surface 'underneath' a wind farm. Here we adapt a dynamic approach to determine subgrid-scale roughness parametrizations and apply it for the case of rough surfaces composed of cuboidal elements with broad size distributions, containing many scales. We first investigate the flow response to ground roughness of a few scales. LES with the dynamic roughness model which accounts for the drag of unresolved roughness is shown to provide resolution-independent results for the mean velocity distribution. Moreover, we develop an analytical roughness model that accounts for the sheltering effects of large-scale on small-scale roughness elements. Taking into account the shading effect, constraints from fundamental conservation laws, and assumptions of geometric self-similarity, the analytical roughness model is shown to provide analytical predictions that agree well with roughness parameters determined from LES.This article is part of the themed issue 'Wind energy in complex terrains'. © 2017 The Author(s).

  14. Analytical characterization of wine and its precursors by capillary electrophoresis.

    PubMed

    Gomez, Federico J V; Monasterio, Romina P; Vargas, Verónica Carolina Soto; Silva, María F

    2012-08-01

    The accurate determination of marker chemical species in grape, musts, and wines presents a unique analytical challenge with high impact on diverse areas of knowledge such as health, plant physiology, and economy. Capillary electromigration techniques have emerged as a powerful tool, allowing the separation and identification of highly polar compounds that cannot be easily separated by traditional HPLC methods, providing complementary information and permitting the simultaneous analysis of analytes with different nature in a single run. The main advantage of CE over traditional methods for wine analysis is that in most cases samples require no treatment other than filtration. The purpose of this article is to present a revision on capillary electromigration methods applied to the analysis of wine and its precursors over the last decade. The current state of the art of the topic is evaluated, with special emphasis on the natural compounds that have allowed wine to be considered as a functional food. The most representative revised compounds are phenolic compounds, amino acids, proteins, elemental species, mycotoxins, and organic acids. Finally, a discussion on future trends of the role of capillary electrophoresis in the field of analytical characterization of wines for routine analysis, wine classification, as well as multidisciplinary aspects of the so-called "from soil to glass" chain is presented. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. User`s and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Femec, D.A.

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user`s guide and a reference guide. The user`s guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMAmore » and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices.« less

  16. Software Tools on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of

  17. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  18. Microfabricated Genomic Analysis System

    NASA Technical Reports Server (NTRS)

    Gonda, Steve; Elms, Rene

    2005-01-01

    Genetic sequencing and many genetic tests and assays require electrophoretic separation of DNA. In this technique, DNA fragments are separated by size as they migrate through a sieving gel under the influence of an applied electric field. In order to conduct these analyses on-orbit, it is essential to acquire the capability to efficiently perform electrophoresis in a microgravity environment. Conventional bench top electrophoresis equipment is large and cumbersome and does not lead itself to on-orbit utilization. Much of the previous research regarding on-orbit electrophoresis involved altering conventional electrophoresis equipment for bioprocessing, purification, and/or separation technology applications. A new and more efficient approach to on-orbit electrophoresis is the use of a microfabricated electrophoresis platform. These platforms are much smaller, less expensive to produce and operate, use less power, require smaller sample sizes (nanoliters), and achieve separation in a much shorter distance (a few centimeters instead of 10 s or 100 s of centimeters.) In contrast to previous applications, this platform would be utilized as an analytical tool for life science/medical research, environmental monitoring, and medical diagnoses. Identification of infectious agents as well as radiation related damage are significant to NASA s efforts to maintain, study, and monitor crew health during and in support of near-Earth and interplanetary missions. The capability to perform genetic assays on-orbit is imperative to conduct relevant and insightful biological and medical research, as well as continuing NASA s search for life elsewhere. This technology would provide an essential analytical tool for research conducted in a microgravity environment (Shuttle, ISS, long duration/interplanetary missions.) In addition, this technology could serve as a critical and invaluable component of a biosentinel system to monitor space environment genotoxic insults to include radiation.

  19. Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y; Glascoe, L

    The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less

  20. Coping with Volume and Variety in Temporal Event Sequences: Strategies for Sharpening Analytic Focus.

    PubMed

    Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam

    2017-06-01

    The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.

  1. 3D-MICE: integration of cross-sectional and longitudinal imputation for multi-analyte longitudinal clinical data.

    PubMed

    Luo, Yuan; Szolovits, Peter; Dighe, Anand S; Baron, Jason M

    2018-06-01

    A key challenge in clinical data mining is that most clinical datasets contain missing data. Since many commonly used machine learning algorithms require complete datasets (no missing data), clinical analytic approaches often entail an imputation procedure to "fill in" missing data. However, although most clinical datasets contain a temporal component, most commonly used imputation methods do not adequately accommodate longitudinal time-based data. We sought to develop a new imputation algorithm, 3-dimensional multiple imputation with chained equations (3D-MICE), that can perform accurate imputation of missing clinical time series data. We extracted clinical laboratory test results for 13 commonly measured analytes (clinical laboratory tests). We imputed missing test results for the 13 analytes using 3 imputation methods: multiple imputation with chained equations (MICE), Gaussian process (GP), and 3D-MICE. 3D-MICE utilizes both MICE and GP imputation to integrate cross-sectional and longitudinal information. To evaluate imputation method performance, we randomly masked selected test results and imputed these masked results alongside results missing from our original data. We compared predicted results to measured results for masked data points. 3D-MICE performed significantly better than MICE and GP-based imputation in a composite of all 13 analytes, predicting missing results with a normalized root-mean-square error of 0.342, compared to 0.373 for MICE alone and 0.358 for GP alone. 3D-MICE offers a novel and practical approach to imputing clinical laboratory time series data. 3D-MICE may provide an additional tool for use as a foundation in clinical predictive analytics and intelligent clinical decision support.

  2. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  3. Structuring modeling and simulation analysis for evacuation planning and operations.

    DOT National Transportation Integrated Search

    2009-06-01

    This document is intended to provide guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in evacuation planning and operations. It is often unclear what kind of analytical approach may be of most value, ...

  4. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    PubMed

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  5. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  6. Assessment of tools for protection of quality of water: Uncontrollable discharges of pollutants.

    PubMed

    Dehghani Darmian, Mohsen; Hashemi Monfared, Seyed Arman; Azizyan, Gholamreza; Snyder, Shane A; Giesy, John P

    2018-06-06

    Selecting an appropriate crisis management plans during uncontrollable loading of pollution to water systems is crucial. In this research the quality of water resources against uncontrollable pollution is protected by use of suitable tools. Case study which was chosen in this investigation was a river-reservoir system. Analytical and numerical solutions of pollutant transport equation were considered as the simulation strategy to calculate the efficient tools to protect water quality. These practical instruments are dilution flow and a new tool called detention time which is proposed and simulated for the first time in this study. For uncontrollable pollution discharge which was approximately 130% of the river's assimilation capacity, as long as the duration of contact (T c ) was considered as a constraint, by releasing 30% of the base flow of the river from the upstream dilution reservoir, the unallowable pollution could be treated. Moreover, when the affected distance (X c ) was selected as a constraint, the required detention time that the rubber dam should detained the water to be treated was equal to 187% of the initial duration of contact. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etmektzoglou, A; Mishra, P; Svatos, M

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly translate research ideas into machine readable scripts without programming knowledge. As an open source initiative, it also enables researcher collaboration on future developments. I am a full time employee at Varian Medical Systems, Palo Alto, California.« less

  8. Nuclear magnetic resonance magnet actively cooled by pulse tube refrigerator

    NASA Astrophysics Data System (ADS)

    Kirichek, Oleg; Carr, Philip; Johnson, Chris; Atrey, Milind

    2005-05-01

    High field NMR spectrometers have been an essential tool for biomolecular scientists for many years. They have been instrumental in the pursuit of understanding of the structure, function and dynamics of proteins and other biological molecules. In addition, NMR is increasingly used for small molecule applications such as metabonomics, providing capabilities that aid drug discovery, as well as general organic and inorganic chemistry [M. Pellecchia et al., Nature Reviews Drug Discovery 1, 211 (2002)]. However, access to these systems is restricted due to the requirement to periodically refill them with liquid cryogens. This is both logistically demanding and expensive. A new system combining NMR spectrometry and Pulse Tube Refrigeration (PTR) has been developed and successfully tested. This approach eliminates the dependence on liquid cryogens, reduces spectrometer downtime, and also significantly reduces the size of the system. In the near future this new type of analytical tool may become ubiquitous in biomedical and chemical laboratories.

  9. Bioinformatics tools in predictive ecology: applications to fisheries

    PubMed Central

    Tucker, Allan; Duplisea, Daniel

    2012-01-01

    There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their ‘crossover potential’ with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse. PMID:22144390

  10. Fabrication de couches minces a memoire de forme et effets de l'irradiation ionique

    NASA Astrophysics Data System (ADS)

    Goldberg, Florent

    1998-09-01

    Nickel and titanium when combined in the right stoichiometric proportion (1:1) can form alloys showing the shape memory effect. Within the scope of this thesis, thin films of such alloys have been successfully produced by sputtering. Precise control of composition is crucial in order to obtain the shape memory effect. A combination of analytical tools which can accurately determine the behavior of such materials is also required (calorimetric analysis, crystallography, composition analysis, etc.). Rutherford backscattering spectrometry has been used for quantitative composition analysis. Thereafter irradiation of films with light ions (He+) of few MeV was shown to allow lowering of the characteristic premartensitic transformation temperatures while preserving the shape memory effect. Those results open the door to a new field of research, particularly for ion irradiation and its potential use as a tool to modify the thermomechanical behavior of shape memory thin film actuators.

  11. Causal inference and the data-fusion problem

    PubMed Central

    Bareinboim, Elias; Pearl, Judea

    2016-01-01

    We review concepts, principles, and tools that unify current approaches to causal analysis and attend to new challenges presented by big data. In particular, we address the problem of data fusion—piecing together multiple datasets collected under heterogeneous conditions (i.e., different populations, regimes, and sampling methods) to obtain valid answers to queries of interest. The availability of multiple heterogeneous datasets presents new opportunities to big data analysts, because the knowledge that can be acquired from combined data would not be possible from any individual source alone. However, the biases that emerge in heterogeneous environments require new analytical tools. Some of these biases, including confounding, sampling selection, and cross-population biases, have been addressed in isolation, largely in restricted parametric models. We here present a general, nonparametric framework for handling these biases and, ultimately, a theoretical solution to the problem of data fusion in causal inference tasks. PMID:27382148

  12. Dispersion relations with crossing symmetry for {pi}{pi} D- and F-wave amplitudes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaminski, R.

    A set of once subtracted dispersion relations with imposed crossing symmetry condition for the {pi}{pi} D- and F-wave amplitudes is derived and analyzed. An example of numerical calculations in the effective two-pion mass range from the threshold to 1.1 GeV is presented. It is shown that these new dispersion relations impose quite strong constraints on the analyzed {pi}{pi} interactions and are very useful tools to test the {pi}{pi} amplitudes. One of the goals of this work is to provide a complete set of equations required for easy use. Full analytical expressions are presented. Along with the well-known dispersion relations successfulmore » in testing the {pi}{pi} S- and P-wave amplitudes, those presented here for the D and F waves give a complete set of tools for analyses of the {pi}{pi} interactions.« less

  13. A digital future for the history of psychology?

    PubMed

    Green, Christopher D

    2016-08-01

    This article discusses the role that digital approaches to the history of psychology are likely to play in the near future. A tentative hierarchy of digital methods is proposed. A few examples are briefly described: a digital repository, a simple visualization using ready-made online database and tools, and more complex visualizations requiring the assembly of the database and, possibly, the analytic tools by the researcher. The relationship of digital history to the old "New Economic History" (Cliometrics) is considered. The question of whether digital history and traditional history need be at odds or, instead, might complement each other is woven throughout. The rapidly expanding territory of digital humanistic research outside of psychology is briefly discussed. Finally, the challenging current employment trends in history and the humanities more broadly are considered, along with the role that digital skills might play in mitigating those factors for prospective academic workers. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  14. Three-dimensional localization of nanoscale battery reactions using soft X-ray tomography.

    PubMed

    Yu, Young-Sang; Farmand, Maryam; Kim, Chunjoong; Liu, Yijin; Grey, Clare P; Strobridge, Fiona C; Tyliszczak, Tolek; Celestre, Rich; Denes, Peter; Joseph, John; Krishnan, Harinarayan; Maia, Filipe R N C; Kilcoyne, A L David; Marchesini, Stefano; Leite, Talita Perciano Costa; Warwick, Tony; Padmore, Howard; Cabana, Jordi; Shapiro, David A

    2018-03-02

    Battery function is determined by the efficiency and reversibility of the electrochemical phase transformations at solid electrodes. The microscopic tools available to study the chemical states of matter with the required spatial resolution and chemical specificity are intrinsically limited when studying complex architectures by their reliance on two-dimensional projections of thick material. Here, we report the development of soft X-ray ptychographic tomography, which resolves chemical states in three dimensions at 11 nm spatial resolution. We study an ensemble of nano-plates of lithium iron phosphate extracted from a battery electrode at 50% state of charge. Using a set of nanoscale tomograms, we quantify the electrochemical state and resolve phase boundaries throughout the volume of individual nanoparticles. These observations reveal multiple reaction points, intra-particle heterogeneity, and size effects that highlight the importance of multi-dimensional analytical tools in providing novel insight to the design of the next generation of high-performance devices.

  15. Bioinformatics tools in predictive ecology: applications to fisheries.

    PubMed

    Tucker, Allan; Duplisea, Daniel

    2012-01-19

    There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their 'crossover potential' with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse.

  16. Simultaneous Voltammetric Determination of Acetaminophen and Isoniazid (Hepatotoxicity-Related Drugs) Utilizing Bismuth Oxide Nanorod Modified Screen-Printed Electrochemical Sensing Platforms.

    PubMed

    Mahmoud, Bahaa G; Khairy, Mohamed; Rashwan, Farouk A; Banks, Craig E

    2017-02-07

    To overcome the recent outbreaks of hepatotoxicity-related drugs, a new analytical tool for the continuously determination of these drugs in human fluids is required. Electrochemical-based analytical methods offer an effective, rapid, and simple tool for on-site determination of various organic and inorganic species. However, the design of a sensitive, selective, stable, and reproducible sensor is still a major challenge. In the present manuscript, a facile, one-pot hydrothermal synthesis of bismuth oxide (Bi 2 O 2.33 ) nanostructures (nanorods) was developed. These BiO nanorods were cast onto mass disposable graphite screen-printed electrodes (BiO-SPEs), allowing the ultrasensitive determination of acetaminophen (APAP) in the presence of its common interference isoniazid (INH), which are both found in drug samples. The simultaneous electroanalytical sensing using BiO-SPEs exhibited strong electrocatalytic activity toward the sensing of APAP and INH with an enhanced analytical signal (voltammetric peak) over that achievable at unmodified (bare) SPEs. The electroanalytical sensing of APAP and INH are possible with accessible linear ranges from 0.5 to 1250 μM and 5 to 1760 μM with limits of detection (3σ) of 30 nM and 1.85 μM, respectively. The stability, reproducibility, and repeatability of BiO-SPE were also investigated. The BiO-SPEs were evaluated toward the sensing of APAP and INH in human serum, urine, saliva, and tablet samples. The results presented in this paper demonstrate that BiO-SPEs sensing platforms provide a potential candidate for the accurate determination of APAP and INH within human fluids and pharmaceutical formulations.

  17. Analytical Tools Interface for Landscape Assessments

    EPA Science Inventory

    Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...

  18. SolarPILOT | Concentrating Solar Power | NREL

    Science.gov Websites

    tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is

  19. Next-Generation Tools For Next-Generation Surveys

    NASA Astrophysics Data System (ADS)

    Murray, S. G.

    2017-04-01

    The next generation of large-scale galaxy surveys, across the electromagnetic spectrum, loom on the horizon as explosively game-changing datasets, in terms of our understanding of cosmology and structure formation. We are on the brink of a torrent of data that is set to both confirm and constrain current theories to an unprecedented level, and potentially overturn many of our conceptions. One of the great challenges of this forthcoming deluge is to extract maximal scientific content from the vast array of raw data. This challenge requires not only well-understood and robust physical models, but a commensurate network of software implementations with which to efficiently apply them. The halo model, a semi-analytic treatment of cosmological spatial statistics down to nonlinear scales, provides an excellent mathematical framework for exploring the nature of dark matter. This thesis presents a next-generation toolkit based on the halo model formalism, intended to fulfil the requirements of next-generation surveys. Our toolkit comprises three tools: (i) hmf, a comprehensive and flexible calculator for halo mass functions (HMFs) within extended Press-Schechter theory, (ii) the MRP distribution for extremely efficient analytic characterisation of HMFs, and (iii) halomod, an extension of hmf which provides support for the full range of halo model components. In addition to the development and technical presentation of these tools, we apply each to the task of physical modelling. With hmf, we determine the precision of our knowledge of the HMF, due to uncertainty in our knowledge of the cosmological parameters, over the past decade of cosmic microwave background (CMB) experiments. We place rule-of-thumb uncertainties on the predicted HMF for the Planck cosmology, and find that current limits on the precision are driven by modeling uncertainties rather than those from cosmological parameters. With the MRP, we create and test a method for robustly fitting the HMF to observed masses with arbitrary measurement uncertainties on a per-object basis. We find that our method reduces estimation uncertainty on parameters by over 50%, and correctly accounts for Eddington bias even in extremely poorly measured data. Additionally, we use the analytical properties of the MRP to obtain asymptotically correct forms for the stellar-mass halo-mass relation, in the subhalo abundance matching scheme. Finally, with halomod, we explore the viability of the halo model as a test of warm dark matter (WDM) via galaxy clustering. Examining three distinct scale regimes, we find that the clustering of galaxies at the smallest resolvable scales may provide a valuable independent probe in the coming era.

  20. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation

    PubMed Central

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment. PMID:26368541

  1. Nondestructive quantification of analyte diffusion in cornea and sclera using optical coherence tomography.

    PubMed

    Ghosn, Mohamad G; Tuchin, Valery V; Larin, Kirill V

    2007-06-01

    Noninvasive functional imaging, monitoring, and quantification of analytes transport in epithelial ocular tissues are extremely important for therapy and diagnostics of many eye diseases. In this study the authors investigated the capability of optical coherence tomography (OCT) for noninvasive monitoring and quantification of diffusion of different analytes in sclera and cornea of rabbit eyes. A portable time-domain OCT system with wavelength of 1310 +/- 15 nm, output power of 3.5 mW, and resolution of 25 mum was used in this study. Diffusion of different analytes was monitored and quantified in rabbit cornea and sclera of whole eyeballs. Diffusion of water, metronidazole (0.5%), dexamethasone (0.2%), ciprofloxacin (0.3%), mannitol (20%), and glucose solution (20%) were examined, and their permeability coefficients were calculated by using OCT signal slope and depth-resolved amplitude methods. Permeability coefficients were calculated as a function of time and tissue depth. For instance, mannitol was found to have a permeability coefficient of (8.99 +/- 1.43) x 10(-6) cm/s in cornea and (6.18 +/- 1.08) x 10(-6) cm/s in sclera. The permeability coefficient of drugs with small concentrations (where water was the major solvent) was found to be in the range of that of water in the same tissue type, whereas permeability coefficients of higher concentrated solutions varied significantly. Results suggest that the OCT technique might be a powerful tool for noninvasive diffusion studies of different analytes in ocular tissues. However, additional methods of OCT signal acquisition and processing are required to study the diffusion of agents of small concentrations.

  2. A Bayesian network meta-analysis for binary outcome: how to do it.

    PubMed

    Greco, Teresa; Landoni, Giovanni; Biondi-Zoccai, Giuseppe; D'Ascenzo, Fabrizio; Zangrillo, Alberto

    2016-10-01

    This study presents an overview of conceptual and practical issues of a network meta-analysis (NMA), particularly focusing on its application to randomised controlled trials with a binary outcome of interest. We start from general considerations on NMA to specifically appraise how to collect study data, structure the analytical network and specify the requirements for different models and parameter interpretations, with the ultimate goal of providing physicians and clinician-investigators a practical tool to understand pros and cons of NMA. Specifically, we outline the key steps, from the literature search to sensitivity analysis, necessary to perform a valid NMA of binomial data, exploiting Markov Chain Monte Carlo approaches. We also apply this analytical approach to a case study on the beneficial effects of volatile agents compared to total intravenous anaesthetics for surgery to further clarify the statistical details of the models, diagnostics and computations. Finally, datasets and models for the freeware WinBUGS package are presented for the anaesthetic agent example. © The Author(s) 2013.

  3. A centrifugation-based physicochemical characterization method for the interaction between proteins and nanoparticles

    NASA Astrophysics Data System (ADS)

    Bekdemir, Ahmet; Stellacci, Francesco

    2016-10-01

    Nanomedicine requires in-depth knowledge of nanoparticle-protein interactions. These interactions are studied with methods limited to large or fluorescently labelled nanoparticles as they rely on scattering or fluorescence-correlation signals. Here, we have developed a method based on analytical ultracentrifugation (AUC) as an absorbance-based, label-free tool to determine dissociation constants (KD), stoichiometry (Nmax), and Hill coefficient (n), for the association of bovine serum albumin (BSA) with gold nanoparticles. Absorption at 520 nm in AUC renders the measurements insensitive to unbound and aggregated proteins. Measurements remain accurate and do not become more challenging for small (sub-10 nm) nanoparticles. In AUC, frictional ratio analysis allows for the qualitative assessment of the shape of the analyte. Data suggests that small-nanoparticles/protein complexes significantly deviate from a spherical shape even at maximum coverage. We believe that this method could become one of the established approaches for the characterization of the interaction of (small) nanoparticles with proteins.

  4. Application of Laser Mass Spectrometry to Art and Archaeology

    NASA Technical Reports Server (NTRS)

    Gulian, Lase Lisa E.; Callahan, Michael P.; Muliadi, Sarah; Owens, Shawn; McGovern, Patrick E.; Schmidt, Catherine M.; Trentelman, Karen A.; deVries, Mattanjah S.

    2011-01-01

    REMPI laser mass spectrometry is a combination of resonance enhanced multiphoton ionization spectroscopy and time of flight mass spectrometry, This technique enables the collection of mass specific optical spectra as well as of optically selected mass spectra. Analytes are jet-cooled by entrainment in a molecular beam, and this low temperature gas phase analysis has the benefit of excellent vibronic resolution. Utilizing this method, mass spectrometric analysis of historically relevant samples can be simplified and improved; Optical selection of targets eliminates the need for chromatography while knowledge of a target's gas phase spectroscopy allows for facile differentiation of molecules that are in the aqueous phase considered spectroscopically indistinguishable. These two factors allow smaller sample sizes than commercial MS instruments, which in turn will require less damage to objects of antiquity. We have explored methods to optimize REMPI laser mass spectrometry as an analytical tool to archaeology using theobromine and caffeine as molecular markers in Mesoamerican pottery, and are expanding this approach to the field of art to examine laccaic acid in shellacs.

  5. Creating value in health care through big data: opportunities and policy implications.

    PubMed

    Roski, Joachim; Bo-Linn, George W; Andrews, Timothy A

    2014-07-01

    Big data has the potential to create significant value in health care by improving outcomes while lowering costs. Big data's defining features include the ability to handle massive data volume and variety at high velocity. New, flexible, and easily expandable information technology (IT) infrastructure, including so-called data lakes and cloud data storage and management solutions, make big-data analytics possible. However, most health IT systems still rely on data warehouse structures. Without the right IT infrastructure, analytic tools, visualization approaches, work flows, and interfaces, the insights provided by big data are likely to be limited. Big data's success in creating value in the health care sector may require changes in current polices to balance the potential societal benefits of big-data approaches and the protection of patients' confidentiality. Other policy implications of using big data are that many current practices and policies related to data use, access, sharing, privacy, and stewardship need to be revised. Project HOPE—The People-to-People Health Foundation, Inc.

  6. RapidIO as a multi-purpose interconnect

    NASA Astrophysics Data System (ADS)

    Baymani, Simaolhoda; Alexopoulos, Konstantinos; Valat, Sébastien

    2017-10-01

    RapidIO (http://rapidio.org/) technology is a packet-switched high-performance fabric, which has been under active development since 1997. Originally meant to be a front side bus, it developed into a system level interconnect which is today used in all 4G/LTE base stations world wide. RapidIO is often used in embedded systems that require high reliability, low latency and scalability in a heterogeneous environment - features that are highly interesting for several use cases, such as data analytics and data acquisition (DAQ) networks. We will present the results of evaluating RapidIO in a data analytics environment, from setup to benchmark. Specifically, we will share the experience of running ROOT and Hadoop on top of RapidIO. To demonstrate the multi-purpose characteristics of RapidIO, we will also present the results of investigating RapidIO as a technology for high-speed DAQ networks using a generic multi-protocol event-building emulation tool. In addition we will present lessons learned from implementing native ports of CERN applications to RapidIO.

  7. MALDI mass spectrometry imaging, from its origins up to today: the state of the art.

    PubMed

    Francese, Simona; Dani, Francesca R; Traldi, Pietro; Mastrobuoni, Guido; Pieraccini, Giuseppe; Moneti, Gloriano

    2009-02-01

    Mass Spectrometry (MS) has a number of features namely sensitivity, high dynamic range, high resolution, and versatility which make it a very powerful analytical tool for a wide spectrum of applications spanning all the life science fields. Among all the MS techniques, MALDI Imaging mass spectrometry (MALDI MSI) is currently one of the most exciting both for its rapid technological improvements, and for its great potential in high impact bioscience fields. Here, MALDI MSI general principles are described along with technical and instrumental details as well as application examples. Imaging MS instruments and imaging mass spectrometric techniques other than MALDI, are presented along with examples of their use. As well as reporting MSI successes in several bioscience fields, an attempt is made to take stock of what has been achieved so far with this technology and to discuss the analytical and technological advances required for MSI to be applied as a routine technique in clinical diagnostics, clinical monitoring and in drug discovery.

  8. Design Considerations of ISTAR Hydrocarbon Fueled Combustor Operating in Air Augmented Rocket, Ramjet and Scramjet Modes

    NASA Technical Reports Server (NTRS)

    Andreadis, Dean; Drake, Alan; Garrett, Joseph L.; Gettinger, Christopher D.; Hoxie, Stephen S.

    2003-01-01

    The development and ground test of a rocket-based combined cycle (RBCC) propulsion system is being conducted as part of the NASA Marshall Space Flight Center (MSFC) Integrated System Test of an Airbreathing Rocket (ISTAR) program. The eventual flight vehicle (X-43B) is designed to support an air-launched self-powered Mach 0.7 to 7.0 demonstration of an RBCC engine through all of its airbreathing propulsion modes - air augmented rocket (AAR), ramjet (RJ), and scramjet (SJ). Through the use of analytical tools, numerical simulations, and experimental tests the ISTAR program is developing and validating a hydrocarbon-fueled RBCC combustor design methodology. This methodology will then be used to design an integrated RBCC propulsion system that produces robust ignition and combustion stability characteristics while maximizing combustion efficiency and minimizing drag losses. First order analytical and numerical methods used to design hydrocarbon-fueled combustors are discussed with emphasis on the methods and determination of requirements necessary to establish engine operability and performance characteristics.

  9. Design Considerations of Istar Hydrocarbon Fueled Combustor Operating in Air Augmented Rocket, Ramjet and Scramjet Modes

    NASA Technical Reports Server (NTRS)

    Andreadis, Dean; Drake, Alan; Garrett, Joseph L.; Gettinger, Christopher D.; Hoxie, Stephen S.

    2002-01-01

    The development and ground test of a rocket-based combined cycle (RBCC) propulsion system is being conducted as part of the NASA Marshall Space Flight Center (MSFC) Integrated System Test of an Airbreathing Rocket (ISTAR) program. The eventual flight vehicle (X-43B) is designed to support an air-launched self-powered Mach 0.7 to 7.0 demonstration of an RBCC engine through all of its airbreathing propulsion modes - air augmented rocket (AAR), ramjet (RJ), and scramjet (SJ). Through the use of analytical tools, numerical simulations, and experimental tests the ISTAR program is developing and validating a hydrocarbon-fueled RBCC combustor design methodology. This methodology will then be used to design an integrated RBCC propulsion system thai: produces robust ignition and combustion stability characteristics while maximizing combustion efficiency and minimizing drag losses. First order analytical and numerical methods used to design hydrocarbon-fueled combustors are discussed with emphasis on the methods and determination of requirements necessary to establish engine operability and performance characteristics.

  10. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  11. Design optimization of piezoresistive cantilevers for force sensing in air and water

    PubMed Central

    Doll, Joseph C.; Park, Sung-Jin; Pruitt, Beth L.

    2009-01-01

    Piezoresistive cantilevers fabricated from doped silicon or metal films are commonly used for force, topography, and chemical sensing at the micro- and macroscales. Proper design is required to optimize the achievable resolution by maximizing sensitivity while simultaneously minimizing the integrated noise over the bandwidth of interest. Existing analytical design methods are insufficient for modeling complex dopant profiles, design constraints, and nonlinear phenomena such as damping in fluid. Here we present an optimization method based on an analytical piezoresistive cantilever model. We use an existing iterative optimizer to minimimize a performance goal, such as minimum detectable force. The design tool is available as open source software. Optimal cantilever design and performance are found to strongly depend on the measurement bandwidth and the constraints applied. We discuss results for silicon piezoresistors fabricated by epitaxy and diffusion, but the method can be applied to any dopant profile or material which can be modeled in a similar fashion or extended to other microelectromechanical systems. PMID:19865512

  12. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  13. Transient well flow in leaky multiple-aquifer systems

    NASA Astrophysics Data System (ADS)

    Hemker, C. J.

    1985-10-01

    A previously developed eigenvalue analysis approach to groundwater flow in leaky multiple aquifers is used to derive exact solutions for transient well flow problems in leaky and confined systems comprising any number of aquifers. Equations are presented for the drawdown distribution in systems of infinite extent, caused by wells penetrating one or more of the aquifers completely and discharging each layer at a constant rate. Since the solution obtained may be regarded as a combined analytical-numerical technique, a type of one-dimensional modelling can be applied to find approximate solutions for several complicating conditions. Numerical evaluations are presented as time-drawdown curves and include effects of storage in the aquitard, unconfined conditions, partially penetrating wells and stratified aquifers. The outcome of calculations for relatively simple systems compares very well with published corresponding results. The proposed multilayer solution can be a valuable tool in aquifer test evaluation, as it provides the analytical expression required to enable the application of existing computer methods to the determination of aquifer characteristics.

  14. Using business intelligence for efficient inter-facility patient transfer.

    PubMed

    Haque, Waqar; Derksen, Beth Ann; Calado, Devin; Foster, Lee

    2015-01-01

    In the context of inter-facility patient transfer, a transfer operator must be able to objectively identify a destination which meets the needs of a patient, while keeping in mind each facility's limitations. We propose a solution which uses Business Intelligence (BI) techniques to analyze data related to healthcare infrastructure and services, and provides a web based system to identify optimal destination(s). The proposed inter-facility transfer system uses a single data warehouse with an Online Analytical Processing (OLAP) cube built on top that supplies analytical data to multiple reports embedded in web pages. The data visualization tool includes map based navigation of the health authority as well as an interactive filtering mechanism which finds facilities meeting the selected criteria. The data visualization is backed by an intuitive data entry web form which safely constrains the data, ensuring consistency and a single version of truth. The overall time required to identify the destination for inter-facility transfers is reduced from hours to a few minutes with this interactive solution.

  15. Lagrangian based methods for coherent structure detection

    NASA Astrophysics Data System (ADS)

    Allshouse, Michael R.; Peacock, Thomas

    2015-09-01

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other two approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.

  16. Manipulability, force, and compliance analysis for planar continuum manipulators

    NASA Technical Reports Server (NTRS)

    Gravagne, Ian A.; Walker, Ian D.

    2002-01-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  17. Manipulability, force, and compliance analysis for planar continuum manipulators.

    PubMed

    Gravagne, Ian A; Walker, Ian D

    2002-06-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  18. Data and Tools | Concentrating Solar Power | NREL

    Science.gov Websites

    download. Solar Power tower Integrated Layout and Optimization Tool (SolarPILOT(tm)) The SolarPILOT is code rapid layout and optimization capability of the analytical DELSOL3 program with the accuracy and

  19. A Clinical Reasoning Tool for Virtual Patients: Design-Based Research Study.

    PubMed

    Hege, Inga; Kononowicz, Andrzej A; Adler, Martin

    2017-11-02

    Clinical reasoning is a fundamental process medical students have to learn during and after medical school. Virtual patients (VP) are a technology-enhanced learning method to teach clinical reasoning. However, VP systems do not exploit their full potential concerning the clinical reasoning process; for example, most systems focus on the outcome and less on the process of clinical reasoning. Keeping our concept grounded in a former qualitative study, we aimed to design and implement a tool to enhance VPs with activities and feedback, which specifically foster the acquisition of clinical reasoning skills. We designed the tool by translating elements of a conceptual clinical reasoning learning framework into software requirements. The resulting clinical reasoning tool enables learners to build their patient's illness script as a concept map when they are working on a VP scenario. The student's map is compared with the experts' reasoning at each stage of the VP, which is technically enabled by using Medical Subject Headings, which is a comprehensive controlled vocabulary published by the US National Library of Medicine. The tool is implemented using Web technologies, has an open architecture that enables its integration into various systems through an open application program interface, and is available under a Massachusetts Institute of Technology license. We conducted usability tests following a think-aloud protocol and a pilot field study with maps created by 64 medical students. The results show that learners interact with the tool but create less nodes and connections in the concept map than an expert. Further research and usability tests are required to analyze the reasons. The presented tool is a versatile, systematically developed software component that specifically supports the clinical reasoning skills acquisition. It can be plugged into VP systems or used as stand-alone software in other teaching scenarios. The modular design allows an extension with new feedback mechanisms and learning analytics algorithms. ©Inga Hege, Andrzej A Kononowicz, Martin Adler. Originally published in JMIR Medical Education (http://mededu.jmir.org), 02.11.2017.

  20. Training the next generation analyst using red cell analytics

    NASA Astrophysics Data System (ADS)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

Top